Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 23
Filtrar
Más filtros

Banco de datos
País/Región como asunto
Tipo del documento
Intervalo de año de publicación
1.
Clin Infect Dis ; 2024 May 14.
Artículo en Inglés | MEDLINE | ID: mdl-38743579

RESUMEN

BACKGROUND: Antibiotics are a strong risk factor for Clostridioides difficile infection (CDI), and CDI incidence is often measured as an important outcome metric for antimicrobial stewardship interventions aiming to reduce antibiotic use. However, risk of CDI from antibiotics varies by agent and dependent on the intensity (i.e., spectrum and duration) of antibiotic therapy. Thus, the impact of stewardship interventions on CDI incidence is variable, and understanding this risk requires a more granular measure of intensity of therapy than traditionally used measures like days of therapy (DOT). METHODS: We performed a retrospective cohort study to measure the independent association between intensity of antibiotic therapy, as measured by the antibiotic spectrum index (ASI), and hospital-associated CDI (HA-CDI) at a large academic medical center between January 2018 and March 2020. We constructed a marginal Poisson regression model to generate adjusted relative risks for a unit increase in ASI per antibiotic day. RESULTS: We included 35,457 inpatient encounters in our cohort. Sixty-eight percent of patients received at least one antibiotic. We identified 128 HA-CDI cases, which corresponds to an incidence rate of 4.1 cases per 10,000 patient-days. After adjusting for known confounders, each additional unit increase in ASI per antibiotic day is associated with 1.09 times the risk of HA-CDI (Relative Risk = 1.09, 95% Confidence Interval: 1.06 to 1.13). CONCLUSIONS: ASI was strongly associated with HA-CDI and could be a useful tool in evaluating the impact of antibiotic stewardship on HA-CDI rates, providing more granular information than the more commonly used days of therapy.

2.
J Gen Intern Med ; 34(11): 2443-2450, 2019 11.
Artículo en Inglés | MEDLINE | ID: mdl-31420823

RESUMEN

BACKGROUND: The continued rise in fatalities from opioid analgesics despite a steady decline in the number of individual prescriptions directing ≥ 90 morphine milligram equivalents (MME)/day may be explained by patient exposures to redundant prescriptions from multiple prescribers. OBJECTIVES: We evaluated prescribers' specialty and social network characteristics associated with high-risk opioid exposures resulting from single-prescriber high-daily dose prescriptions or multi-prescriber discoordination. DESIGN: Retrospective cohort study. PARTICIPANTS: A cohort of prescribers with opioid analgesic prescription claims for non-cancer chronic opioid users in an Illinois Medicaid managed care program in 2015-2016. MAIN MEASURES: Per prescriber rates of single-prescriber high-daily-dose prescriptions or multi-prescriber discoordination. KEY RESULTS: For 2280 beneficiaries, 36,798 opioid prescription claims were submitted by 3532 prescribers. Compared to 3% of prescriptions (involving 6% of prescribers and 7% of beneficiaries) that directed ≥ 90 MME/day, discoordination accounted for a greater share of high-risk exposures-13% of prescriptions (involving 23% of prescribers and 24% of beneficiaries). The following specialties were at highest risk of discoordinated prescribing compared to internal medicine: dental (incident rate ratio (95% confidence interval) 5.9 (4.6, 7.5)), emergency medicine (4.7 (3.8, 5.8)), and surgical subspecialties (4.2 (3.0, 5.8)). Social network analysis identified 2 small interconnected prescriber communities of high-volume pain management specialists, and 3 sparsely connected groups of predominantly low-volume primary care or emergency medicine clinicians. Using multivariate models, we found that the sparsely connected sociometric positions were a risk factor for high-risk exposures. CONCLUSION: Low-volume prescribers in the social network's periphery were at greater risk of intended or discoordinated prescribing than interconnected high-volume prescribers. Interventions addressing discoordination among low-volume opioid prescribers in non-integrated practices should be a priority. Demands for enhanced functionality and integration of Prescription Drug Monitoring Programs or referrals to specialized multidisciplinary pain management centers are potential policy implications.


Asunto(s)
Analgésicos Opioides/administración & dosificación , Pautas de la Práctica en Medicina/estadística & datos numéricos , Medicina de Emergencia , Humanos , Trastornos Relacionados con Opioides/epidemiología , Programas de Monitoreo de Medicamentos Recetados/estadística & datos numéricos , Atención Primaria de Salud , Estudios Retrospectivos , Red Social
3.
Clin Infect Dis ; 67(3): 407-410, 2018 07 18.
Artículo en Inglés | MEDLINE | ID: mdl-29415264

RESUMEN

Background: In 2013, New Delhi metallo-ß-lactamase (NDM)-producing Escherichia coli, a type of carbapenem-resistant Enterobacteriaceae uncommon in the United States, was identified in a tertiary care hospital (hospital A) in northeastern Illinois. The outbreak was traced to a contaminated duodenoscope. Patient-sharing patterns can be described through social network analysis and ego networks, which could be used to identify hospitals most likely to accept patients from a hospital with an outbreak. Methods: Using Illinois' hospital discharge data and the Illinois extensively drug-resistant organism (XDRO) registry, we constructed an ego network around hospital A. We identified which facilities NDM outbreak patients subsequently visited and whether the facilities reported NDM cases. Results: Of the 31 outbreak cases entered into the XDRO registry who visited hospital A, 19 (61%) were subsequently admitted to 13 other hospitals during the following 12 months. Of the 13 hospitals, the majority (n = 9; 69%) were in our defined ego network, and 5 of those 9 hospitals consequently reported at least 1 additional NDM case. Ego network facilities were more likely to identify cases compared to a geographically defined group of facilities (9/22 vs 10/66; P = .01); only 1 reported case fell outside of the ego network. Conclusions: The outbreak hospital's ego network accurately predicted which hospitals the outbreak patients would visit. Many of these hospitals reported additional NDM cases. Prior knowledge of this ego network could have efficiently focused public health resources on these high-risk facilities.


Asunto(s)
Enterobacteriaceae Resistentes a los Carbapenémicos/aislamiento & purificación , Infección Hospitalaria/epidemiología , Brotes de Enfermedades , Infecciones por Enterobacteriaceae/transmisión , Instituciones de Salud , Antibacterianos/farmacología , Infección Hospitalaria/microbiología , Farmacorresistencia Bacteriana Múltiple , Enterobacteriaceae/efectos de los fármacos , Infecciones por Enterobacteriaceae/epidemiología , Escherichia coli/efectos de los fármacos , Humanos , Illinois/epidemiología , Klebsiella pneumoniae/efectos de los fármacos , Pruebas de Sensibilidad Microbiana , Sistema de Registros , Red Social
4.
Clin Infect Dis ; 63(7): 889-93, 2016 10 01.
Artículo en Inglés | MEDLINE | ID: mdl-27486116

RESUMEN

BACKGROUND: Carbapenem-resistant Enterobacteriaceae (CRE) spread regionally throughout healthcare facilities through patient transfer and cause difficult-to-treat infections. We developed a state-wide patient-sharing matrix and applied social network analyses to determine whether greater connectedness (centrality) to other healthcare facilities and greater patient sharing with long-term acute care hospitals (LTACHs) predicted higher facility CRE rates. METHODS: We combined CRE case information from the Illinois extensively drug-resistant organism registry with measures of centrality calculated from a state-wide hospital discharge dataset to predict facility-level CRE rates, adjusting for hospital size and geographic characteristics. RESULTS: Higher CRE rates were observed among facilities with greater patient sharing, as measured by degree centrality. Each additional hospital connection (unit of degree) conferred a 6% increase in CRE rate in rural facilities (relative risk [RR] = 1.056; 95% confidence interval [CI], 1.030-1.082) and a 3% increase among Chicagoland and non-Chicago urban facilities (RR = 1.027; 95% CI, 1.002-1.052 and RR = 1.025; 95% CI, 1.002-1.048, respectively). Sharing 4 or more patients with LTACHs was associated with higher CRE rates, but this association may have been due to chance (RR = 2.08; 95% CI, .85-5.08; P = .11). CONCLUSIONS: Hospitals with greater connectedness to other hospitals in a statewide patient-sharing network had higher CRE burden. Centrality had a greater effect on CRE rates in rural counties, which do not have LTACHs. Social network analysis likely identifies hospitals at higher risk of CRE exposure, enabling focused clinical and public health interventions.


Asunto(s)
Enterobacteriaceae Resistentes a los Carbapenémicos , Infección Hospitalaria/epidemiología , Infecciones por Enterobacteriaceae/epidemiología , Hospitales/estadística & datos numéricos , Anciano , Femenino , Humanos , Illinois/epidemiología , Masculino , Persona de Mediana Edad , Modelos Estadísticos , Estudios Prospectivos
5.
medRxiv ; 2024 Jan 11.
Artículo en Inglés | MEDLINE | ID: mdl-38260609

RESUMEN

Background: Clinical research focused on the burden and impact of Clostridioides difficile infection (CDI) often relies upon accurate identification of cases using existing health record data. Use of diagnosis codes alone can lead to misclassification of cases. Our goal was to develop and validate a multi-component algorithm to identify hospital-associated CDI (HA-CDI) cases using electronic health record (EHR) data. Methods: We performed a validation study using a random sample of adult inpatients at a large academic hospital setting in Portland, Oregon from January 2018 to March 2020. We excluded patients with CDI on admission and those with short lengths of stay (< 4 days). We tested a multi-component algorithm to identify HA-CDI; case patients were required to have received an inpatient course of metronidazole, oral vancomycin, or fidaxomicin and have at least one of the following: a positive C. difficile laboratory test or the International Classification of Diseases, Tenth Revision (ICD-10) code for non-recurrent CDI. For a random sample of 80 algorithm-identified HA-CDI cases and 80 non-cases, we performed manual EHR review to identify gold standard of HA-CDI diagnosis. We then calculated overall percent accuracy, sensitivity, specificity, and positive and negative predictive value for the algorithm overall and for the individual components. Results: Our case definition algorithm identified HA-CDI cases with 94% accuracy (95% Confidence Interval (CI): 88% to 97%). We achieved 100% sensitivity (94% to 100%), 89% specificity (81% to 95%), 88% positive predictive value (78% to 94%), and 100% negative predictive value (95% to 100%). Requiring a positive C. difficile test as our gold standard further improved diagnostic performance (97% accuracy [93% to 99%], 93% PPV [85% to 98%]). Conclusions: Our algorithm accurately detected true HA-CDI cases from EHR data in our patient population. A multi-component algorithm performs better than any isolated component. Requiring a positive laboratory test for C. difficile strengthens diagnostic performance even further. Accurate detection could have important implications for CDI tracking and research.

6.
Eur J Clin Invest ; 40(6): 497-503, 2010 Jun.
Artículo en Inglés | MEDLINE | ID: mdl-20412293

RESUMEN

BACKGROUND: Peri-operative cardiac events are common and associated with significant morbidity. A predictive biomarker would assist in pre-operative risk stratification of surgical patients. This study explored the utility of pre-operative measurements of platelet-bound CD40 ligand and other biomarkers for predicting peri-operative cardiac events in total hip or knee arthroplasty. METHODS: Blood samples were collected from 62 patients prior to surgery and tested for the biomarkers platelet CD40 ligand, platelet factor V/Va, platelet P-selectin, high-sensitivity C-reactive protein, B-type natriuretic peptide and soluble CD40 ligand. The Revised Cardiac Risk Index was also calculated. Patients were then followed up prospectively and screened for peri-operative cardiac events by means of ECG, serial troponin I, a cardiologist's review and an interview at 6 weeks post operation. RESULTS: Six of 62 (9.7%) patients had a cardiac event. Patients who experienced a cardiac event had higher pre-operative platelet CD40 ligand levels as measured by flow cytometry [median 0.55% vs. 0.29% (P = 0.02)]. In this sized sample, platelet CD40L was the only biomarker independently associated with cardiac events (P = 0.02), the area under the receiver-operator characteristic curve being 0.79. CONCLUSION: In a study of this number of patients, of the six biomarkers tested, only platelet CD40 ligand was found to have a probable association with peri-operative cardiac events in hip and knee arthroplasty.


Asunto(s)
Artroplastia de Reemplazo de Cadera , Artroplastia de Reemplazo de Rodilla , Plaquetas/metabolismo , Ligando de CD40/sangre , Cardiopatías/sangre , Anciano , Anciano de 80 o más Años , Área Bajo la Curva , Arritmias Cardíacas/epidemiología , Biomarcadores/sangre , Factores de Coagulación Sanguínea/análisis , Proteína C-Reactiva/análisis , Femenino , Citometría de Flujo , Estudios de Seguimiento , Cardiopatías/epidemiología , Insuficiencia Cardíaca/sangre , Insuficiencia Cardíaca/epidemiología , Humanos , Masculino , Infarto del Miocardio/sangre , Infarto del Miocardio/epidemiología , Péptido Natriurético Encefálico/sangre , Selectina-P/sangre , Valor Predictivo de las Pruebas , Cuidados Preoperatorios , Troponina I/sangre
7.
BMJ ; 367: l6461, 2019 Dec 11.
Artículo en Inglés | MEDLINE | ID: mdl-31826860

RESUMEN

OBJECTIVES: To identify the frequency with which antibiotics are prescribed in the absence of a documented indication in the ambulatory care setting, to quantify the potential effect on assessments of appropriateness of antibiotics, and to understand patient, provider, and visit level characteristics associated with antibiotic prescribing without a documented indication. DESIGN: Cross sectional study. SETTING: 2015 National Ambulatory Medical Care Survey. PARTICIPANTS: 28 332 sample visits representing 990.9 million ambulatory care visits nationwide. MAIN OUTCOME MEASURES: Overall antibiotic prescribing and whether each antibiotic prescription was accompanied by appropriate, inappropriate, or no documented indication as identified through ICD-9-CM (international classification of diseases, 9th revision, clinical modification) codes. Survey weighted multivariable logistic regression was used to evaluate potential risk factors for receipt of an antibiotic prescription without a documented indication. RESULTS: Antibiotics were prescribed during 13.2% (95% confidence interval 11.6% to 13.7%) of the estimated 990.8 million ambulatory care visits in 2015. According to the criteria, 57% (52% to 62%) of the 130.5 million prescriptions were for appropriate indications, 25% (21% to 29%) were inappropriate, and 18% (15% to 22%) had no documented indication. This corresponds to an estimated 24 million prescriptions without a documented indication. Being an adult male, spending more time with the provider, and seeing a non-primary care specialist were significantly positively associated with antibiotic prescribing without an indication. Sulfonamides and urinary anti-infective agents were the antibiotic classes most likely to be prescribed without documentation. CONCLUSIONS: This nationally representative study of ambulatory visits identified a large number of prescriptions for antibiotics without a documented indication. Antibiotic prescribing in the absence of a documented indication may severely bias national estimates of appropriate antibiotic use in this setting. This study identified a wide range of factors associated with antibiotic prescribing without a documented indication, which may be useful in directing initiatives aimed at supporting better documentation.


Asunto(s)
Instituciones de Atención Ambulatoria , Antibacterianos/farmacología , Utilización de Medicamentos/normas , Prescripción Inadecuada/estadística & datos numéricos , Pautas de la Práctica en Medicina , Estudios Transversales , Humanos , Factores de Riesgo , Estados Unidos
8.
Open Forum Infect Dis ; 6(12): ofz483, 2019 Dec.
Artículo en Inglés | MEDLINE | ID: mdl-32128328

RESUMEN

BACKGROUND: Timely identification of patients likely to harbor carbapenem-resistant Enterobacteriaceae (CRE) can help health care facilities provide effective infection control and treatment. We evaluated whether a model utilizing prior health care information from a state hospital discharge database could predict a patient's probability of CRE colonization at the time of hospital admission. METHODS: We performed a case-control study using the Illinois hospital discharge database. From a 2014-2015 patient cohort, we defined cases as index adult patient hospital encounters with a positive CRE culture collected within the first 3 days of hospitalization, as reported to the Illinois XDRO registry; controls were all patient admissions from the same hospital and month. We split the data into training (~60%) and validation (~40%) sets and developed a logistic regression model to estimate coefficients for predictors of interest. RESULTS: We identified 486 index cases and 340 005 controls. Independent risk factors for CRE at the time of admission were age, number of short-term acute care hospital (STACH) hospitalizations in the prior 365 days, mean STACH length of stay, number of long-term acute care hospital (LTACH) hospitalizations in the prior 365 days, mean LTACH length of stay, current admission to LTACH, and prior hospital admission with an infection diagnosis code. When applying the model to the validation data set, the area under the receiver operating characteristic curve was 0.84. CONCLUSIONS: A prediction model utilizing prior health care exposure information could discriminate patients who were likely to harbor CRE at the time of hospital admission.

9.
Infect Control Hosp Epidemiol ; 39(4): 377-382, 2018 04.
Artículo en Inglés | MEDLINE | ID: mdl-29460713

RESUMEN

OBJECTIVE Because antibacterial history is difficult to obtain, especially when the exposure occurred at an outside hospital, we assessed whether infection-related diagnostic billing codes, which are more readily available through hospital discharge databases, could infer prior antibacterial receipt. DESIGN Retrospective cohort study. PARTICIPANTS This study included 121,916 hospitalizations representing 78,094 patients across the 3 hospitals. METHODS We obtained hospital inpatient data from 3 Chicago-area hospitals. Encounters were categorized as "infection" if at least 1 International Classification of Disease, Ninth Revision, Clinical Modification (ICD-9-CM) code indicated a bacterial infection. From medication administration records, we categorized antibacterial agents and calculated total therapy days using Centers for Disease Control and Prevention (CDC) definitions. We evaluated bivariate associations between infection encounters and 3 categories of antibacterial exposure: any, broad spectrum, or surgical prophylaxis. We constructed multivariable models to evaluate adjusted risk ratios for antibacterial receipt. RESULTS Of the 121,916 inpatient encounters (78,094 patients) across the 3 hospitals, 24% had an associated infection code, 47% received an antibacterial, and 13% received a broad-spectrum antibacterial. Infection-related ICD-9-CM codes were associated with a 2-fold increase in antibacterial administration compared to those lacking such codes (RR, 2.29; 95% confidence interval [CI], 2.27-2.31) and a 5-fold increased risk for broad-spectrum antibacterial administration (RR, 5.52; 95% CI, 5.37-5.67). Encounters with infection codes had 3 times the number of antibacterial days. CONCLUSIONS Infection diagnostic billing codes are strong surrogate markers for prior antibacterial exposure, especially to broad-spectrum antibacterial agents; such an association can be used to enhance early identification of patients at risk of multidrug-resistant organism (MDRO) carriage at the time of admission. Infect Control Hosp Epidemiol 2018;39:377-382.


Asunto(s)
Antibacterianos , Infección Hospitalaria , Hospitales , Antibacterianos/clasificación , Antibacterianos/farmacología , Infección Hospitalaria/diagnóstico , Infección Hospitalaria/tratamiento farmacológico , Infección Hospitalaria/epidemiología , Infección Hospitalaria/prevención & control , Notificación de Enfermedades/métodos , Notificación de Enfermedades/normas , Resistencia a Múltiples Medicamentos/efectos de los fármacos , Femenino , Hospitales/normas , Hospitales/estadística & datos numéricos , Humanos , Illinois/epidemiología , Clasificación Internacional de Enfermedades , Masculino , Registros Médicos/estadística & datos numéricos , Persona de Mediana Edad , Alta del Paciente/normas , Alta del Paciente/estadística & datos numéricos , Medición de Riesgo
10.
Infect Control Hosp Epidemiol ; 39(7): 765-770, 2018 07.
Artículo en Inglés | MEDLINE | ID: mdl-29695310

RESUMEN

OBJECTIVETo evaluate probiotics for the primary prevention of Clostridium difficile infection (CDI) among hospital inpatients.DESIGNA before-and-after quality improvement intervention comparing 12-month baseline and intervention periods.SETTINGA 694-bed teaching hospital.INTERVENTIONWe administered a multispecies probiotic comprising L. acidophilus (CL1285), L. casei (LBC80R), and L. rhamnosus (CLR2) to eligible antibiotic recipients within 12 hours of initial antibiotic receipt through 5 days after final dose. We excluded (1) all patients on neonatal, pediatric and oncology wards; (2) all individuals receiving perioperative prophylactic antibiotic recipients; (3) all those restricted from oral intake; and (4) those with pancreatitis, leukopenia, or posttransplant. We defined CDI by symptoms plus C. difficile toxin detection by polymerase chain reaction. Our primary outcome was hospital-onset CDI incidence on eligible hospital units, analyzed using segmented regression.RESULTSThe study included 251 CDI episodes among 360,016 patient days during the baseline and intervention periods, and the incidence rate was 7.0 per 10,000 patient days. The incidence rate was similar during baseline and intervention periods (6.9 vs 7.0 per 10,000 patient days; P=.95). However, compared to the first 6 months of the intervention, we detected a significant decrease in CDI during the final 6 months (incidence rate ratio, 0.6; 95% confidence interval, 0.4-0.9; P=.009). Testing intensity remained stable between the baseline and intervention periods: 19% versus 20% of stools tested were C. difficile positive by PCR, respectively. From medical record reviews, only 26% of eligible patients received a probiotic per the protocol.CONCLUSIONSDespite poor adherence to the protocol, there was a reduction in the incidence of CDI during the intervention, which was delayed ~6 months after introducing probiotic for primary prevention.Infect Control Hosp Epidemiol 2018;765-770.


Asunto(s)
Infecciones por Clostridium/epidemiología , Infecciones por Clostridium/prevención & control , Infección Hospitalaria/epidemiología , Infección Hospitalaria/prevención & control , Prevención Primaria/métodos , Probióticos/uso terapéutico , Estudios de Casos y Controles , Chicago/epidemiología , Clostridioides difficile , Infección Hospitalaria/microbiología , Hospitales de Enseñanza , Humanos , Mejoramiento de la Calidad , Centros de Atención Terciaria
11.
Eur J Cardiothorac Surg ; 29(1): 82-8, 2006 Jan.
Artículo en Inglés | MEDLINE | ID: mdl-16337395

RESUMEN

OBJECTIVE: This paper compares four techniques used to assess change in neuropsychological test scores before and after coronary artery bypass graft surgery (CABG), and includes a rationale for the classification of a patient as overall impaired. METHODS: A total of 55 patients were tested before and after surgery on the MicroCog neuropsychological test battery. A matched control group underwent the same testing regime to generate test-retest reliabilities and practice effects. Two techniques designed to assess statistical change were used: the Reliable Change Index (RCI), modified for practice, and the Standardised Regression-based (SRB) technique. These were compared against two fixed cutoff techniques (standard deviation and 20% change methods). RESULTS: The incidence of decline across test scores varied markedly depending on which technique was used to describe change. The SRB method identified more patients as declined on most measures. In comparison, the two fixed cutoff techniques displayed relatively reduced sensitivity in the detection of change. CONCLUSIONS: Overall change in an individual can be described provided the investigators choose a rational cutoff based on likely spread of scores due to chance. A cutoff value of > or =20% of test scores used provided acceptable probability based on the number of tests commonly encountered. Investigators must also choose a test battery that minimises shared variance among test scores.


Asunto(s)
Trastornos del Conocimiento/diagnóstico , Puente de Arteria Coronaria/efectos adversos , Pruebas Neuropsicológicas/estadística & datos numéricos , Anciano , Anciano de 80 o más Años , Estudios de Casos y Controles , Trastornos del Conocimiento/etiología , Estudios de Cohortes , Femenino , Humanos , Masculino , Persona de Mediana Edad , Modelos Estadísticos , Periodo Posoperatorio
12.
Tree Physiol ; 34(11): 1252-62, 2014 Nov.
Artículo en Inglés | MEDLINE | ID: mdl-24186940

RESUMEN

Minimizing nitrogen (N) fertilization inputs during cultivation is essential for sustainable production of bioenergy and biofuels. The biomass crop willow (Salix spp.) is considered to have low N fertilizer requirements due to efficient recycling of nutrients during the perennial cycle. To investigate how successfully different willow genotypes assimilate and allocate N during growth, and remobilize and consequently recycle N before the onset of winter dormancy, N allocation and N remobilization (to and between different organs) were examined in 14 genotypes of a genetic family using elemental analysis and (15)N as a label. Cuttings were established in pots in April and sampled in June, August and at onset of senescence in October. Biomass yield of the trees correlated well with yields recorded in the field. Genotype-specific variation was observed for all traits measured and general trends spanning these sampling points were identified when trees were grouped by biomass yield. Nitrogen reserves in the cutting fuelled the entirety of the canopy establishment, yet earlier cessation of this dependency was linked to higher biomass yields. The stem was found to be the major N reserve by autumn, which constitutes a major source of N loss at harvest, typically every 2-3 years. These data contribute to understanding N remobilization in short rotation coppice willow and to the identification of traits that could potentially be selected for in breeding programmes to further improve the sustainability of biomass production.


Asunto(s)
Nitrógeno/metabolismo , Salix/metabolismo , Biocombustibles , Transporte Biológico , Biomasa , Cruzamiento , Genotipo , Marcaje Isotópico , Isótopos de Nitrógeno/análisis , Especificidad de Órganos , Fenotipo , Hojas de la Planta/genética , Hojas de la Planta/metabolismo , Tallos de la Planta/crecimiento & desarrollo , Tallos de la Planta/metabolismo , Salix/crecimiento & desarrollo , Árboles
13.
Biotechnol Biofuels ; 5(1): 83, 2012 Nov 22.
Artículo en Inglés | MEDLINE | ID: mdl-23173900

RESUMEN

BACKGROUND: The recalcitrance of lignocellulosic cell wall biomass to deconstruction varies greatly in angiosperms, yet the source of this variation remains unclear. Here, in eight genotypes of short rotation coppice willow (Salix sp.) variability of the reaction wood (RW) response and the impact of this variation on cell wall recalcitrance to enzymatic saccharification was considered. RESULTS: A pot trial was designed to test if the 'RW response' varies between willow genotypes and contributes to the differences observed in cell wall recalcitrance to enzymatic saccharification in field-grown trees. Biomass composition was measured via wet chemistry and used with glucose release yields from enzymatic saccharification to determine cell wall recalcitrance. The levels of glucose release found for pot-grown control trees showed no significant correlation with glucose release from mature field-grown trees. However, when a RW phenotype was induced in pot-grown trees, glucose release was strongly correlated with that for mature field-grown trees. Field studies revealed a 5-fold increase in glucose release from a genotype grown at a site exposed to high wind speeds (a potentially high RW inducing environment) when compared with the same genotype grown at a more sheltered site. CONCLUSIONS: Our findings provide evidence for a new concept concerning variation in the recalcitrance to enzymatic hydrolysis of the stem biomass of different, field-grown willow genotypes (and potentially other angiosperms). Specifically, that genotypic differences in the ability to produce a response to RW inducing conditions (a 'RW response') indicate that this RW response is a primary determinant of the variation observed in cell wall glucan accessibility. The identification of the importance of this RW response trait in willows, is likely to be valuable in selective breeding strategies in willow (and other angiosperm) biofuel crops and, with further work to dissect the nature of RW variation, could provide novel targets for genetic modification for improved biofuel feedstocks.

14.
Biotechnol Biofuels ; 4: 13, 2011 May 24.
Artículo en Inglés | MEDLINE | ID: mdl-21609446

RESUMEN

BACKGROUND: Short rotation coppice willow is a potential lignocellulosic feedstock in the United Kingdom and elsewhere; however, research on optimising willow specifically for bioethanol production has started developing only recently. We have used the feedstock Salix viminalis × Salix schwerinii cultivar 'Olof' in a three-month pot experiment with the aim of modifying cell wall composition and structure within the stem to the benefit of bioethanol production. Trees were treated for 26 or 43 days with tension wood induction and/or with an application of the cellulose synthesis inhibitor 2,6-dichlorobenzonitrile that is specific to secondary cell walls. Reaction wood (tension and opposite wood) was isolated from material that had received the 43-day tension wood induction treatment. RESULTS: Glucan content, lignin content and enzymatically released glucose were assayed. All measured parameters were altered without loss of total stem biomass yield, indicating that enzymatic saccharification yield can be enhanced by both alterations to cell wall structure and alterations to absolute contents of either glucan or lignin. CONCLUSIONS: Final glucose yields can be improved by the induction of tension wood without a detrimental impact on biomass yield. The increase in glucan accessibility to cell wall degrading enzymes could help contribute to reducing the energy and environmental impacts of the lignocellulosic bioethanol production process.

15.
Bioresour Technol ; 101(6): 1652-61, 2010 Mar.
Artículo en Inglés | MEDLINE | ID: mdl-19857960

RESUMEN

A novel three-stage bioprocess achieved 75% volatile solids (VS) removal at an organic loading rate (OLR) of 4g VSL(-1)day, a solids retention time (SRT) of 66days, a hydraulic retention time (HRT) of 20days, at a temperature of 35 degrees C. The bioprocess consisted of an anaerobic hydrolytic reactor (HR) where the solids and liquid fractions of the Organic Fraction of the Municipal Solid Waste (OFMSW) were separated with a mesh. The leachate was pumped to a Submerged Anaerobic Membrane Bioreactor (SAMBR) and the treated permeate was polished in an Aerobic Membrane Bioreactor (AMBR). Denaturing Gradient Gel Electrophoresis (DGGE) and DNA sequencing analyses indicated that the increase in methane content in the HR caused by the excess sludge recycle from the SAMBR was associated with an increase in the number of hydrogenotrophic species, mainly Methanobrevibacter sp., Methanobacterium formicicum and Methanosarcina sp. At 20 degrees C VS removal dropped to 50% in the HR and some DGGE bands disappeared when compared to 35 degrees C samples, while some bands such as the one corresponding to Ruminococcus flavefaciens were reduced in intensity. The species associated with the COD-polishing properties of the AMBR correspond to the genera Pseudomonas, Hyphomonas and Hyphomicrobiaceae. These results highlight the positive effect of recycling the excess sludge from the SAMBR to re-inoculate the HR with hydrogenotrophic species.


Asunto(s)
Archaea/metabolismo , Bacterias/metabolismo , Reactores Biológicos/microbiología , Eliminación de Residuos/métodos , Eliminación de Residuos Líquidos/métodos , Anaerobiosis , Biotecnología/métodos , Ecología , Hidrólisis , Metano/química , Nitrógeno/química , Filogenia , Pseudomonas/metabolismo , Aguas del Alcantarillado , Temperatura
16.
Anal Quant Cytol Histol ; 32(6): 301-10, 2010 Dec.
Artículo en Inglés | MEDLINE | ID: mdl-21456341

RESUMEN

OBJECTIVE: To examine bias associated with human-interactive semi-automated systems key components with machine vision used in quantitative histometry. STUDY DESIGN: A standard image set of 20 images was created using 5 nuclei sampled from hematoxylin-eosin-stained sections of benign tissue within a prostate tissue microarray that were rotated through the cardinal directions. Four trained technicians performed segmentation of these images at the start, then at the end, of 3 daily sessions, creating a total analytic set of 480 observations. Measurements of nuclear area (NA), nuclear roundness factor (NRF), and mean optical density (MOD) were compared by segmenter, time, and rotational orientation. RESULTS: NA varied significantly among sessions (p < 0.0009) and session variance differed within segmenter (p < 0.0001). NRF was significant among segmenters (p < 0.001) and sessions (p < 0.0001), and in session (p < 0.0001) and intra-session differences (p = 0.026). Differences in MOD varied among sessions (p < 0.0001) and within sessions (p < 0.049). CONCLUSION: Imaging systems remain vulnerable to statistical inter-segmenter variation, in spite of extensive efforts to eliminate variation among individual segmenters. As statistical significance often guides decision-making in morphometric analysis, statistically significant effects potentially produce bias. Current practices and quality assurance methods require review to eliminate individual operator effects in semiautomated machine systems.


Asunto(s)
Procesamiento de Imagen Asistido por Computador , Análisis por Micromatrices , Próstata/patología , Hiperplasia Prostática/patología , Humanos , Masculino , Variaciones Dependientes del Observador
17.
Anal Quant Cytol Histol ; 32(6): 311-9, 2010 Dec.
Artículo en Inglés | MEDLINE | ID: mdl-21456342

RESUMEN

OBJECTIVE: To compare manual and automated image analysis systems in morphologic analysis of nuclei from benign prostate, high-grade prostatic intraepithelial neoplasia (HGPIN) and prostate cancer (CaP). Morphologic features derived using automated image analysis systems may be more objective and reproducible than manual systems, which require humans to segment nuclei from histologic images. STUDY DESIGN: Images of hematoxylin-eosin-stained sections of prostate tissue microarray were analyzed independently using the automated and manual systems. Mean optical density (MOD), nuclear area (NA), and nuclear roundness factor (NRF) were the morphologic features studied. The ability to differentiate between tissue types using morphologic features derived from an automated and a manual system was compared. RESULTS: Nuclei from 17 benign prostate hyperplasia (BPH), 4 HGPIN, and 8 aggressive CaP were analyzed. The manual system distinguished better between BPH and HGPIN (p < 0.0001), whereas the automated system distinguished better between BPH and CaP (p = 0.01) in multivariate models. The manual system distinguished better BPH and HGPIN using NA (p < 0.0001) and MOD (p < 0.0001), whereas the automated system distinguished better BPH and CaP using MOD (p < 0.0001) and NRF (p = 0.004). CONCLUSION: The minimal human effort required for automated image analysis makes it superior to the manual system.


Asunto(s)
Procesamiento de Imagen Asistido por Computador/métodos , Análisis por Micromatrices , Próstata/patología , Automatización , Núcleo Celular/patología , Colorantes/química , Hematoxilina/química , Humanos , Masculino
18.
EuroIntervention ; 5(3): 330-5, 2009 Aug.
Artículo en Inglés | MEDLINE | ID: mdl-19736157

RESUMEN

AIMS: Our study sought to evaluate mechanisms of the current strategies for optimal anticoagulation during percutaneous coronary intervention (PCI). METHODS AND RESULTS: Thirty-two high risk acute coronary syndrome patients were randomised to bivalirudin and provisional GPIIb/IIIa inhibition (GPIIb/IIIa) or unfractionated heparin (UFH) and mandatory GPIIb/IIIa. Flow cytometric measurements immediately after anticoagulation showed that, unlike UFH, bivalirudin did not activate platelets as indicated by P-selectin expression and fibrinogen binding while decreasing platelet-monocyte aggregates and monocyte expression of tissue factor. UFH released tissue factor pathway inhibitor (TFPI) during and immediately after PCI while bivalirudin (irrespective of GP IIb/IIIa) did not. Lower levels of TFPI with bivalirudin were seen during and immediately after PCI (P<0.01). Thrombin generation as indicated by prothrombin fragment F 1+2 levels was reduced during PCI in the UFH group (P<0.01) but not with bivalirudin. Soluble CD40 ligand is associated with thrombosis and levels were higher in the bivalirudin group irrespective of GPIIb/IIIa at the same stages (P<0.05). CONCLUSIONS: Bivalirudin has some early advantages on platelet activation when compared to UFH. However, there are significant limitations in its mechanism of action, particularly a lack of release of tissue factor pathway inhibitor.


Asunto(s)
Síndrome Coronario Agudo/terapia , Angioplastia Coronaria con Balón , Anticoagulantes/uso terapéutico , Coagulación Sanguínea/efectos de los fármacos , Heparina/uso terapéutico , Fragmentos de Péptidos/uso terapéutico , Activación Plaquetaria/efectos de los fármacos , Inhibidores de Agregación Plaquetaria/uso terapéutico , Complejo GPIIb-IIIa de Glicoproteína Plaquetaria/antagonistas & inhibidores , Síndrome Coronario Agudo/sangre , Síndrome Coronario Agudo/tratamiento farmacológico , Anciano , Angioplastia Coronaria con Balón/efectos adversos , Anticoagulantes/efectos adversos , Aspirina/uso terapéutico , Biomarcadores/sangre , Ligando de CD40/sangre , Clopidogrel , Quimioterapia Combinada , Femenino , Fibrinógeno/metabolismo , Heparina/efectos adversos , Hirudinas/efectos adversos , Humanos , Lipoproteínas/sangre , Masculino , Glicoproteínas de Membrana/sangre , Persona de Mediana Edad , Monocitos/efectos de los fármacos , Monocitos/metabolismo , Fragmentos de Péptidos/efectos adversos , Fragmentos de Péptidos/sangre , Adhesividad Plaquetaria/efectos de los fármacos , Protrombina , Proteínas Recombinantes/efectos adversos , Proteínas Recombinantes/uso terapéutico , Tromboplastina/metabolismo , Trombosis/etiología , Trombosis/prevención & control , Ticlopidina/análogos & derivados , Ticlopidina/uso terapéutico , Resultado del Tratamiento
19.
Med J Aust ; 190(12): 665-9, 2009 Jun 15.
Artículo en Inglés | MEDLINE | ID: mdl-19527199

RESUMEN

OBJECTIVES: To determine whether redesign of pathology processes, including indicators of sample priority, could reduce patient length of stay (LOS) in an emergency department (ED), and assess the long-term impact of two indicators of sample priority on pathology clinical performance indicators for ED samples. DESIGN, SETTING AND PARTICIPANTS: Two observational studies of de-identified data from standard databases were conducted--a single-site pilot trial of patients attending the ED of one hospital compared with historical controls, and a multisite study of 132,521 full blood count (FBC) requests for patients attending seven EDs that utilised either of two pathology process changes (coloured specimen transport bags alone, or coloured specimen bags plus blood tubes with a priority indicator). MAIN OUTCOME MEASURES: LOS in the ED was measured for the pilot trial, and collected-to-validated times for FBCs that fulfilled computer algorithm validation rules were measured for the multisite study. RESULTS: In the pilot trial, the redesigned pathology process resulted in a 29-minute reduction (15.6%) in the median ED LOS for all patients (P < 0.001) compared with historical controls. In the multisite study, use of coloured specimen bags plus blood tubes with a priority indicator resulted in an 8-minute reduction (20.1%) in mean collected-to-validated times for FBC requests compared with FBC requests that used coloured specimen bags alone (P < 0.001). CONCLUSIONS: Our pilot trial revealed a direct relationship between pathology process design and LOS in the ED, suggesting that redesigned pathology processes can significantly reduce LOS in the ED. Our multisite study showed that collecting samples directly into blood tubes with an incorporated priority indicator reduces pathology test turnaround times. These data suggest that LOS in the ED can be significantly reduced by simple changes to pathology processes, such as collecting samples directly into specimen containers with an incorporated priority indicator.


Asunto(s)
Técnicas de Laboratorio Clínico/estadística & datos numéricos , Servicio de Urgencia en Hospital/tendencias , Tiempo de Internación/estadística & datos numéricos , Servicio de Patología en Hospital/tendencias , Humanos , Proyectos Piloto , Estudios Prospectivos , Garantía de la Calidad de Atención de Salud , Factores de Tiempo
20.
Perfusion ; 22(1): 27-33, 2007 Jan.
Artículo en Inglés | MEDLINE | ID: mdl-17633132

RESUMEN

BACKGROUND: An analysis of neuropsychological impairment following cardiopulmonary bypass was performed in 55 patients undergoing elective coronary artery bypass grafting. METHODS: Neurocognitive function was measured preoperatively using the MicroCog: Assessment of Cognitive Functioning computer-based testing tool. Testing was repeated in the postoperative period immediately prior to discharge from hospital. Analysis of significant score decline was performed using the standardised regression-based technique. A patient was classified as overall impaired when > or = 20% of test scores were significantly impaired. The proposed marker of neurological damage S-100beta was also used. Prothrombin Fragment 1+2 (F1+2) was measured as a marker of thrombin development to test the hypothesis that excessive haemostatic activation may lead to thromboembolic damage to the brain. RESULTS AND CONCLUSIONS: 32.7% of patients were classified as significantly impaired. No relationship was detected between F1+2 and any neuropsychological test score; however, the study was limited due to small sample size. F1+2 levels were higher in patients undergoing prolonged bypass times. Neuropsychological decline was significantly correlated with patient age, suggesting a degree of caution is warranted when operating on an elderly cohort. An unexpected relationship was detected between higher heparin concentrations and increased risk of neuropsychological impairment; however, this requires re-evaluation.


Asunto(s)
Trastornos del Conocimiento/etiología , Puente de Arteria Coronaria/efectos adversos , Pruebas Neuropsicológicas , Factores de Edad , Anciano , Trastornos del Conocimiento/diagnóstico , Procedimientos Quirúrgicos Electivos , Femenino , Heparina/sangre , Humanos , Masculino , Persona de Mediana Edad , Fragmentos de Péptidos/sangre , Periodo Posoperatorio , Protrombina , Factores de Riesgo
SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA