RESUMEN
BACKGROUND: Little is known about antibiotic prescribing for respiratory tract infections (RTIs) in virtual versus in-person urgent care within the same health system. METHODS: This is a retrospective study using electronic health record data from Cleveland Clinic Health System. We identified RTI patients via ICD-10 codes and assessed whether the visit resulted in an antibiotic. We described differences in diagnoses and prescribing by type of urgent care (virtual versus in-person.) We used mixed effects logistic regression to model the odds of a patient receiving an antibiotic by urgent care setting. We applied the model first to all physicians and second only to those who saw patients in both settings. RESULTS: There were 69,189 in-person and 19,003 virtual visits. Fifty-eight percent of virtual visits resulted in an antibiotic compared to 43% of in-person visits. Sinusitis diagnoses were more than twice as common in virtual versus in-person care (36% versus 14%) and were associated with high rates of prescribing in both settings (95% in person, 91% virtual). Compared to in-person care, virtual urgent care was positively associated with a prescription (OR:1.64, 95%CI:1.53-1.75). Among visits conducted by 39 physicians who saw patients in both settings, odds of antibiotic prescription in virtual care were 1.71 times higher than in in-person care (95%CI:1.53-1.90). CONCLUSIONS: Antibiotic prescriptions were more common in virtual versus in-person urgent care settings, including among physicians who provided care in both platforms. This appears to be related to the high rate of sinusitis diagnosis in virtual urgent care.
RESUMEN
BACKGROUND: The 2019 ATS/IDSA community-acquired pneumonia (CAP) guidelines recommend that clinicians prescribe empiric antibiotics for MRSA or P. aeruginosa only if locally validated risk factors (or 2 generic risk factors if local validation is not feasible) are present. It remains unknown how implementation of this recommendation would influence care. METHODS: This cross-sectional study included adults hospitalized for CAP across 50 hospitals in the Premier Healthcare Database from 2010-2015 and sought to describe how the use of extended-spectrum antibiotics (ESA) and the coverage for patients with CAP due to restraint organisms would change under the two approaches described in 2019 ATS/IDSA guidelines. To do this, the proportion of ESA use in patients with CAP and the proportion of ESA coverage among patients with infections resistant to recommended CAP therapy were measured. RESULTS: In the 50 hospitals, 19%-75% of patients received ESA, and 42%-100% of patients with resistant organisms received ESA. The median number of risk factors identified per hospital was 9 (interquartile range [IQR], 6-12). Overall, treatment according to local risk factors reduced the number of patients receiving ESA by 38.8 percentage points and using generic risk factors by 47.5 percentage points. However, the effect varied by hospital. The use of generic risk factors always resulted in less ESA use and less coverage for resistant organisms. Using locally validated risk factors resulted in a similar outcome in all but one hospital. CONCLUSION: Future guidelines should explicitly define the optimal trade-off between adequate coverage for resistant organisms and ESA use.
RESUMEN
OBJECTIVES: Do-not-resuscitate (DNR) orders are used to express patient preferences for cardiopulmonary resuscitation. This study examined whether early DNR orders are associated with differences in treatments and outcomes among patients hospitalized with pneumonia. METHODS: This is a retrospective cohort study of 768,015 adult patients hospitalized with pneumonia from 2010 to 2015 in 646 US hospitals. The exposure was DNR orders present on admission. Secondary analyses stratified patients by predicted in-hospital mortality. Main outcomes included in-hospital mortality, length of stay, cost, intensive care admission, invasive mechanical ventilation, noninvasive ventilation, vasopressors, and dialysis initiation. RESULTS: Of 768,015 patients, 94,155 (12.3%) had an early DNR order. Compared with those without, patients with DNR orders were older (mean age 80.1 ± 10.6 years vs 67.8 ± 16.4 years), with higher comorbidity burden, intensive care use (31.6% vs 30.6%), and in-hospital mortality (28.2% vs 8.5%). After adjustment via propensity score weighting, these patients had higher mortality (odds ratio [OR] 2.39, 95% confidence interval [CI] 2.33-2.45) and lower use of intensive therapies such as vasopressors (OR 0.83, 95% CI 0.81-0.85) and invasive mechanical ventilation (OR 0.68, 95% CI 0.66-0.70). Although there was little relationship between predicted mortality and DNR orders, among those with highest predicted mortality, DNR orders were associated with lower intensive care use compared with those without (66.7% vs 80.8%). CONCLUSIONS: Patients with early DNR orders have higher in-hospital mortality rates than those without, but often receive intensive care. These orders have the most impact on the care of patients with the highest mortality risk.
Asunto(s)
Neumonía , Órdenes de Resucitación , Adulto , Humanos , Anciano , Anciano de 80 o más Años , Estudios Retrospectivos , Hospitalización , Comorbilidad , Neumonía/terapiaRESUMEN
This study compares radiographic osteoarthritis severity prior with total knee arthroplasty (TKA) by payer type. Five hundred and three primary TKAs were included. Preoperative radiographs were scored by Ahlback, Kellgren-Lawrence (KL), and International Knee Documentation Committee (IKDC) classifications. Osteoarthritis severity by age and insurance type (private, Medicare, and Medicaid) were compared using Mann Whitney U and Kruskal-Wallis testing. Three hundred and two (60%) subjects were under 65 years old, and 201 (40%) were 65 years and older. Younger subjects had no differences in radiographic severity in KL (p = 0.268), Ahlback (p = 1), or IKDC (p = 0.948) classification by insurance. Older subjects also had no differences in osteoarthritis severity for KL (p = 0.282), Ahlback (p = 0.354), or IKDC (p = 0.735) classifications by insurance. Three osteoarthritis classification systems found no difference in preoperative radiographic changes by payor type, suggesting that, in the study's population, there is no delay in appropriate surgical treatment by payer. Future studies should compare preoperative clinical symptoms. (Journal of Surgical Orthopaedic Advances 33(3):184-188, 2024).
Asunto(s)
Artroplastia de Reemplazo de Rodilla , Medicare , Osteoartritis de la Rodilla , Índice de Severidad de la Enfermedad , Humanos , Artroplastia de Reemplazo de Rodilla/estadística & datos numéricos , Osteoartritis de la Rodilla/cirugía , Osteoartritis de la Rodilla/diagnóstico por imagen , Anciano , Masculino , Femenino , Persona de Mediana Edad , Estados Unidos , Medicaid , Radiografía , Estudios Retrospectivos , Seguro de Salud/estadística & datos numéricos , Factores de EdadRESUMEN
BACKGROUND: Debilitating symptoms of recurrent Clostridioides difficile infection (rCDI) often lead to long-term effects on health-related quality-of-life (HRQOL). In ECOSPOR III, SER-109, an investigational oral microbiome therapeutic, was superior to placebo in reducing rCDI. We investigated the validity, reliability, and responsiveness of a 32-item, CDI-specific questionnaire-the Clostridium difficile Quality of Life Survey (Cdiff32)-across mental, physical, and social domains in patients with rCDI. METHODS: In this post hoc analysis of a phase 3 clinical trial, 182 outpatients with rCDI completed Cdiff32 and EQ-5D at baseline and at 1 and 8 weeks. Cdiff32 was evaluated for item performance, internal reliability, and convergent validity. To assess known-groups validity, Cdiff32 scores were compared by disease recurrence status at week 1; internal responsiveness was evaluated in the nonrecurrent disease group by 8 weeks by means of paired t test. RESULTS: All 182 patients (mean age [standard deviation], 65.5 [16.5] years; 59.9% female) completed baseline Cdiff32. Confirmatory factor analysis identified 3 domains (physical, mental, and social relationships) with good item fit. High internal reliability was demonstrated (Cronbach α = 0.94 with all subscales >0.80). Convergent validity was evidenced by significant correlations between Cdiff32 subscales and EQ-5D (r = 0.29-0.37; P < .001). Cdiff32 differentiated patients by disease recurrence status at week 1 (effect sizes, 0.38-0.42; P < .05 overall), with significant improvement from baseline through week 8 in patients with nonrecurrent disease at week 1 (effect sizes, 0.75-1.02; P < .001 overall). CONCLUSIONS: Cdiff32 is a valid, reliable, and responsive disease-specific HRQOL questionnaire that is fit for purpose for interventional treatment trials. The significant improvement in patients with nonrecurrent disease by 8 weeks demonstrates the negative impact of rCDI on HRQOL.
Asunto(s)
Clostridioides difficile , Infecciones por Clostridium , Humanos , Femenino , Adolescente , Masculino , Calidad de Vida , Reproducibilidad de los Resultados , Infecciones por Clostridium/tratamiento farmacológico , Encuestas y Cuestionarios , RecurrenciaRESUMEN
BACKGROUND: Community-acquired pneumonia (CAP) is a leading cause of hospital admissions and antimicrobial use. Clinical practice guidelines recommend switching from intravenous (IV) to oral antibiotics once patients are clinically stable. METHODS: We conducted a retrospective cohort study of adults admitted with CAP and initially treated with IV antibiotics at 642 US hospitals from 2010 through 2015. Switching was defined as discontinuation of IV and initiation of oral antibiotics without interrupting therapy. Patients switched by hospital day 3 were considered early switchers. We compared length of stay (LOS), in-hospital 14-day mortality, late deterioration (intensive care unit [ICU] transfer), and hospital costs between early switchers and others, controlling for hospital characteristics, patient demographics, comorbidities, initial treatments, and predicted mortality. RESULTS: Of 378 041 CAP patients, 21 784 (6%) were switched early, most frequently to fluoroquinolones. Patients switched early had fewer days on IV antibiotics, shorter duration of inpatient antibiotic treatment, shorter LOS, and lower hospitalization costs, but no significant excesses in 14-day in-hospital mortality or late ICU admission. Patients at a higher mortality risk were less likely to be switched. However, even in hospitals with relatively high switch rates, <15% of very low-risk patients were switched early. CONCLUSIONS: Although early switching was not associated with worse outcomes and was associated with shorter LOS and fewer days on antibiotics, it occurred infrequently. Even in hospitals with high switch rates, <15% of very low-risk patients were switched early. Our findings suggest that many more patients could be switched early without compromising outcomes.
Asunto(s)
Infecciones Comunitarias Adquiridas , Neumonía , Adulto , Humanos , Estudios Retrospectivos , Neumonía/tratamiento farmacológico , Antibacterianos/uso terapéutico , Hospitalización , Tiempo de Internación , Infecciones Comunitarias Adquiridas/tratamiento farmacológico , Administración OralRESUMEN
BACKGROUND: Although comorbidities are risk factors for recurrent Clostridioides difficile infection (rCDI), many clinical trials exclude patients with medical conditions such as malignancy or immunosuppression. In a phase 3, double-blind, placebo-controlled, randomized trial (ECOSPOR III), fecal microbiota spores, live (VOWST, Seres Therapeutics; hereafter "VOS," formerly SER-109), an oral microbiota therapeutic, significantly reduced the risk of rCDI at week 8. We evaluated the efficacy of VOS compared with placebo in patients with comorbidities and other risk factors for rCDI. METHODS: Adults with rCDI were randomized to receive VOS or placebo (4 capsules daily for 3 days) following standard-of-care antibiotics. In this post hoc analysis, the rate of rCDI through week 8 was assessed in VOS-treated participants compared with placebo for subgroups including (i) Charlson comorbidity index (CCI) score category (0, 1-2, 3-4, ≥5); (ii) baseline creatinine clearance (<30, 30-50, >50 to 80, or >80 mL/minute); (iii) number of CDI episodes, inclusive of the qualifying episode (3 and ≥4); (iv) exposure to non-CDI-targeted antibiotics after dosing; and (v) acid-suppressing medication use at baseline. RESULTS: Of 281 participants screened, 182 were randomized (59.9% female; mean age, 65.5 years). Comorbidities were common with a mean overall baseline age-adjusted CCI score of 4.1 (4.1 in the VOS arm and 4.2 in the placebo arm). Across all subgroups analyzed, VOS-treated participants had a lower relative risk of recurrence compared with placebo. CONCLUSIONS: In this post hoc analysis, VOS reduced the risk of rCDI compared with placebo, regardless of baseline characteristics, concomitant medications, or comorbidities.
Asunto(s)
Clostridioides difficile , Infecciones por Clostridium , Microbiota , Adulto , Humanos , Femenino , Anciano , Masculino , Prevalencia , Antibacterianos/uso terapéutico , Infecciones por Clostridium/tratamiento farmacológico , RecurrenciaRESUMEN
BACKGROUND & AIMS: We conducted a meta-analysis to summarize the rates of progression to and regression of nonalcoholic fatty liver (NAFL), nonalcoholic steatohepatitis (NASH), and fibrosis in adults with nonalcoholic fatty liver disease (NAFLD). METHODS: We searched PubMed/Medline and 4 other databases from 1985 through 2020. We included observational studies and randomized controlled trials in any language that used liver biopsy or imaging to diagnose NAFLD in adults with a follow-up period ≥48 weeks. Rates were calculated as incident cases per 100 person-years and pooled using the random-effects Poisson distribution model. Heterogeneity was assessed using the I2 statistic. RESULTS: We screened 9744 articles and included 54 studies involving 26,738 patients. Among observational studies, 20% of healthy adults developed NAFL (incidence rate, 4.8/100 person-years) while 21% of people with fatty liver had resolution of NAFL (incidence rate, 2.4/100 person-years) after a median of approximately 4.5 years. In addition, 31% of patients developed NASH after 4.7 years (incidence rate, 7.4/100 person-years), whereas in 29% of those with NASH, resolution occurred after a median of 3.5 years (incidence rate, 5.1/100 person-years). Time to progress by 1 fibrosis stage was 9.9, 10.3, 13.3, and 22.2 years for F0, F1, F2, and F3, respectively. Time to regress by 1 stage was 21.3, 12.5, 20.4, and 40.0 years for F4, F3, F2, and F1, respectively. Rates estimated from randomized controlled trials were higher than those from observational studies. CONCLUSIONS: In our meta-analysis, progression to NASH was more common than regression from NASH. Rates of fibrosis progression were similar across baseline stage, but patients with advanced fibrosis were more likely to regress than those with mild fibrosis.
Asunto(s)
Enfermedad del Hígado Graso no Alcohólico , Adulto , Humanos , Enfermedad del Hígado Graso no Alcohólico/diagnóstico por imagen , Enfermedad del Hígado Graso no Alcohólico/epidemiología , Cirrosis Hepática/patología , Fibrosis , Biopsia , Hígado/diagnóstico por imagen , Hígado/patologíaRESUMEN
INTRODUCTION: Influenza is a preventable acute respiratory illness with a high potential to cause serious complications and is associated with high mortality and morbidity in the US. We aimed to determine the specific community-level vulnerabilities for different race/ethnic communities that are most predictive of influenza vaccination rates. METHODS: We conducted a machine learning analysis (XGBoost) to identify community-level social vulnerability features that are predictive of influenza vaccination rates among Medicare enrollees across counties in the US and by race/ethnicity. RESULTS: Population density per square mile in a county is the most important feature in predicting influenza vaccination in a county, followed by unemployment rates and the percentage of mobile homes. The gain relative importance of these features are 11.6%, 9.2%, and 9%, respectively. Among whites, population density (17% gain relative importance) was followed by the percentage of mobile homes (9%) and per capita income (8.7%). For Black/African Americans, the most important features were population density (12.8%), percentage of minorities in the county (8.0%), per capita income (6.9%), and percent of over-occupied housing units (6.8%). Finally, for Hispanics, the top features were per capita income (8.4%), percentage of mobile homes (8.0%), percentage of non-institutionalized persons with a disability (7.9%), and population density (7.6%). CONCLUSIONS: Our study may have implications for the success of large vaccination programs in counties with high social vulnerabilities. Further, our findings suggest that policies and interventions seeking to increase rates of vaccination in race/ethnic minority communities may need to be tailored to address their specific socioeconomic vulnerabilities.
Asunto(s)
Etnicidad , Gripe Humana , Anciano , Humanos , Estados Unidos , Vulnerabilidad Social , Gripe Humana/prevención & control , Medicare , Grupos Minoritarios , VacunaciónRESUMEN
Paediatric sport participation continues to increase in the United States, with a corresponding increase in sports-related concussions or traumatic brain injuries (TBIs). It is important to recognize which sports are at elevated risk and identify risk factors for hospital admission and length of stay (LOS). Paediatric patients (ages 5-18) from 2008 to 2014 were identified from the Healthcare Cost and Utilization Project (HCUP) National Inpatient Sample (NIS). Eight hundred and ninety-four patients included those who were hospitalized with a TBI resulting from participation in an individual (451 patients) or team (443 patients) sport. We evaluated the differences in LOS and total charges between individual and team sports and found that compared to team sports, TBI patients in individual sports had significantly longer hospital stays compared to team sports (1.75 days versus 1.34 days, p < 0.001) and costlier ($27,333 versus $19,069, p < 0.001) hospital stays. This may be due to reduced awareness and reduced compliance with return-to-play protocols in individual sports. Safety education information at a young age, increased awareness of TBIs, and additional medical support for individual sports as well as team sports may help mitigate these findings.
RESUMEN
OBJECTIVES: Compare the clinical practice and outcomes in severe community-acquired pneumonia (sCAP) patients to those in non-sCAP patients using guideline-defined criteria for sCAP. DESIGN: Retrospective observational cohort study. SETTING: One hundred seventy-seven U.S. hospitals within the Premier Healthcare Database. PATIENTS: Hospitalized adult (≥ 18 yr old) patients with pneumonia. MEASUREMENTS AND MAIN RESULTS: Adult patients (≥ 18 yr old) with a principal diagnosis of pneumonia or a secondary diagnosis of pneumonia paired with a principal diagnosis of sepsis or respiratory failure were included. Patients with at least one guideline-defined major criterion for severe pneumonia were compared with patients with nonsevere disease. Among 154,799 patients with pneumonia, 21,805 (14.1%) met criteria for sCAP. They had higher organ failure scores (1.9 vs 0.63; p < 0.001) and inpatient mortality (22.0 vs 5.0%; p < 0.001), longer lengths of stay (8 vs 5 d; p < 0.001), and higher costs ($20,046 vs $7,543; p < 0.001) than those with nonsevere disease. Patients with sCAP had twice the rate of positive blood cultures (10.0% vs 4.5%; p < 0.001) and respiratory cultures (34.2 vs 21.1%; p < 0.001) and more often had isolates resistant to first-line community-acquired pneumonia antibiotics (10% of severe vs 3.1% of nonsevere; p < 0.001). Regardless of disease severity, Streptococcus pneumoniae was the most common pathogen recovered from blood cultures and Staphylococcus aureus and Pseudomonas species were the most common pathogens recovered from the respiratory tract. Although few patients with sCAP had cultures positive for a resistant organism, 65% received vancomycin and 42.8% received piperacillin-tazobactam. CONCLUSIONS: sCAP patients had worse outcomes and twice the rate of culture positivity. S. aureus and S. pneumoniae were the most common organisms in respiratory and blood specimens, respectively. Although only recommended for sCAP patients, nearly all pneumonia patients received blood cultures, a quarter of nonsevere patients received sputum cultures, and treatment with broad-spectrum agents was widespread, indicating fertile ground for antimicrobial and diagnostic stewardship programs.
Asunto(s)
Infecciones Comunitarias Adquiridas , Neumonía , Adulto , Antibacterianos/uso terapéutico , Infecciones Comunitarias Adquiridas/tratamiento farmacológico , Humanos , Neumonía/tratamiento farmacológico , Neumonía/epidemiología , Neumonía/etiología , Estudios Retrospectivos , Staphylococcus aureusRESUMEN
Homeostasis represents the idea that a feature may remain invariant despite changes in some external parameters. We establish a connection between homeostasis and injectivity for reaction network models. In particular, we show that a reaction network cannot exhibit homeostasis if a modified version of the network (which we call homeostasis-associated network) is injective. We provide examples of reaction networks which can or cannot exhibit homeostasis by analyzing the injectivity of their homeostasis-associated networks.
Asunto(s)
Modelos Biológicos , HomeostasisRESUMEN
PURPOSE: Understanding the anatomy of the deep neurovascular structures of the hand is essential in surgical planning. There is a lack of literature regarding hand size and its influence in branching variation and the distances between branches of various neurovascular structures. Our study quantifies the variation in branching distances of the deep ulnar nerve and deep palmar arch branches. METHODS: Twenty-five fresh-frozen cadaveric hands were dissected. Each branch of the deep ulnar nerve and deep palmar arch was identified. The distance from the most distal portion of the pisiform to the proximal aspect of the branch was measured. The relationship between the length of the third metacarpal and the distance of each branch from the pisiform was examined. RESULTS: There was no relationship between branching differences in the deep ulnar nerve and the length of the third metacarpal. There was a significant association between the length of the third metacarpal and the second, third, and fourth branches of the deep palmar arch (p < 0.05). CONCLUSIONS: Our study found a significant association between the branching distances of the second, third, and fourth branches of the deep palmar arch and hand size as measured by the length of the third metacarpal.
Asunto(s)
Mano , Nervio Cubital , Humanos , Nervio Cubital/anatomía & histología , Cadáver , Mano/irrigación sanguíneaRESUMEN
BACKGROUND: Several studies have investigated the utility of electronic decision support alerts in diagnostic stewardship for Clostridioides difficile infection (CDI). However, it is unclear if alerts are effective in reducing inappropriate CDI testing and/or CDI rates. The aim of this systematic review was to determine if alerts related to CDI diagnostic stewardship are effective at reducing inappropriate CDI testing volume and CDI rates among hospitalized adult patients. METHODS: We searched Ovid Medline and 5 other databases for original studies evaluating the association between alerts for CDI diagnosis and CDI testing volume and/or CDI rate. Two investigators independently extracted data on study characteristics, study design, alert triggers, cointerventions, and study outcomes. RESULTS: Eleven studies met criteria for inclusion. Studies varied significantly in alert triggers and in study outcomes. Six of 11 studies demonstrated a statistically significant decrease in CDI testing volume, 6 of 6 studies evaluating appropriateness of CDI testing found a significant reduction in the proportion of inappropriate testing, and 4 of 7 studies measuring CDI rate demonstrated a significant decrease in the CDI rate in the postintervention vs preintervention period. The magnitude of the increase in appropriate CDI testing varied, with some studies reporting an increase with minimal clinical significance. CONCLUSIONS: The use of electronic alerts for diagnostic stewardship for C. difficile was associated with reductions in CDI testing, the proportion of inappropriate CDI testing, and rates of CDI in most studies. However, broader concerns related to alerts remain understudied, including unintended adverse consequences and alert fatigue.
Asunto(s)
Clostridioides difficile , Infecciones por Clostridium , Sistemas de Apoyo a Decisiones Clínicas , Adulto , Clostridioides , Infecciones por Clostridium/diagnóstico , HumanosRESUMEN
BACKGROUND: For patients at risk for multidrug-resistant organisms, IDSA/ATS guidelines recommend empiric therapy against methicillin-resistant Staphylococcus aureus (MRSA) and Pseudomonas. Following negative cultures, the guidelines recommend antimicrobial de-escalation. We assessed antibiotic de-escalation practices across hospitals and their associations with outcomes in hospitalized patients with pneumonia with negative cultures. METHODS: We included adults admitted with pneumonia in 2010-2015 to 164 US hospitals if they had negative blood and/or respiratory cultures and received both anti-MRSA and antipseudomonal agents other than quinolones. De-escalation was defined as stopping both empiric drugs on day 4 while continuing another antibiotic. Patients were propensity adjusted for de-escalation and compared on in-hospital 14-day mortality, late deterioration (ICU transfer), length-of-stay (LOS), and costs. We also compared adjusted outcomes across hospital de-escalation rate quartiles. RESULTS: Of 14 170 patients, 1924 (13%) had both initial empiric drugs stopped by hospital day 4. Hospital de-escalation rates ranged from 2-35% and hospital de-escalation rate quartile was not significantly associated with outcomes. At hospitals in the top quartile of de-escalation, even among patients at lowest risk for mortality, the de-escalation rates were <50%. In propensity-adjusted analysis, patients with de-escalation had lower odds of subsequent transfer to ICU (adjusted odds ratio, .38; 95% CI, .18-.79), LOS (adjusted ratio of means, .76; .75-.78), and costs (.74; .72-.76). CONCLUSIONS: A minority of eligible patients with pneumonia had antibiotics de-escalated by hospital day 4 following negative cultures and de-escalation rates varied widely between hospitals. To adhere to recent guidelines will require substantial changes in practice.
Asunto(s)
Antiinfecciosos , Staphylococcus aureus Resistente a Meticilina , Neumonía , Adulto , Antibacterianos/uso terapéutico , Mortalidad Hospitalaria , Humanos , Neumonía/tratamiento farmacológico , Estudios RetrospectivosRESUMEN
BACKGROUND: Wallerian degeneration (WD) following peripheral nerve injury (PNI) is an area of growing focus for pharmacological developments. Clinically, WD presents challenges in achieving full functional recovery following PNI, as prolonged denervation of distal tissues for an extended period of time can irreversibly destabilize sensory and motor targets with secondary tissue atrophy. Our objective is to improve upon histological assessments of WD. METHODS: Conventional methods utilize a qualitative system simply describing the presence or absence of WD in nerve fibers. We propose a three-category assessment that allows more quantification: A fibers appear normal, B fibers have moderate WD (altered axoplasm), and C fibers have extensive WD (myelin figures). Analysis was by light microscopy (LM) on semithin sections stained with toluidine blue in three rat tibial nerve lesion models (crush, partial transection, and complete transection) at 5 days postop and 5 mm distal to the injury site. The LM criteria were verified at the ultrastructural level. This early outcome measure was compared with the loss of extensor postural thrust and the absence of muscle atrophy. RESULTS: The results showed good to excellent internal consistency among counters, demonstrating a significant difference between the crush and transection lesion models. A significant decrease in fiber density in the injured nerves due to inflammation/edema was observed. The growth cones of regenerating axons were evident in the crush lesion group. CONCLUSION: The ABC method of histological assessment is a consistent and reliable method that will be useful to quantify the effects of different interventions on the WD process.
Asunto(s)
Traumatismos de los Nervios Periféricos , Degeneración Walleriana , Animales , Axones/patología , Compresión Nerviosa , Regeneración Nerviosa , Traumatismos de los Nervios Periféricos/patología , Ratas , Nervio Ciático/patología , Nervio Tibial/cirugía , Degeneración Walleriana/patologíaRESUMEN
BACKGROUND: Choice of empiric therapy for pneumonia depends on risk for antimicrobial resistance. Models to predict resistance are derived from blood and respiratory culture results. We compared these results to understand if organisms and resistance patterns differed by site. We also compared characteristics and outcomes of patients with positive cultures by site. METHODS: We studied adult patients discharged from 177 US hospitals from July 2010 through June 2015, with principal diagnoses of pneumonia, or principal diagnoses of respiratory failure, acute respiratory distress syndrome, respiratory arrest, or sepsis with a secondary diagnosis of pneumonia, and who had blood or respiratory cultures performed. Demographics, treatment, microbiologic results, and outcomes were examined. RESULTS: Among 138 561 hospitalizations of patients with pneumonia who had blood or respiratory cultures obtained at admission, 12 888 (9.3%) yielded positive cultures: 6438 respiratory cultures, 5992 blood cultures, and 458 both respiratory and blood cultures. Forty-two percent had isolates resistant to first-line therapy for community-acquired pneumonia. Isolates from respiratory samples were more often resistant than were isolates from blood (54.2% vs 26.6%; P < .001). Patients with both culture sites positive had higher case-fatality, longer lengths of stay, and higher costs than patients who had only blood or respiratory cultures positive. Among respiratory cultures, the most common pathogens were Staphylococcus aureus (34%) and Pseudomonas aeruginosa (17%), whereas blood cultures most commonly grew Streptococcus pneumoniae (33%), followed by S. aureus (22%). CONCLUSIONS: Patients with positive respiratory tract cultures are clinically different from those with positive blood cultures, and resistance patterns differ by source. Models of antibiotic resistance should account for culture source.
Asunto(s)
Infecciones Comunitarias Adquiridas , Neumonía , Adulto , Antibacterianos/uso terapéutico , Cultivo de Sangre , Infecciones Comunitarias Adquiridas/tratamiento farmacológico , Farmacorresistencia Microbiana , Humanos , Neumonía/tratamiento farmacológico , Neumonía/epidemiología , Staphylococcus aureusRESUMEN
OBJECTIVES: Despite the common occurrence of brain injury in patients undergoing extracorporeal membrane oxygenation, it is unclear which cannulation method carries a higher risk of brain injury. We compared the prevalence of brain injury between patients undergoing venoarterial and venovenous extracorporeal membrane oxygenation. DATA SOURCES: PubMed and six other databases from inception to April 2020. STUDY SELECTION: Observational studies and randomized clinical trials in adult patients undergoing venoarterial extracorporeal membrane oxygenation or venovenous extracorporeal membrane oxygenation reporting brain injury. DATA EXTRACTION: Two independent reviewers extracted the data from the studies. Random-effects meta-analyses were used to pool data. DATA SYNTHESIS: Seventy-three studies (n = 16,063) met inclusion criteria encompassing 8,211 patients (51.2%) undergoing venoarterial extracorporeal membrane oxygenation and 7,842 (48.8%) undergoing venovenous extracorporeal membrane oxygenation. Venoarterial extracorporeal membrane oxygenation patients had more overall brain injury compared with venovenous extracorporeal membrane oxygenation (19% vs 10%; p = 0.002). Venoarterial extracorporeal membrane oxygenation patients had more ischemic stroke (10% vs 1%; p < 0.001), hypoxic-ischemic brain injury (13% vs 1%; p < 0.001), and brain death (11% vs 1%; p = 0.001). In contrast, rates of intracerebral hemorrhage (6% vs 8%; p = 0.35) did not differ. Survival was lower in venoarterial extracorporeal membrane oxygenation (48%) than venovenous extracorporeal membrane oxygenation (64%) (p < 0.001). After excluding studies that included extracorporeal cardiopulmonary resuscitation, no significant difference was seen in the rate of overall acute brain injury between venoarterial extracorporeal membrane oxygenation and venovenous extracorporeal membrane oxygenation (13% vs 10%; p = 0.4). However, ischemic stroke (10% vs 1%; p < 0.001), hypoxic-ischemic brain injury (7% vs 1%; p = 0.02), and brain death (9% vs 1%; p = 0.005) remained more frequent in nonextracorporeal cardiopulmonary resuscitation venoarterial extracorporeal membrane oxygenation compared with venovenous extracorporeal membrane oxygenation. CONCLUSIONS: Brain injury was more common in venoarterial extracorporeal membrane oxygenation compared with venovenous extracorporeal membrane oxygenation. While ischemic brain injury was more common in venoarterial extracorporeal membrane oxygenation patients, the rates of intracranial hemorrhage were similar between venoarterial extracorporeal membrane oxygenation and venovenous extracorporeal membrane oxygenation. Further research on mechanism, timing, and effective monitoring of acute brain injury and its management is necessary.
Asunto(s)
Lesiones Encefálicas/etiología , Oxigenación por Membrana Extracorpórea/efectos adversos , Oxigenación por Membrana Extracorpórea/métodos , Humanos , Factores de RiesgoRESUMEN
OBJECTIVES: Extracorporeal cardiopulmonary resuscitation has shown survival benefit in select patients with refractory cardiac arrest but there is insufficient data on the frequency of different types of brain injury. We aimed to systematically review the prevalence, predictors of and survival from neurologic complications in patients who have undergone extracorporeal cardiopulmonary resuscitation. DATA SOURCES: MEDLINE (PubMed) and six other databases (EMBASE, Cochrane Library, CINAHL Plus, Web of Science, and Scopus) from inception to August 2019. STUDY SELECTION: Randomized controlled trials and observational studies in patients greater than 18 years old. DATA EXTRACTION: Two independent reviewers extracted the data. Study quality was assessed by the Cochrane Risk of Bias tool for randomized controlled trials, the Newcastle-Ottawa Scale for cohort and case-control studies, and the Murad tool for case series. Random-effects meta-analyses were used to pool data. DATA SYNTHESIS: The 78 studies included in our analysis encompassed 50,049 patients, of which 6,261 (12.5%) received extracorporeal cardiopulmonary resuscitation. Among extracorporeal cardiopulmonary resuscitation patients, the median age was 56 years (interquartile range, 52-59 yr), 3,933 were male (63%), 3,019 had out-of-hospital cardiac arrest (48%), and 2,289 had initial shockable heart rhythm (37%). The most common etiology of cardiac arrest was acute coronary syndrome (n = 1,657, 50% of reported). The median extracorporeal cardiopulmonary resuscitation duration was 3.2 days (interquartile range, 2.1-4.9 d). Overall, 27% (95% CI, 0.17-0.39%) had at least one neurologic complication, 23% (95% CI, 0.14-0.32%) hypoxic-ischemic brain injury, 6% (95% CI, 0.02-0.11%) ischemic stroke, 6% (95% CI, 0.01-0.16%) seizures, and 4% (95% CI, 0.01-0.1%) intracerebral hemorrhage. Seventeen percent (95% CI, 0.12-0.23%) developed brain death. The overall survival rate after extracorporeal cardiopulmonary resuscitation was 29% (95% CI, 0.26-0.33%) and good neurologic outcome was achieved in 24% (95% CI, 0.21-0.28%). CONCLUSIONS: One in four patients developed acute brain injury after extracorporeal cardiopulmonary resuscitation and the most common type was hypoxic-ischemic brain injury. One in four extracorporeal cardiopulmonary resuscitation patients achieved good neurologic outcome. Further research on assessing predictors of extracorporeal cardiopulmonary resuscitation-associated brain injury is necessary.
Asunto(s)
Lesiones Encefálicas/etiología , Oxigenación por Membrana Extracorpórea/efectos adversos , Paro Cardíaco/terapia , Humanos , Resultado del TratamientoRESUMEN
Enzymes are central to both metabolism and information processing in cells. In both cases, an enzyme's ability to accelerate a reaction without being consumed in the reaction is crucial. Nevertheless, enzymes are transiently sequestered when they bind to their substrates; this sequestration limits activity and potentially compromises information processing and signal transduction. In this article, we analyse the mechanism of enzyme-substrate catalysis from the perspective of minimizing the load on the enzymes through sequestration, while maintaining at least a minimum reaction flux. In particular, we ask: which binding free energies of the enzyme-substrate and enzyme-product reaction intermediates minimize the fraction of enzymes sequestered in complexes, while sustaining a certain minimal flux? Under reasonable biophysical assumptions, we find that the optimal design will saturate the bound on the minimal flux and reflects a basic trade-off in catalytic operation. If both binding free energies are too high, there is low sequestration, but the effective progress of the reaction is hampered. If both binding free energies are too low, there is high sequestration, and the reaction flux may also be suppressed in extreme cases. The optimal binding free energies are therefore neither too high nor too low, but in fact moderate. Moreover, the optimal difference in substrate and product binding free energies, which contributes to the thermodynamic driving force of the reaction, is in general strongly constrained by the intrinsic free-energy difference between products and reactants. Both the strategies of using a negative binding free-energy difference to drive the catalyst-bound reaction forward and of using a positive binding free-energy difference to enhance detachment of the product are limited in their efficacy.