RESUMEN
Clinical tools for the prediction of antimicrobial resistance have been derived and validated without examination of their implementation in clinical practice. This study examined the impact of utilization of the extended-spectrum beta-lactamase (ESBL) prediction score on the time to initiation of appropriate antimicrobial therapy for bloodstream infection (BSI). The quasi-experimental cohort study included hospitalized adults with BSI due to ceftriaxone-resistant (CRO-R) Enterobacterales at three community hospitals in Columbia, South Carolina, USA before (January 2010 to December 2013) and after (January 2014 to December 2019) implementation of an antimicrobial stewardship intervention. In total, 45 and 101 patients with BSI due to CRO-R Enterobacterales were included before and after the intervention, respectively. Overall, the median age was 66 years, 85 (58%) were men, and 86 (59%) had a urinary source of infection. The mean time to appropriate antimicrobial therapy was 78 h before and 46 h after implementation of the antimicrobial stewardship intervention (p = 0.04). Application of the ESBL prediction score as part of an antimicrobial stewardship intervention was associated with a significant reduction in time to appropriate antimicrobial therapy in patients with BSI due to CRO-R Enterobacterales. Utilization of advanced rapid diagnostics may be necessary for a further reduction in time to appropriate antimicrobial therapy in this population.
RESUMEN
PURPOSE: Early clinical failure criteria (ECFC) were recently introduced to predict unfavorable outcomes in patients with Gram-negative bloodstream infections (BSI). ECFC include hypotension, tachycardia, tachypnea or mechanical ventilation, altered mental status, and leukocytosis evaluated at 72-96 h after BSI. The aim of this retrospective cohort study was to assess performance of ECFC in predicting 28-day mortality in Enterococcus species BSI. METHODS: Hospitalized adults with Enterococcus species BSI at Prisma Health hospitals from 1 January 2015 to 31 July 2018 were identified. Multivariate logistic regression was used to determine the association between ECFC and 28-day mortality. Area under the receiver operating characteristic (AUROC) curve was used to measure model discrimination. RESULTS: Among 157 patients, 28 (18%) died within 28 days of BSI. After adjustments in multivariate model, the risk of 28-day mortality increased in the presence of each additional ECFC (OR 1.6, 95% CI 1.2-2.3, p = 0.005). Infective endocarditis (OR 3.9, 95% CI 1.4-10.7, p = 0.01) was independently associated with 28-day mortality. AUROC curve of ECFC model in predicting 28-day mortality was 0.74 with ECFC of 2 identified as the best breakpoint. Mortality was 8% in patients with ECFC < 2 compared to 33% in those with ECFC ≥ 2 (p < 0.001). CONCLUSION: ECFC had good discrimination in predicting 28-day mortality in patients with Enterococcus species BSI. These criteria may have utility in future clinical investigations.
Asunto(s)
Bacteriemia , Sepsis , Adulto , Área Bajo la Curva , Bacteriemia/diagnóstico , Enterococcus , Humanos , Estudios Retrospectivos , Factores de RiesgoRESUMEN
BACKGROUND: The role of follow up blood cultures (FUBC) in the management of gram-negative bloodstream infection (GN-BSI) remains controversial. This retrospective cohort study examines the association between obtaining FUBC and mortality in GN-BSI. METHODS: Hospitalized adults with community-onset GN-BSI at Prisma Health-Midlands hospitals in South Carolina, USA from January 1, 2010 to June 30, 2015 were identified. Patients who died or were discharged from hospital within 72 h were excluded to minimize impact of survival and selection biases on results, respectively. Multivariate Cox proportional hazards regression was used to examine association between obtaining FUBC and 28-day all-cause mortality after adjustment for the propensity to obtain FUBC. FINDINGS: Among 766 patients with GN-BSI, 219 (28.6%) had FUBC obtained and 15 of 219 (6.8%) FUBC were persistently positive. Overall, median age was 67 years, 438 (57%) were women, 457 (60%) had urinary source of infection, and 426 (56%) had BSI due to Escherichia coli. Mortality was significantly lower in patients who had FUBC obtained than in those who did not have FUBC (6.3% vs. 11.7%, log-rank p = 0.03). Obtaining FUBC was independently associated with reduced mortality (hazards ratio 0.47, 95% confidence intervals: 0.23-0.87; p = 0.02) after adjustments for age, chronic comorbidities, acute severity of illness, appropriateness of empirical antimicrobial therapy, and propensity to obtain FUBC. INTERPRETATION: Improved survival in hospitalized patients with GN-BSI who had FUBC is consistent with the results of recent publications from Italy and North Carolina supporting utilization of FUBC in management of GN-BSI. FUNDING: This study had no funding source.
RESUMEN
Expanding pharmacist-driven antimicrobial stewardship efforts in the emergency department (ED) can improve antibiotic management for both admitted and discharged patients. We piloted a pharmacist-driven culture and rapid diagnostic technology (RDT) follow-up program in patients discharged from the ED. This was a single-center, pre- and post-implementation, cohort study examining the impact of a pharmacist-driven culture/RDT follow-up program in the ED. Adult patients discharged from the ED with subsequent positive cultures and/or RDT during the pre- (21 August 2018-18 November 2018) and post-implementation (19 November 2018-15 February 2019) periods were screened for inclusion. The primary endpoints were time from ED discharge to culture/RDT review and completion of follow-up. Secondary endpoints included antimicrobial agent prescribed during outpatient follow-up, repeat ED encounters within 30 days, and hospital admissions within 30 days. Baseline characteristics were analyzed using descriptive statistics. Time-to-event data were analyzed using the Wilcoxon signed-rank test. One-hundred-and-twenty-seven patients were included, 64 in the pre-implementation group and 63 in the post-implementation group. There was a 36.3% reduction in the meantime to culture/RDT data review in the post-implementation group (75.2 h vs. 47.9 h, p < 0.001). There was a significant reduction in fluoroquinolone prescribing in the post-implementation group (18.1% vs. 5.4%, p = 0.036). The proportion of patients who had a repeat ED encounter or hospital admission within 30 days was not significantly different between the pre- and post-implementation groups (15.6 vs. 19.1%, p = 0.78 and 9.4% vs. 7.9%, p = 1.0, respectively). Introduction of a pharmacist culture and RDT follow-up program in the ED reduced time to data review, time to outpatient intervention and outpatient follow-up of fluoroquinolone prescribing.
RESUMEN
For decades, the performance of antimicrobial stewardship programs (ASPs) has been measured by incidence rates of hospital-onset Clostridioides difficile and other infections due to multidrug-resistant bacteria. However, these represent indirect and nonspecific ASP metrics. They are often confounded by factors beyond an ASP's control, such as changes in diagnostic testing methods or algorithms and the potential of patient-to-patient transmission. Whereas these metrics remain useful for global assessment of healthcare systems, antimicrobial use represents a direct metric that separates the performance of an ASP from other safety and quality teams within an institution. The evolution of electronic medical records and healthcare informatics has made measurements of antimicrobial use a reality. The US Centers for Disease Control and Prevention's initiative for reporting antimicrobial use and standardized antimicrobial administration ratio in hospitals is highly welcomed. Ultimately, ASPs should be evaluated based on what they do best and what they can control, that is, antimicrobial use within their own institution. This narrative review critically appraises existing stewardship metrics and advocates for adopting antimicrobial use as the primary performance measure. It proposes novel formulas to adjust antimicrobial use based on quality of care and microbiological burden at each institution to allow for meaningful inter-network and inter-facility comparisons.