RESUMEN
BACKGROUND: Antibiotic overuse at hospital discharge is common, but there is no metric to evaluate hospital performance at this transition of care. We built a risk-adjusted metric for comparing hospitals on their overall post-discharge antibiotic use. METHODS: This was a retrospective study across all acute-care admissions within the Veterans Health Administration during 2018-2021. For patients discharged to home, we collected data on antibiotics and relevant covariates. We built a zero-inflated, negative, binomial mixed model with 2 random intercepts for each hospital to predict post-discharge antibiotic exposure and length of therapy (LOT). Data were split into training and testing sets to evaluate model performance using absolute error. Hospital performance was determined by the predicted random intercepts. RESULTS: 1 804 300 patient-admissions across 129 hospitals were included. Antibiotics were prescribed to 41.5% while hospitalized and 19.5% at discharge. Median LOT among those prescribed post-discharge antibiotics was 7 (IQR, 4-10) days. The predictive model detected post-discharge antibiotic use with fidelity, including accurate identification of any exposure (area under the precision-recall curve = 0.97) and reliable prediction of post-discharge LOT (mean absolute error = 1.48). Based on this model, 39 (30.2%) hospitals prescribed antibiotics less often than expected at discharge and used shorter LOT than expected. Twenty-eight (21.7%) hospitals prescribed antibiotics more often at discharge and used longer LOT. CONCLUSIONS: A model using electronically available data was able to predict antibiotic use prescribed at hospital discharge and showed that some hospitals were more successful in reducing antibiotic overuse at this transition of care. This metric may help hospitals identify opportunities for improved antibiotic stewardship at discharge.
Asunto(s)
Antibacterianos , Hospitales , Alta del Paciente , Humanos , Antibacterianos/uso terapéutico , Alta del Paciente/estadística & datos numéricos , Estudios Retrospectivos , Femenino , Masculino , Hospitales/estadística & datos numéricos , Anciano , Persona de Mediana Edad , Estados Unidos , Programas de Optimización del Uso de los Antimicrobianos , Ajuste de Riesgo/métodos , Pautas de la Práctica en Medicina/estadística & datos numéricos , United States Department of Veterans Affairs , Prescripción Inadecuada/estadística & datos numéricosRESUMEN
BACKGROUND: Urine cultures are nonspecific and often lead to misdiagnosis of urinary tract infection and unnecessary antibiotics. Diagnostic stewardship is a set of procedures that modifies test ordering, processing, and reporting in order to optimize diagnosis and downstream treatment. In this study, we aimed to develop expert guidance on best practices for urine culture diagnostic stewardship. METHODS: A RAND-modified Delphi approach with a multidisciplinary expert panel was used to ascertain diagnostic stewardship best practices. Clinical questions to guide recommendations were grouped into three thematic areas (ordering, processing, reporting) in practice settings of emergency department, inpatient, ambulatory, and long-term care. Fifteen experts ranked recommendations on a 9-point Likert scale. Recommendations on which the panel did not reach agreement were discussed during a virtual meeting, then a second round of ranking by email was completed. After secondary review of results and panel discussion, a series of guidance statements was developed. RESULTS: One hundred and sixty-five questions were reviewed. The panel reaching agreement on 104, leading to 18 overarching guidance statements. The following strategies were recommended to optimize ordering urine cultures: requiring documentation of symptoms, sending alerts to discourage ordering in the absence of symptoms, and cancelling repeat cultures. For urine culture processing, conditional urine cultures and urine white blood cell count as criteria were supported. For urine culture reporting, appropriate practices included nudges to discourage treatment under specific conditions and selective reporting of antibiotics to guide therapy decisions. CONCLUSIONS: These 18 guidance statements can optimize use of urine cultures for better patient outcomes.
Asunto(s)
Urinálisis , Infecciones Urinarias , Antibacterianos/uso terapéutico , Técnica Delphi , Humanos , Infecciones Urinarias/diagnósticoRESUMEN
BACKGROUND: Dexamethasone decreases mortality in coronavirus disease 2019 (COVID-19) patients on intensive respiratory support (IRS) but is of uncertain benefit if less severely ill. We determined whether early (within 48â h) dexamethasone was associated with mortality in patients hospitalised with COVID-19 not on IRS. METHODS: We included patients admitted to US Veterans Affairs hospitals between 7 June 2020 and 31 May 2021 within 14â days after a positive test for severe acute respiratory syndrome coronavirus 2. Exclusions included recent prior corticosteroids and IRS within 48â h. We used inverse probability of treatment weighting (IPTW) to balance exposed and unexposed groups, and Cox proportional hazards models to determine 90-day all-cause mortality. RESULTS: Of 19â973 total patients (95% men, median age 71â years, 27% black), 15â404 (77%) were without IRS within 48â h. Of these, 3514 out of 9450 (34%) patients on no oxygen received dexamethasone and 1042 (11%) died; 4472 out of 5954 (75%) patients on low-flow nasal cannula (NC) only received dexamethasone and 857 (14%) died. In IPTW stratified models, patients on no oxygen who received dexamethasone experienced 76% increased risk for 90-day mortality (hazard ratio (HR) 1.76, 95% CI 1.47-2.12); there was no association with mortality among patients on NC only (HR 1.08, 95% CI 0.86-1.36). CONCLUSIONS: In patients hospitalised with COVID-19, early initiation of dexamethasone was common and was associated with no mortality benefit among those on no oxygen or NC only in the first 48â h; instead, we found evidence of potential harm. These real-world findings do not support the use of early dexamethasone in hospitalised COVID-19 patients without IRS.
Asunto(s)
Tratamiento Farmacológico de COVID-19 , Anciano , Dexametasona/uso terapéutico , Femenino , Hospitalización , Humanos , Masculino , SARS-CoV-2RESUMEN
We characterized serology following a nursing home outbreak of severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) where residents were serially tested by reverse-transcription polymerase chain reaction (RT-PCR) and positive residents were cohorted. When tested 46-76 days later, 24 of 26 RT-PCR-positive residents were seropositive; none of the 124 RT-PCR-negative residents had confirmed seropositivity, supporting serial SARS-CoV-2 RT-PCR testing and cohorting in nursing homes.
Asunto(s)
COVID-19 , SARS-CoV-2 , Brotes de Enfermedades , Humanos , Reacción en Cadena de la Polimerasa , Instituciones de Cuidados Especializados de EnfermeríaRESUMEN
BACKGROUND: The 2019 American Thoracic Society/Infectious Diseases Society of America guidelines for community-acquired pneumonia (CAP) revised recommendations for culturing and empiric broad-spectrum antibiotics. We simulated guideline adoption in Veterans Affairs (VA) inpatients. METHODS: For all VA acute hospitalizations for CAP from 2006-2016 nationwide, we compared observed with guideline-expected proportions of hospitalizations with initial blood and respiratory cultures obtained, empiric antibiotic therapy with activity against methicillin-resistant Staphylococcus aureus (anti-MRSA) or Pseudomonas aeruginosa (antipseudomonal), empiric "overcoverage" (receipt of anti-MRSA/antipseudomonal therapy without eventual detection of MRSA/P. aeruginosa on culture), and empiric "undercoverage" (lack of anti-MRSA/antipseudomonal therapy with eventual detection on culture). RESULTS: Of 115 036 CAP hospitalizations over 11 years, 17 877 (16%) were admitted to an intensive care unit (ICU). Guideline adoption would slightly increase respiratory culture (30% to 36%) and decrease blood culture proportions (93% to 36%) in hospital wards and increase both respiratory (40% to 100%) and blood (95% to 100%) cultures in ICUs. Adoption would decrease empiric selection of anti-MRSA (ward: 27% to 1%; ICU: 61% to 8%) and antipseudomonal (ward: 25% to 1%; ICU: 54% to 9%) therapies. This would correspond to greatly decreased MRSA overcoverage (ward: 27% to 1%; ICU: 56% to 8%), slightly increased MRSA undercoverage (ward: 0.6% to 1.3%; ICU: 0.5% to 3.3%), with similar findings for P. aeruginosa. For all comparisons, Pâ <â .001. CONCLUSIONS: Adoption of the 2019 CAP guidelines in this population would substantially change culturing and empiric antibiotic selection practices, with a decrease in overcoverage and slight increase in undercoverage for MRSA and P. aeruginosa.
Asunto(s)
Infecciones Comunitarias Adquiridas , Staphylococcus aureus Resistente a Meticilina , Neumonía , Veteranos , Antibacterianos/uso terapéutico , Infecciones Comunitarias Adquiridas/tratamiento farmacológico , Humanos , Neumonía/tratamiento farmacológicoRESUMEN
BACKGROUND: The Core Elements of Outpatient Antibiotic Stewardship provide a framework to improve antibiotic use. We report the impact of core elements implementation within Veterans Health Administration sites. METHODS: In this quasiexperimental controlled study, effects of an intervention targeting antibiotic prescription for uncomplicated acute respiratory tract infections (ARIs) were assessed. Outcomes included per-visit antibiotic prescribing, treatment appropriateness, ARI revisits, hospitalization, and ARI diagnostic changes over a 3-year pre-implementation period and 1-year post-implementation period. Logistic regression adjusted for covariates (odds ratio [OR], 95% confidence interval [CI]) and a difference-in-differences analysis compared outcomes between intervention and control sites. RESULTS: From 2014-2019, there were 16 712 and 51 275 patient visits within 10 intervention and 40 control sites, respectively. Antibiotic prescribing rates pre- and post-implementation within intervention sites were 59.7% and 41.5%, compared to 73.5% and 67.2% within control sites, respectively (difference-in-differences, P < .001). Intervention site pre- and post-implementation OR to receive appropriate therapy increased (OR, 1.67; 95% CI, 1.31-2.14), which remained unchanged within control sites (OR,1.04; 95% CI, .91-1.19). ARI-related return visits post-implementation (-1.3% vs -2.0%; difference-in-differences P = .76) were not different, but all-cause hospitalization was lower within intervention sites (-0.5% vs -0.2%; difference-in-differences P = .02). The OR to diagnose non-specific ARI compared with non-ARI diagnoses increased post-implementation forintervention (OR, 1.27; 95% CI, 1.21 -1.34) but not control (OR, 0.97; 95% CI, .94-1.01) sites. CONCLUSIONS: Implementation of the core elements was associated with reduced antibiotic prescribing for RIs and a reduction in hospitalizations. Diagnostic coding changes were observed.
Asunto(s)
Programas de Optimización del Uso de los Antimicrobianos , Infecciones del Sistema Respiratorio , Antibacterianos/uso terapéutico , Servicio de Urgencia en Hospital , Humanos , Prescripción Inadecuada , Pacientes Ambulatorios , Pautas de la Práctica en Medicina , Atención Primaria de Salud , Infecciones del Sistema Respiratorio/tratamiento farmacológico , Salud de los VeteranosRESUMEN
INTRODUCTION: The optimal method for implementing hospital-level restrictions for antibiotics that carry a high risk of Clostridioides difficile infection has not been identified. We aimed to explore barriers and facilitators to implementing restrictions for fluoroquinolones and third/fourth-generation cephalosporins. METHODS: This mixed-methods study across a purposeful sample of 15 acute-care, geographically dispersed Veterans Health Administration hospitals included electronic surveys and semi-structured interviews (September 2018 to May 2019). Surveys on stewardship strategies were administered at each hospital and summarized with descriptive statistics. Interviews were performed with 30 antibiotic stewardship programme (ASP) champions across all 15 sites and 19 additional stakeholders at a subset of 5 sites; transcripts were analysed using thematic content analysis. RESULTS: The most restricted agent was moxifloxacin, which was restricted at 12 (80%) sites. None of the 15 hospitals restricted ceftriaxone. Interviews identified differing opinions on the feasibility of restricting third/fourth-generation cephalosporins and fluoroquinolones. Some participants felt that restrictions could be implemented in a way that was not burdensome to clinicians and did not interfere with timely antibiotic administration. Others expressed concerns about restricting these agents, particularly through prior approval, given their frequent use, the difficulty of enforcing restrictions and potential unintended consequences of steering clinicians towards non-restricted antibiotics. A variety of stewardship strategies were perceived to be effective at reducing the use of these agents. CONCLUSIONS: Across 15 hospitals, there were differing opinions on the feasibility of implementing antibiotic restrictions for third/fourth-generation cephalosporins and fluoroquinolones. While the perceived barrier to implementing restrictions was frequently high, many hospitals were effectively using restrictions and reported few barriers to their use.
Asunto(s)
Cefalosporinas , Fluoroquinolonas , Antibacterianos/uso terapéutico , Estudios de Factibilidad , Hospitales , Humanos , Salud de los VeteranosRESUMEN
BACKGROUND: People with human immunodeficiency virus (PWH) face increased risks for heart failure and adverse heart failure outcomes. Myocardial steatosis predisposes to diastolic dysfunction, a heart failure precursor. We aimed to characterize myocardial steatosis and associated potential risk factors among a subset of the Randomized Trial to Prevent Vascular Events in HIV (REPRIEVE) participants. METHODS: Eighty-two PWH without known heart failure successfully underwent cardiovascular magnetic resonance spectroscopy, yielding data on intramyocardial triglyceride (IMTG) content (a continuous marker for myocardial steatosis extent). Logistic regression models were applied to investigate associations between select clinical characteristics and odds of increased or markedly increased IMTG content. RESULTS: Median (Q1, Q3) IMTG content was 0.59% (0.28%, 1.15%). IMTG content was increased (> 0.5%) among 52% and markedly increased (> 1.5%) among 22% of participants. Parameters associated with increased IMTG content included age (P = .013), body mass index (BMI) ≥ 25 kg/m2 (P = .055), history of intravenous drug use (IVDU) (P = .033), and nadir CD4 count < 350 cells/mm³ (P = .055). Age and BMI ≥ 25 kg/m2 were additionally associated with increased odds of markedly increased IMTG content (P = .049 and P = .046, respectively). CONCLUSIONS: A substantial proportion of antiretroviral therapy-treated PWH exhibited myocardial steatosis. Age, BMI ≥ 25 kg/m2, low nadir CD4 count, and history of IVDU emerged as possible risk factors for myocardial steatosis in this group. CLINICAL TRIALS REGISTRATION: NCT02344290; NCT03238755.
Asunto(s)
Cardiomiopatías/epidemiología , Cardiomiopatías/patología , Tejido Adiposo , Antirretrovirales/uso terapéutico , Índice de Masa Corporal , Recuento de Linfocito CD4 , Femenino , Infecciones por VIH/tratamiento farmacológico , Factores de Riesgo de Enfermedad Cardiaca , Humanos , Imagen por Resonancia Magnética , Espectroscopía de Resonancia Magnética , Masculino , Persona de Mediana Edad , TriglicéridosRESUMEN
BACKGROUND: Antimicrobial stewards may benefit from comparative data to inform interventions that promote optimal inpatient antimicrobial use. METHODS: Antimicrobial stewards from 8 geographically dispersed Veterans Affairs (VA) inpatient facilities participated in the development of antimicrobial use visualization tools that allowed for comparison to facilities of similar complexity. The visualization tools consisted of an interactive web-based antimicrobial dashboard and, later, a standardized antimicrobial usage report updated at user-selected intervals. Stewards participated in monthly learning collaboratives. The percent change in average monthly antimicrobial use (all antimicrobial agents, anti-methicillin-resistant Staphylococcus aureus [anti-MRSA] agents, and antipseudomonal agents) was analyzed using a pre-post (January 2014-January 2016 vs July 2016-January 2018) design with segmented regression and external comparison with uninvolved control facilities (n = 118). RESULTS: Intervention sites demonstrated a 2.1% decrease (95% confidence interval [CI], -5.7% to 1.6%) in total antimicrobial use pre-post intervention vs a 2.5% increase (95% CI, 0.8% to 4.1%) in nonintervention sites (absolute difference, 4.6%; P = .025). Anti-MRSA antimicrobial use decreased 11.3% (95% CI, -16.0% to -6.3%) at intervention sites vs a 6.6% decrease (95% CI, -9.1% to -3.9%) at nonintervention sites (absolute difference, 4.7%; P = .092). Antipseudomonal antimicrobial use decreased 3.4% (95% CI, -8.2% to 1.7%) at intervention sites vs a 3.6% increase (95% CI, 0.8% to 6.5%) at nonintervention sites (absolute difference, 7.0%; P = .018). CONCLUSIONS: Comparative data visualization tool use by stewards at 8 VA facilities was associated with significant reductions in overall antimicrobial and antipseudomonal use relative to uninvolved facilities.
Asunto(s)
Antiinfecciosos , Programas de Optimización del Uso de los Antimicrobianos , Staphylococcus aureus Resistente a Meticilina , Antibacterianos/uso terapéutico , Antiinfecciosos/uso terapéutico , Electrónica , HumanosRESUMEN
BACKGROUND: Norovirus is an important cause of epidemic acute gastroenteritis (AGE), yet the burden of endemic disease in adults has not been well documented. We estimated the prevalence and incidence of outpatient and community-acquired inpatient norovirus AGE at 4 Veterans Affairs Medical Centers (VAMC) (Atlanta, Georgia; Bronx, New York; Houston, Texas; and Los Angeles, California) and examined trends over 4 surveillance years. METHODS: From November 2011 to September 2015, stool specimens collected within 7 days of AGE symptom onset for clinician-requested diagnostic testing were tested for norovirus, and positive samples were genotyped. Incidence was calculated by multiplying norovirus prevalence among tested specimens by AGE-coded outpatient encounters and inpatient discharges, and dividing by the number of unique patients served. RESULTS: Of 1603 stool specimens, 6% tested were positive for norovirus; GII.4 viruses (GII.4 New Orleans [17%] and GII.4 Sydney [47%]) were the most common genotypes. Overall prevalence and outpatient and inpatient community-acquired incidence followed a seasonal pattern, with higher median rates during November-April (9.2%, 376/100 000, and 45/100 000, respectively) compared to May-October (3.0%, 131/100 000, and 13/100 000, respectively). An alternate-year pattern was also detected, with highest peak prevalence and outpatient and inpatient community-acquired norovirus incidence rates in the first and third years of surveillance (14%-25%, 349-613/100 000, and 43-46/100 000, respectively). CONCLUSIONS: This multiyear analysis of laboratory-confirmed AGE surveillance from 4 VAMCs demonstrates dynamic intra- and interannual variability in prevalence and incidence of outpatient and inpatient community-acquired norovirus in US Veterans, highlighting the burden of norovirus disease in this adult population.
Asunto(s)
Infecciones por Caliciviridae , Gastroenteritis , Norovirus , Veteranos , Adulto , Infecciones por Caliciviridae/epidemiología , Heces , Gastroenteritis/epidemiología , Genotipo , Georgia/epidemiología , Humanos , Incidencia , Lactante , Los Angeles , New York , Norovirus/genética , Filogenia , Texas , Estados Unidos/epidemiologíaRESUMEN
BACKGROUND: The effect of human immunodeficiency virus (HIV) on the development of peripheral artery disease (PAD) remains unclear. We investigated whether HIV infection is associated with an increased risk of PAD after adjustment for traditional atherosclerotic risk factors in a large cohort of HIV-infected (HIV+) and demographically similar HIV-uninfected veterans. METHODS: We studied participants in the Veterans Aging Cohort Study from April 1, 2003 through December 31, 2014. We excluded participants with known prior PAD or prevalent cardiovascular disease (myocardial infarction, stroke, coronary heart disease, and congestive heart failure) and analyzed the effect of HIV status on the risk of incident PAD events after adjusting for demographics, PAD risk factors, substance use, CD4 cell count, HIV-1 ribonucleic acid, and antiretroviral therapy. The primary outcome is incident peripheral artery disease events. Secondary outcomes include mortality and amputation in subjects with incident PAD events by HIV infection status, viral load, and CD4 count. RESULTS: Among 91 953 participants, over a median follow up of 9.0 years, there were 7708 incident PAD events. Rates of incident PAD events per 1000 person-years were higher among HIV+ (11.9; 95% confidence interval [CI], 11.5-12.4) than uninfected veterans (9.9; 95% CI, 9.6-10.1). After adjustment for demographics, PAD risk factors, and other covariates, HIV+ veterans had an increased risk of incident PAD events compared with uninfected veterans (hazard ratio [HR], 1.19; 95% CI, 1.13-1.25). This risk was highest among those with time-updated HIV viral load >500 copies/mL (HR, 1.51; 95% CI, 1.38-1.65) and CD4 cell counts <200 cells/mm3 (HR, 1.91; 95% CI, 1.71-2.13). In contrast, HIV+ veterans with time updated CD4 cell count ≥500 cells/mm3 had no increased risk of PAD (HR, 1.03; 95% CI, 0.96-1.11). Mortality rates after incident PAD events are high regardless of HIV status. HIV infection did not affect rates of amputation after incident PAD events. CONCLUSIONS: Infection with HIV is associated with a 19% increased risk of PAD beyond that explained by traditional atherosclerotic risk factors. However, for those with sustained CD4 cell counts <200 cells/mm3, the risk of incident PAD events is nearly 2-fold higher whereas for those with sustained CD4 cell counts ≥500 cells/mm3 there is no excess risk of incident PAD events compared with uninfected people.
Asunto(s)
Infecciones por VIH/epidemiología , VIH-1/fisiología , Enfermedad Arterial Periférica/epidemiología , Adulto , Recuento de Linfocito CD4 , Estudios de Cohortes , Femenino , Estudios de Seguimiento , Infecciones por VIH/diagnóstico , Infecciones por VIH/mortalidad , Humanos , Masculino , Persona de Mediana Edad , Enfermedad Arterial Periférica/diagnóstico , Enfermedad Arterial Periférica/mortalidad , Pronóstico , Riesgo , Análisis de Supervivencia , Estados Unidos/epidemiología , VeteranosRESUMEN
Background: Viral suppression is a primary marker of HIV treatment success. Persons with HIV are at increased risk for AIDS-defining cancer (ADC) and several types of non-AIDS-defining cancer (NADC), some of which are caused by oncogenic viruses. Objective: To determine whether viral suppression is associated with decreased cancer risk. Design: Prospective cohort. Setting: Department of Veterans Affairs. Participants: HIV-positive veterans (n = 42 441) and demographically matched uninfected veterans (n = 104 712) from 1999 to 2015. Measurements: Standardized cancer incidence rates and Poisson regression rate ratios (RRs; HIV-positive vs. uninfected persons) by viral suppression status (unsuppressed: person-time with HIV RNA levels ≥500 copies/mL; early suppression: initial 2 years with HIV RNA levels <500 copies/mL; long-term suppression: person-time after early suppression with HIV RNA levels <500 copies/mL). Results: Cancer incidence for HIV-positive versus uninfected persons was highest for unsuppressed persons (RR, 2.35 [95% CI, 2.19 to 2.51]), lower among persons with early suppression (RR, 1.99 [CI, 1.87 to 2.12]), and lowest among persons with long-term suppression (RR, 1.52 [CI, 1.44 to 1.61]). This trend was strongest for ADC (unsuppressed: RR, 22.73 [CI, 19.01 to 27.19]; early suppression: RR, 9.48 [CI, 7.78 to 11.55]; long-term suppression: RR, 2.22 [CI, 1.69 to 2.93]), much weaker for NADC caused by viruses (unsuppressed: RR, 3.82 [CI, 3.24 to 4.49]; early suppression: RR, 3.42 [CI, 2.95 to 3.97]; long-term suppression: RR, 3.17 [CI, 2.78 to 3.62]), and absent for NADC not caused by viruses. Limitation: Lower viral suppression thresholds, duration of long-term suppression, and effects of CD4+ and CD8+ T-cell counts were not thoroughly evaluated. Conclusion: Antiretroviral therapy resulting in long-term viral suppression may contribute to cancer prevention, to a greater degree for ADC than for NADC. Patients with long-term viral suppression still had excess cancer risk. Primary Funding Source: National Cancer Institute and National Institute on Alcohol Abuse and Alcoholism of the National Institutes of Health.
Asunto(s)
Infecciones por VIH/complicaciones , Neoplasias/etiología , Adulto , Anciano , Fármacos Anti-VIH/uso terapéutico , Estudios de Casos y Controles , Femenino , Infecciones por VIH/tratamiento farmacológico , Humanos , Incidencia , Masculino , Persona de Mediana Edad , Neoplasias/epidemiología , Distribución de Poisson , Estudios Prospectivos , Factores de Riesgo , Estados Unidos/epidemiología , Veteranos/estadística & datos numéricos , Carga Viral , Adulto JovenRESUMEN
Objectives: Inappropriate antibiotic use poses a serious threat to patient safety. Antimicrobial stewardship programmes (ASPs) may optimize antimicrobial use and improve patient outcomes, but their implementation remains an organizational challenge. Using the Promoting Action on Research Implementation in Health Services (PARiHS) framework, this study aimed to identify organizational factors that may facilitate ASP design, development and implementation. Methods: Among 130 Veterans Affairs facilities that offered acute care, we classified organizational variables supporting antimicrobial stewardship activities into three PARiHS domains: evidence to encompass sources of knowledge; contexts to translate evidence into practice; and facilitation to enhance the implementation process. We conducted a series of exploratory factor analyses to identify conceptually linked factor scales. Cronbach's alphas were calculated. Variables with large uniqueness values were left as single factors. Results: We identified 32 factors, including six constructs derived from factor analyses under the three PARiHS domains. In the evidence domain, four factors described guidelines and clinical pathways. The context domain was broken into three main categories: (i) receptive context (15 factors describing resources, affiliations/networks, formalized policies/practices, decision-making, receptiveness to change); (ii) team functioning (1 factor); and (iii) evaluation/feedback (5 factors). Within facilitation, two factors described facilitator roles and tasks and five captured skills and training. Conclusions: We mapped survey data onto PARiHS domains to identify factors that may be adapted to facilitate ASP uptake. Our model encompasses mostly mutable factors whose relationships with performance outcomes may be explored to optimize antimicrobial use. Our framework also provides an analytical model for determining whether leveraging existing organizational processes can potentially optimize ASP performance.
Asunto(s)
Programas de Optimización del Uso de los Antimicrobianos/organización & administración , United States Department of Veterans Affairs/organización & administración , Veteranos , Servicios Médicos de Urgencia , Análisis Factorial , Instituciones de Salud , Humanos , Estados UnidosRESUMEN
BACKGROUND: Bacteriuria contributes to antibiotic overuse through treatment of asymptomatic bacteriuria (ASB) and long durations of therapy for symptomatic urinary tract infections (UTIs), yet large-scale evaluations of bacteriuria management among inpatients are lacking. METHODS: Inpatients with bacteriuria were classified as asymptomatic or symptomatic based on established criteria applied to data collected by manual chart review. We examined frequency of treatment of ASB, factors associated with treatment of ASB, durations of therapy, and frequency of complications including Clostridium difficile infection, readmission, and all-cause mortality within 28 days of discharge. RESULTS: Among 2225 episodes of bacteriuria, 64% were classified as ASB. After excluding patients with non-UTI indications for antibiotics, 72% of patients with ASB received antibiotics. When evaluating only patients not meeting SIRS criteria, 68% of patients with ASB received antibiotics. The mean (±SD) days of antibiotic therapy for ASB, cystitis, CA-UTI and pyelonephritis were 10.0 (4.5), 11.4 (4.7), 12.0 (6.1), and 13.6 (5.3), respectively. In sum, 14% of patients with ASB were treated for greater than 14 days, and fluoroquinolones were the most commonly used empiric antibiotic for ASB [245/691 (35%)]. Complications were rare but more common among patients with ASB treated with antibiotics. CONCLUSIONS: The majority of bacteriuria among inpatient veterans is due to ASB with high rates of treatment of ASB and prolonged durations of therapy for ASB and symptomatic UTIs.
Asunto(s)
Antibacterianos/uso terapéutico , Infecciones Asintomáticas/terapia , Bacteriuria/tratamiento farmacológico , Fluoroquinolonas/uso terapéutico , Hospitales de Veteranos , Anciano , Anciano de 80 o más Años , Antibacterianos/administración & dosificación , Antibacterianos/efectos adversos , Bacteriuria/etiología , Infecciones Relacionadas con Catéteres/tratamiento farmacológico , Infecciones Relacionadas con Catéteres/etiología , Causas de Muerte , Clostridioides difficile , Infecciones por Clostridium/inducido químicamente , Cistitis/tratamiento farmacológico , Femenino , Fluoroquinolonas/administración & dosificación , Fluoroquinolonas/efectos adversos , Humanos , Masculino , Persona de Mediana Edad , Readmisión del Paciente , Pielonefritis/tratamiento farmacológico , Catéteres Urinarios/efectos adversosRESUMEN
Background: Drivers of differences in Clostridium difficile incidence across acute and long-term care facilities are poorly understood. We sought to obtain a comprehensive picture of C. difficile incidence and risk factors in acute and long-term care. Methods: We conducted a case-cohort study of persons spending at least 3 days in one of 131 acute care or 120 long-term care facilities managed by the United States Veterans Health Administration between 2006 and 2012. Patient (n = 8) and facility factors (n = 5) were included in analyses. The outcome was the incidence of facility-onset laboratory-identified C. difficile infection (CDI), defined as a person with a positive C. difficile test without a positive test in the prior 8 weeks. Results: CDI incidence in acute care was 5 times that observed in long-term care (median, 15.6 vs 3.2 per 10000 person-days). History of antibiotic use was greater in acute care compared to long-term care (median, 739 vs 513 per 1000 person-days) and explained 72% of the variation in C. difficile rates. Importation of C. difficile cases (acute care: patients with recent long-term care attributable infection; long-term care: residents with recent acute care attributable infection) was 3 times higher in long-term care as compared to acute care (median, 52.3 vs 16.2 per 10000 person-days). Conclusions: Facility-level antibiotic use was the main factor driving differences in CDI incidence between acute and long-term care. Importation of acute care C. difficile cases was a greater concern for long-term care as compared to importation of long-term care cases for acute care.
Asunto(s)
Infecciones por Clostridium/epidemiología , Infección Hospitalaria/epidemiología , Anciano , Anciano de 80 o más Años , Antibacterianos/administración & dosificación , Antibacterianos/uso terapéutico , Clostridioides difficile , Infecciones por Clostridium/tratamiento farmacológico , Infección Hospitalaria/tratamiento farmacológico , Femenino , Humanos , Incidencia , Masculino , Persona de Mediana Edad , Análisis Multinivel , Transferencia de Pacientes , Estudios Retrospectivos , Factores de Riesgo , Resultado del Tratamiento , Estados Unidos/epidemiologíaRESUMEN
BACKGROUND: Patients with human immunodeficiency virus (HIV) and/or chronic hepatitis C virus (HCV) infection may be prescribed statins as treatment for metabolic/cardiovascular disease, but it remains unclear if the risk of acute liver injury (ALI) is increased for statin initiators compared to nonusers in groups classified by HIV/HCV status. METHODS: We conducted a cohort study to compare rates of ALI in statin initiators vs nonusers among 7686 HIV/HCV-coinfected, 8155 HCV-monoinfected, 17739 HIV-monoinfected, and 36604 uninfected persons in the Veterans Aging Cohort Study (2000-2012). We determined development of (1) liver aminotransferases >200 U/L, (2) severe ALI (coagulopathy with hyperbilirubinemia), and (3) death, all within 18 months. Cox regression was used to determine propensity score-adjusted hazard ratios (HRs) with 95% confidence intervals (CIs) of outcomes in statin initiators compared to nonusers across the groups. RESULTS: Among HIV/HCV-coinfected patients, statin initiators had lower risks of aminotransferase levels >200 U/L (HR, 0.66 [95% CI, .53-.83]), severe ALI (HR, 0.23 [95% CI, .12-.46]), and death (HR, 0.36 [95% CI, .28-.46]) compared with statin nonusers. In the setting of chronic HCV alone, statin initiators had reduced risks of aminotransferase elevations (HR, 0.57 [95% CI, .45-.72]), severe ALI (HR, 0.15 [95% CI, .06-.37]), and death (HR, 0.42 [95% CI, .32-.54]) than nonusers. Among HIV-monoinfected patients, statin initiators had lower risks of aminotransferase increases (HR, 0.52 [95% CI, .40-.66]), severe ALI (HR, 0.26 [95% CI, .13-.55]), and death (HR, 0.19 [95% CI, .16-.23]) compared with nonusers. Results were similar among uninfected persons. CONCLUSIONS: Regardless of HIV and/or chronic HCV status, statin initiators had a lower risk of ALI and death within 18 months compared with statin nonusers.
Asunto(s)
Enfermedad Hepática Inducida por Sustancias y Drogas/epidemiología , Infecciones por VIH/epidemiología , Hepatitis C Crónica/epidemiología , Inhibidores de Hidroximetilglutaril-CoA Reductasas/efectos adversos , Enfermedad Hepática Inducida por Sustancias y Drogas/mortalidad , Femenino , Infecciones por VIH/complicaciones , Hepatitis C Crónica/complicaciones , Humanos , Incidencia , Masculino , Persona de Mediana Edad , Estudios Retrospectivos , Factores de RiesgoRESUMEN
OBJECTIVES: To understand clinicians' impressions of and decision-making processes regarding an informatics-supported antibiotic timeout program to re-evaluate the appropriateness of continuing vancomycin and piperacillin/tazobactam. METHODS: We implemented a multi-pronged informatics intervention, based on Dual Process Theory, to prompt discontinuation of unwarranted vancomycin and piperacillin/tazobactam on or after day three in a large Veterans Affairs Medical Center. Two workflow changes were introduced to facilitate cognitive deliberation about continuing antibiotics at day three: (1) teams completed an electronic template note, and (2) a paper summary of clinical and antibiotic-related information was provided to clinical teams. Shortly after starting the intervention, six focus groups were conducted with users or potential users. Interviews were recorded and transcribed. Iterative thematic analysis identified recurrent themes from feedback. RESULTS: Themes that emerged are represented by the following quotations: (1) captures and controls attention ("it reminds us to think about it"), (2) enhances informed and deliberative reasoning ("it makes you think twice"), (3) redirects decision direction (" because [there was no indication] I just [discontinued] it without even trying"), (4) fosters autonomy and improves team empowerment ("the template forces the team to really discuss it"), and (5) limits use of emotion-based heuristics ("my clinical concern is high enough I think they need more aggressive therapy "). CONCLUSIONS: Requiring template completion to continue antibiotics nudged clinicians to re-assess the appropriateness of specified antibiotics. Antibiotic timeouts can encourage deliberation on overprescribed antibiotics without substantially curtailing autonomy. An effective nudge should take into account clinician's time, workflow, and thought processes.
Asunto(s)
Antibacterianos/administración & dosificación , Toma de Decisiones , Pautas de la Práctica en Medicina , Cognición , Hospitales de Veteranos , HumanosRESUMEN
PURPOSE: Among patients dually infected with human immunodeficiency virus (HIV) and chronic hepatitis C virus (HCV), use of antiretroviral therapy (ART) containing mitochondrial toxic nucleoside reverse transcriptase inhibitors (mtNRTIs) might induce chronic hepatic injury, which could accelerate HCV-associated liver fibrosis and increase the risk of hepatic decompensation and death. METHODS: We conducted a cohort study among 1747 HIV/HCV patients initiating NRTI-containing ART within the Veterans Aging Cohort Study (2002-2009) to determine if cumulative mtNRTI use increased the risk of hepatic decompensation and death among HIV-/HCV-coinfected patients. Separate marginal structural models were used to estimate hazard ratios (HRs) of each outcome associated with cumulative exposure to ART regimens that contain mtNRTIs versus regimens that contain other NRTIs. RESULTS: Over 7033 person-years, we observed 97 (5.6%) decompensation events (incidence rate, 13.8/1000 person-years) and 125 (7.2%) deaths (incidence rate, 17.8 events/1000 person-years). The risk of hepatic decompensation increased with cumulative mtNRTI use (1-11 mo: HR, 1.79 [95% confidence interval (CI), 0.74-4.31]; 12-35 mo: HR, 1.39 [95% CI, 0.68-2.87]; 36-71 mo: HR, 2.27 [95% CI, 0.92-5.60]; >71 mo: HR, 4.66 [95% CI, 1.04-20.83]; P = .045) versus nonuse. Cumulative mtNRTI use also increased risk of death (1-11 mo: HR, 2.24 [95% CI, 1.04-4.81]; 12-35 mo: HR, 2.05 [95% CI, 0.68-6.20]; 36-71 mo: HR, 3.04 [95% CI, 1.12-8.26]; >71 mo: HR, 3.93 [95% CI, 0.75-20.50]; P = .030). CONCLUSIONS: These findings suggest that cumulative mtNRTI use may increase the risk of hepatic decompensation and death in HIV/HCV coinfection. These drugs should be avoided when alternatives exist for HIV/HCV patients.
Asunto(s)
Enfermedad Hepática Crónica Inducida por Sustancias y Drogas/complicaciones , Coinfección/tratamiento farmacológico , Infecciones por VIH/tratamiento farmacológico , Hepatitis C Crónica/tratamiento farmacológico , Fallo Hepático/epidemiología , Mitocondrias/efectos de los fármacos , Inhibidores de la Transcriptasa Inversa/efectos adversos , Enfermedad Hepática Crónica Inducida por Sustancias y Drogas/etiología , Femenino , Humanos , Incidencia , Hígado/efectos de los fármacos , Cirrosis Hepática/complicaciones , Fallo Hepático/etiología , Masculino , Persona de Mediana Edad , Modelos de Riesgos Proporcionales , Estudios RetrospectivosRESUMEN
BACKGROUND: Guidelines now recommend limited use of routine CD4 cell count testing in human immunodeficiency virus (HIV)-infected patients with successful viral control who are not immunocompromised. METHODS: CD4 and viral load tests for patients receiving HIV care from the US Department of Veterans Affairs during 2009-2013 were evaluated to determine trends in CD4 testing frequency and the number, cost, and results of CD4 tests considered optional under the guidelines. RESULTS: There were 28 530 individuals with sufficient testing to be included. At the time of the last CD4 test, 19.8% of the cohort was eligible for optional monitoring and 15.6% for minimal monitoring. CD4 testing frequency declined by 10.8% over 4 years, reducing the direct cost of testing by US$196 000 per year. Full implementation of new treatment guidelines could reduce CD4 testing a further 28.9%, an additional annual savings of US$600 000. CD4 tests conducted during periods of potentially reduced monitoring were rarely <200 cells/µL: 1.1% of the tests conducted when minimal monitoring was recommended and just 0.3% of tests conducted when optional monitoring was recommended were less than this value. CONCLUSIONS: Reduced CD4 monitoring of HIV-infected patients would result in modest cost savings and likely reduce patient anxiety, with little or no impact on the quality of care. Veterans Affairs has made substantial progress in reducing the frequency of optional CD4 testing, but further reductions may still be warranted.
Asunto(s)
Recuento de Linfocito CD4 , Análisis Costo-Beneficio , Infecciones por VIH , Veteranos , Fármacos Anti-VIH/uso terapéutico , Recuento de Linfocito CD4/economía , Recuento de Linfocito CD4/estadística & datos numéricos , Infecciones por VIH/diagnóstico , Infecciones por VIH/tratamiento farmacológico , Infecciones por VIH/economía , Infecciones por VIH/virología , Humanos , Guías de Práctica Clínica como Asunto , Carga ViralRESUMEN
BACKGROUND: After adjustment for cardiovascular risk factors and despite higher mortality, those with human immunodeficiency virus (HIV+) have a greater risk of acute myocardial infarction (AMI) than uninfected individuals. METHODS: We included HIV+ individuals who started combination antiretroviral therapy (cART) in the Veterans Aging Cohort Study (VACS) from 1996 to 2012. We fit multivariable proportional hazards models for baseline, time-updated and cumulative measures of HIV-1 RNA, CD4 counts, and the VACS Index. We used the trapezoidal rule to build the following cumulative measures: viremia copy-years, CD4-years, and VACS Index score-years, captured 180 days after cART initiation until AMI, death, last clinic visit, or 30 September 2012. The primary outcomes were incident AMI (Medicaid, Medicare, and Veterans Affairs International Classification of Diseases-9 codes) and death. RESULTS: A total of 8168 HIV+ individuals (53 861 person-years) were analyzed with 196 incident AMIs and 1710 deaths. Controlling for known cardiovascular risk factors, 6 of the 9 metrics predicted AMI and all metrics predicted mortality. Time-updated VACS Index had the lowest Akaike information criterion among all models for both outcomes. A time-updated VACS Index score of 55+ was associated with a hazard ratio (HR) of 3.31 (95% confidence interval [CI], 2.11-5.20) for AMI and a HR of 31.77 (95% CI, 26.17-38.57) for mortality. CONCLUSIONS: Time-updated VACS Index provided better AMI and mortality prediction than CD4 count and HIV-1 RNA, suggesting that current health determines risk more accurately than prior history and that risk assessment can be improved by biomarkers of organ injury.