RESUMEN
BACKGROUND: A study previously conducted in primary care practices found that implementation of an educational session and peer comparison feedback was associated with reduced antibiotic prescribing for respiratory tract diagnoses (RTDs). Here, we assess the long-term effects of this intervention on antibiotic prescribing following cessation of feedback. METHODS: RTD encounters were grouped into tiers based on antibiotic prescribing appropriateness: tier 1, almost always indicated; tier 2, possibly indicated; and tier 3, rarely indicated. A χ2 test was used to compare prescribing between 3 time periods: pre-intervention, intervention, and post-intervention (14 months following cessation of feedback). A mixed-effects multivariable logistic regression analysis was performed to assess the association between period and prescribing. RESULTS: We analyzed 260 900 RTD encounters from 29 practices. Antibiotic prescribing was more frequent in the post-intervention period than in the intervention period (28.9% vs 23.0%, P < .001) but remained lower than the 35.2% pre-intervention rate (P < .001). In multivariable analysis, the odds of prescribing were higher in the post-intervention period than the intervention period for tier 2 (odds ratio [OR], 1.19; 95% confidence interval [CI]: 1.10-1.30; P < .05) and tier 3 (OR, 1.20; 95% CI: 1.12-1.30) indications but was lower compared to the pre-intervention period for each tier (OR, 0.66; 95% CI: 0.59-0.73 tier 2; OR, 0.68; 95% CI: 0.61-0.75 tier 3). CONCLUSIONS: The intervention effects appeared to last beyond the intervention period. However, without ongoing provider feedback, there was a trend toward increased prescribing. Future studies are needed to determine optimal strategies to sustain intervention effects.
Asunto(s)
Antibacterianos , Pautas de la Práctica en Medicina , Atención Primaria de Salud , Infecciones del Sistema Respiratorio , Humanos , Antibacterianos/uso terapéutico , Pautas de la Práctica en Medicina/estadística & datos numéricos , Masculino , Femenino , Infecciones del Sistema Respiratorio/tratamiento farmacológico , Persona de Mediana Edad , Adulto , Retroalimentación , Anciano , Programas de Optimización del Uso de los Antimicrobianos/métodos , Prescripción Inadecuada/prevención & control , Prescripción Inadecuada/estadística & datos numéricosRESUMEN
BACKGROUND: The epidemiology of extended-spectrum cephalosporin-resistant Enterobacterales (ESCrE) in low- and middle-income countries (LMICs) is poorly described. Identifying risk factors for ESCrE colonization is critical to inform antibiotic resistance reduction strategies because colonization is typically a precursor to infection. METHODS: From 15 January 2020 to 4 September 2020, we surveyed a random sample of clinic patients at 6 sites in Botswana. We also invited each enrolled participant to refer up to 3 adults and children. All participants had rectal swabs collected that were inoculated onto chromogenic media followed by confirmatory testing. Data were collected on demographics, comorbidities, antibiotic use, healthcare exposures, travel, and farm and animal contact. Participants with ESCrE colonization (cases) were compared with noncolonized participants (controls) to identify risk factors for ESCrE colonization using bivariable, stratified, and multivariable analyses. RESULTS: A total of 2000 participants were enrolled. There were 959 (48.0%) clinic participants, 477 (23.9%) adult community participants, and 564 (28.2%) child community participants. The median (interquartile range) age was 30 (12-41) and 1463 (73%) were women. There were 555 cases and 1445 controls (ie, 27.8% of participants were ESCrE colonized). Independent risk factors (adjusted odds ratio [95% confidence interval]) for ESCrE included healthcare exposure (1.37 [1.08-1.73]), foreign travel [1.98 (1.04-3.77]), tending livestock (1.34 [1.03-1.73]), and presence of an ESCrE-colonized household member (1.57 [1.08-2.27]). CONCLUSIONS: Our results suggest healthcare exposure may be important in driving ESCrE. The strong links to livestock exposure and household member ESCrE colonization highlight the potential role of common exposure or household transmission. These findings are critical to inform strategies to curb further emergence of ESCrE in LMICs.
Asunto(s)
Antibacterianos , Cefalosporinas , Femenino , Humanos , Masculino , Antibacterianos/farmacología , Antibacterianos/uso terapéutico , Botswana/epidemiología , Farmacorresistencia Microbiana , Hospitales , Monobactamas , Estudios Prospectivos , Factores de Riesgo , Niño , Adolescente , Adulto Joven , AdultoRESUMEN
BACKGROUND: Inappropriate antibiotic prescribing is common in primary care (PC), particularly for respiratory tract diagnoses (RTDs). However, the optimal approach for improving prescribing remains unknown. METHODS: We conducted a stepped-wedge study in PC practices within a health system to assess the impact of a provider-targeted intervention on antibiotic prescribing for RTDs. RTDs were grouped into tiers based on appropriateness of antibiotic prescribing: tier 1 (almost always indicated), tier 2 (may be indicated), and tier 3 (rarely indicated). Providers received education on appropriate RTD prescribing followed by monthly peer comparison feedback on antibiotic prescribing for (1) all tiers and (2) tier 3 RTDs. A χâ2 test was used to compare the proportion of visits with antibiotic prescriptions before and during the intervention. Mixed-effects multivariable logistic regression analysis was performed to assess the association between the intervention and antibiotic prescribing. RESULTS: Across 30 PC practices and 185 755 total visits, overall antibiotic prescribing was reduced with the intervention, from 35.2% to 23.0% of visits (Pâ <â .001). In multivariable analysis, the intervention was associated with a reduced odds of antibiotic prescription for tiers 2 (odds ratio [OR] 0.57; 95% confidence interval [CI] .52-.62) and 3 (OR 0.57; 95% CI .53-.61) but not for tier 1 (OR 0.98; 95% CI .83-1.16). CONCLUSIONS: A provider-focused intervention reduced overall antibiotic prescribing for RTDs without affecting prescribing for infections that likely require antibiotics. Future research should examine the sustainability of such interventions, potential unintended adverse effects on patient health or satisfaction, and provider perceptions and acceptability.
Asunto(s)
Programas de Optimización del Uso de los Antimicrobianos , Infecciones del Sistema Respiratorio , Antibacterianos/uso terapéutico , Humanos , Prescripción Inadecuada/prevención & control , Pacientes Ambulatorios , Pautas de la Práctica en Medicina , Atención Primaria de Salud , Infecciones del Sistema Respiratorio/tratamiento farmacológicoRESUMEN
BACKGROUND: The impact of the US Centers for Medicare & Medicaid Services (CMS) Severe Sepsis and Septic Shock: Management Bundle (SEP-1) core measure on overall antibacterial utilization is unknown. METHODS: We performed a retrospective multicenter longitudinal cohort study with interrupted time-series analysis to determine the impact of SEP-1 implementation on antibacterial utilization and patient outcomes. All adult patients admitted to 26 hospitals between 1 October 2014 and 30 September 2015 (SEP-1 preparation period) and between 1 November 2015 and 31 October 2016 (SEP-1 implementation period) were evaluated for inclusion. The primary outcome was total antibacterial utilization, measured as days of therapy (DOT) per 1000 patient-days. RESULTS: The study cohort included 701 055 eligible patient admissions and 4.2 million patient-days. Overall antibacterial utilization increased 2% each month during SEP-1 preparation (relative rate [RR], 1.02 per month [95% confidence interval {CI}, 1.00-1.04]; P = .02). Cumulatively, the mean monthly DOT per 1000 patient-days increased 24.4% (95% CI, 18.0%-38.8%) over the entire study period (October 2014-October 2016). The rate of sepsis diagnosis/1000 patients increased 2% each month during SEP-1 preparation (RR, 1.02 per month [95% CI, 1.00-1.04]; P = .04). The rate of all-cause mortality rate per 1000 patients decreased during the study period (RR for SEP-1 preparation, 0.95 [95% CI, .92-.98; P = .001]; RR for SEP-1 implementation, .98 [.97-1.00; P = .01]). Cumulatively, the monthly mean all-cause mortality rate/1000 patients declined 38.5% (95% CI, 25.9%-48.0%) over the study period. CONCLUSIONS: Announcement and implementation of the CMS SEP-1 process measure was associated with increased diagnosis of sepsis and antibacterial utilization and decreased mortality rate among hospitalized patients.
Asunto(s)
Paquetes de Atención al Paciente , Sepsis , Adulto , Anciano , Antibacterianos/uso terapéutico , Estudios de Cohortes , Humanos , Estudios Longitudinales , Medicaid , Medicare , Estudios Retrospectivos , Estados UnidosRESUMEN
BACKGROUND: Multidrug-resistant organisms (MDROs) frequently contaminate hospital environments. We performed a multicenter, cluster-randomized, crossover trial of 2 methods for monitoring of terminal cleaning effectiveness. METHODS: Six intensive care units (ICUs) at 3 medical centers received both interventions sequentially, in randomized order. Ten surfaces were surveyed each in 5 rooms weekly, after terminal cleaning, with adenosine triphosphate (ATP) monitoring or an ultraviolet fluorescent marker (UV/F). Results were delivered to environmental services staff in real time with failing surfaces recleaned. We measured monthly rates of MDRO infection or colonization, including methicillin-resistant Staphylococcus aureus, Clostridioides difficile, vancomycin-resistant Enterococcus, and MDR gram-negative bacilli (MDR-GNB) during a 12-month baseline period and sequential 6-month intervention periods, separated by a 2-month washout. Primary analysis compared only the randomized intervention periods, whereas secondary analysis included the baseline. RESULTS: The ATP method was associated with a reduction in incidence rate of MDRO infection or colonization compared with the UV/F period (incidence rate ratio [IRR] 0.876; 95% confidence interval [CI], 0.807-0.951; Pâ =â .002). Including the baseline period, the ATP method was associated with reduced infection with MDROs (IRR 0.924; 95% CI, 0.855-0.998; Pâ =â .04), and MDR-GNB infection or colonization (IRR 0.856; 95% CI, 0.825-0.887; Pâ <â .001). The UV/F intervention was not associated with a statistically significant impact on these outcomes. Room turnaround time increased by a median of 1 minute with the ATP intervention and 4.5 minutes with UV/F compared with baseline. CONCLUSIONS: Intensive monitoring of ICU terminal room cleaning with an ATP modality is associated with a reduction of MDRO infection and colonization.
Asunto(s)
Infección Hospitalaria , Staphylococcus aureus Resistente a Meticilina , Enterococos Resistentes a la Vancomicina , Adenosina Trifosfato , Infección Hospitalaria/epidemiología , Infección Hospitalaria/prevención & control , Farmacorresistencia Bacteriana Múltiple , Bacterias Gramnegativas , Humanos , Unidades de Cuidados Intensivos , VancomicinaRESUMEN
We conducted a retrospective study to assess performance of provider-selected antibiotic indication (PSI) in identifying hospitalized adults with community-acquired pneumonia. PSI showed moderate sensitivity (64.4%) and high specificity (96.3%). PSI has potential utility for targeted real-time antibiotic stewardship interventions, though future research should investigate methods to improve sensitivity.
RESUMEN
Objective: To determine antibiotic prescribing appropriateness for respiratory tract diagnoses (RTD) by season. Design: Retrospective cohort study. Setting: Primary care practices in a university health system. Patients: Patients who were seen at an office visit with diagnostic code for RTD. Methods: Office visits for the entire cohort were categorized based on ICD-10 codes by the likelihood that an antibiotic was indicated (tier 1: always indicated; tier 2: sometimes indicated; tier 3: rarely indicated). Medical records were reviewed for 1,200 randomly selected office visits to determine appropriateness. Based on this reference standard, metrics and prescriber characteristics associated with inappropriate antibiotic prescribing were determined. Characteristics of antibiotic prescribing were compared between winter and summer months. Results: A significantly greater proportion of RTD visits had an antibiotic prescribed in winter [20,558/51,090 (40.2%)] compared to summer months [11,728/38,537 (30.4%)][standardized difference (SD) = 0.21]. A significantly greater proportion of winter compared to summer visits was associated with tier 2 RTDs (29.4% vs 23.4%, SD = 0.14), but less tier 3 RTDs (68.4% vs 74.4%, SD = 0.13). A greater proportion of visits in winter compared to summer months had an antibiotic prescribed for tier 2 RTDs (80.2% vs 74.2%, SD = 0.14) and tier 3 RTDs (22.9% vs 16.2%, SD = 0.17). The proportion of inappropriate antibiotic prescribing was higher in winter compared to summer months (72.4% vs 62.0%, P < .01). Conclusions: Increases in antibiotic prescribing for RTD visits from summer to winter were likely driven by shifts in diagnoses as well as increases in prescribing for certain diagnoses. At least some of this increased prescribing was inappropriate.
RESUMEN
Introduction. Lack of laboratory capacity hampers consistent national antimicrobial resistance (AMR) surveillance. Chromogenic media may provide a practical screening tool for detection of individuals colonized by extended-spectrum beta-lactamase (ESBL)-producing organisms.Hypothesis. CHROMagar ESBL media represent an adequate screening method for the detection of extended-spectrum cephalosporin-resistant Enterobacterales (ESCrE), isolated from rectal swabs.Aim. To evaluate the performance of CHROMagar ESBL media to accurately identify ESCrE isolates from rectal swab samples attained from hospitalized and community participants.Methodology. All participants provided informed consent prior to enrolment. Rectal swabs from 2469 hospital and community participants were inoculated onto CHROMagar ESBL. The performance of CHROMagar ESBL to differentiate Escherichia coli and Klebsiella spp., Enterobacter spp. and Citrobacter spp. (KEC spp.) as well as select for extended-spectrum cephalosporin resistance were compared to matrix-assisted laser desorption/ionization-time-of-flight MS (MALDI-TOF-MS) and VITEK-2 automated susceptibility testing.Results. CHROMagar ESBL had a positive and negative agreement of 91.2â% (95â% CI, 88.4-93.3) and 86.8â% (95â% CI, 82.0-90.7) for E. coli and 88.1â% (95â% CI 83.2-92.1) and 87.6â% (95â% CI 84.7-90.2) for KEC spp. differentiation, respectively, when compared to species ID by MALDI-TOF-MS. When evaluated for phenotypic susceptibilities (VITEK-2), 88.1â% (714/810) of the isolates recovered on the selective agar exhibited resistance to third-generation cephalosporins.Conclusion. The performance characteristics of CHROMagar ESBL media suggest that they may be a viable screening tool for the identification of ESCrE from hospitalized and community participants and could be used to inform infection prevention and control practices in Botswana and potentially other low-and middle-income countries (LMICs). Further studies are required to analyse the costs and the impact on time-to-result of the media in comparison with available laboratory methods for ESCrE surveillance in the country.
Asunto(s)
Cefalosporinas , Gammaproteobacteria , Humanos , Cefalosporinas/farmacología , Botswana , Escherichia coli , Monobactamas , Agar , HidrolasasRESUMEN
Background: Reported ß-lactam allergies (BLAs) are common and frequently inaccurate, but there are limited data on the clinical implications of BLA among solid organ transplant (SOT) recipients. We examined the impact of BLA on clinical outcomes and antibiotic use among SOT recipients. Methods: This retrospective cohort study included adult patients undergoing single-organ heart, kidney, liver, lung, or pancreas transplant at a United States academic medical center from 1 April 2017 to 31 December 2020. Demographic and clinical data were collected from the electronic health record. Multivariate median regression was performed to evaluate the association between BLA and days alive and out of the hospital in the first 180 days posttransplant (DAOH180). Multivariate logistic regression was performed to evaluate the association between BLA and antibiotic use. Results: Among 1700 SOT recipients, 285 (16.8%) had a BLA at the time of transplant. BLA was not associated with DAOH180 (adjusted median difference, -0.8 days [95% confidence interval {CI}, -2.7 to 1.2]; P = .43). Patients with BLA were more likely to receive intravenous vancomycin (adjusted odds ratio [aOR], 1.8 [95% CI, 1.3-2.6]; P < .001), clindamycin (aOR, 9.9 [95% CI, 5.1-18.9]; P < .001), aztreonam (aOR, 19.6 [95% CI, 5.9-64.4]; P < .001), fluoroquinolones (aOR, 3.8 [95% CI, 2.8-5.0]; P < .001), or aminoglycosides (aOR, 3.9 [95% CI, 2.5-6.2]; P < .001). Conclusions: BLA was associated with use of ß-lactam alternative antibiotics but not DAOH180 among SOT recipients.
RESUMEN
OBJECTIVE: To determine metrics and provider characteristics associated with inappropriate antibiotic prescribing for respiratory tract diagnoses (RTDs). DESIGN: Retrospective cohort study. SETTING: Primary care practices in a university health system. PARTICIPANTS: Patients seen by an attending physician or advanced practice provider (APP) at their primary care office visit with International Classification of Disease, Tenth Revision, Clinical Modification (ICD-10-CM)-coded RTDs. METHODS: Medical records were reviewed for 1,200 randomly selected office visits in which an antibiotic was prescribed to determine appropriateness. Based on this gold standard, metrics and provider characteristics associated with inappropriate antibiotic prescribing were determined. RESULTS: Overall, 69% of antibiotics were inappropriate. Metrics utilizing prespecified RTDs most strongly associated with inappropriate prescribing were (1) proportion prescribing for RTDs for which antibiotics are almost never required (eg, bronchitis) and (2) proportion prescribing for any RTD. Provider characteristics associated with inappropriate antibiotic prescribing were APP versus physician (72% vs 58%; P = .02), family medicine versus internal medicine (76% vs 63%; P = .01), board certification 1997 or later versus board certification before 1997 (75% vs 63%; P = .02), nonteaching versus teaching practice (73% vs 51%; P < .01), and nonurban vs urban practice (77% vs 57%; P < .01). CONCLUSIONS: Metrics utilizing proportion prescribing for RTDs for which antibiotics are almost never required and proportion prescribing for any RTD were most strongly associated with inappropriate prescribing. APPs and clinicians with family medicine training, with board certification 1997 or later, and who worked in nonteaching or nonurban practices had higher proportions of inappropriate prescribing. These findings could inform design of interventions to improve prescribing and could represent an efficient way to track inappropriate prescribing.
Asunto(s)
Programas de Optimización del Uso de los Antimicrobianos , Infecciones del Sistema Respiratorio , Antibacterianos/uso terapéutico , Benchmarking , Humanos , Prescripción Inadecuada/prevención & control , Pacientes Ambulatorios , Pautas de la Práctica en Medicina , Sistema Respiratorio , Infecciones del Sistema Respiratorio/tratamiento farmacológico , Estudios RetrospectivosRESUMEN
Background: A major challenge for antibiotic stewardship programs is the lack of accurate and accessible electronic data to target interventions. We developed and validated separate electronic algorithms to identify inappropriate antibiotic use for adult outpatients with bronchitis and pharyngitis. Methods: We used International Classification of Diseases, 10th Revision, diagnostic codes to identify patient encounters for acute bronchitis and pharyngitis at outpatient practices between 3/15/17 and 3/14/18. Exclusion criteria included immunocompromising conditions, complex chronic conditions, and concurrent infections. We randomly selected 300 eligible subjects each with bronchitis and pharyngitis. Inappropriate antibiotic use based on chart review served as the gold standard for assessment of the electronic algorithm, which was constructed using only data in the electronic data warehouse. Criteria for appropriate prescribing, choice of antibiotic, and duration were based on established guidelines. Results: Of 300 subjects with bronchitis, 167 (55.7%) received an antibiotic inappropriately based on chart review. The electronic algorithm demonstrated 100% sensitivity and 95.3% specificity for detection of inappropriate prescribing. Of 300 subjects with pharyngitis, 94 (31.3%) had an incorrect prescribing decision. Among 29 subjects with a positive rapid streptococcal antigen test, 27 (93.1%) received an appropriate antibiotic and 29 (100%) received the correct duration. The electronic algorithm demonstrated very high sensitivity and specificity for all outcomes. Conclusions: Inappropriate antibiotic prescribing for bronchitis and pharyngitis is common. Electronic algorithms for identifying inappropriate prescribing, antibiotic choice, and duration showed excellent test characteristics. These algorithms could be used to efficiently assess prescribing among practices and individual clinicians. Interventions based on these algorithms should be tested in future work.
RESUMEN
OBJECTIVES: Although extended-spectrum cephalosporin-resistant Enterobacterales (ESCrE) and carbapenem-resistant Enterobacterales (CRE) are a global challenge, data on these organisms in low- and middle-income countries are limited. In this study, we sought to characterize colonization data critical for greater antibiotic resistance surveillance efforts. METHODS: This study was conducted in three hospitals and six clinics in Botswana. We conducted ongoing surveillance of adult patients in hospitals and clinics and adults and children in the community. All participants underwent rectal swab sampling to identify ESCrE and CRE. RESULTS: Enrollment occurred from January 15, 2020, to September 4, 2020, but paused from April 2, 2020, to May 21, 2020, because of a countrywide COVID-19 lockdown. Of 5088 individuals approached, 2469 (49%) participated. ESCrE colonization prevalence was 30.7% overall (43% for hospital participants, 31% for clinic participants, 24% for adult community participants, and 26% for child community participants) (P <0.001). A total of 42 (1.7%) participants were colonized with CRE. CRE colonization prevalence was 1.7% overall (6.8% for hospital participants, 0.7% for clinic participants, 0.2% for adult community participants, and 0.5% for child community participants) (P <0.001). ESCrE and CRE prevalence varied substantially across regions and was significantly higher prelockdown versus postlockdown. CONCLUSIONS: ESCrE colonization was high in all settings in Botswana. CRE prevalence in hospitals was also considerable. Colonization prevalence varied by region and clinical setting and decreased after a countrywide lockdown.
Asunto(s)
COVID-19 , Infecciones por Enterobacteriaceae , Adulto , Antibacterianos/farmacología , Antibacterianos/uso terapéutico , Botswana/epidemiología , Carbapenémicos/farmacología , Carbapenémicos/uso terapéutico , Cefalosporinas , Niño , Control de Enfermedades Transmisibles , Atención a la Salud , Farmacorresistencia Microbiana , Infecciones por Enterobacteriaceae/tratamiento farmacológico , Infecciones por Enterobacteriaceae/epidemiología , Hospitales , HumanosRESUMEN
OBJECTIVES: Real-world data regarding the effectiveness of meropenem/vaborbactam (MVB) in the treatment of carbapenem-resistant Enterobacterales (CRE) infections remain limited. In this retrospective case series, we describe the outcomes of patients who received MVB for serious CRE infections. METHODS: This study included adult patients with MVB-susceptible CRE infection who received ≥48 h of MVB. Clinical and microbiological outcomes were ascertained via chart review. RESULTS: Among 15 patients with CRE infection who were treated with MVB, 9 (60.0%) had a positive clinical response. Among five patients with CRE bone and joint infection, three (60.0%) experienced a positive clinical response. One patient developed a microbiologically confirmed recurrent CRE infection and one patient developed Clostridioides difficile infection. CONCLUSION: MVB was well tolerated and effective for the majority of patients in this case series.
Asunto(s)
Antibacterianos , Carbapenémicos , Adulto , Antibacterianos/uso terapéutico , Ácidos Borónicos , Carbapenémicos/uso terapéutico , Humanos , Meropenem/uso terapéutico , Estudios RetrospectivosRESUMEN
The severe acute respiratory coronavirus-2 (SARS-CoV-2) is the cause of the global outbreak of COVID-19. Evidence suggests that the virus is evolving to allow efficient spread through the human population, including vaccinated individuals. Here, we report a study of viral variants from surveillance of the Delaware Valley, including the city of Philadelphia, and variants infecting vaccinated subjects. We sequenced and analyzed complete viral genomes from 2621 surveillance samples from March 2020 to September 2021 and compared them to genome sequences from 159 vaccine breakthroughs. In the early spring of 2020, all detected variants were of the B.1 and closely related lineages. A mixture of lineages followed, notably including B.1.243 followed by B.1.1.7 (alpha), with other lineages present at lower levels. Later isolations were dominated by B.1.617.2 (delta) and other delta lineages; delta was the exclusive variant present by the last time sampled. To investigate whether any variants appeared preferentially in vaccine breakthroughs, we devised a model based on Bayesian autoregressive moving average logistic multinomial regression to allow rigorous comparison. This revealed that B.1.617.2 (delta) showed 3-fold enrichment in vaccine breakthrough cases (odds ratio of 3; 95% credible interval 0.89-11). Viral point substitutions could also be associated with vaccine breakthroughs, notably the N501Y substitution found in the alpha, beta and gamma variants (odds ratio 2.04; 95% credible interval of1.25-3.18). This study thus overviews viral evolution and vaccine breakthroughs in the Delaware Valley and introduces a rigorous statistical approach to interrogating enrichment of breakthrough variants against a changing background. IMPORTANCE SARS-CoV-2 vaccination is highly effective at reducing viral infection, hospitalization and death. However, vaccine breakthrough infections have been widely observed, raising the question of whether particular viral variants or viral mutations are associated with breakthrough. Here, we report analysis of 2621 surveillance isolates from people diagnosed with COVID-19 in the Delaware Valley in southeastern Pennsylvania, allowing rigorous comparison to 159 vaccine breakthrough case specimens. Our best estimate is a 3-fold enrichment for some lineages of delta among breakthroughs, and enrichment of a notable spike substitution, N501Y. We introduce statistical methods that should be widely useful for evaluating vaccine breakthroughs and other viral phenotypes.
Asunto(s)
COVID-19 , Vacunas , Humanos , SARS-CoV-2 , Teorema de Bayes , Vacunas contra la COVID-19 , DelawareRESUMEN
The severe acute respiratory coronavirus-2 (SARS-CoV-2) is the cause of the global outbreak of COVID-19. Evidence suggests that the virus is evolving to allow efficient spread through the human population, including vaccinated individuals. Here we report a study of viral variants from surveillance of the Delaware Valley, including the city of Philadelphia, and variants infecting vaccinated subjects. We sequenced and analyzed complete viral genomes from 2621 surveillance samples from March 2020 to September 2021 and compared them to genome sequences from 159 vaccine breakthroughs. In the early spring of 2020, all detected variants were of the B.1 and closely related lineages. A mixture of lineages followed, notably including B.1.243 followed by B.1.1.7 (alpha), with other lineages present at lower levels. Later isolations were dominated by B.1.617.2 (delta) and other delta lineages; delta was the exclusive variant present by the last time sampled. To investigate whether any variants appeared preferentially in vaccine breakthroughs, we devised a model based on Bayesian autoregressive moving average logistic multinomial regression to allow rigorous comparison. This revealed that B.1.617.2 (delta) showed three-fold enrichment in vaccine breakthrough cases (odds ratio of 3; 95% credible interval 0.89-11). Viral point substitutions could also be associated with vaccine breakthroughs, notably the N501Y substitution found in the alpha, beta and gamma variants (odds ratio 2.04; 95% credible interval of 1.25-3.18). This study thus provides a detailed picture of viral evolution in the Delaware Valley and a geographically matched analysis of vaccine breakthroughs; it also introduces a rigorous statistical approach to interrogating enrichment of viral variants.
RESUMEN
BACKGROUND: Clostridioides difficile infection (CDI) is the leading cause of antibiotic-associated and health care-associated diarrhea in humans. Recurrent CDI (R-CDI) occurs in ~20%-30% of patients with CDI and results in increased morbidity, mortality, and hospital costs. Genomic analyses have shown overlap of C. difficile isolates from animals and people, suggesting that a zoonotic reservoir may contribute to recurrence. The objective of this study was to determine whether pet ownership is a risk factor for recurrence of CDI. METHODS: We conducted a case-control study among patients with recurrent CDI (cases; nâ =â 86) and patients with nonrecurrent CDI (controls; nâ =â 146). Multivariable logistic regression modeling was used to determine the association between recurrence of CDI and pet ownership while accounting for patient-level risk factors. RESULTS: Pet ownership was not significantly associated with recurrence of CDI (odds ratio [OR],â 1.02; 95% confidence interval [CI],â 0.38-2.72; Pâ =â 0.965) among all patients (nâ =â 232). However, among the subset of patients with community-associated or community-onset health care facility-acquired CDI (nâ =â 127), increasing contact with pets was increasingly protective against recurrence: for every point increase in a pet contact score (out of 7 possible points), the odds of recurrence decreased by 14% (OR,â 0.86; 95% CI,â 0.74-1.00; Pâ =â 0.051). CONCLUSIONS: Close interactions with pets appear protective against the recurrence of community-acquired CDI. A potential mechanism may involve beneficial contributions to the microbiota of pet owners afflicted with CDI, as has been observed for other conditions such as atopy, obesity, and food allergies. However, more research is needed to understand the interactions between pets, owners, and their microbiota.