ABSTRACT
BACKGROUND: Nosocomial spread of Severe Acute Respiratory Syndrome Coronavirus 2 (SARS-CoV-2) has been widely reported, but the transmission pathways among patients and healthcare workers (HCWs) are unclear. Identifying the risk factors and drivers for these nosocomial transmissions is critical for infection prevention and control interventions. The main aim of our study was to quantify the relative importance of different transmission pathways of SARS-CoV-2 in the hospital setting. METHODS AND FINDINGS: This is an observational cohort study using data from 4 teaching hospitals in Oxfordshire, United Kingdom, from January to October 2020. Associations between infectious SARS-CoV-2 individuals and infection risk were quantified using logistic, generalised additive and linear mixed models. Cases were classified as community- or hospital-acquired using likely incubation periods of 3 to 7 days. Of 66,184 patients who were hospitalised during the study period, 920 had a positive SARS-CoV-2 PCR test within the same period (1.4%). The mean age was 67.9 (Ā±20.7) years, 49.2% were females, and 68.5% were from the white ethnic group. Out of these, 571 patients had their first positive PCR tests while hospitalised (62.1%), and 97 of these occurred at least 7 days after admission (10.5%). Among the 5,596 HCWs, 615 (11.0%) tested positive during the study period using PCR or serological tests. The mean age was 39.5 (Ā±11.1) years, 78.9% were females, and 49.8% were nurses. For susceptible patients, 1 day in the same ward with another patient with hospital-acquired SARS-CoV-2 was associated with an additional 7.5 infections per 1,000 susceptible patients (95% credible interval (CrI) 5.5 to 9.5/1,000 susceptible patients/day) per day. Exposure to an infectious patient with community-acquired Coronavirus Disease 2019 (COVID-19) or to an infectious HCW was associated with substantially lower infection risks (2.0/1,000 susceptible patients/day, 95% CrI 1.6 to 2.2). As for HCW infections, exposure to an infectious patient with hospital-acquired SARS-CoV-2 or to an infectious HCW were both associated with an additional 0.8 infection per 1,000 susceptible HCWs per day (95% CrI 0.3 to 1.6 and 0.6 to 1.0, respectively). Exposure to an infectious patient with community-acquired SARS-CoV-2 was associated with less than half this risk (0.2/1,000 susceptible HCWs/day, 95% CrI 0.2 to 0.2). These assumptions were tested in sensitivity analysis, which showed broadly similar results. The main limitations were that the symptom onset dates and HCW absence days were not available. CONCLUSIONS: In this study, we observed that exposure to patients with hospital-acquired SARS-CoV-2 is associated with a substantial infection risk to both HCWs and other hospitalised patients. Infection control measures to limit nosocomial transmission must be optimised to protect both staff and patients from SARS-CoV-2 infection.
Subject(s)
COVID-19 , Community-Acquired Infections , Cross Infection/epidemiology , Health Personnel , Hospitals , Infectious Disease Transmission, Patient-to-Professional , Infectious Disease Transmission, Professional-to-Patient , Adult , Aged , Aged, 80 and over , COVID-19/transmission , Cohort Studies , Female , Hospitalization , Hospitals/statistics & numerical data , Humans , Infection Control , Infectious Disease Transmission, Patient-to-Professional/statistics & numerical data , Infectious Disease Transmission, Professional-to-Patient/statistics & numerical data , Male , Middle Aged , Nurses , Risk Factors , SARS-CoV-2 , United Kingdom/epidemiologyABSTRACT
BACKGROUND: Candida auris is an emerging and multidrug-resistant pathogen. Here we report the epidemiology of a hospital outbreak of C. auris colonization and infection. METHODS: After identification of a cluster of C. auris infections in the neurosciences intensive care unit (ICU) of the Oxford University Hospitals, United Kingdom, we instituted an intensive patient and environmental screening program and package of interventions. Multivariable logistic regression was used to identify predictors of C. auris colonization and infection. Isolates from patients and from the environment were analyzed by whole-genome sequencing. RESULTS: A total of 70 patients were identified as being colonized or infected with C. auris between February 2, 2015, and August 31, 2017; of these patients, 66 (94%) had been admitted to the neurosciences ICU before diagnosis. Invasive C. auris infections developed in 7 patients. When length of stay in the neurosciences ICU and patient vital signs and laboratory results were controlled for, the predictors of C. auris colonization or infection included the use of reusable skin-surface axillary temperature probes (multivariable odds ratio, 6.80; 95% confidence interval [CI], 2.96 to 15.63; P<0.001) and systemic fluconazole exposure (multivariable odds ratio, 10.34; 95% CI, 1.64 to 65.18; P=0.01). C. auris was rarely detected in the general environment. However, it was detected in isolates from reusable equipment, including multiple axillary skin-surface temperature probes. Despite a bundle of infection-control interventions, the incidence of new cases was reduced only after removal of the temperature probes. All outbreak sequences formed a single genetic cluster within the C. auris South African clade. The sequenced isolates from reusable equipment were genetically related to isolates from the patients. CONCLUSIONS: The transmission of C. auris in this hospital outbreak was found to be linked to reusable axillary temperature probes, indicating that this emerging pathogen can persist in the environment and be transmitted in health care settings. (Funded by the National Institute for Health Research Health Protection Research Unit in Healthcare Associated Infections and Antimicrobial Resistance at Oxford University and others.).
Subject(s)
Candida , Candidiasis/epidemiology , Cross Infection/epidemiology , Disease Outbreaks , Equipment Contamination , Equipment Reuse , Infection Control/methods , Intensive Care Units , Thermometers/microbiology , Adult , Candida/genetics , Candida/isolation & purification , Candidiasis/mortality , Candidiasis/transmission , Case-Control Studies , Cross Infection/mortality , Cross Infection/transmission , Female , Hospital Departments , Humans , Incidence , Male , Microbial Sensitivity Tests , Middle Aged , Multivariate Analysis , Neurology , Phylogeny , Risk Factors , United Kingdom/epidemiologySubject(s)
Candida , Candidiasis/epidemiology , Candidemia/epidemiology , Critical Care , Disease Outbreaks , HumansABSTRACT
OBJECTIVES: Unexplained chronic cough (UCC) is common and has significant impacts on quality of life. Ongoing cough can sensitize the larynx, increasing the urge to cough and perpetuating the cycle of chronic cough. Vibrotactile stimulation (VTS) of the larynx is a noninvasive stimulation technique that can modulate laryngeal somatosensory and motor activity. Study objectives were to assess feasibility and acceptability of VTS use by people with UCC. Secondarily, changes in cough-related quality of life measures were assessed. METHODS: Adults with UCC recorded cough measures at baseline and after completing 2 weeks of daily VTS. Feasibility and acceptability were assessed through participant-reported device use and structured feedback. Cough-related quality of life measures were the Leicester Cough Questionnaire (LCQ) and the Newcastle Laryngeal Hypersensitivity Questionnaire (NLHQ). RESULTS: Nineteen adults participated, with mean age 67 years and cough duration 130 months. Notably, 93% of planned VTS sessions were logged, 94% of participants found the device comfortable to wear, 89% found it easy to operate and 79% would recommend it to others. Pre-post LCQ change achieved a minimal important difference (MID) (mean 1.3 [SD 2.4, p = 0.015]). NLHQ scores improved, but did not reach an MID. CONCLUSIONS: Laryngeal VTS use was feasible and acceptable for use by patients with UCC and was associated with a meaningful improvement in cough-related quality of life. Future studies will include VTS dose refinement and the inclusion of a comparison arm to further assess the potential for laryngeal VTS as a novel treatment modality for UCC. LEVEL OF EVIDENCE: 4 Laryngoscope, 2024.
ABSTRACT
OBJECTIVE: To determine whether there is a difference in patient satisfaction between in-person and virtual voice therapy. METHODS: Patient satisfaction answers to the National Research Corporation (NRC) Health patient survey were retrieved for two separate 11 month periods. The first was for an in-person cohort, from April 2019 to February 2020. The second was for a virtual cohort between April 2020 and February 2021. Two group t tests or Wilcoxon rank sum tests were used to compare responses between the in-person and virtual cohorts. The effect of modality of therapy by gender, age, and race was examined by testing interactions with separate ANOVA models. RESULTS: Responses were compared between 224 patient satisfaction surveys for the virtual cohort and 309 patient satisfaction surveys for the in-person cohort. Overall, responses were highly favorable in all categories. There were no differences between the in-person and virtual cohorts' responses with respect to three main categories: likelihood of future referral of clinic or provider; communication with provider; and comprehension of the treatment plan. The interaction between modality of therapy delivery and age was significant for the question, "Did you know what to do after your visit," with 18-44 year olds in the in-person group reporting a better understanding of the treatment plan compared to the 18-44 year olds in the virtual therapy cohort (PĀ =Ā 0.004). There were no interactions between modality of therapy and gender, or race. CONCLUSION: Virtual delivery of voice therapy was associated with comparable visit satisfaction scores to in-person delivery, with both delivery modalities demonstrating very high satisfaction. Future studies are needed to identify which patients and conditions are most suited for virtual versus in-person delivery of speech-language pathology services in voice clinics.
ABSTRACT
Background: Gram-negative organisms are common causes of bloodstream infection (BSI) during the neonatal period and early childhood. Whilst several large studies have characterised these isolates in adults, equivalent data (particularly incorporating whole genome sequencing) is lacking in the paediatric population. Methods: We perform an epidemiological and sequencing based analysis of Gram-negative bloodstream infections (327 isolates (296 successfully sequenced) from 287 patients) in children <18 years old between 2008 and 2018 in Oxfordshire, UK. Results: Here we show that the burden of infection lies predominantly in neonates and that most infections are caused by Escherichia coli, Klebsiella spp. and Enterobacter hormaechei. There is no evidence in our setting that the proportion of antimicrobial resistant isolates is increasing in the paediatric population although we identify some evidence of sub-breakpoint increases in gentamicin resistance. The population structure of E. coli BSI isolates in neonates and children mirrors that in adults with a predominance of STs 131/95/73/69 and the same proportions of O-antigen serotypes. In most cases in our setting there is no evidence of transmission/point-source acquisition and we demonstrate the utility of whole genome sequencing to refute a previously suspected outbreak. Conclusions: Our findings support continued use of current empirical treatment guidelines and suggest that O-antigen targeted vaccines may have a role in reducing the incidence of neonatal sepsis.
ABSTRACT
BACKGROUND: Gram-negative bloodstream infection (GNBSI) is a threat to public health in terms of mortality and antibiotic resistance. The hepatopancreatobiliary (HPB) cohort accounts for 15%-20% of GNBSI, yet few strategies have been explored to reduce HPB GNBSI. AIM: To identify clinical factors contributing to HPB GNBSI and strategies for its prevention. METHODS: We performed a retrospective analysis of 433 cases of HPB GNBSI presenting to four hospitals between April 2015 and May 2019. We extracted key data from hospital and primary care records including: the underlying source of GNBSI; previous documentation of biliary disease; and any previous surgical or non-surgical management. FINDINGS: Out of 433 cases of HPB GNBSI, 388 had clear evidence of HPB origin. The source of GNBSI was related to gallstone disease in 282 of the 388 cases (73%) and to HPB malignancy in 70 cases (18%). Of the gallstone-related cases, 117 had previously been diagnosed with symptomatic gallstones. Of the 117 with a previous presentation, 93 could have been prevented with a laparoscopic cholecystectomy at the first presentation of gallstones, while 18 could have been prevented if intraoperative biliary tract imaging had been performed during a prior cholecystectomy. Of the 70 malignant cases, five could have been prevented through earlier biliary stenting, use of metal stents instead of plastic stents or earlier pancreaticoduodenectomy. DISCUSSION: The incidence of HPB GNBSI could be reduced by up to 30% by the implementation of alternative management strategies in this cohort.
ABSTRACT
OBJECTIVES: Despite robust efforts, patients and staff acquire SARS-CoV-2 infection in hospitals. We investigated whether whole-genome sequencing enhanced the epidemiological investigation of healthcare-associated SARS-CoV-2 acquisition. METHODS: From 17-November-2020 to 5-January-2021, 803 inpatients and 329 staff were diagnosed with SARS-CoV-2 infection at four Oxfordshire hospitals. We classified cases using epidemiological definitions, looked for a potential source for each nosocomial infection, and evaluated genomic evidence supporting transmission. RESULTS: Using national epidemiological definitions, 109/803(14%) inpatient infections were classified as definite/probable nosocomial, 615(77%) as community-acquired and 79(10%) as indeterminate. There was strong epidemiological evidence to support definite/probable cases as nosocomial. Many indeterminate cases were likely infected in hospital: 53/79(67%) had a prior-negative PCR and 75(95%) contact with a potential source. 89/615(11% of all 803 patients) with apparent community-onset had a recent hospital exposure. Within 764 samples sequenced 607 genomic clusters were identified (>1 SNP distinct). Only 43/607(7%) clusters contained evidence of onward transmission (subsequent cases within ≤Ā 1 SNP). 20/21 epidemiologically-identified outbreaks contained multiple genomic introductions. Most (80%) nosocomial acquisition occurred in rapid super-spreading events in settings with a mix of COVID-19 and non-COVID-19 patients. CONCLUSIONS: Current surveillance definitions underestimate nosocomial acquisition. Most nosocomial transmission occurs from a relatively limited number of highly infectious individuals.
Subject(s)
COVID-19 , Cross Infection , Cross Infection/epidemiology , Disease Outbreaks , Hospitals , Humans , SARS-CoV-2ABSTRACT
PROBLEM: Suicides are now the second leading cause of death among teenagers and young adults, 10-24. Many people who die by suicide visit a healthcare provider in the months before their death. Unfortunately, many healthcare clinicians do not routinely screen for mental health concerns such as suicide risk even though the American Academy of Pediatrics recommends screening adolescents for suicide risk. METHODS: The Ask Suicide-Screening Questions (aSQ), a four-question screening instrument, was administered by nurses to all patients, 12 years and older, admitted to the general pediatric wards of a tertiary Children's Hospital. Nursing feedback and comfort levels were assessed before and after the 6-week pilot program. FINDINGS: During the 6 weeks, 152 eligible children were admitted to the general pediatric wards and 67 were screened using the ASQ; 3/67 had a nonacute "positive" screen and received a further psychiatric assessment. CONCLUSIONS: This pilot quality improvement initiative showed that suicide screening is feasible and acceptable to patients and families in a general pediatric inpatient setting. However, nurses would benefit from further teaching and training around asking suicide screening questions.
Subject(s)
Child, Hospitalized , Hospitals, Pediatric , Inpatients , Psychiatric Status Rating Scales , Risk Assessment/methods , Suicide , Adolescent , Adult , Child , Female , Humans , Male , Quality Improvement , Young AdultABSTRACT
OBJECTIVES: The purpose of this qualitative study was to examine relationships between psychological factors, particularly perceived control, and voice symptoms in adults seeking treatment for a voice problem. METHODS: Semistructured interviews of adult patients with a clinical diagnosis of muscle tension dysphonia were conducted and transcribed. Follow-up interviews were conducted as needed for further information or clarification. A multidisciplinary team analyzed interview content using inductive techniques. Common themes and subthemes were identified. A conceptual model was developed describing the association between voice symptoms, psychological factors, precipitants of ongoing voice symptoms, and perceived control. RESULTS: Thematic saturation was reached after 23 interviews. No participants reported a direct psychological cause for their voice problem, although half described significant life events preceding voice problem onset (eg, miscarriage and other health events, interpersonal conflicts, and family members' illnesses, injuries, and deaths). Participants described psychological influences on voice symptoms that led to rapid exacerbation of their voice symptoms. Participants described the helpfulness of speech therapy and sometimes also challenges of applying techniques in daily life. They also discussed personal coping strategies that included behavioral (eg, avoiding triggers and seeking social support) and psychological (eg, mind-body awareness and emotion regulation) components. Voice-related perceived control was associated with adaptive emotional and behavioral responses, which appeared to facilitate symptom improvement. CONCLUSIONS: In this qualitative pilot study, participant narratives suggested that psychological factors and emotions influence voice symptoms, facilitating development of a preliminary conceptual model of how adaptive and maladaptive responses develop and how they influence vocal function.
Subject(s)
Dysphonia/psychology , Emotions , Life Change Events , Self-Control , Stress, Psychological/psychology , Voice Quality , Adaptation, Psychological , Adult , Aged , Dysphonia/diagnosis , Dysphonia/physiopathology , Dysphonia/therapy , Female , Humans , Interviews as Topic , Male , Middle Aged , Pilot Projects , Qualitative Research , Stress, Psychological/diagnosis , Stress, Psychological/physiopathology , Stress, Psychological/therapy , Time Factors , Voice Training , Young AdultABSTRACT
BACKGROUND: Reporting of strategic healthcare-associated infections (HCAIs) to Public Health England is mandatory for all acute hospital trusts in England, via a web-based HCAI Data Capture System (HCAI-DCS). AIM: Investigate the feasibility of automating the current, manual, HCAI reporting using linked electronic health records (linked-EHR), and assess its level of accuracy. METHODS: All data previously submitted through the HCAI-DCS by the Oxford University Hospitals infection control (IC) team for methicillin-resistant and methicillin-susceptible Staphylococcus aureus (MRSA, MSSA), Clostridium difficile, and Escherichia coli, through March 2017 were downloaded and compared to outputs created from linked-EHR, with detailed comparisons between 2013-2017. FINDINGS: Total MRSA, MSSA, E. coli and C. difficile cases entered by the IC team vs linked-EHR were 428 vs 432, 795 vs 816, 2454 vs 2450 and 3365 vs 3393 respectively. From 2013-2017, most discrepancies (32/37 (86%)) were likely due to IC recording errors. Patient and specimen identifiers were completed for >98% of cases by both methods, with very high agreement (>97%). Fields relating to the patient at the time the specimen was taken were complete to a similarly high level (>99% IC, >97% linked-EHR), and agreement was fairly good (>80%) except for the main and treatment specialties (57% and 54% respectively) and the patient category (55%). Optional, organism-specific data-fields were less complete, by both methods. Where comparisons were possible, agreement was reasonably high (mostly 70-90%). CONCLUSION: Basic factual information, such as demographic data, is almost-certainly better automated, and many other data fields can potentially be populated successfully from linked-EHR. Manual data collection is time-consuming and inefficient; automated electronic data collection would leave healthcare professionals free to focus on clinical rather than administrative work.
Subject(s)
Cross Infection/epidemiology , Electronic Health Records/statistics & numerical data , Epidemiological Monitoring , Infection Control/methods , Public Health Informatics/methods , Datasets as Topic , Disease Notification/methods , Disease Notification/statistics & numerical data , England/epidemiology , Health Plan Implementation/organization & administration , Health Plan Implementation/statistics & numerical data , Hospitals, University/statistics & numerical data , Humans , Infection Control/organization & administration , Mandatory Programs/organization & administration , Mandatory Programs/statistics & numerical data , Program Evaluation , Public Health Administration , Public Health Informatics/statistics & numerical data , Time FactorsABSTRACT
BACKGROUND: Previously, we reported that the Brompton Harefield Infection Score (BHIS) accurately predicts surgical site infection (SSI) after coronary artery bypass grafting (CABG). The BHIS was developed using two-centre data and stratifies SSI risk into three groups based on female gender, diabetes or HbA1c > 7.5%, body mass index ≥ 35, left ventricular ejection fraction < 45% and emergency surgery. The purpose of this study was to prospectively evaluate BHIS internally as well as externally. METHODS: Multi-centre prospective evaluation involving three tertiary centres took place between October 2012 and November 2015. SSI was classified using the Public Health England protocol. Receiver operating characteristic (ROC) curves assessed predictive accuracy. RESULTS: Across the four hospital sites, 168 of 4308 (3.9%) CABG patients had a SSI. Categorising the hospitals by BHIS score revealed that 65% of all patients were low risk (BHIS 0-1), 26% were medium risk (BHIS 2-3) and 8% were high risk (BHIS ≥ 4). The area under the ROC curve was in the range of 0.702-0.785. Overall area under the ROC curve was 0.709. CONCLUSIONS: BHIS provides a novel, internally and externally evaluated score for a patient's risk of SSI after CABG. It enables clinicians to focus on strategies to prospectively identify high-risk patients and improve outcomes.
ABSTRACT
Group B coxsackieviral (CVB) infection commonly causes viral myocarditis. Mice are protected from CVB3 myocarditis by gene-targeted knockout of p56(Lck)(Lck), the Src family kinase (Src) essential for T cell activation. Extracellular signal-regulated kinase 1 and 2 (ERK-1/2) can influence cell function downstream of Lck. Using T cell lines and neonatal cardiac myocytes we investigated the role of ERK-1/2 in CVB3 infection. In Jurkat T cells ERK-1/2 is rapidly activated by CVB3; but, this response is absent in Lck-negative JCaM T cells. Inhibition of ERK-1/2 with UO126 reduced CVB3 titers in Jurkat cells, but not in JCaM cells. In cardiac myocytes CVB3 activation of ERK-1/2 is blocked by the Src inhibitor PP2. In addition, viral production in myocytes is decreased by Src or ERK-1/2 inhibition. In vitro, in both immune and myocardial cells, ERK-1/2 is activated by CVB3 downstream of Lck and other Src's and is necessary for efficient CVB3 replication. In vivo, following CVB3 infection, ERK-1/2 activation is evident in the myocardium. ERK-1/2 activation is intense in the hearts of myocarditis-susceptible A/J mice. In contrast, significantly less ERK-1/2 activation is found in the hearts of myocarditis-resistant C57BL/6 mice. Therefore, the ERK-1/2 response to CVB3 infection may contribute to differential host susceptibility to viral myocarditis.
Subject(s)
Enterovirus B, Human/physiology , Enterovirus Infections/enzymology , MAP Kinase Signaling System , Mitogen-Activated Protein Kinase 1/metabolism , Mitogen-Activated Protein Kinases/metabolism , Myocarditis/enzymology , Animals , Disease Models, Animal , Disease Susceptibility , Enterovirus B, Human/growth & development , Enterovirus Infections/pathology , Enterovirus Infections/virology , Enzyme Activation , Humans , Jurkat Cells , Lymphocyte Specific Protein Tyrosine Kinase p56(lck)/genetics , Lymphocyte Specific Protein Tyrosine Kinase p56(lck)/metabolism , Mice , Mice, Inbred A , Mice, Inbred C57BL , Mitogen-Activated Protein Kinase 3 , Myocarditis/pathology , Myocarditis/virology , Myocardium/pathology , src-Family Kinases/metabolismABSTRACT
AIM: To compare between 2 and 4 d colon cleansing protocols. METHODS: Children who were scheduled for colonoscopy procedure (2010-2012) for various medical reasons, were recruited from the pediatric gastroenterology clinic at Marshall University School of Medicine, Huntington, WV. Exclusion criteria were patients who were allergic to the medication used in the protocols [polyethylene glycol (PEG) 3350, Bisacodyl], or children with metabolic or renal diseases. Two PEG 3350 protocols for 4 d (A) and 2 d (B) were prescribed as previously described. A questionnaire describing the volume of PEG consumed, clinical data, and side effects were recorded. Colon preparation was graded by two observers according to previously described method. MAIN OUTCOME MEASUREMENTS: Rate of adequate colon preparation. RESULTS: A total of 78 patients were considered for final calculation (group A: 40, group B: 38). Age and stool consistency at the last day was comparable in both groups, but the number of stools/day was significantly higher in group B (P = 0.001). Adequate colon preparation was reached in 57.5% (A) and 73.6% (B), respectively (P = 0.206). Side effects were minimal and comparable in both groups. There was no difference in children's age, stool characteristics, or side effects between the children with adequate or inadequate colon preparation. Correlation and agreement between observers was excellent (Pearson correlation = 0.972, kappa = 1.0). CONCLUSION: No difference between protocols was observed, but the 2 d protocol was superior for its shorter time. Direct comparison between different colon cleansing protocols is crucial in order to establish the "gold standard" protocol for children.