Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 44
Filtrar
Mais filtros

País/Região como assunto
Intervalo de ano de publicação
1.
PLoS Comput Biol ; 17(7): e1009053, 2021 07.
Artigo em Inglês | MEDLINE | ID: mdl-34228716

RESUMO

Drug-drug interactions account for up to 30% of adverse drug reactions. Increasing prevalence of electronic health records (EHRs) offers a unique opportunity to build machine learning algorithms to identify drug-drug interactions that drive adverse events. In this study, we investigated hospitalizations' data to study drug interactions with non-steroidal anti-inflammatory drugs (NSAIDS) that result in drug-induced liver injury (DILI). We propose a logistic regression based machine learning algorithm that unearths several known interactions from an EHR dataset of about 400,000 hospitalization. Our proposed modeling framework is successful in detecting 87.5% of the positive controls, which are defined by drugs known to interact with diclofenac causing an increased risk of DILI, and correctly ranks aggregate risk of DILI for eight commonly prescribed NSAIDs. We found that our modeling framework is particularly successful in inferring associations of drug-drug interactions from relatively small EHR datasets. Furthermore, we have identified a novel and potentially hepatotoxic interaction that might occur during concomitant use of meloxicam and esomeprazole, which are commonly prescribed together to allay NSAID-induced gastrointestinal (GI) bleeding. Empirically, we validate our approach against prior methods for signal detection on EHR datasets, in which our proposed approach outperforms all the compared methods across most metrics, such as area under the receiver operating characteristic curve (AUROC) and area under the precision-recall curve (AUPRC).


Assuntos
Anti-Inflamatórios não Esteroides/efeitos adversos , Doença Hepática Induzida por Substâncias e Drogas , Interações Medicamentosas , Registros Eletrônicos de Saúde/estatística & dados numéricos , Aprendizado de Máquina , Adolescente , Adulto , Idoso , Idoso de 80 Anos ou mais , Algoritmos , Doença Hepática Induzida por Substâncias e Drogas/epidemiologia , Doença Hepática Induzida por Substâncias e Drogas/etiologia , Biologia Computacional , Feminino , Humanos , Fígado/efeitos dos fármacos , Masculino , Pessoa de Meia-Idade , Modelos Estatísticos , Estudos Retrospectivos , Adulto Jovem
2.
Clin Infect Dis ; 73(2): 213-222, 2021 07 15.
Artigo em Inglês | MEDLINE | ID: mdl-32421195

RESUMO

BACKGROUND: Quantifying the amount and diversity of antibiotic use in United States hospitals assists antibiotic stewardship efforts but is hampered by limited national surveillance. Our study aimed to address this knowledge gap by examining adult antibiotic use across 576 hospitals and nearly 12 million encounters in 2016-2017. METHODS: We conducted a retrospective study of patients aged ≥ 18 years discharged from hospitals in the Premier Healthcare Database between 1 January 2016 and 31 December 2017. Using daily antibiotic charge data, we mapped antibiotics to mutually exclusive classes and to spectrum of activity categories. We evaluated relationships between facility and case-mix characteristics and antibiotic use in negative binomial regression models. RESULTS: The study included 11 701 326 admissions, totaling 64 064 632 patient-days, across 576 hospitals. Overall, patients received antibiotics in 65% of hospitalizations, at a crude rate of 870 days of therapy (DOT) per 1000 patient-days. By class, use was highest among ß-lactam/ß-lactamase inhibitor combinations, third- and fourth-generation cephalosporins, and glycopeptides. Teaching hospitals averaged lower rates of total antibiotic use than nonteaching hospitals (834 vs 957 DOT per 1000 patient-days; P < .001). In adjusted models, teaching hospitals remained associated with lower use of third- and fourth-generation cephalosporins and antipseudomonal agents (adjusted incidence rate ratio [95% confidence interval], 0.92 [.86-.97] and 0.91 [.85-.98], respectively). Significant regional differences in total and class-specific antibiotic use also persisted in adjusted models. CONCLUSIONS: Adult inpatient antibiotic use remains high, driven predominantly by broad-spectrum agents. Better understanding reasons for interhospital usage differences, including by region and teaching status, may inform efforts to reduce inappropriate antibiotic prescribing.


Assuntos
Antibacterianos , Gestão de Antimicrobianos , Adulto , Antibacterianos/uso terapêutico , Hospitais , Humanos , Alta do Paciente , Estudos Retrospectivos , Estados Unidos
3.
Clin Infect Dis ; 73(11): e4484-e4492, 2021 12 06.
Artigo em Inglês | MEDLINE | ID: mdl-32756970

RESUMO

BACKGROUND: The Centers for Disease Control and Prevention (CDC) uses standardized antimicrobial administration ratios (SAARs)-that is, observed-to-predicted ratios-to compare antibiotic use across facilities. CDC models adjust for facility characteristics when predicting antibiotic use but do not include patient diagnoses and comorbidities that may also affect utilization. This study aimed to identify comorbidities causally related to appropriate antibiotic use and to compare models that include these comorbidities and other patient-level claims variables to a facility model for risk-adjusting inpatient antibiotic utilization. METHODS: The study included adults discharged from Premier Database hospitals in 2016-2017. For each admission, we extracted facility, claims, and antibiotic data. We evaluated 7 models to predict an admission's antibiotic days of therapy (DOTs): a CDC facility model, models that added patient clinical constructs in varying layers of complexity, and an external validation of a published patient-variable model. We calculated hospital-specific SAARs to quantify effects on hospital rankings. Separately, we used Delphi Consensus methodology to identify Elixhauser comorbidities associated with appropriate antibiotic use. RESULTS: The study included 11 701 326 admissions across 576 hospitals. Compared to a CDC-facility model, a model that added Delphi-selected comorbidities and a bacterial infection indicator was more accurate for all antibiotic outcomes. For total antibiotic use, it was 24% more accurate (respective mean absolute errors: 3.11 vs 2.35 DOTs), resulting in 31-33% more hospitals moving into bottom or top usage quartiles postadjustment. CONCLUSIONS: Adding electronically available patient claims data to facility models consistently improved antibiotic utilization predictions and yielded substantial movement in hospitals' utilization rankings.


Assuntos
Antibacterianos , Hospitais , Adulto , Antibacterianos/uso terapêutico , Centers for Disease Control and Prevention, U.S. , Comorbidade , Humanos , Pacientes Internados , Estados Unidos/epidemiologia
4.
Am J Epidemiol ; 190(4): 539-552, 2021 04 06.
Artigo em Inglês | MEDLINE | ID: mdl-33351077

RESUMO

There are limited data on longitudinal outcomes for coronavirus disease 2019 (COVID-19) hospitalizations that account for transitions between clinical states over time. Using electronic health record data from a hospital network in the St. Louis, Missouri, region, we performed multistate analyses to examine longitudinal transitions and outcomes among hospitalized adults with laboratory-confirmed COVID-19 with respect to 15 mutually exclusive clinical states. Between March 15 and July 25, 2020, a total of 1,577 patients in the network were hospitalized with COVID-19 (49.9% male; median age, 63 years (interquartile range, 50-75); 58.8% Black). Overall, 34.1% (95% confidence interval (CI): 26.4, 41.8) had an intensive care unit admission and 12.3% (95% CI: 8.5, 16.1) received invasive mechanical ventilation (IMV). The risk of decompensation peaked immediately after admission; discharges peaked around days 3-5, and deaths plateaued between days 7 and 16. At 28 days, 12.6% (95% CI: 9.6, 15.6) of patients had died (4.2% (95% CI: 3.2, 5.2) had received IMV) and 80.8% (95% CI: 75.4, 86.1) had been discharged. Among those receiving IMV, 35.1% (95% CI: 28.2, 42.0) remained intubated after 14 days; after 28 days, 37.6% (95% CI: 30.4, 44.7) had died and only 37.7% (95% CI: 30.6, 44.7) had been discharged. Multistate methods offer granular characterizations of the clinical course of COVID-19 and provide essential information for guiding both clinical decision-making and public health planning.


Assuntos
COVID-19/epidemiologia , Hospitalização/tendências , Unidades de Terapia Intensiva/estatística & dados numéricos , Pandemias , Respiração Artificial/métodos , SARS-CoV-2 , Idoso , COVID-19/terapia , Feminino , Humanos , Masculino , Pessoa de Meia-Idade , Estudos Retrospectivos , Estados Unidos/epidemiologia
5.
Clin Infect Dis ; 65(5): 803-810, 2017 Sep 01.
Artigo em Inglês | MEDLINE | ID: mdl-28481976

RESUMO

BACKGROUND: Healthcare-associated infections such as surgical site infections (SSIs) are used by the Centers for Medicare and Medicaid Services (CMS) as pay-for-performance metrics. Risk adjustment allows a fairer comparison of SSI rates across hospitals. Until 2016, Centers for Disease Control and Prevention (CDC) risk adjustment models for pay-for-performance SSI did not adjust for patient comorbidities. New 2016 CDC models only adjust for body mass index and diabetes. METHODS: We performed a multicenter retrospective cohort study of patients undergoing surgical procedures at 28 US hospitals. Demographic data and International Classification of Diseases, Ninth Revision codes were obtained on patients undergoing colectomy, hysterectomy, and knee and hip replacement procedures. Complex SSIs were identified by infection preventionists at each hospital using CDC criteria. Model performance was evaluated using measures of discrimination and calibration. Hospitals were ranked by SSI proportion and risk-adjusted standardized infection ratios (SIR) to assess the impact of comorbidity adjustment on public reporting. RESULTS: Of 45394 patients at 28 hospitals, 573 (1.3%) developed a complex SSI. A model containing procedure type, age, race, smoking, diabetes, liver disease, obesity, renal failure, and malnutrition showed good discrimination (C-statistic, 0.73) and calibration. When comparing hospital rankings by crude proportion to risk-adjusted ranks, 24 of 28 (86%) hospitals changed ranks, 16 (57%) changed by ≥2 ranks, and 4 (14%) changed by >10 ranks. CONCLUSIONS: We developed a well-performing risk adjustment model for SSI using electronically available comorbidities. Comorbidity-based risk adjustment should be strongly considered by the CDC and CMS to adequately compare SSI rates across hospitals.


Assuntos
Infecção da Ferida Cirúrgica/epidemiologia , Adulto , Idoso , Comorbidade , Feminino , Humanos , Masculino , Pessoa de Meia-Idade , Estudos Retrospectivos , Risco Ajustado , Fatores de Risco , Estados Unidos/epidemiologia
6.
Health Care Manag Sci ; 19(3): 291-9, 2016 Sep.
Artigo em Inglês | MEDLINE | ID: mdl-25876516

RESUMO

We compare statistical approaches for predicting the likelihood that individual patients will require readmission to hospital within 30 days of their discharge and for setting quality-control standards in that regard. Logistic regression, neural networks and decision trees are found to have comparable discriminating power when applied to cases that were not used to calibrate the respective models. Significant factors for predicting likelihood of readmission are the patient's medical condition upon admission and discharge, length (days) of the hospital visit, care rendered during the hospital stay, size and role of the medical facility, the type of medical insurance, and the environment into which the patient is discharged. Separately constructed models for major medical specialties (Surgery/Gynecology, Cardiorespiratory, Cardiovascular, Neurology, and Medicine) can improve the ability to identify high-risk patients for possible intervention, while consolidated models (with indicator variables for the specialties) can serve well for assessing overall quality of care.


Assuntos
Readmissão do Paciente/estatística & dados numéricos , Fatores Etários , Idoso , Árvores de Decisões , Meio Ambiente , Número de Leitos em Hospital/estatística & dados numéricos , Humanos , Seguro Saúde/estatística & dados numéricos , Tempo de Internação/estatística & dados numéricos , Modelos Logísticos , Medicina/estatística & dados numéricos , Pessoa de Meia-Idade , Redes Neurais de Computação , Alta do Paciente/estatística & dados numéricos , Medição de Risco , Fatores de Risco , Índice de Gravidade de Doença
7.
J Patient Exp ; 8: 23743735211034064, 2021.
Artigo em Inglês | MEDLINE | ID: mdl-34423122

RESUMO

Transitioning from one electronic health record (EHR) system to another is of the most disruptive events in health care and research about its impact on patient experience for inpatient is limited. This study aimed to assess the impact of transitioning EHR on patient experience measured by the Hospital Consumer Assessment of Healthcare Providers and Systems composites and global items. An interrupted time series study was conducted to evaluate quarter-specific changes in patient experience following implementation of a new EHR at a Midwest health care system during 2017 to 2018. First quarter post-implementation was associated with statistically significant decreases in Communication with Nurses (-1.82; 95% CI, -3.22 to -0.43; P = .0101), Responsiveness of Hospital Staff (-2.73; 95% CI, -4.90 to -0.57; P = .0131), Care Transition (-2.01; 95% CI, -3.96 to -0.07; P = .0426), and Recommend the Hospital (-2.42; 95% CI, -4.36 to -0.49; P = .0142). No statistically significant changes were observed in the transition, second, or third quarters post-implementation. Patient experience scores returned to baseline level after two quarters and the impact from EHR transition appeared to be temporary.

8.
Learn Health Syst ; 5(1): e10235, 2021 Jan.
Artigo em Inglês | MEDLINE | ID: mdl-32838037

RESUMO

Problem: The current coronavirus disease 2019 (COVID-19) pandemic underscores the need for building and sustaining public health data infrastructure to support a rapid local, regional, national, and international response. Despite a historical context of public health crises, data sharing agreements and transactional standards do not uniformly exist between institutions which hamper a foundational infrastructure to meet data sharing and integration needs for the advancement of public health. Approach: There is a growing need to apply population health knowledge with technological solutions to data transfer, integration, and reasoning, to improve health in a broader learning health system ecosystem. To achieve this, data must be combined from healthcare provider organizations, public health departments, and other settings. Public health entities are in a unique position to consume these data, however, most do not yet have the infrastructure required to integrate data sources and apply computable knowledge to combat this pandemic. Outcomes: Herein, we describe lessons learned and a framework to address these needs, which focus on: (a) identifying and filling technology "gaps"; (b) pursuing collaborative design of data sharing requirements and transmission mechanisms; (c) facilitating cross-domain discussions involving legal and research compliance; and (d) establishing or participating in multi-institutional convening or coordinating activities. Next steps: While by no means a comprehensive evaluation of such issues, we envision that many of our experiences are universal. We hope those elucidated can serve as the catalyst for a robust community-wide dialogue on what steps can and should be taken to ensure that our regional and national health care systems can truly learn, in a rapid manner, so as to respond to this and future emergent public health crises.

9.
IEEE J Biomed Health Inform ; 25(6): 2204-2214, 2021 06.
Artigo em Inglês | MEDLINE | ID: mdl-33095721

RESUMO

Machine learning, combined with a proliferation of electronic healthcare records (EHR), has the potential to transform medicine by identifying previously unknown interventions that reduce the risk of adverse outcomes. To realize this potential, machine learning must leave the conceptual 'black box' in complex domains to overcome several pitfalls, like the presence of confounding variables. These variables predict outcomes but are not causal, often yielding uninformative models. In this work, we envision a 'conversational' approach to design machine learning models, which couple modeling decisions to domain expertise. We demonstrate this approach via a retrospective cohort study to identify factors which affect the risk of hospital-acquired venous thromboembolism (HA-VTE). Using logistic regression for modeling, we have identified drugs that reduce the risk of HA-VTE. Our analysis reveals that ondansetron, an anti-nausea and anti-emetic medication, commonly used in treating side-effects of chemotherapy and post-general anesthesia period, substantially reduces the risk of HA-VTE when compared to aspirin (11% vs. 15% relative risk reduction or RRR, respectively). The low cost and low morbidity of ondansetron may justify further inquiry into its use as a preventative agent for HA-VTE. This case study highlights the importance of engaging domain expertise while applying machine learning in complex domains.


Assuntos
Tromboembolia Venosa , Hospitais , Humanos , Aprendizado de Máquina , Ondansetron/uso terapêutico , Estudos Retrospectivos , Fatores de Risco , Tromboembolia Venosa/epidemiologia , Tromboembolia Venosa/prevenção & controle
10.
JAMA Netw Open ; 4(9): e2123374, 2021 09 01.
Artigo em Inglês | MEDLINE | ID: mdl-34468756

RESUMO

Importance: In the absence of a national strategy in response to the COVID-19 pandemic, many public health decisions fell to local elected officials and agencies. Outcomes of such policies depend on a complex combination of local epidemic conditions and demographic features as well as the intensity and timing of such policies and are therefore unclear. Objective: To use a decision analytical model of the COVID-19 epidemic to investigate potential outcomes if actual policies enacted in March 2020 (during the first wave of the epidemic) in the St Louis region of Missouri had been delayed. Design, Setting, and Participants: A previously developed, publicly available, open-source modeling platform (Local Epidemic Modeling for Management & Action, version 2.1) designed to enable localized COVID-19 epidemic projections was used. The compartmental epidemic model is programmed in R and Stan, uses bayesian inference, and accepts user-supplied demographic, epidemiologic, and policy inputs. Hospital census data for 1.3 million people from St Louis City and County from March 14, 2020, through July 15, 2020, were used to calibrate the model. Exposures: Hypothetical delays in actual social distancing policies (which began on March 13, 2020) by 1, 2, or 4 weeks. Sensitivity analyses were conducted that explored plausible spontaneous behavior change in the absence of social distancing policies. Main Outcomes and Measures: Hospitalizations and deaths. Results: A model of 1.3 million residents of the greater St Louis, Missouri, area found an initial reproductive number (indicating transmissibility of an infectious agent) of 3.9 (95% credible interval [CrI], 3.1-4.5) in the St Louis region before March 15, 2020, which fell to 0.93 (95% CrI, 0.88-0.98) after social distancing policies were implemented between March 15 and March 21, 2020. By June 15, a 1-week delay in policies would have increased cumulative hospitalizations from an observed actual number of 2246 hospitalizations to 8005 hospitalizations (75% CrI: 3973-15 236 hospitalizations) and increased deaths from an observed actual number of 482 deaths to a projected 1304 deaths (75% CrI, 656-2428 deaths). By June 15, a 2-week delay would have yielded 3292 deaths (75% CrI, 2104-4905 deaths)-an additional 2810 deaths or a 583% increase beyond what was actually observed. Sensitivity analyses incorporating a range of spontaneous behavior changes did not avert severe epidemic projections. Conclusions and Relevance: The results of this decision analytical model study suggest that, in the St Louis region, timely social distancing policies were associated with improved population health outcomes, and small delays may likely have led to a COVID-19 epidemic similar to the most heavily affected areas in the US. These findings indicate that an open-source modeling platform designed to accept user-supplied local and regional data may provide projections tailored to, and more relevant for, local settings.


Assuntos
COVID-19/mortalidade , Política de Saúde , Hospitalização/estatística & dados numéricos , Distanciamento Físico , Teorema de Bayes , Feminino , Mortalidade Hospitalar/tendências , Humanos , Masculino , Missouri , Pandemias , SARS-CoV-2
11.
Am J Infect Control ; 49(5): 646-648, 2021 05.
Artigo em Inglês | MEDLINE | ID: mdl-32860846

RESUMO

Ultraviolet light (UVL) room disinfection has emerged as an adjunct to manual cleaning of patient rooms. Two different no-touch UVL devices were implemented in 3 health system hospitals to reduce Clostridioides difficile infections (CDI). CDI rates at all 3 facilities remained unchanged following implementation of UVL disinfection. Preintervention CDI rates were generally low, and data from one hospital showed high compliance with manual cleaning, which may have limited the impact of UVL disinfection.


Assuntos
Clostridioides difficile , Infecções por Clostridium , Infecção Hospitalar , Clostridioides , Infecções por Clostridium/prevenção & controle , Infecção Hospitalar/prevenção & controle , Desinfecção , Humanos , Raios Ultravioleta
12.
Clin Infect Dis ; 50(4): 459-64, 2010 Feb 15.
Artigo em Inglês | MEDLINE | ID: mdl-20064039

RESUMO

BACKGROUND: Influenza vaccination of health care workers has been recommended since 1984. Multiple strategies to enhance vaccination rates have been suggested, but national rates have remained low. METHODS: BJC HealthCare is a large Midwestern health care organization with approximately 26,000 employees. Because organizational vaccination rates remained below target levels, influenza vaccination was made a condition of employment for all employees in 2008. Medical or religious exemptions could be requested. Predetermined medical contraindications include hypersensitivity to eggs, prior hypersensitivity reaction to influenza vaccine, and history of Guillan-Barré syndrome. Medical exemption requests were reviewed by occupational health nurses and their medical directors. Employees who were neither vaccinated nor exempted by 15 December 2008 were not scheduled for work. Employees still not vaccinated or exempt by 15 January 2009 were terminated. RESULTS: Overall, 25,561 (98.4%) of 25,980 active employees were vaccinated. Ninety employees (0.3%) received religious exemptions, and 321 (1.2%) received medical exemptions. Eight employees (0.03%) were not vaccinated or exempted. Reasons for medical exemption included allergy to eggs (107 [33%]), prior allergic reaction or allergy to other vaccine component (83 [26%]), history of Guillan-Barré syndrome (15 [5%]), and other (116 [36%]), including 14 because of pregnancy. Many requests reflected misinformation about the vaccine. CONCLUSIONS: A mandatory influenza vaccination campaign successfully increased vaccination rates. Fewer employees sought medical or religious exemptions than had signed declination statements during the previous year. A standardized medical exemption request form would simplify the request and review process for employees, their physicians, and occupational health and will be used next year.


Assuntos
Pessoal de Saúde/estatística & dados numéricos , Vacinas contra Influenza/administração & dosagem , Influenza Humana/prevenção & controle , Programas Obrigatórios/estatística & dados numéricos , Vacinação em Massa/estatística & dados numéricos , Feminino , Instalações de Saúde , Humanos , Masculino , Gravidez
13.
Crit Care Med ; 38(8 Suppl): S399-404, 2010 Aug.
Artigo em Inglês | MEDLINE | ID: mdl-20647798

RESUMO

The potential to automate at least part of the surveillance process for health care-associated infections was seen as soon as hospitals began to implement computer systems. Progress toward automated surveillance has been ongoing for the last several decades. But as more information becomes available electronically in the healthcare setting, the promise of electronic surveillance for healthcare-associated infections has become closer to reality. Although true fully automated surveillance is not here yet, significant progress is being made at a number of centers for electronic surveillance of central catheter-associated bloodstream infections, ventilator-associated pneumonia, and other healthcare-associated infections. We review the progress that has been made in this area and issues that need to be addressed as surveillance systems are implemented, as well as promising areas for future development.


Assuntos
Infecção Hospitalar/prevenção & controle , Sistemas de Informação Hospitalar , Vigilância de Evento Sentinela , Infecções Relacionadas a Cateter/diagnóstico , Humanos , Pneumonia Associada à Ventilação Mecânica/diagnóstico
14.
Int J Med Microbiol ; 300(7): 503-11, 2010 Nov.
Artigo em Inglês | MEDLINE | ID: mdl-20510651

RESUMO

Endotracheal (ET) tubes accumulate a biofilm during use, which can harbor potentially pathogenic microorganisms. The enrichment of pathogenic strains in the biofilm may lead to ventilator-associated pneumonia (VAP) with an increased morbidity rate in intensive care units. We used quantitative PCR (qPCR) and gene surveys targeting 16S rRNA genes to quantify and identify the bacterial community to detect fastidious/nonculturable organisms present among extubated ET tubes. We collected eight ET tubes with intubation periods between 12 h and 23 d from different patients in a surgical and a medical intensive care unit. Our qPCR data showed that ET tubes were colonized within 24 h. However, the variation between patients was too high to find a positive correlation between the bacterial load and intubation period. We obtained 1263 near full-length 16S rRNA gene sequences from the diverse bacterial communities. Over 70% of these sequences were associated with genera of typical oral flora, while only 6% were associated with gastrointestinal flora. The most common genus identified was Streptococcus (348/1263), followed by Prevotella (179/1263), and Neisseria (143/1263) with the highest relative concentrations for ET tubes with short intubation periods, indicating oral inoculation of the ET tubes. Our study also shows that even though potentially pathogenic bacteria existed in ET tube biofilms within 24 h of intubation, a longer intubation period increases the opportunity for these organisms to proliferate. In the ET tube that was in place for 23 d, 95% of the sequences belonged to Pseudomonas aeruginosa, which is a bacterial pathogen that is known to out compete commensal bacteria in biofilms, especially during periods of antibiotic treatment. Harboring such pathogens in ET biofilms may increase the chance of VAP, and should be aggressively monitored and prevented.


Assuntos
Bactérias/classificação , Infecções Bacterianas/microbiologia , Biodiversidade , Portador Sadio/microbiologia , Equipamentos e Provisões/microbiologia , Intubação Intratraqueal , Traqueia/microbiologia , Adolescente , Adulto , Idoso , Idoso de 80 Anos ou mais , Bactérias/genética , Bactérias/isolamento & purificação , Carga Bacteriana , Biofilmes/crescimento & desenvolvimento , DNA Bacteriano/química , DNA Bacteriano/genética , DNA Ribossômico/química , DNA Ribossômico/genética , Trato Gastrointestinal/microbiologia , Humanos , Pessoa de Meia-Idade , Dados de Sequência Molecular , Boca/microbiologia , RNA Ribossômico 16S/genética , Análise de Sequência de DNA , Adulto Jovem
15.
JAMA ; 304(18): 2035-41, 2010 Nov 10.
Artigo em Inglês | MEDLINE | ID: mdl-21063013

RESUMO

CONTEXT: Central line-associated bloodstream infection (BSI) rates, determined by infection preventionists using the Centers for Disease Control and Prevention (CDC) surveillance definitions, are increasingly published to compare the quality of patient care delivered by hospitals. However, such comparisons are valid only if surveillance is performed consistently across institutions. OBJECTIVE: To assess institutional variation in performance of traditional central line-associated BSI surveillance. DESIGN, SETTING, AND PARTICIPANTS: We performed a retrospective cohort study of 20 intensive care units among 4 medical centers (2004-2007). Unit-specific central line-associated BSI rates were calculated for 12-month periods. Infection preventionists, blinded to study participation, performed routine prospective surveillance using CDC definitions. A computer algorithm reference standard was applied retrospectively using criteria that adapted the same CDC surveillance definitions. MAIN OUTCOME MEASURES: Correlation of central line-associated BSI rates as determined by infection preventionist vs the computer algorithm reference standard. Variation in performance was assessed by testing for institution-dependent heterogeneity in a linear regression model. RESULTS: Forty-one unit-periods among 20 intensive care units were analyzed, representing 241,518 patient-days and 165,963 central line-days. The median infection preventionist and computer algorithm central line-associated BSI rates were 3.3 (interquartile range [IQR], 2.0-4.5) and 9.0 (IQR, 6.3-11.3) infections per 1000 central line-days, respectively. Overall correlation between computer algorithm and infection preventionist rates was weak (ρ = 0.34), and when stratified by medical center, point estimates for institution-specific correlations ranged widely: medical center A: 0.83; 95% confidence interval (CI), 0.05 to 0.98; P = .04; medical center B: 0.76; 95% CI, 0.32 to 0.93; P = .003; medical center C: 0.50, 95% CI, -0.11 to 0.83; P = .10; and medical center D: 0.10; 95% CI -0.53 to 0.66; P = .77. Regression modeling demonstrated significant heterogeneity among medical centers in the relationship between computer algorithm and expected infection preventionist rates (P < .001). The medical center that had the lowest rate by traditional surveillance (2.4 infections per 1000 central line-days) had the highest rate by computer algorithm (12.6 infections per 1000 central line-days). CONCLUSIONS: Institutional variability of infection preventionist rates relative to a computer algorithm reference standard suggests that there is significant variation in the application of standard central line-associated BSI surveillance definitions across medical centers. Variation in central line-associated BSI surveillance practice may complicate interinstitutional comparisons of publicly reported central line-associated BSI rates.


Assuntos
Bacteriemia/epidemiologia , Infecções Relacionadas a Cateter/epidemiologia , Infecção Hospitalar/epidemiologia , Vigilância da População , Garantia da Qualidade dos Cuidados de Saúde , Centros Médicos Acadêmicos/estatística & dados numéricos , Algoritmos , Bacteriemia/classificação , Infecções Relacionadas a Cateter/classificação , Centers for Disease Control and Prevention, U.S. , Estudos de Coortes , Infecção Hospitalar/classificação , Humanos , Controle de Infecções , Unidades de Terapia Intensiva/estatística & dados numéricos , Reprodutibilidade dos Testes , Estudos Retrospectivos , Método Simples-Cego , Terminologia como Assunto , Estados Unidos/epidemiologia
16.
J Am Med Inform Assoc ; 27(7): 1142-1146, 2020 07 01.
Artigo em Inglês | MEDLINE | ID: mdl-32333757

RESUMO

Data and information technology are key to every aspect of our response to the current coronavirus disease 2019 (COVID-19) pandemic-including the diagnosis of patients and delivery of care, the development of predictive models of disease spread, and the management of personnel and equipment. The increasing engagement of informaticians at the forefront of these efforts has been a fundamental shift, from an academic to an operational role. However, the past history of informatics as a scientific domain and an area of applied practice provides little guidance or prologue for the incredible challenges that we are now tasked with performing. Building on our recent experiences, we present 4 critical lessons learned that have helped shape our scalable, data-driven response to COVID-19. We describe each of these lessons within the context of specific solutions and strategies we applied in addressing the challenges that we faced.


Assuntos
Betacoronavirus , Infecções por Coronavirus/epidemiologia , Registros Eletrônicos de Saúde , Informática Médica , Pandemias , Pneumonia Viral/epidemiologia , COVID-19 , Conjuntos de Dados como Assunto , Humanos , SARS-CoV-2
17.
Mo Med ; 106(4): 274-6, 2009.
Artigo em Inglês | MEDLINE | ID: mdl-19753919

RESUMO

Infections caused by community-acquired methicillin-resistant Staphylococcus aureus (CA-MRSA) have become epidemic over the last decade. It causes a spectrum of diseases in humans but skin and soft tissue infections predominate. Molecular virulence factors in CA-MRSA are incompletely understood. In this article, the epidemiology, presentation, treatment, and surveillance of skin and soft tissue infections due to CA-MRSA are reviewed.


Assuntos
Staphylococcus aureus Resistente à Meticilina , Dermatopatias Infecciosas/microbiologia , Infecções dos Tecidos Moles/microbiologia , Infecções Estafilocócicas/complicações , Infecções Comunitárias Adquiridas , Humanos , Dermatopatias Infecciosas/diagnóstico , Infecções dos Tecidos Moles/diagnóstico , Infecções Estafilocócicas/epidemiologia
18.
Jt Comm J Qual Patient Saf ; 45(7): 480-486, 2019 07.
Artigo em Inglês | MEDLINE | ID: mdl-31133536

RESUMO

Medical errors are a significant source of morbidity and mortality, and while focused efforts to prevent harm have been made, sustaining reductions across multiple categories of patient harm remains a challenge. In 2008 BJC HealthCare initiated a systemwide program to eliminate all major causes of preventable harm and mortality over a five-year period with a goal of sustaining these reductions over the subsequent five years. METHODS: Areas of focus included pressure ulcers, adverse drug events, falls with injury, health care-associated infections, and venous thromboembolism. Initial efforts involved building system-level multidisciplinary teams, utilizing standardized project management methods, and establishing standard surveillance methods. Evidence-based interventions were deployed across the system; core standards were established while allowing for flexibility in local implementation. Improvements were tracked using actual numbers of events rather than rates to increase meaning and interpretability by patients and frontline staff. RESULTS: Over the course of the five-year intervention period, total harm events were reduced by 51.6% (10,371 events in 2009 to 5,018 events in 2012). Continued improvement efforts over the subsequent five years led to additional harm reduction (2,605 events in 2017; a 74.9% reduction since 2009). CONCLUSION: A combination of project management discipline, rigorous surveillance, and focused interventions, along with system-level support of local hospital improvement efforts, led to dramatic reductions in preventable harm and long-term sustainment of progress.


Assuntos
Doença Iatrogênica/prevenção & controle , Melhoria de Qualidade/organização & administração , Acidentes por Quedas/prevenção & controle , Infecção Hospitalar/prevenção & controle , Efeitos Colaterais e Reações Adversas Relacionados a Medicamentos/prevenção & controle , Registros Eletrônicos de Saúde/normas , Humanos , Erros Médicos/prevenção & controle , Segurança do Paciente , Úlcera por Pressão/prevenção & controle , Melhoria de Qualidade/normas , Tromboembolia Venosa/prevenção & controle
20.
Am J Infect Control ; 35(5): 315-8, 2007 Jun.
Artigo em Inglês | MEDLINE | ID: mdl-17577478

RESUMO

BACKGROUND: Clostridium difficile spores can contaminate the hospital environment. Little is known about the prevalence and strain variability of C. difficile environmental contamination in health care facilities. The objective of this study was to assess C. difficile environmental contamination at various health care facilities in a metropolitan area and determine if the North American pulsed field gel electrophoresis type 1 (NAP1) strain was present. METHODS: A cross-sectional pilot survey was conducted. Forty-eight environmental samples were collected from six health care facilities. Samples were cultured for the presence of C. difficile, and positive samples underwent pulsed field gel electrophoresis, toxinotyping, and detection of binary toxin and/or tcdC deletion. RESULTS: C. difficile was cultured from 13 of 48 (27%) samples. Rooms housing a patient with C. difficile-associated disease (CDAD) were more likely to be culture positive than non-CDAD patient rooms (100% vs. 33%; P < 0.01); C. difficile was not isolated outside of patient rooms (0 of 12 samples). The NAP1 epidemic strain was found in 5 out of 6 facilities. CONCLUSION: C. difficile spores frequently contaminated the hospital environment. Rooms with a CDAD patient were more likely to be contaminated than rooms without a CDAD patient. The NAP1 strain was prevalent throughout the metropolitan area.


Assuntos
Toxinas Bacterianas/classificação , Clostridioides difficile/classificação , Clostridioides difficile/isolamento & purificação , Contaminação de Equipamentos/estatística & dados numéricos , Instalações de Saúde , Toxinas Bacterianas/isolamento & purificação , Clostridioides difficile/patogenicidade , Infecções por Clostridium/prevenção & controle , Infecção Hospitalar/prevenção & controle , Estudos Transversais , Disenteria/prevenção & controle , Eletroforese em Gel de Campo Pulsado , Monitoramento Ambiental , Monitoramento Epidemiológico , Humanos , Missouri/epidemiologia , Prevalência
SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA