RESUMO
COVID-19 manifests with a wide spectrum of clinical phenotypes that are characterized by exaggerated and misdirected host immune responses1-6. Although pathological innate immune activation is well-documented in severe disease1, the effect of autoantibodies on disease progression is less well-defined. Here we use a high-throughput autoantibody discovery technique known as rapid extracellular antigen profiling7 to screen a cohort of 194 individuals infected with SARS-CoV-2, comprising 172 patients with COVID-19 and 22 healthcare workers with mild disease or asymptomatic infection, for autoantibodies against 2,770 extracellular and secreted proteins (members of the exoproteome). We found that patients with COVID-19 exhibit marked increases in autoantibody reactivities as compared to uninfected individuals, and show a high prevalence of autoantibodies against immunomodulatory proteins (including cytokines, chemokines, complement components and cell-surface proteins). We established that these autoantibodies perturb immune function and impair virological control by inhibiting immunoreceptor signalling and by altering peripheral immune cell composition, and found that mouse surrogates of these autoantibodies increase disease severity in a mouse model of SARS-CoV-2 infection. Our analysis of autoantibodies against tissue-associated antigens revealed associations with specific clinical characteristics. Our findings suggest a pathological role for exoproteome-directed autoantibodies in COVID-19, with diverse effects on immune functionality and associations with clinical outcomes.
Assuntos
Autoanticorpos/análise , Autoanticorpos/imunologia , COVID-19/imunologia , COVID-19/metabolismo , Proteoma/imunologia , Proteoma/metabolismo , Animais , Antígenos de Superfície/imunologia , COVID-19/patologia , COVID-19/fisiopatologia , Estudos de Casos e Controles , Proteínas do Sistema Complemento/imunologia , Citocinas/imunologia , Modelos Animais de Doenças , Progressão da Doença , Feminino , Humanos , Masculino , Camundongos , Especificidade de Órgãos/imunologiaRESUMO
Recent studies have provided insights into the pathogenesis of coronavirus disease 2019 (COVID-19)1-4. However, the longitudinal immunological correlates of disease outcome remain unclear. Here we serially analysed immune responses in 113 patients with moderate or severe COVID-19. Immune profiling revealed an overall increase in innate cell lineages, with a concomitant reduction in T cell number. An early elevation in cytokine levels was associated with worse disease outcomes. Following an early increase in cytokines, patients with moderate COVID-19 displayed a progressive reduction in type 1 (antiviral) and type 3 (antifungal) responses. By contrast, patients with severe COVID-19 maintained these elevated responses throughout the course of the disease. Moreover, severe COVID-19 was accompanied by an increase in multiple type 2 (anti-helminths) effectors, including interleukin-5 (IL-5), IL-13, immunoglobulin E and eosinophils. Unsupervised clustering analysis identified four immune signatures, representing growth factors (A), type-2/3 cytokines (B), mixed type-1/2/3 cytokines (C), and chemokines (D) that correlated with three distinct disease trajectories. The immune profiles of patients who recovered from moderate COVID-19 were enriched in tissue reparative growth factor signature A, whereas the profiles of those with who developed severe disease had elevated levels of all four signatures. Thus, we have identified a maladapted immune response profile associated with severe COVID-19 and poor clinical outcome, as well as early immune signatures that correlate with divergent disease trajectories.
Assuntos
Infecções por Coronavirus/imunologia , Infecções por Coronavirus/fisiopatologia , Citocinas/análise , Pneumonia Viral/imunologia , Pneumonia Viral/fisiopatologia , Adulto , Idoso , Idoso de 80 Anos ou mais , COVID-19 , Análise por Conglomerados , Citocinas/imunologia , Eosinófilos/imunologia , Feminino , Humanos , Imunoglobulina E/análise , Imunoglobulina E/imunologia , Interleucina-13/análise , Interleucina-13/imunologia , Interleucina-5/análise , Interleucina-5/imunologia , Masculino , Pessoa de Meia-Idade , Pandemias , Linfócitos T/citologia , Linfócitos T/imunologia , Carga Viral , Adulto JovemRESUMO
There is increasing evidence that coronavirus disease 2019 (COVID-19) produces more severe symptoms and higher mortality among men than among women1-5. However, whether immune responses against severe acute respiratory syndrome coronavirus (SARS-CoV-2) differ between sexes, and whether such differences correlate with the sex difference in the disease course of COVID-19, is currently unknown. Here we examined sex differences in viral loads, SARS-CoV-2-specific antibody titres, plasma cytokines and blood-cell phenotyping in patients with moderate COVID-19 who had not received immunomodulatory medications. Male patients had higher plasma levels of innate immune cytokines such as IL-8 and IL-18 along with more robust induction of non-classical monocytes. By contrast, female patients had more robust T cell activation than male patients during SARS-CoV-2 infection. Notably, we found that a poor T cell response negatively correlated with patients' age and was associated with worse disease outcome in male patients, but not in female patients. By contrast, higher levels of innate immune cytokines were associated with worse disease progression in female patients, but not in male patients. These findings provide a possible explanation for the observed sex biases in COVID-19, and provide an important basis for the development of a sex-based approach to the treatment and care of male and female patients with COVID-19.
Assuntos
COVID-19/imunologia , Citocinas/imunologia , Imunidade Inata/imunologia , SARS-CoV-2/imunologia , Caracteres Sexuais , Linfócitos T/imunologia , COVID-19/sangue , COVID-19/virologia , Quimiocinas/sangue , Quimiocinas/imunologia , Estudos de Coortes , Citocinas/sangue , Progressão da Doença , Feminino , Humanos , Ativação Linfocitária , Masculino , Monócitos/imunologia , Fenótipo , Prognóstico , RNA Viral/análise , SARS-CoV-2/patogenicidade , Carga ViralRESUMO
Significant variations have been observed in viral copies generated during SARS-CoV-2 infections. However, the factors that impact viral copies and infection dynamics are not fully understood, and may be inherently dependent upon different viral and host factors. Here, we conducted virus whole genome sequencing and measured viral copies using RT-qPCR from 9,902 SARS-CoV-2 infections over a 2-year period to examine the impact of virus genetic variation on changes in viral copies adjusted for host age and vaccination status. Using a genome-wide association study (GWAS) approach, we identified multiple single-nucleotide polymorphisms (SNPs) corresponding to amino acid changes in the SARS-CoV-2 genome associated with variations in viral copies. We further applied a marginal epistasis test to detect interactions among SNPs and identified multiple pairs of substitutions located in the spike gene that have non-linear effects on viral copies. We also analyzed the temporal patterns and found that SNPs associated with increased viral copies were predominantly observed in Delta and Omicron BA.2/BA.4/BA.5/XBB infections, whereas those associated with decreased viral copies were only observed in infections with Omicron BA.1 variants. Our work showcases how GWAS can be a useful tool for probing phenotypes related to SNPs in viral genomes that are worth further exploration. We argue that this approach can be used more broadly across pathogens to characterize emerging variants and monitor therapeutic interventions.
Assuntos
COVID-19 , Genoma Viral , Estudo de Associação Genômica Ampla , Polimorfismo de Nucleotídeo Único , SARS-CoV-2 , Polimorfismo de Nucleotídeo Único/genética , Humanos , SARS-CoV-2/genética , Estudo de Associação Genômica Ampla/métodos , COVID-19/genética , COVID-19/virologia , Genoma Viral/genética , Glicoproteína da Espícula de Coronavírus/genética , Pessoa de Meia-Idade , Adulto , Masculino , Feminino , Carga Viral/genética , Idoso , Sequenciamento Completo do Genoma/métodosRESUMO
Over the last century, outbreaks and pandemics have occurred with disturbing regularity, necessitating advance preparation and large-scale, coordinated response. Here, we developed a machine learning predictive model of disease severity and length of hospitalization for COVID-19, which can be utilized as a platform for future unknown viral outbreaks. We combined untargeted metabolomics on plasma data obtained from COVID-19 patients (n = 111) during hospitalization and healthy controls (n = 342), clinical and comorbidity data (n = 508) to build this patient triage platform, which consists of three parts: (i) the clinical decision tree, which amongst other biomarkers showed that patients with increased eosinophils have worse disease prognosis and can serve as a new potential biomarker with high accuracy (AUC = 0.974), (ii) the estimation of patient hospitalization length with ± 5 days error (R2 = 0.9765) and (iii) the prediction of the disease severity and the need of patient transfer to the intensive care unit. We report a significant decrease in serotonin levels in patients who needed positive airway pressure oxygen and/or were intubated. Furthermore, 5-hydroxy tryptophan, allantoin, and glucuronic acid metabolites were increased in COVID-19 patients and collectively they can serve as biomarkers to predict disease progression. The ability to quickly identify which patients will develop life-threatening illness would allow the efficient allocation of medical resources and implementation of the most effective medical interventions. We would advocate that the same approach could be utilized in future viral outbreaks to help hospitals triage patients more effectively and improve patient outcomes while optimizing healthcare resources.
Assuntos
COVID-19 , Humanos , COVID-19/epidemiologia , Triagem , Alantoína , Surtos de Doenças , Aprendizado de MáquinaRESUMO
BACKGROUND: Improving hypertension control is a public health priority. However, consistent identification of uncontrolled hypertension using computable definitions in electronic health records (EHR) across health systems remains uncertain. METHODS: In this retrospective cohort study, we applied two computable definitions to the EHR data to identify patients with controlled and uncontrolled hypertension and to evaluate differences in characteristics, treatment, and clinical outcomes between these patient populations. We included adult patients (≥ 18 years) with hypertension (based on either ICD-10 codes of hypertension or two elevated blood pressure [BP] measurements) receiving ambulatory care within Yale-New Haven Health System (YNHHS; a large US health system) and OneFlorida Clinical Research Consortium (OneFlorida; a Clinical Research Network comprised of 16 health systems) between October 2015 and December 2018. We identified patients with controlled and uncontrolled hypertension based on either a single BP measurement from a randomly selected visit or all BP measurements recorded between hypertension identification and the randomly selected visit). RESULTS: Overall, 253,207 and 182,827 adults at YNHHS and OneFlorida were identified as having hypertension. Of these patients, 83.1% at YNHHS and 76.8% at OneFlorida were identified using ICD-10-CM codes, whereas 16.9% and 23.2%, respectively, were identified using elevated BP measurements (≥ 140/90 mmHg). A total of 24.1% of patients at YNHHS and 21.6% at OneFlorida had both diagnosis code for hypertension and elevated blood pressure measurements. Uncontrolled hypertension was observed among 32.5% and 43.7% of patients at YNHHS and OneFlorida, respectively. Uncontrolled hypertension was disproportionately higher among Black patients when compared with White patients (38.9% versus 31.5% in YNHHS; p < 0.001; 49.7% versus 41.2% in OneFlorida; p < 0.001). Medication prescription for hypertension management was more common in patients with uncontrolled hypertension when compared with those with controlled hypertension (overall treatment rate: 39.3% versus 37.3% in YNHHS; p = 0.04; 42.2% versus 34.8% in OneFlorida; p < 0.001). Patients with controlled and uncontrolled hypertension had similar incidence rates of deaths, CVD events, and healthcare visits at 3, 6, 12, and 24 months. The two computable definitions generated consistent results. CONCLUSIONS: While the current EHR systems are not fully optimized for disease surveillance and stratification, our findings illustrate the potential of leveraging EHR data to conduct digital population surveillance in the realm of hypertension management.
Assuntos
Anti-Hipertensivos , Pressão Sanguínea , Registros Eletrônicos de Saúde , Hipertensão , Humanos , Hipertensão/diagnóstico , Hipertensão/fisiopatologia , Hipertensão/tratamento farmacológico , Hipertensão/epidemiologia , Masculino , Feminino , Estudos Retrospectivos , Pessoa de Meia-Idade , Anti-Hipertensivos/uso terapêutico , Idoso , Pressão Sanguínea/efeitos dos fármacos , Adulto , Resultado do Tratamento , Estados Unidos/epidemiologia , Fatores de TempoRESUMO
BACKGROUND: The impact variant-specific immune evasion and waning protection have on declining coronavirus disease 2019 (COVID-19) vaccine effectiveness (VE) remains unclear. Using whole-genome sequencing (WGS), we examined the contribution these factors had on the decline that followed the introduction of the Delta variant. Furthermore, we evaluated calendar-period-based classification as a WGS alternative. METHODS: We conducted a test-negative case-control study among people tested for severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) between 1 April and 24 August 2021. Variants were classified using WGS and calendar period. RESULTS: We included 2029 cases (positive, sequenced samples) and 343 727 controls (negative tests). VE 14-89 days after second dose was significantly higher against Alpha (84.4%; 95% confidence interval [CI], 75.6%-90.0%) than Delta infection (68.9%; 95% CI, 58.0%-77.1%). The odds of Delta infection were significantly higher 90-149 than 14-89 days after second dose (P value = .003). Calendar-period-classified VE estimates approximated WGS-classified estimates; however, calendar-period-based classification was subject to misclassification (35% Alpha, 4% Delta). CONCLUSIONS: Both waning protection and variant-specific immune evasion contributed to the lower effectiveness. While calendar-period-classified VE estimates mirrored WGS-classified estimates, our analysis highlights the need for WGS when variants are cocirculating and misclassification is likely.
Assuntos
COVID-19 , Hepatite D , Humanos , Vacinas contra COVID-19 , Estudos de Casos e Controles , Evasão da Resposta Imune , SARS-CoV-2 , Eficácia de VacinasRESUMO
Graph data models are an emerging approach to structure clinical and biomedical information. These models offer intriguing opportunities for novel approaches in healthcare, such as disease phenotyping, risk prediction, and personalized precision care. The combination of data and information in a graph model to create knowledge graphs has rapidly expanded in biomedical research, but the integration of real-world data from the electronic health record has been limited. To broadly apply knowledge graphs to EHR and other real-world data, a deeper understanding of how to represent these data in a standardized graph model is needed. We provide an overview of the state-of-the-art research for clinical and biomedical data integration and summarize the potential to accelerate healthcare and precision medicine research through insight generation from integrated knowledge graphs.
Assuntos
Algoritmos , Pesquisa Biomédica , Humanos , Reconhecimento Automatizado de Padrão , Fenótipo , Medicina de PrecisãoRESUMO
BACKGROUND: The benefit of primary and booster vaccination in people who experienced a prior Severe Acute Respiratory Syndrome Coronavirus 2 (SARS-CoV-2) infection remains unclear. The objective of this study was to estimate the effectiveness of primary (two-dose series) and booster (third dose) mRNA vaccination against Omicron (lineage BA.1) infection among people with a prior documented infection. METHODS AND FINDINGS: We conducted a test-negative case-control study of reverse transcription PCRs (RT-PCRs) analyzed with the TaqPath (Thermo Fisher Scientific) assay and recorded in the Yale New Haven Health system from November 1, 2021, to April 30, 2022. Overall, 11,307 cases (positive TaqPath analyzed RT-PCRs with S-gene target failure [SGTF]) and 130,041 controls (negative TaqPath analyzed RT-PCRs) were included (median age: cases: 35 years, controls: 39 years). Among cases and controls, 5.9% and 8.1% had a documented prior infection (positive SARS-CoV-2 test record ≥90 days prior to the included test), respectively. We estimated the effectiveness of primary and booster vaccination relative to SGTF-defined Omicron (lineage BA.1) variant infection using a logistic regression adjusted for date of test, age, sex, race/ethnicity, insurance, comorbidities, social venerability index, municipality, and healthcare utilization. The effectiveness of primary vaccination 14 to 149 days after the second dose was 41.0% (95% confidence interval (CI): 14.1% to 59.4%, p 0.006) and 27.1% (95% CI: 18.7% to 34.6%, p < 0.001) for people with and without a documented prior infection, respectively. The effectiveness of booster vaccination (≥14 days after booster dose) was 47.1% (95% CI: 22.4% to 63.9%, p 0.001) and 54.1% (95% CI: 49.2% to 58.4%, p < 0.001) in people with and without a documented prior infection, respectively. To test whether booster vaccination reduced the risk of infection beyond that of the primary series, we compared the odds of infection among boosted (≥14 days after booster dose) and booster-eligible people (≥150 days after second dose). The odds ratio (OR) comparing boosted and booster-eligible people with a documented prior infection was 0.79 (95% CI: 0.54 to 1.16, p 0.222), whereas the OR comparing boosted and booster-eligible people without a documented prior infection was 0.54 (95% CI: 0.49 to 0.59, p < 0.001). This study's limitations include the risk of residual confounding, the use of data from a single system, and the reliance on TaqPath analyzed RT-PCR results. CONCLUSIONS: In this study, we observed that primary vaccination provided significant but limited protection against Omicron (lineage BA.1) infection among people with and without a documented prior infection. While booster vaccination was associated with additional protection against Omicron BA.1 infection in people without a documented prior infection, it was not found to be associated with additional protection among people with a documented prior infection. These findings support primary vaccination in people regardless of documented prior infection status but suggest that infection history may impact the relative benefit of booster doses.
Assuntos
COVID-19 , Humanos , Adulto , COVID-19/epidemiologia , COVID-19/prevenção & controle , SARS-CoV-2/genética , Estudos de Casos e Controles , Razão de Chances , VacinaçãoRESUMO
BACKGROUND: Modern artificial intelligence (AI) and machine learning (ML) methods are now capable of completing tasks with performance characteristics that are comparable to those of expert human operators. As a result, many areas throughout healthcare are incorporating these technologies, including in vitro diagnostics and, more broadly, laboratory medicine. However, there are limited literature reviews of the landscape, likely future, and challenges of the application of AI/ML in laboratory medicine. CONTENT: In this review, we begin with a brief introduction to AI and its subfield of ML. The ensuing sections describe ML systems that are currently in clinical laboratory practice or are being proposed for such use in recent literature, ML systems that use laboratory data outside the clinical laboratory, challenges to the adoption of ML, and future opportunities for ML in laboratory medicine. SUMMARY: AI and ML have and will continue to influence the practice and scope of laboratory medicine dramatically. This has been made possible by advancements in modern computing and the widespread digitization of health information. These technologies are being rapidly developed and described, but in comparison, their implementation thus far has been modest. To spur the implementation of reliable and sophisticated ML-based technologies, we need to establish best practices further and improve our information system and communication infrastructure. The participation of the clinical laboratory community is essential to ensure that laboratory data are sufficiently available and incorporated conscientiously into robust, safe, and clinically effective ML-supported clinical diagnostics.
Assuntos
Inteligência Artificial , Medicina , Atenção à Saúde , Humanos , Laboratórios , Aprendizado de MáquinaRESUMO
BACKGROUND: Clinical babesiosis is diagnosed, and parasite burden is determined, by microscopic inspection of a thick or thin Giemsa-stained peripheral blood smear. However, quantitative analysis by manual microscopy is subject to error. As such, methods for the automated measurement of percent parasitemia in digital microscopic images of peripheral blood smears could improve clinical accuracy, relative to the predicate method. METHODS: Individual erythrocyte images were manually labeled as "parasite" or "normal" and were used to train a model for binary image classification. The best model was then used to calculate percent parasitemia from a clinical validation dataset, and values were compared to a clinical reference value. Lastly, model interpretability was examined using an integrated gradient to identify pixels most likely to influence classification decisions. RESULTS: The precision and recall of the model during development testing were 0.92 and 1.00, respectively. In clinical validation, the model returned increasing positive signal with increasing mean reference value. However, there were 2 highly erroneous false positive values returned by the model. Further, the model incorrectly assessed 3 cases well above the clinical threshold of 10%. The integrated gradient suggested potential sources of false positives including rouleaux formations, cell boundaries, and precipitate as deterministic factors in negative erythrocyte images. CONCLUSIONS: While the model demonstrated highly accurate single cell classification and correctly assessed most slides, several false positives were highly incorrect. This project highlights the need for integrated testing of machine learning-based models, even when models in the development phase perform well.
Assuntos
Babesia , Parasitemia , Eritrócitos , Humanos , Microscopia/métodos , Redes Neurais de Computação , Parasitemia/diagnósticoRESUMO
RATIONALE & OBJECTIVE: Although coronavirus disease 2019 (COVID-19) has been associated with acute kidney injury (AKI), it is unclear whether this association is independent of traditional risk factors such as hypotension, nephrotoxin exposure, and inflammation. We tested the independent association of COVID-19 with AKI. STUDY DESIGN: Multicenter, observational, cohort study. SETTING & PARTICIPANTS: Patients admitted to 1 of 6 hospitals within the Yale New Haven Health System between March 10, 2020, and August 31, 2020, with results for severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) testing via polymerase chain reaction of a nasopharyngeal sample. EXPOSURE: Positive test for SARS-CoV-2. OUTCOME: AKI by KDIGO (Kidney Disease: Improving Global Outcomes) criteria. ANALYTICAL APPROACH: Evaluated the association of COVID-19 with AKI after controlling for time-invariant factors at admission (eg, demographic characteristics, comorbidities) and time-varying factors updated continuously during hospitalization (eg, vital signs, medications, laboratory results, respiratory failure) using time-updated Cox proportional hazard models. RESULTS: Of the 22,122 patients hospitalized, 2,600 tested positive and 19,522 tested negative for SARS-CoV-2. Compared with patients who tested negative, patients with COVID-19 had more AKI (30.6% vs 18.2%; absolute risk difference, 12.5% [95% CI, 10.6%-14.3%]) and dialysis-requiring AKI (8.5% vs 3.6%) and lower rates of recovery from AKI (58% vs 69.8%). Compared with patients without COVID-19, patients with COVID-19 had higher inflammatory marker levels (C-reactive protein, ferritin) and greater use of vasopressors and diuretic agents. Compared with patients without COVID-19, patients with COVID-19 had a higher rate of AKI in univariable analysis (hazard ratio, 1.84 [95% CI, 1.73-1.95]). In a fully adjusted model controlling for demographic variables, comorbidities, vital signs, medications, and laboratory results, COVID-19 remained associated with a high rate of AKI (adjusted hazard ratio, 1.40 [95% CI, 1.29-1.53]). LIMITATIONS: Possibility of residual confounding. CONCLUSIONS: COVID-19 is associated with high rates of AKI not fully explained by adjustment for known risk factors. This suggests the presence of mechanisms of AKI not accounted for in this analysis, which may include a direct effect of COVID-19 on the kidney or other unmeasured mediators. Future studies should evaluate the possible unique pathways by which COVID-19 may cause AKI.
Assuntos
Injúria Renal Aguda/epidemiologia , COVID-19/epidemiologia , Injúria Renal Aguda/sangue , Injúria Renal Aguda/terapia , Idoso , Proteína C-Reativa/metabolismo , COVID-19/metabolismo , COVID-19/terapia , Estudos de Coortes , Creatinina/sangue , Diuréticos/uso terapêutico , Feminino , Mortalidade Hospitalar , Humanos , Unidades de Terapia Intensiva , Tempo de Internação , Masculino , Pessoa de Meia-Idade , Modelos de Riscos Proporcionais , Diálise Renal , Insuficiência Renal Crônica/sangue , Insuficiência Renal Crônica/epidemiologia , Respiração Artificial , Fatores de Risco , SARS-CoV-2 , Índice de Gravidade de Doença , Estados Unidos/epidemiologia , Vasoconstritores/uso terapêuticoRESUMO
BACKGROUND: The electronic health record (EHR) holds the prospect of providing more complete and timely access to clinical information for biomedical research, quality assessments, and quality improvement compared to other data sources, such as administrative claims. In this study, we sought to assess the completeness and timeliness of structured diagnoses in the EHR compared to computed diagnoses for hypertension (HTN), hyperlipidemia (HLD), and diabetes mellitus (DM). METHODS: We determined the amount of time for a structured diagnosis to be recorded in the EHR from when an equivalent diagnosis could be computed from other structured data elements, such as vital signs and laboratory results. We used EHR data for encounters from January 1, 2012 through February 10, 2019 from an academic health system. Diagnoses for HTN, HLD, and DM were computed for patients with at least two observations above threshold separated by at least 30 days, where the thresholds were outpatient blood pressure of ≥ 140/90 mmHg, any low-density lipoprotein ≥ 130 mg/dl, or any hemoglobin A1c ≥ 6.5%, respectively. The primary measure was the length of time between the computed diagnosis and the time at which a structured diagnosis could be identified within the EHR history or problem list. RESULTS: We found that 39.8% of those with HTN, 21.6% with HLD, and 5.2% with DM did not receive a corresponding structured diagnosis recorded in the EHR. For those who received a structured diagnosis, a mean of 389, 198, and 166 days elapsed before the patient had the corresponding diagnosis of HTN, HLD, or DM, respectively, recorded in the EHR. CONCLUSIONS: We found a marked temporal delay between when a diagnosis can be computed or inferred and when an equivalent structured diagnosis is recorded within the EHR. These findings demonstrate the continued need for additional study of the EHR to avoid bias when using observational data and reinforce the need for computational approaches to identify clinical phenotypes.
Assuntos
Diabetes Mellitus , Hipertensão , Diabetes Mellitus/diagnóstico , Diabetes Mellitus/epidemiologia , Registros Eletrônicos de Saúde , Humanos , Hipertensão/diagnóstico , Hipertensão/epidemiologia , Armazenamento e Recuperação da Informação , Pacientes AmbulatoriaisRESUMO
Pathogen-reduced (PR) platelets are routinely used in many countries. Some studies reported changes in platelet and red blood cell (RBC) transfusion requirements in patients who received PR platelets when compared to conventional (CONV) platelets. Over a 28-month period we retrospectively analysed platelet utilisation, RBC transfusion trends, and transfusion reaction rates data from all transfused adult patients transfused at the Yale-New Haven Hospital, New Haven, CT, USA. We determined the number of RBC and platelet components administered between 2 and 24, 48, 72 or 96 h. A total of 3767 patients received 21 907 platelet components (CONV = 8912; PR = 12 995); 1,087 patients received only CONV platelets (1578 components) and 1,466 patients received only PR platelets (2604 components). The number of subsequently transfused platelet components was slightly higher following PR platelet components (P < 0·05); however, fewer RBCs were transfused following PR platelet administration (P < 0·05). The mean time-to-next platelet component transfusion was slightly shorter following PR platelet transfusion (P = 0·002). The rate of non-septic transfusion reactions did not differ (all P > 0·05). Septic transfusion reactions (N = 5) were seen only after CONV platelet transfusions (P = 0·011). These results provide evidence for comparable clinical efficacy of PR and CONV platelets. PR platelets eliminated septic transfusion reactions without increased risk of other types of transfusions with only slight increase in platelet utilisation.
Assuntos
Plaquetas , Desinfecção , Transfusão de Plaquetas/efeitos adversos , Reação Transfusional/epidemiologia , Adulto , Feminino , Humanos , Masculino , Pessoa de Meia-IdadeRESUMO
STUDY OBJECTIVE: The goal of this study is to create a predictive, interpretable model of early hospital respiratory failure among emergency department (ED) patients admitted with coronavirus disease 2019 (COVID-19). METHODS: This was an observational, retrospective, cohort study from a 9-ED health system of admitted adult patients with severe acute respiratory syndrome coronavirus 2 (COVID-19) and an oxygen requirement less than or equal to 6 L/min. We sought to predict respiratory failure within 24 hours of admission as defined by oxygen requirement of greater than 10 L/min by low-flow device, high-flow device, noninvasive or invasive ventilation, or death. Predictive models were compared with the Elixhauser Comorbidity Index, quick Sequential [Sepsis-related] Organ Failure Assessment, and the CURB-65 pneumonia severity score. RESULTS: During the study period, from March 1 to April 27, 2020, 1,792 patients were admitted with COVID-19, 620 (35%) of whom had respiratory failure in the ED. Of the remaining 1,172 admitted patients, 144 (12.3%) met the composite endpoint within the first 24 hours of hospitalization. On the independent test cohort, both a novel bedside scoring system, the quick COVID-19 Severity Index (area under receiver operating characteristic curve mean 0.81 [95% confidence interval {CI} 0.73 to 0.89]), and a machine-learning model, the COVID-19 Severity Index (mean 0.76 [95% CI 0.65 to 0.86]), outperformed the Elixhauser mortality index (mean 0.61 [95% CI 0.51 to 0.70]), CURB-65 (0.50 [95% CI 0.40 to 0.60]), and quick Sequential [Sepsis-related] Organ Failure Assessment (0.59 [95% CI 0.50 to 0.68]). A low quick COVID-19 Severity Index score was associated with a less than 5% risk of respiratory decompensation in the validation cohort. CONCLUSION: A significant proportion of admitted COVID-19 patients progress to respiratory failure within 24 hours of admission. These events are accurately predicted with bedside respiratory examination findings within a simple scoring system.
Assuntos
Infecções por Coronavirus/complicações , Infecções por Coronavirus/diagnóstico , Serviço Hospitalar de Emergência , Pneumonia Viral/complicações , Pneumonia Viral/diagnóstico , Insuficiência Respiratória/virologia , Índice de Gravidade de Doença , Adolescente , Adulto , Idoso , Betacoronavirus , COVID-19 , Teste para COVID-19 , Técnicas de Laboratório Clínico , Infecções por Coronavirus/terapia , Feminino , Humanos , Masculino , Pessoa de Meia-Idade , Oxigenoterapia , Pandemias , Pneumonia Viral/terapia , Insuficiência Respiratória/terapia , Estudos Retrospectivos , Medição de Risco/métodos , SARS-CoV-2 , Adulto JovemRESUMO
Background: As cardiovascular risk increases in China, interest in strategies to mitigate it is growing. However, national information about the prevalence and treatment of high cardiovascular disease (CVD) risk is limited. Objective: To assess the prevalence and treatment of high CVD risk as well as variations in risk across population subgroups. Design: National project of CVD screening and management. Setting: 141 county-level regions in all 31 provinces of China. Participants: Local residents aged 35 to 75 years. Measurements: Rates of high CVD risk were assessed both in the overall study population and by age, sex, body mass index, geographic region, and socioeconomic status. Multivariable mixed models were fitted to assess the associations between individual characteristics and high CVD risk. Statin and aspirin use was evaluated among persons at high risk for CVD. Results: Among 1 680 126 participants, 9.5% (95% CI, 9.5% to 9.6%) had high risk for CVD. Mixed models identified persons who were of Han ethnicity, had medical insurance, were currently using alcohol, or were obese as more likely to be at high risk for CVD. Of those with high CVD risk, only 0.6% (CI, 0.5% to 0.6%) and 2.4% (CI, 2.3% to 2.5%) reported using statins and aspirin, respectively. Among persons with high CVD risk and hypertension, 31.8% were receiving antihypertensive medications. Limitation: Samples were not nationally representative. Conclusion: Of the 1.7 million participants, 1 in 10 had a high risk for CVD; among those at high risk, fewer than 3% were receiving statins or aspirin. An immense opportunity exists for risk mitigation in this substantial population. Primary Funding Source: Ministry of Finance and National Health Commission, China.
Assuntos
Doenças Cardiovasculares/epidemiologia , Programas de Rastreamento/métodos , Medição de Risco/métodos , Adulto , Idoso , China/epidemiologia , Feminino , Humanos , Masculino , Pessoa de Meia-Idade , Prevalência , Fatores de Risco , Classe SocialRESUMO
The ongoing coronavirus disease outbreak demonstrates the need for novel applications of real-time data to produce timely information about incident cases. Using health information technology (HIT) and real-world data, we sought to produce an interface that could, in near real time, identify patients presenting with suspected respiratory tract infection and enable monitoring of test results related to specific pathogens, including severe acute respiratory syndrome coronavirus 2. This tool was built upon our computational health platform, which provides access to near real-time data from disparate HIT sources across our health system. This combination of technology allowed us to rapidly prototype, iterate, and deploy a platform to support a cohesive organizational response to a rapidly evolving outbreak. Platforms that allow for agile analytics are needed to keep pace with evolving needs within the health care system.
Assuntos
Betacoronavirus , Infecções por Coronavirus/epidemiologia , Atenção à Saúde/estatística & dados numéricos , Informática Médica/métodos , Pneumonia Viral/epidemiologia , Vigilância em Saúde Pública/métodos , COVID-19 , Surtos de Doenças/estatística & dados numéricos , Humanos , Pandemias , SARS-CoV-2 , Fatores de TempoRESUMO
OBJECTIVES: To assess the safety and efficacy of a Food and Drug Administration-approved pathogen-reduced platelet (PLT) product in children, as ongoing questions regarding their use in this population remain. STUDY DESIGN: We report findings from a quality assurance review of PLT utilization, associated red blood cell transfusion trends, and short-term safety of conventional vs pathogen-reduced PLTs over a 21-month period while transitioning from conventional to pathogen-reduced PLTs at a large, tertiary care hospital. We assessed utilization in neonatal intensive care unit (NICU) patients, infants 0-1 year not in the NICU, and children age 1-18 years (PED). RESULTS: In the 48 hours after an index conventional or pathogen-reduced platelet transfusion, respectively, NICU patients received 1.0 ± 1.4 (n = 91 transfusions) compared with 1.2 ± 1.3 (n = 145) additional platelet doses (P = .29); infants 0-1 year not in the NICU received 2.8 ± 3.0 (n = 125) vs 2.6 ± 2.6 (n = 254) additional platelet doses (P = .57); and PEDs received 0.9 ± 1.6 (n = 644) vs 1.4 ± 2.2 (n = 673) additional doses (P < .001). Time to subsequent transfusion and red cell utilization were similar in every group (P > .05). The number and type of transfusion reactions did not significantly vary based on PLT type and no rashes were reported in NICU patients receiving phototherapy and pathogen-reduced PLTs. CONCLUSIONS: Conventional and pathogen-reduced PLTs had similar utilization patterns in our pediatric populations. A small, but statistically significant, increase in transfusions was noted following pathogen-reduced PLT transfusion in PED patients, but not in other groups. Red cell utilization and transfusion reactions were similar for both products in all age groups.