Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 14 de 14
Filtrar
Mais filtros

Bases de dados
País/Região como assunto
Tipo de documento
Intervalo de ano de publicação
1.
Artigo em Inglês | MEDLINE | ID: mdl-38058222

RESUMO

OBJECTIVE: We aimed to investigate the potential of Growth Differentiation Factor 15 (GDF-15) as a novel biomarker for disease activity in Juvenile Dermatomyositis (JDM). METHODS: We recruited children with juvenile myositis including juvenile dermatomyositis (n = 77), polymyositis (n = 6), and healthy controls (n = 22). GDF-15 levels in plasma were measured using ELISA. Statistical analyses were performed using non-parametric tests. RESULTS: Levels of GDF-15 were significantly elevated in JDM compared with healthy controls (p< 0.001). GDF-15 levels exhibited strong positive correlations with disease activity scores, including the Disease Activity Score (DAS) total score, DAS skin score, DAS muscle score, and Childhood Myositis Assessment Scale (CMAS). Additionally, GDF-15 levels could differentiate between active disease and remission based on the Physician Global Assessment of muscle score. Positive correlations were observed between levels of GDF-15 and creatine kinase, neopterin, and nailfold end row loops, indicating the potential involvement of GDF-15 in muscle damage, immune activation, and vascular pathology. ROC curve analysis showed GDF-15 to be more effective in assessing disease activity in JDM than creatine kinase (AUC 0.77, p= 0.001 and AUC 0.6369, p= 0.0738, respectively). CONCLUSION: GDF-15 may serve as a valuable biomarker for assessing disease activity in JDM. It exhibits better sensitivity and specificity than creatine kinase, and the levels correlate with various disease activity scores and functional measures. GDF-15 may provide valuable information for treatment decision-making and monitoring disease progression in JDM.

2.
Cell Mol Biol (Noisy-le-grand) ; 68(10): 117-123, 2022 Sep 30.
Artigo em Inglês | MEDLINE | ID: mdl-37114261

RESUMO

A group of protozoan parasites known as Leishmania species can cause a variety of chronic illnesses, ranging from self-healing lesions to fatal outcomes. Drug-resistant pathogens have become common due to the lack of safe and effective medications, which has sparked the development of new therapeutic interventions, particularly plant-based natural extracts. As a way to avoid chemotherapy's side effects, natural herbal remedies have drawn more attention. In addition to having anti-inflammatory, anticancer, and cosmetic properties, the secondary metabolites of plants, such as phenolic compounds, flavonoids, alkaloids, and terpenes, have a number of positive effects on our health. Natural metabolites such as naphthoquinone, alkaloids, benzophenones, etc. that have antileishmanial and antiprotozoal activity have been the subject of extensive research. In this review paper, it can be concluded that these natural extracts can be developed into excellent therapeutic agents against Leishmaniasis.


Assuntos
Alcaloides , Antiprotozoários , Leishmania , Leishmaniose , Humanos , Extratos Vegetais/farmacologia , Extratos Vegetais/uso terapêutico , Alcaloides/farmacologia , Alcaloides/uso terapêutico , Leishmaniose/tratamento farmacológico , Antiprotozoários/farmacologia , Antiprotozoários/uso terapêutico
3.
Bioconjug Chem ; 32(10): 2154-2166, 2021 10 20.
Artigo em Inglês | MEDLINE | ID: mdl-34499487

RESUMO

Translation of intravenously administered nanomaterials to the clinic is limited due to adverse infusion reactions. While these reactions are infrequent, with up to 10% prone to experiencing infusion reactions, the reactions can be severe and life-threatening. One of the innate immune pathways, the complement activation pathway, plays a significant role in mediating this response. Nanoparticle surface properties are a relevant design feature, as they control the blood proteins the nanoparticles interact with and allow the nanoparticles to evade the immune reaction. PEGylation of nanosurfaces is critical in improving the blood circulation of nanoparticles and reducing opsonization. Our goal was to understand whether modifying the surface architecture by varying the PEG density and architecture can impact the complement response in vitro. We utilized block copolymers of poly(lactic acid)-b-poly(ethylene glycol) prepared with poly(ethylene glycol) macroinitiators of molecular weights 3400 and 5000 Da. Tracking the complement biomarker C5a, we monitored the impact of changing PEGylation of the nanoparticles. We also investigated how the changing PEG length on the nanoparticle surface impacts further strengthening the stealth properties. Lastly, we determined which cytokines change upon blood incubation with nanoparticles in vitro to understand the extent to which inflammation may occur and the crosstalk between the complement and immune responses. Increasing PEGylation reduced the generation of complement-mediated anaphylatoxin C5a in vitro, with 5000 Da PEG more effectively reducing levels of C5a generated compared to 3400 Da PEG. The insights gathered regarding the impact of PEG density and PEG chain length would be critical in developing stealth nanoparticles that do not lead to infusion reactions upon intravenous administration.


Assuntos
Opsonização , Poliésteres , Lactatos , Nanopartículas , Polietilenoglicóis
4.
Europace ; 23(8): 1179-1191, 2021 08 06.
Artigo em Inglês | MEDLINE | ID: mdl-33564873

RESUMO

In the recent decade, deep learning, a subset of artificial intelligence and machine learning, has been used to identify patterns in big healthcare datasets for disease phenotyping, event predictions, and complex decision making. Public datasets for electrocardiograms (ECGs) have existed since the 1980s and have been used for very specific tasks in cardiology, such as arrhythmia, ischemia, and cardiomyopathy detection. Recently, private institutions have begun curating large ECG databases that are orders of magnitude larger than the public databases for ingestion by deep learning models. These efforts have demonstrated not only improved performance and generalizability in these aforementioned tasks but also application to novel clinical scenarios. This review focuses on orienting the clinician towards fundamental tenets of deep learning, state-of-the-art prior to its use for ECG analysis, and current applications of deep learning on ECGs, as well as their limitations and future areas of improvement.


Assuntos
Cardiologia , Aprendizado Profundo , Inteligência Artificial , Eletrocardiografia , Humanos , Aprendizado de Máquina
5.
J Gen Intern Med ; 35(10): 2838-2844, 2020 10.
Artigo em Inglês | MEDLINE | ID: mdl-32815060

RESUMO

BACKGROUND: Data on patients with coronavirus disease 2019 (COVID-19) who return to hospital after discharge are scarce. Characterization of these patients may inform post-hospitalization care. OBJECTIVE: To describe clinical characteristics of patients with COVID-19 who returned to the emergency department (ED) or required readmission within 14 days of discharge. DESIGN: Retrospective cohort study of SARS-COV-2-positive patients with index hospitalization between February 27 and April 12, 2020, with ≥ 14-day follow-up. Significance was defined as P < 0.05 after multiplying P by 125 study-wide comparisons. PARTICIPANTS: Hospitalized patients with confirmed SARS-CoV-2 discharged alive from five New York City hospitals. MAIN MEASURES: Readmission or return to ED following discharge. RESULTS: Of 2864 discharged patients, 103 (3.6%) returned for emergency care after a median of 4.5 days, with 56 requiring inpatient readmission. The most common reason for return was respiratory distress (50%). Compared with patients who did not return, there were higher proportions of COPD (6.8% vs 2.9%) and hypertension (36% vs 22.1%) among those who returned. Patients who returned also had a shorter median length of stay (LOS) during index hospitalization (4.5 [2.9,9.1] vs 6.7 [3.5, 11.5] days; Padjusted = 0.006), and were less likely to have required intensive care on index hospitalization (5.8% vs 19%; Padjusted = 0.001). A trend towards association between absence of in-hospital treatment-dose anticoagulation on index admission and return to hospital was also observed (20.9% vs 30.9%, Padjusted = 0.06). On readmission, rates of intensive care and death were 5.8% and 3.6%, respectively. CONCLUSIONS: Return to hospital after admission for COVID-19 was infrequent within 14 days of discharge. The most common cause for return was respiratory distress. Patients who returned more likely had COPD and hypertension, shorter LOS on index-hospitalization, and lower rates of in-hospital treatment-dose anticoagulation. Future studies should focus on whether these comorbid conditions, longer LOS, and anticoagulation are associated with reduced readmissions.


Assuntos
Infecções por Coronavirus/epidemiologia , Serviço Hospitalar de Emergência/estatística & dados numéricos , Readmissão do Paciente/estatística & dados numéricos , Pneumonia Viral/epidemiologia , Idoso , Anticoagulantes/administração & dosagem , Betacoronavirus , COVID-19 , Estudos de Casos e Controles , Comorbidade , Infecções por Coronavirus/terapia , Feminino , Humanos , Hipertensão/epidemiologia , Tempo de Internação/estatística & dados numéricos , Masculino , Pessoa de Meia-Idade , Cidade de Nova Iorque/epidemiologia , Pandemias , Pneumonia Viral/terapia , Doença Pulmonar Obstrutiva Crônica/epidemiologia , Síndrome do Desconforto Respiratório/epidemiologia , Estudos Retrospectivos , SARS-CoV-2
6.
J Med Internet Res ; 22(11): e24018, 2020 11 06.
Artigo em Inglês | MEDLINE | ID: mdl-33027032

RESUMO

BACKGROUND: COVID-19 has infected millions of people worldwide and is responsible for several hundred thousand fatalities. The COVID-19 pandemic has necessitated thoughtful resource allocation and early identification of high-risk patients. However, effective methods to meet these needs are lacking. OBJECTIVE: The aims of this study were to analyze the electronic health records (EHRs) of patients who tested positive for COVID-19 and were admitted to hospitals in the Mount Sinai Health System in New York City; to develop machine learning models for making predictions about the hospital course of the patients over clinically meaningful time horizons based on patient characteristics at admission; and to assess the performance of these models at multiple hospitals and time points. METHODS: We used Extreme Gradient Boosting (XGBoost) and baseline comparator models to predict in-hospital mortality and critical events at time windows of 3, 5, 7, and 10 days from admission. Our study population included harmonized EHR data from five hospitals in New York City for 4098 COVID-19-positive patients admitted from March 15 to May 22, 2020. The models were first trained on patients from a single hospital (n=1514) before or on May 1, externally validated on patients from four other hospitals (n=2201) before or on May 1, and prospectively validated on all patients after May 1 (n=383). Finally, we established model interpretability to identify and rank variables that drive model predictions. RESULTS: Upon cross-validation, the XGBoost classifier outperformed baseline models, with an area under the receiver operating characteristic curve (AUC-ROC) for mortality of 0.89 at 3 days, 0.85 at 5 and 7 days, and 0.84 at 10 days. XGBoost also performed well for critical event prediction, with an AUC-ROC of 0.80 at 3 days, 0.79 at 5 days, 0.80 at 7 days, and 0.81 at 10 days. In external validation, XGBoost achieved an AUC-ROC of 0.88 at 3 days, 0.86 at 5 days, 0.86 at 7 days, and 0.84 at 10 days for mortality prediction. Similarly, the unimputed XGBoost model achieved an AUC-ROC of 0.78 at 3 days, 0.79 at 5 days, 0.80 at 7 days, and 0.81 at 10 days. Trends in performance on prospective validation sets were similar. At 7 days, acute kidney injury on admission, elevated LDH, tachypnea, and hyperglycemia were the strongest drivers of critical event prediction, while higher age, anion gap, and C-reactive protein were the strongest drivers of mortality prediction. CONCLUSIONS: We externally and prospectively trained and validated machine learning models for mortality and critical events for patients with COVID-19 at different time horizons. These models identified at-risk patients and uncovered underlying relationships that predicted outcomes.


Assuntos
Infecções por Coronavirus/diagnóstico , Infecções por Coronavirus/mortalidade , Aprendizado de Máquina/normas , Pneumonia Viral/diagnóstico , Pneumonia Viral/mortalidade , Injúria Renal Aguda/epidemiologia , Adolescente , Adulto , Idoso , Idoso de 80 Anos ou mais , Betacoronavirus , COVID-19 , Estudos de Coortes , Registros Eletrônicos de Saúde , Feminino , Mortalidade Hospitalar , Hospitalização/estatística & dados numéricos , Hospitais , Humanos , Masculino , Pessoa de Meia-Idade , Cidade de Nova Iorque/epidemiologia , Pandemias , Prognóstico , Curva ROC , Medição de Risco/métodos , Medição de Risco/normas , SARS-CoV-2 , Adulto Jovem
7.
Am J Nephrol ; 47(6): 441-449, 2018.
Artigo em Inglês | MEDLINE | ID: mdl-29895030

RESUMO

BACKGROUND: Various medications are cleared by the kidneys, therefore patients with impaired renal function, especially dialysis patients are at risk for adverse drug events (ADEs). There are limited studies on ADEs in maintenance dialysis patients. METHODS: We utilized a nationally representative database, the Nationwide Emergency Department Sample, from 2008 to 2013, to compare emergency department (ED) visits for dialysis and propensity matched non-dialysis patients. Log binomial regression was used to calculate relative risk of hospital admission and logistic regression to calculate ORs for in-hospital mortality while adjusting for patient and hospital characteristics. RESULTS: While ED visits for ADEs decreased in both groups, they were over 10-fold higher in dialysis patients than non-dialysis patients (65.8-88.5 per 1,000 patients vs. 4.6-5.4 per 1,000 patients respectively, p < 0.001). The top medication category associated with ED visits for ADEs in dialysis patients is agents primarily affecting blood constituents, which has increased. After propensity matching, patient admission was higher in dialysis patients than non-dialysis patients, (88 vs. 76%, p < 0.001). Dialysis was associated with a 3% increase in risk of admission and 3 times the odds of in-hospital mortality (adjusted OR 3, 95% CI 2.7-2.3.3). CONCLUSIONS: ED visits for ADEs are substantially higher in dialysis patients than non-dialysis patients. In dialysis patients, ADEs associated with agents primarily affecting blood constituents are on the rise. ED visits for ADEs in dialysis patients have higher inpatient admissions and in-hospital mortality. Further studies are needed to identify and implement measures aimed at reducing ADEs in dialysis patients.


Assuntos
Efeitos Colaterais e Reações Adversas Relacionados a Medicamentos/epidemiologia , Serviço Hospitalar de Emergência/estatística & dados numéricos , Utilização de Instalações e Serviços/estatística & dados numéricos , Utilização de Instalações e Serviços/tendências , Hospitalização/estatística & dados numéricos , Diálise Renal , Feminino , Humanos , Masculino , Pessoa de Meia-Idade , Estados Unidos
8.
J Prim Care Community Health ; 15: 21501319231223437, 2024.
Artigo em Inglês | MEDLINE | ID: mdl-38185870

RESUMO

INTRODUCTION/OBJECTIVE: The KidneyIntelX is a multiplex, bioprognostic, immunoassay consisting of 3 plasma biomarkers and clinical variables that uses machine learning to predict a patient's risk for a progressive decline in kidney function over 5 years. We report the 1-year pre- and post-test clinical impact on care management, eGFR slope, and A1C along with engagement of population health clinical pharmacists and patient coordinators to promote a program of sustainable kidney, metabolic, and cardiac health. METHODS: The KidneyIntelX in vitro prognostic test was previously validated for patients with type 2 diabetes and diabetic kidney disease (DKD) to predict kidney function decline within 5 years was introduced into the RWE study (NCT04802395) across the Health System as part of a population health chronic disease management program from [November 2020 to April 2023]. Pre- and post-test patients with a minimum of 12 months of follow-up post KidneyIntelX were assessed across all aspects of the program. RESULTS: A total of 5348 patients with DKD had a KidneyIntelX assay. The median age was 68 years old, 52% were female, 27% self-identified as Black, and 89% had hypertension. The median baseline eGFR was 62 ml/min/1.73 m2, urine albumin-creatinine ratio was 54 mg/g, and A1C was 7.3%. The KidneyIntelX risk level was low in 49%, intermediate in 40%, and high in 11% of cases. New prescriptions for SGLT2i, GLP-1 RA, or referral to a specialist were noted in 19%, 33%, and 43% among low-, intermediate-, and high-risk patients, respectively. The median A1C decreased from 8.2% pre-test to 7.5% post-test in the high-risk group (P < .001). UACR levels in the intermediate-risk patients with albuminuria were reduced by 20%, and in a subgroup treated with new scripts for SGLT2i, UACR levels were lowered by approximately 50%. The median eGFR slope improved from -7.08 ml/min/1.73 m2/year to -4.27 ml/min/1.73 m2/year in high-risk patients (P = .0003), -2.65 to -1.04 in intermediate risk, and -3.26 ml/min/1.73 m2/year to +0.45 ml/min/1.73 m2/year in patients with low-risk (P < .001). CONCLUSIONS: Deployment and risk stratification by KidneyIntelX was associated with an escalation in action taken to optimize cardio-kidney-metabolic health including medications and specialist referrals. Glycemic control and kidney function trajectories improved post-KidneyIntelX testing, with the greatest improvements observed in those scored as high-risk.


Assuntos
Diabetes Mellitus Tipo 2 , Nefropatias Diabéticas , Humanos , Feminino , Idoso , Masculino , Nefropatias Diabéticas/terapia , Diabetes Mellitus Tipo 2/complicações , Diabetes Mellitus Tipo 2/terapia , Hemoglobinas Glicadas , Medicina de Precisão , Albuminúria
9.
Chem Biol Drug Des ; 102(2): 332-356, 2023 08.
Artigo em Inglês | MEDLINE | ID: mdl-36872849

RESUMO

In tropical and subtropical regions of the world, leishmaniasis is endemic and causes a range of clinical symptoms in people, from severe tegumentary forms (such as cutaneous, mucocutaneous, and diffuse leishmaniasis) to lethal visceral forms. The protozoan parasite of the genus Leishmania causes leishmaniasis, which is still a significant public health issue, according to the World Health Organization 2022. The public's worry about the neglected tropical disease is growing as new foci of the illness arise, which are exacerbated by alterations in behavior, changes in the environment, and an enlarged range of sand fly vectors. Leishmania research has advanced significantly during the past three decades in a few different avenues. Despite several studies on Leishmania, many issues, such as illness control, parasite resistance, parasite clearance, etc., remain unresolved. The key virulence variables that play a role in the pathogenicity-host-pathogen relationship of the parasite are comprehensively discussed in this paper. The important Leishmania virulence factors, such as Kinetoplastid Membrane Protein-11 (KMP-11), Leishmanolysin (GP63), Proteophosphoglycan (PPG), Lipophosphoglycan (LPG), Glycosylinositol Phospholipids (GIPL), and others, have an impact on the pathophysiology of the disease and enable the parasite to spread the infection. Leishmania infection may arise from virulence factors; they are treatable with medications or vaccinations more promptly and might greatly shorten the duration of treatment. Additionally, our research sought to present a modeled structure of a few putative virulence factors that might aid in the development of new chemotherapeutic approaches for the treatment of leishmaniasis. The predicted virulence protein's structure is utilized to design novel drugs, therapeutic targets, and immunizations for considerable advantage from a higher understanding of the host immune response.


Assuntos
Leishmania , Leishmaniose , Humanos , Leishmaniose/tratamento farmacológico , Interações Hospedeiro-Patógeno , Fatores de Virulência
10.
J Prim Care Community Health ; 13: 21501319221138196, 2022.
Artigo em Inglês | MEDLINE | ID: mdl-36404761

RESUMO

INTRODUCTION AND OBJECTIVE: The lack of precision to identify patients with early-stage diabetic kidney disease (DKD) at near-term risk for progressive decline in kidney function results in poor disease management often leading to kidney failure requiring unplanned dialysis. The KidneyIntelX is a multiplex, bioprognostic, immunoassay consisting of 3 plasma biomarkers and clinical variables that uses machine learning to generate a risk score for progressive decline in kidney function over 5-year in adults with early-stage DKD. Our objective was to assess the impact of KidneyIntelX on management and outcomes in a Health System in the real-world evidence (RWE) study. METHODS: KidneyIntelX was introduced into a large metropolitan Health System via a population health-defined approved care pathway for patients with stages 1 to 3 DKD between [November 2020 to March 2022]. Decision impact on visit frequency, medication management, specialist referral, and selected lab values was assessed. We performed an interim analysis in patients through 6-months post-test date to evaluate the impact of risk level with clinical decision-making and outcomes. RESULTS: A total of 1686 patients were enrolled in the RWE study and underwent KidneyIntelX testing and subsequent care pathway management. The median age was 68 years, 52% were female, 26% self-identified as Black, and 94% had hypertension. The median baseline eGFR was 59 ml/minute/1.73 m2, urine albumin-creatinine ratio was 69 mg/g, and HbA1c was 7.7%. After testing, a clinical encounter in the first month occurred in 13%, 43%, and 53% of low-risk, intermediate-risk, and high-risk patients, respectively and 46%, 61%, and 71% had at least 1 action taken within the first 6 months. High-risk patients were more likely to be placed on SGLT2 inhibitors (OR = 4.56; 95% CI 3.00-6.91 vs low-risk), and more likely to be referred to a specialist such as a nephrologist, endocrinologist, or dietician (OR = 2.49; 95% CI 1.53-4.01) compared to low-risk patients. CONCLUSIONS: The combination of KidneyIntelX, clinical guidelines and educational support resulted in changes in clinical management by clinicians. After testing, there was an increase in visit frequency, referrals for disease management, and introduction to guideline-recommended medications. These differed by risk category, indicating an impact of KidneyIntelX risk stratification on clinical care.


Assuntos
Diabetes Mellitus , Nefropatias Diabéticas , Inibidores do Transportador 2 de Sódio-Glicose , Adulto , Humanos , Feminino , Idoso , Masculino , Nefropatias Diabéticas/tratamento farmacológico , Inibidores do Transportador 2 de Sódio-Glicose/uso terapêutico , Biomarcadores , Diálise Renal , Fatores de Risco , Diabetes Mellitus/tratamento farmacológico
11.
Asian Pac J Cancer Prev ; 22(9): 2781-2788, 2021 Sep 01.
Artigo em Inglês | MEDLINE | ID: mdl-34582646

RESUMO

OBJECTIVE: This study aims to assess the correlation of exhaled CO and nicotine dependence with the occurrence of oral mucosal lesions while also taking into consideration socio-demographic, clinical and anthropometrical characteristics of participants. METHODS: An observational cross-sectional study was carried out among smokers who visited the tobacco cessation center at Tertiary Care Dental Hospital in Goa, India. An intra-oral soft tissue examination for detecting presence of oral mucosal lesions followed by a questionnaire-based interview for the measurement of exposure, sociodemographic factors, body mass index, cooking habits and nicotine dependence was conducted. The exhaled CO levels were measured with a CO breath analyzer. Statistical analysis was performed using IBM SPSS version 20.0 Descriptive statistics were calculated and multivariable analysis was done to assess the association of different variables with oral mucosal lesions and carbon monoxide levels. p-value ≤ 0.05 was considered as statistically significant. RESULTS: Of the 173 subjects enrolled in the study, 69.36% were without any lesions while 30.63% were diagnosed with some lesion. In the regression analysis, the variables of physical activity (present vs absent OR: 5.808), exhaled CO levels (OR: 1.098) and nicotine dependence (mild vs moderate OR: 6.518) were significant risk factors influencing the presence of oral mucosal lesions. Usage of both cigarettes and bidis by smokers exhibited highest mean exhaled CO values of 19.67±1.506 ppm. Exhaled CO levels were significantly higher in smokers who were overweight (14.96±9.14 ppm), physically inactive (13.98±8.26 ppm), highly nicotine dependent (20.67±8.30) and used coal for cooking (12.55±8.17). CONCLUSION: A robust correlation between exhaled CO levels, nicotine dependence and incidence of oral mucosal lesions was established. The multifactorial tenacity of exhaled CO which is affected by smoked tobacco as well as variables such as physical activity, BMI, cooking habits and type of smoking habit should be noted.


Assuntos
Monóxido de Carbono/análise , Expiração , Neoplasias Bucais/epidemiologia , Tabagismo/epidemiologia , Adulto , Testes Respiratórios , Estudos Transversais , Unidade Hospitalar de Odontologia , Feminino , Humanos , Incidência , Índia/epidemiologia , Masculino , Pessoa de Meia-Idade , Fatores de Risco , Fumantes/estatística & dados numéricos , Inquéritos e Questionários , Centros de Atenção Terciária
12.
ArXiv ; 2021 Jan 11.
Artigo em Inglês | MEDLINE | ID: mdl-33442560

RESUMO

Machine Learning (ML) models typically require large-scale, balanced training data to be robust, generalizable, and effective in the context of healthcare. This has been a major issue for developing ML models for the coronavirus-disease 2019 (COVID-19) pandemic where data is highly imbalanced, particularly within electronic health records (EHR) research. Conventional approaches in ML use cross-entropy loss (CEL) that often suffers from poor margin classification. For the first time, we show that contrastive loss (CL) improves the performance of CEL especially for imbalanced EHR data and the related COVID-19 analyses. This study has been approved by the Institutional Review Board at the Icahn School of Medicine at Mount Sinai. We use EHR data from five hospitals within the Mount Sinai Health System (MSHS) to predict mortality, intubation, and intensive care unit (ICU) transfer in hospitalized COVID-19 patients over 24 and 48 hour time windows. We train two sequential architectures (RNN and RETAIN) using two loss functions (CEL and CL). Models are tested on full sample data set which contain all available data and restricted data set to emulate higher class imbalance.CL models consistently outperform CEL models with the restricted data set on these tasks with differences ranging from 0.04 to 0.15 for AUPRC and 0.05 to 0.1 for AUROC. For the restricted sample, only the CL model maintains proper clustering and is able to identify important features, such as pulse oximetry. CL outperforms CEL in instances of severe class imbalance, on three EHR outcomes with respect to three performance metrics: predictive power, clustering, and feature importance. We believe that the developed CL framework can be expanded and used for EHR ML work in general.

13.
Patterns (N Y) ; 2(12): 100389, 2021 Dec 10.
Artigo em Inglês | MEDLINE | ID: mdl-34723227

RESUMO

Deep learning (DL) models typically require large-scale, balanced training data to be robust, generalizable, and effective in the context of healthcare. This has been a major issue for developing DL models for the coronavirus disease 2019 (COVID-19) pandemic, where data are highly class imbalanced. Conventional approaches in DL use cross-entropy loss (CEL), which often suffers from poor margin classification. We show that contrastive loss (CL) improves the performance of CEL, especially in imbalanced electronic health records (EHR) data for COVID-19 analyses. We use a diverse EHR dataset to predict three outcomes: mortality, intubation, and intensive care unit (ICU) transfer in hospitalized COVID-19 patients over multiple time windows. To compare the performance of CEL and CL, models are tested on the full dataset and a restricted dataset. CL models consistently outperform CEL models, with differences ranging from 0.04 to 0.15 for area under the precision and recall curve (AUPRC) and 0.05 to 0.1 for area under the receiver-operating characteristic curve (AUROC).

14.
medRxiv ; 2020 May 22.
Artigo em Inglês | MEDLINE | ID: mdl-32511547

RESUMO

Background: Data on patients with coronavirus disease 2019 (COVID-19) who return to hospital after discharge are scarce. Characterization of these patients may inform post-hospitalization care. Methods and Findings: Retrospective cohort study of patients with confirmed SARS-CoV-2 discharged alive from five hospitals in New York City with index hospitalization between February 27th-April 12th, 2020, with follow-up of ≥14 days. Significance was defined as P<0.05 after multiplying P by 125 study-wide comparisons. Of 2,864 discharged patients, 103 (3.6%) returned for emergency care after a median of 4.5 days, with 56 requiring inpatient readmission. The most common reason for return was respiratory distress (50%). Compared to patients who did not return, among those who returned there was a higher proportion of COPD (6.8% vs 2.9%) and hypertension (36% vs 22.1%). Patients who returned also had a shorter median length of stay (LOS) during index hospitalization (4.5 [2.9,9.1] vs. 6.7 [3.5, 11.5] days; P adjusted =0.006), and were less likely to have required intensive care on index hospitalization (5.8% vs 19%; P adjusted =0.001). A trend towards association between absence of in-hospital anticoagulation on index admission and return to hospital was also observed (20.9% vs 30.9%, P adjusted =0.064). On readmission, rates of intensive care and death were 5.8% and 3.6%, respectively. Conclusions: Return to hospital after admission for COVID-19 was infrequent within 14 days of discharge. The most common cause for return was respiratory distress. Patients who returned had higher proportion of COPD and hypertension with shorter LOS on index hospitalization, and a trend towards lower rates of in-hospital treatment-dose anticoagulation. Future studies should focus on whether these comorbid conditions, longer LOS and anticoagulation are associated with reduced readmissions.

SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA