Your browser doesn't support javascript.
A Transformer-Based Model Trained on Large Scale Claims Data for Prediction of Severe COVID-19 Disease Progression (preprint)
medrxiv; 2022.
Preprint em Inglês | medRxiv | ID: ppzbmed-10.1101.2022.11.29.22282632
ABSTRACT
In situations like the COVID-19 pandemic, healthcare systems are under enormous pressure as they can rapidly collapse under the burden of the crisis. Machine learning (ML) based risk models could lift the burden by identifying patients with high risk of severe disease progression. Electronic Health Records (EHRs) provide crucial sources of information to develop these models because they rely on routinely collected healthcare data. However, EHR data is challenging for training ML models because it contains irregularly timestamped diagnosis, prescription, and procedure codes. For such data, transformer-based models are promising. We extended the previously published Med-BERT model by including age, sex, medications, quantitative clinical measures, and state information. After pre-training on approximately 988 million EHRs from 3.5 million patients, we developed models to predict Acute Respiratory Manifestations (ARM) risk using the medical history of 80,211 COVID-19 patients. Compared to XGBoost and Random Forests, our transformer-based models more accurately forecast the risk of developing ARM after COVID-19 infection. We used Integrated Gradients and Bayesian networks to understand the link between the essential features of our model. Finally, we evaluated adapting our model to Austrian in-patient data. Our study highlights the promise of predictive transformer-based models for precision medicine.
Assuntos

Texto completo: Disponível Coleções: Preprints Base de dados: medRxiv Assunto principal: COVID-19 Idioma: Inglês Ano de publicação: 2022 Tipo de documento: Preprint

Similares

MEDLINE

...
LILACS

LIS


Texto completo: Disponível Coleções: Preprints Base de dados: medRxiv Assunto principal: COVID-19 Idioma: Inglês Ano de publicação: 2022 Tipo de documento: Preprint