RESUMO
BACKGROUND: Atrial fibrillation (AF) is the most common heart arrhythmia worldwide and is linked to a higher risk of mortality and morbidity. To predict AF and AF-related complications, clinical risk scores are commonly employed, but their predictive accuracy is generally limited, given the inherent complexity and heterogeneity of patients with AF. By classifying different presentations of AF into coherent and manageable clinical phenotypes, the development of tailored prevention and treatment strategies can be facilitated. In this study, we propose an artificial intelligence (AI)-based methodology to derive meaningful clinical phenotypes of AF in the general and critical care populations. METHODS: Our approach employs generative topographic mapping, a probabilistic machine learning method, to identify micro-clusters of patients with similar characteristics. It then identifies macro-cluster regions (clinical phenotypes) in the latent space using Ward's minimum variance method. We applied it to two large cohort databases (UK-Biobank and MIMIC-IV) representing general and critical care populations. FINDINGS: The proposed methodology showed its ability to derive meaningful clinical phenotypes of AF. Because of its probabilistic foundations, it can enhance the robustness of patient stratification. It also produced interpretable visualisation of complex high-dimensional data, enhancing understanding of the derived phenotypes and their key characteristics. Using our methodology, we identified and characterised clinical phenotypes of AF across diverse patient populations. INTERPRETATION: Our methodology is robust to noise, can uncover hidden patterns and subgroups, and can elucidate more specific patient profiles, contributing to more robust patient stratification, which could facilitate the tailoring of prevention and treatment programs specific to each phenotype. It can also be applied to other datasets to derive clinically meaningful phenotypes of other conditions. FUNDING: This study was funded by the DECIPHER project (LJMU QR-PSF) and the EU project TARGET (10113624).
Assuntos
Inteligência Artificial , Fibrilação Atrial , Cuidados Críticos , Fenótipo , Fibrilação Atrial/diagnóstico , Humanos , Cuidados Críticos/métodos , Aprendizado de Máquina , Feminino , Algoritmos , MasculinoRESUMO
Background We aim to determine which electrocardiogram (ECG) data format is optimal for ML modelling, in the context of myocardial infarction prediction. We will also address the auxiliary objective of evaluating the viability of using digitised ECG signals for ML modelling. Methods Two ECG arrangements displaying 10s and 2.5 s of data for each lead were used. For each arrangement, conservative and speculative data cohorts were generated from the PTB-XL dataset. All ECGs were represented in three different data formats: Signal ECGs, Image ECGs, and Extracted Signal ECGs, with 8358 and 11,621 ECGs in the conservative and speculative cohorts, respectively. ML models were trained using the three data formats in both data cohorts. Results For ECGs that contained 10s of data, Signal and Extracted Signal ECGs were optimal and statistically similar, with AUCs [95% CI] of 0.971 [0.961, 0.981] and 0.974 [0.965, 0.984], respectively, for the conservative cohort; and 0.931 [0.918, 0.945] and 0.919 [0.903, 0.934], respectively, for the speculative cohort. For ECGs that contained 2.5 s of data, the Image ECG format was optimal, with AUCs of 0.960 [0.948, 0.973] and 0.903 [0.886, 0.920], for the conservative and speculative cohorts, respectively. Conclusion When available, the Signal ECG data should be preferred for ML modelling. If not, the optimal format depends on the data arrangement within the ECG: If the Image ECG contains 10s of data for each lead, the Extracted Signal ECG is optimal, however, if it only uses 2.5 s, then using the Image ECG data is optimal for ML performance.
Assuntos
Eletrocardiografia , Aprendizado de Máquina , Infarto do Miocárdio , Eletrocardiografia/métodos , Humanos , Infarto do Miocárdio/diagnóstico , Valor Preditivo dos TestesRESUMO
BACKGROUND: The association between dietary iron intake and the risk of type 2 diabetes mellitus (T2DM) remains inconsistent. In this study, we aimed to investigate the relationship between trajectories of dietary iron intake and risk of T2DM. METHODS: This study comprised a total of 61,115 participants without a prior T2DM from the UK Biobank database. We used the group-based trajectory model (GBTM) to identify different dietary iron intake trajectories. Cox proportional hazards models were used to evaluate the relationship between trajectories of dietary iron intake and risk of T2DM. RESULTS: During a mean follow-up of 4.8 years, a total of 677 T2DM events were observed. Four trajectory groups of dietary iron intake were characterized by the GBTM: trajectory group 1 (with a mean dietary iron intake of 10.9 mg/day), 2 (12.3 mg/day), 3 (14.1 mg/day) and 4 (17.6 mg/day). Trajectory group 3 was significantly associated with a 38% decreased risk of T2DM when compared with trajectory group 1 (hazard ratio [HR] = 0.62, 95% confidence interval [CI]: 0.49-0.79), while group 4 was significantly related with a 30% risk reduction (HR = 0.70, 95% CI: 0.54-0.91). Significant effect modifications by obesity (p = 0.04) and history of cardiovascular disease (p < 0.01) were found to the relationship between trajectories of dietary iron intake and the risk of T2DM. CONCLUSIONS: We found that trajectories of dietary iron intake were significantly associated with the risk of T2DM, where the lowest T2DM risk was observed in trajectory group 3 with a mean iron intake of 14.1 mg/day. These findings may highlight the importance of adequate dietary iron intake to the T2DM prevention from a public health perspective. Further studies to assess the relationship between dietary iron intake and risk of T2DM are needed, as well as intervention studies to mitigate the risks of T2DM associated with dietary iron changes.
Assuntos
Diabetes Mellitus Tipo 2 , Humanos , Diabetes Mellitus Tipo 2/epidemiologia , Diabetes Mellitus Tipo 2/prevenção & controle , Ferro da Dieta , Ferro , Estudos Prospectivos , Dieta , Fatores de RiscoRESUMO
The standard treatment in glioblastoma includes maximal safe resection followed by concomitant radiotherapy plus chemotherapy and adjuvant temozolomide. The first follow-up study to evaluate treatment response is performed 1 month after concomitant treatment, when contrast-enhancing regions may appear that can correspond to true progression or pseudoprogression. We retrospectively evaluated 31 consecutive patients at the first follow-up after concomitant treatment to check whether the metabolic pattern assessed with multivoxel MRS was predictive of treatment response 2 months later. We extracted the underlying metabolic patterns of the contrast-enhancing regions with a blind-source separation method and mapped them over the reference images. Pattern heterogeneity was calculated using entropy, and association between patterns and outcomes was measured with Cramér's V. We identified three distinct metabolic patterns-proliferative, necrotic, and responsive, which were associated with status 2 months later. Individually, 70% of the patients showed metabolically heterogeneous patterns in the contrast-enhancing regions. Metabolic heterogeneity was not related to the regions' size and only stable patients were less heterogeneous than the rest. Contrast-enhancing regions are also metabolically heterogeneous 1 month after concomitant treatment. This could explain the reported difficulty in finding robust pseudoprogression biomarkers.
Assuntos
Neoplasias Encefálicas , Glioblastoma , Humanos , Glioblastoma/terapia , Glioblastoma/tratamento farmacológico , Seguimentos , Estudos Retrospectivos , Dacarbazina/uso terapêutico , Quimiorradioterapia/métodos , Progressão da Doença , Neoplasias Encefálicas/terapia , Neoplasias Encefálicas/tratamento farmacológico , Imageamento por Ressonância Magnética/métodosRESUMO
The integration of artificial intelligence (AI) technologies is evolving in different fields of cardiology and in particular in sports cardiology. Artificial intelligence offers significant opportunities to enhance risk assessment, diagnosis, treatment planning, and monitoring of athletes. This article explores the application of AI in various aspects of sports cardiology, including imaging techniques, genetic testing, and wearable devices. The use of machine learning and deep neural networks enables improved analysis and interpretation of complex datasets. However, ethical and legal dilemmas must be addressed, including informed consent, algorithmic fairness, data privacy, and intellectual property issues. The integration of AI technologies should complement the expertise of physicians, allowing for a balanced approach that optimizes patient care and outcomes. Ongoing research and collaborations are vital to harness the full potential of AI in sports cardiology and advance our management of cardiovascular health in athletes.
Assuntos
Cardiologia , Cardiomegalia Induzida por Exercícios , Esportes , Humanos , Inteligência Artificial , Cardiologia/métodos , Redes Neurais de ComputaçãoRESUMO
Observational studies using causal inference frameworks can provide a feasible alternative to randomized controlled trials. Advances in statistics, machine learning, and access to big data facilitate unraveling complex causal relationships from observational data across healthcare, social sciences, and other fields. However, challenges like evaluating models and bias amplification remain.
Assuntos
Big Data , Aprendizado de Máquina , Humanos , CausalidadeRESUMO
Background: Sepsis is a life-threatening disease commonly complicated by activation of coagulation and immune pathways. Sepsis-induced coagulopathy (SIC) is associated with micro- and macrothrombosis, but its relation to other cardiovascular complications remains less clear. In this study we explored associations between SIC and the occurrence of atrial fibrillation (AF) in patients admitted to the Intensive Care Unit (ICU) in sinus rhythm. We also aimed to identify predictive factors for the development of AF in patients with and without SIC. Methods: Data were extracted from the publicly available AmsterdamUMCdb database. Patients with sepsis and documented sinus rhythm on admission to ICU were included. Patients were stratified into those who fulfilled the criteria for SIC and those who did not. Following univariate analysis, logistic regression models were developed to describe the association between routinely documented demographics and blood results and the development of at least one episode of AF. Machine learning methods (gradient boosting machines and random forest) were applied to define the predictive importance of factors contributing to the development of AF. Results: Age was the strongest predictor for the development of AF in patients with and without SIC. Routine coagulation tests activated Partial Thromboplastin Time (aPTT) and International Normalized Ratio (INR) and C-reactive protein (CRP) as a marker of inflammation were also associated with AF occurrence in SIC-positive and SIC-negative patients. Cardiorespiratory parameters (oxygen requirements and heart rate) showed predictive potential. Conclusion: Higher INR, elevated CRP, increased heart rate and more severe respiratory failure are risk factors for occurrence of AF in critical illness, suggesting an association between cardiac, respiratory and immune and coagulation pathways. However, age was the most dominant factor to predict the first episodes of AF in patients admitted in sinus rhythm with and without SIC.
RESUMO
BACKGROUND: Glioblastoma (GB) is a malignant brain tumour that is challenging to treat, often relapsing even after aggressive therapy. Evaluating therapy response relies on magnetic resonance imaging (MRI) following the Response Assessment in Neuro-Oncology (RANO) criteria. However, early assessment is hindered by phenomena such as pseudoprogression and pseudoresponse. Magnetic resonance spectroscopy (MRS/MRSI) provides metabolomics information but is underutilised due to a lack of familiarity and standardisation. METHODS: This study explores the potential of spectroscopic imaging (MRSI) in combination with several machine learning approaches, including one-dimensional convolutional neural networks (1D-CNNs), to improve therapy response assessment. Preclinical GB (GL261-bearing mice) were studied for method optimisation and validation. RESULTS: The proposed 1D-CNN models successfully identify different regions of tumours sampled by MRSI, i.e., normal brain (N), control/unresponsive tumour (T), and tumour responding to treatment (R). Class activation maps using Grad-CAM enabled the study of the key areas relevant to the models, providing model explainability. The generated colour-coded maps showing the N, T and R regions were highly accurate (according to Dice scores) when compared against ground truth and outperformed our previous method. CONCLUSIONS: The proposed methodology may provide new and better opportunities for therapy response assessment, potentially providing earlier hints of tumour relapsing stages.
RESUMO
BACKGROUND: Evidence about the association between calculated remnant cholesterol (RC) and risk of heart failure (HF) in participants with diabetes mellitus (DM) remains sparse and limited. METHODS: We included a total of 22 230 participants with DM from the UK Biobank for analyses. Participants were categorized into three groups based on their baseline RC measures: low (with a mean RC of 0.41 mmol/L), moderate (0.66 mmol/L), and high (1.04 mmol/L). Cox proportional hazards models were used to evaluate the relationship between RC groups and HF risk. We performed discordance analysis to evaluate whether RC was associated with HF risk independently of low-density lipoprotein cholesterol (LDL-C). RESULTS: During a mean follow-up period of 11.5 years, there were a total of 2232 HF events observed. The moderate RC group was significantly related with a 15% increased risk of HF when compared with low RC group (hazard ratio [HR] = 1.15, 95% confidence interval [CI]: 1.01-1.32), while the high RC group with a 23% higher HF risk (HR = 1.23, 95% CI: 1.05-1.43). There was significant relationship between RC as a continuous measure and the increased HF risk (P < 0.01). The association between RC and risk of HF was stronger in participants with HbA1c level ≥ 53 mmol/mol when compared with HbA1c < 53 mmol/mol (P for interaction = 0.02). Results from discordance analyses showed that RC was significantly related to HF risk independent of LDL-C measures. CONCLUSIONS: Elevated RC was significantly associated with risk of HF in patients with DM. Moreover, RC was significantly related to HF risk independent of LDL-C measures. These findings may highlight the importance of RC management to HF risk in patients with DM.
Assuntos
Diabetes Mellitus , Insuficiência Cardíaca , Humanos , LDL-Colesterol , Hemoglobinas Glicadas , Fatores de Risco , Diabetes Mellitus/epidemiologia , Colesterol , Insuficiência Cardíaca/epidemiologia , Insuficiência Cardíaca/etiologiaRESUMO
AIMS: We sought to assess and compare the association of epicardial adipose tissue (EAT) with cardiovascular disease (CVD) in HIV-positive and HIV-negative groups. METHODS AND RESULTS: Using existing clinical databases, we analyzed 700 patients (195 HIV-positive, 505 HIV-negative). CVD was quantified by the presence of coronary calcification from both dedicated cardiac computed tomography (CT) and non-dedicated CT of the thorax. Epicardial adipose tissue (EAT) was quantified using dedicated software. The HIV-positive group had lower mean age (49.2 versus 57.8, p < 0.005), higher proportion of male sex (75.9 % versus 48.1 %, p < 0.005), and lower rates of coronary calcification (29.2 % versus 58.2 %, p < 0.005). Mean EAT volume was also lower in the HIV-positive group (68mm3 versus 118.3mm3, p < 0.005). Multiple linear regression demonstrated EAT volume was associated with hepatosteatosis (HS) in the HIV-positive group but not the HIV-negative group after adjustment for BMI (p < 0.005 versus p = 0.066). In the multivariate analysis, after adjustment for CVD risk factors, age, sex, statin use, and body mass index (BMI), EAT volume and hepatosteatosis were significantly associated with coronary calcification (odds ratio [OR] 1.14, p < 0.005 and OR 3.17, p < 0.005 respectively). In the HIV-negative group, the only significant association with EAT volume after adjustment was total cholesterol (OR 0.75, p = 0.012). CONCLUSIONS: We demonstrated a strong and significant independent association of EAT volume and coronary calcium, after adjustment, in HIV-positive group but not in the HIV-negative group. This result hints at differences in the mechanistic drivers of atherosclerosis between HIV-positive and HIV-negative groups.
Assuntos
Doenças Cardiovasculares , Doença da Artéria Coronariana , Soropositividade para HIV , Calcificação Vascular , Humanos , Masculino , Doença da Artéria Coronariana/complicações , Doença da Artéria Coronariana/diagnóstico por imagem , Doença da Artéria Coronariana/epidemiologia , Cálcio , Fatores de Risco , Pericárdio/diagnóstico por imagem , Tecido Adiposo/diagnóstico por imagemRESUMO
Lal and colleagues1 reported an integrative approach-combining transcriptomics, iPSCs, and epidemiological evidence-to identify and repurpose metformin, a main first-line medication for the treatment of type 2 diabetes, as an effective risk reducer for atrial fibrillation.
Assuntos
Fibrilação Atrial , Diabetes Mellitus Tipo 2 , Metformina , Humanos , Metformina/uso terapêutico , Fibrilação Atrial/tratamento farmacológico , Fibrilação Atrial/epidemiologia , Diabetes Mellitus Tipo 2/tratamento farmacológicoRESUMO
BACKGROUND: Intense training exercise regimes cause physiological changes within the heart to help cope with the increased stress, known as the "athlete's heart". These changes can mask pathological changes, making them harder to diagnose and increasing the risk of an adverse cardiac outcome. AIM: This paper reviews which machine learning techniques (ML) are being used within athlete's heart research and how they are being implemented, as well as assesses the uptake of these techniques within this area of research. METHODS: Searches were carried out on the Scopus and PubMed online datasets and a scoping review was conducted on the studies which were identified. RESULTS: Twenty-eight studies were included within the review, with ML being directly referenced within 16 (57%). A total of 12 different techniques were used, with the most popular being artificial neural networks and the most common implementation being to perform classification tasks. The review also highlighted the subgroups of interest: predictive modelling, reviews, and wearables, with most of the studies being attributed to the predictive modelling subgroup. The most common type of data used was the electrocardiogram (ECG), with echocardiograms being used the second most often. CONCLUSION: The results show that over the last 11 years, there has been a growing desire of leveraging ML techniques to help further the understanding of the athlete's heart, whether it be by expanding the knowledge of the physiological changes or by improving the accuracies of models to help improve the treatments and disease management.
RESUMO
The most limiting factor in heart transplantation is the lack of donor organs. With enhanced prediction of outcome, it may be possible to increase the life-years from the organs that become available. Applications of machine learning to tabular data, typical of clinical decision support, pose the practical question of interpretation, which has technical and potential ethical implications. In particular, there is an issue of principle about the predictability of complex data and whether this is inherent in the data or strongly dependent on the choice of machine learning model, leading to the so-called accuracy-interpretability trade-off. We model 1-year mortality in heart transplantation data with a self-explaining neural network, which is benchmarked against a deep learning model on the same development data, in an external validation study with two data sets: (1) UNOS transplants in 2017-2018 (n = 4750) for which the self-explaining and deep learning models are comparable in their AUROC 0.628 [0.602,0.654] cf. 0.635 [0.609,0.662] and (2) Scandinavian transplants during 1997-2018 (n = 2293), showing good calibration with AUROCs of 0.626 [0.588,0.665] and 0.634 [0.570, 0.698], respectively, with and without missing data (n = 982). This shows that for tabular data, predictive models can be transparent and capture important nonlinearities, retaining full predictive performance.
Assuntos
Inteligência Artificial , Transplante de Coração , Estudos Retrospectivos , Aprendizado de Máquina , Redes Neurais de ComputaçãoRESUMO
The modification of physical activity (PA) on the metabolic status in relation to atrial fibrillation (AF) in obesity remains unknown. We aimed to investigate the independent and joint associations of metabolic status and PA with the risk of AF in obese population. Based on the data from UK Biobank study, we used Cox proportional hazards models for analyses. Metabolic status was categorized into metabolically healthy obesity (MHO) and metabolically unhealthy obesity (MUO). PA was categorized into four groups according to the level of moderate-to-vigorous PA (MVPA): none, low, medium, and high. A total of 119,424 obese participants were included for analyses. MHO was significantly associated with a 35% reduced AF risk compared with MUO (HR = 0.65, 95% CI: 0.57-0.73). No significant modification of PA on AF risk among individuals with MHO was found. Among the MUO participants, individuals with medium and high PA had significantly lower AF risk compared with no MVPA (HR = 0.84, 95% CI: 0.74-0.95, and HR = 0.87, 95% CI: 0.78-0.96 for medium and high PA, respectively). As the severity of MUO increased, the modification of PA on AF risk was elevated accordingly. To conclude, MHO was significantly associated with a reduced risk of AF when compared with MUO in obese participants. PA could significantly modify the relationship between metabolic status and risk of AF among MUO participants, with particular benefits of PA associated with the reduced AF risk as the MUO severity elevated.
Assuntos
Fibrilação Atrial , Síndrome Metabólica , Obesidade Metabolicamente Benigna , Fibrilação Atrial/diagnóstico , Fibrilação Atrial/epidemiologia , Fibrilação Atrial/prevenção & controle , Índice de Massa Corporal , Exercício Físico , Humanos , Obesidade/complicações , Obesidade/diagnóstico , Obesidade/epidemiologia , Obesidade Metabolicamente Benigna/diagnóstico , Obesidade Metabolicamente Benigna/epidemiologia , Fatores de RiscoRESUMO
Breast cancer is the most commonly diagnosed female malignancy globally, with better survival rates if diagnosed early. Mammography is the gold standard in screening programmes for breast cancer, but despite technological advances, high error rates are still reported. Machine learning techniques, and in particular deep learning (DL), have been successfully used for breast cancer detection and classification. However, the added complexity that makes DL models so successful reduces their ability to explain which features are relevant to the model, or whether the model is biased. The main aim of this study is to propose a novel visualisation to help characterise breast cancer patients using Fisher Information Networks on features extracted from mammograms using a DL model. In the proposed visualisation, patients are mapped out according to their similarities and can be used to study new patients as a 'patient-like-me' approach. When applied to the CBIS-DDSM dataset, it was shown that it is a competitive methodology that can (i) facilitate the analysis and decision-making process in breast cancer diagnosis with the assistance of the FIN visualisations and 'patient-like-me' analysis, and (ii) help improve diagnostic accuracy and reduce overdiagnosis by identifying the most likely diagnosis based on clinical similarities with neighbouring patients.
Assuntos
Neoplasias da Mama , Aprendizado Profundo , Mama/patologia , Neoplasias da Mama/diagnóstico por imagem , Neoplasias da Mama/patologia , Feminino , Humanos , Serviços de Informação , Mamografia/métodosRESUMO
Sepsis is a heterogeneous syndrome characterized by a variety of clinical features. Analysis of large clinical datasets may serve to define groups of sepsis with different risks of adverse outcomes. Clinical experience supports the concept that prognosis, treatment, severity, and time course of sepsis vary depending on the source of infection. We analyzed a large publicly available database to test this hypothesis. In addition, we developed prognostic models for the three main types of sepsis: pulmonary, urinary, and abdominal sepsis. We used logistic regression using routinely available clinical data for mortality prediction in each of these groups. The data was extracted from the eICU collaborative research database, a multi-center intensive care unit with over 200,000 admissions. Sepsis cohorts were defined using admission diagnosis codes. We used univariate and multivariate analyses to establish factors relevant for outcome prediction in all three cohorts of sepsis (pulmonary, urinary and abdominal). For logistic regression, input variables were automatically selected using a sequential forward search algorithm over 10 dataset instances. Receiver operator characteristics were generated for each model and compared with established prognostication tools (APACHE IV and SOFA). A total of 3,958 sepsis admissions were included in the analysis. Sepsis in-hospital mortality differed depending on the cause of infection: abdominal 18.93%, pulmonary 19.27%, and renal 12.81%. Higher average heart rate was associated with increased mortality risk. Increased average Mean Arterial Pressure (MAP) showed a reduced mortality risk across all sepsis groups. Results from the LR models found significant factors that were relevant for specific sepsis groups. Our models outperformed APACHE IV and SOFA scores with AUC between 0.63 and 0.74. Predictive power decreased over time, with the best results achieved for data extracted for the first 24 h of admission. Mortality varied significantly between the three sepsis groups. We also demonstrate that factors of importance show considerable heterogeneity depending on the source of infection. The factors influencing in-hospital mortality vary depending on the source of sepsis which may explain why most sepsis trials have failed to identify an effective treatment. The source of infection should be considered when considering mortality risk. Planning of sepsis treatment trials may benefit from risk stratification based on the source of infection.
RESUMO
OBJECTIVE: We develop and externally validate two models for use with radiological knee osteoarthritis. They consist of a diagnostic model for KOA and a prognostic model of time to onset of KOA. Model development and optimisation used data from the Osteoarthritis initiative (OAI) and external validation for both models was by application to data from the Multicenter Osteoarthritis Study (MOST). MATERIALS AND METHODS: The diagnostic model at first presentation comprises subjects in the OAI with and without KOA (n = 2006), modelling with multivariate logistic regression. The prognostic sample involves 5-year follow-up of subjects presenting without clinical KOA (n = 1155), with modelling with Cox regression. In both instances the models used training data sets of n = 1353 and 1002 subjects and optimisation used test data sets of n = 1354 and 1003. The external validation data sets for the diagnostic and prognostic models comprised n = 2006 and n = 1155 subjects respectively. RESULTS: The classification performance of the diagnostic model on the test data has an AUC of 0.748 (0.721-0.774) and 0.670 (0.631-0.708) in external validation. The survival model has concordance scores for the OAI test set of 0.74 (0.7325-0.7439) and in external validation 0.72 (0.7190-0.7373). The survival approach stratified the population into two risk cohorts. The separation between the cohorts remains when the model is applied to the validation data. DISCUSSION: The models produced are interpretable with app interfaces that implement nomograms. The apps may be used for stratification and for patient education over the impact of modifiable risk factors. The externally validated results, by application to data from a substantial prospective observational study, show the robustness of models for likelihood of presenting with KOA at an initial assessment based on risk factors identified by the OAI protocol and stratification of risk for developing KOA in the next five years. CONCLUSION: Modelling clinical KOA from OAI data validates well for the MOST data set. Both risk models identified key factors for differentiation of the target population from commonly available variables. With this analysis there is potential to improve clinical management of patients.
Assuntos
Osteoartrite do Joelho , Progressão da Doença , Humanos , Articulação do Joelho , Osteoartrite do Joelho/diagnóstico por imagem , Osteoartrite do Joelho/epidemiologia , Radiografia , Fatores de RiscoRESUMO
The occurrence of atrial fibrillation (AF) represents clinical deterioration in acutely unwell patients and leads to increased morbidity and mortality. Prediction of the development of AF allows early intervention. Using the AmsterdamUMCdb, clinically relevant variables from patients admitted in sinus rhythm were extracted over the full duration of the ICU stay or until the first recorded AF episode occurred. Multiple logistic regression was performed to identify risk factors for AF. Input variables were automatically selected by a sequential forward search algorithm using cross-validation. We developed three different models: For the overall cohort, for ventilated patients and non-ventilated patients. 16,144 out of 23,106 admissions met the inclusion criteria. 2,374 (12.8%) patients had at least one AF episode during their ICU stay. Univariate analysis revealed that a higher percentage of AF patients were older than 70 years (60% versus 32%) and died in ICU (23.1% versus 7.1%) compared to non-AF patients. Multivariate analysis revealed age to be the dominant risk factor for developing AF with doubling of age leading to a 10-fold increased risk. Our logistic regression models showed excellent performance with AUC.ROC > 0.82 and > 0.91 in ventilated and non-ventilated cohorts, respectively. Increasing age was the dominant risk factor for the development of AF in both ventilated and non-ventilated critically ill patients. In non-ventilated patients, risk for development of AF was significantly higher than in ventilated patients. Further research is warranted to identify the role of ventilatory settings on risk for AF in critical illness and to optimise predictive models.