RESUMO
INTRODUCTION: Cellular senescence is the irreversible growth arrest subsequent to oncogenic mutations, DNA damage, or metabolic insult. Senescence is associated with ageing and chronic age associated diseases such as cardiovascular disease and diabetes. The involvement of cellular senescence in acute kidney injury (AKI) and chronic kidney disease (CKD) is not fully understood. However, recent studies suggest that such patients have a higher-than-normal level of cellular senescence and accelerated ageing. METHODS: This study aimed to discover key biomarkers of senescence in AKI and CKD patients compared to other chronic ageing diseases in controls using OLINK proteomics. RESULTS: We show that senescence proteins CKAP4 (p-value < 0.0001) and PTX3 (p-value < 0.0001) are upregulated in AKI and CKD patients compared with controls with chronic diseases, suggesting the proteins may play a role in overall kidney disease development. CONCLUSIONS: CKAP4 was found to be differentially expressed in both AKI and CKD when compared to UHCs; hence, this biomarker could be a prognostic senescence biomarker of both AKI and CKD.
Assuntos
Biomarcadores , Proteína C-Reativa , Senescência Celular , Insuficiência Renal Crônica , Humanos , Biomarcadores/metabolismo , Insuficiência Renal Crônica/metabolismo , Insuficiência Renal Crônica/genética , Insuficiência Renal Crônica/patologia , Senescência Celular/genética , Proteína C-Reativa/metabolismo , Masculino , Componente Amiloide P Sérico/metabolismo , Componente Amiloide P Sérico/genética , Injúria Renal Aguda/metabolismo , Feminino , Pessoa de Meia-Idade , IdosoRESUMO
Background: The COVID-19 pandemic, caused by the novel coronavirus SARS-CoV-2, has posed unprecedented challenges to healthcare systems worldwide. Here, we have identified proteomic and genetic signatures for improved prognosis which is vital for COVID-19 research. Methods: We investigated the proteomic and genomic profile of COVID-19-positive patients (n = 400 for proteomics, n = 483 for genomics), focusing on differential regulation between hospitalised and non-hospitalised COVID-19 patients. Signatures had their predictive capabilities tested using independent machine learning models such as Support Vector Machine (SVM), Random Forest (RF) and Logistic Regression (LR). Results: This study has identified 224 differentially expressed proteins involved in various inflammatory and immunological pathways in hospitalised COVID-19 patients compared to non-hospitalised COVID-19 patients. LGALS9 (p-value < 0.001), LAMP3 (p-value < 0.001), PRSS8 (p-value < 0.001) and AGRN (p-value < 0.001) were identified as the most statistically significant proteins. Several hundred rsIDs were queried across the top 10 significant signatures, identifying three significant SNPs on the FSTL3 gene showing a correlation with hospitalisation status. Conclusions: Our study has not only identified key signatures of COVID-19 patients with worsened health but has also demonstrated their predictive capabilities as potential biomarkers, which suggests a staple role in the worsened health effects caused by COVID-19.
Assuntos
Biomarcadores , Proteínas Sanguíneas , COVID-19 , Hospitalização , SARS-CoV-2 , Adulto , Idoso , Feminino , Humanos , Masculino , Pessoa de Meia-Idade , Biomarcadores/sangue , Proteínas Sanguíneas/genética , Proteínas Sanguíneas/metabolismo , COVID-19/genética , COVID-19/epidemiologia , Galectinas/genética , Proteínas de Membrana Lisossomal/genética , Prognóstico , Proteômica/métodos , SARS-CoV-2/isolamento & purificaçãoRESUMO
BACKGROUND: The presence of coronary plaques with high-risk characteristics is strongly associated with adverse cardiac events beyond the identification of coronary stenosis. Testing by coronary computed tomography angiography (CCTA) enables the identification of high-risk plaques (HRP). Referral for CCTA is presently based on pre-test probability estimates including clinical risk factors (CRFs); however, proteomics and/or genetic information could potentially improve patient selection for CCTA and, hence, identification of HRP. We aimed to (1) identify proteomic and genetic features associated with HRP presence and (2) investigate the effect of combining CRFs, proteomics, and genetics to predict HRP presence. METHODS: Consecutive chest pain patients (n = 1462) undergoing CCTA to diagnose obstructive coronary artery disease (CAD) were included. Coronary plaques were assessed using a semi-automatic plaque analysis tool. Measurements of 368 circulating proteins were obtained with targeted Olink panels, and DNA genotyping was performed in all patients. Imputed genetic variants were used to compute a multi-trait multi-ancestry genome-wide polygenic score (GPSMult). HRP presence was defined as plaques with two or more high-risk characteristics (low attenuation, spotty calcification, positive remodeling, and napkin ring sign). Prediction of HRP presence was performed using the glmnet algorithm with repeated fivefold cross-validation, using CRFs, proteomics, and GPSMult as input features. RESULTS: HRPs were detected in 165 (11%) patients, and 15 input features were associated with HRP presence. Prediction of HRP presence based on CRFs yielded a mean area under the receiver operating curve (AUC) ± standard error of 73.2 ± 0.1, versus 69.0 ± 0.1 for proteomics and 60.1 ± 0.1 for GPSMult. Combining CRFs with GPSMult increased prediction accuracy (AUC 74.8 ± 0.1 (P = 0.004)), while the inclusion of proteomics provided no significant improvement to either the CRF (AUC 73.2 ± 0.1, P = 1.00) or the CRF + GPSMult (AUC 74.6 ± 0.1, P = 1.00) models, respectively. CONCLUSIONS: In patients with suspected CAD, incorporating genetic data with either clinical or proteomic data improves the prediction of high-risk plaque presence. TRIAL REGISTRATION: https://clinicaltrials.gov/ct2/show/NCT02264717 (September 2014).
Assuntos
Doença da Artéria Coronariana , Placa Aterosclerótica , Humanos , Doença da Artéria Coronariana/diagnóstico , Doença da Artéria Coronariana/genética , Estratificação de Risco Genético , Proteômica , Angiografia Coronária/métodos , Placa Aterosclerótica/genética , Placa Aterosclerótica/complicações , Fatores de RiscoRESUMO
BACKGROUND: Health organizations and countries around the world have found it difficult to control the spread of COVID-19. To minimize the future impact on the UK National Health Service and improve patient care, there is a pressing need to identify individuals who are at a higher risk of being hospitalized because of severe COVID-19. Early targeted work was successful in identifying angiotensin-converting enzyme-2 receptors and type II transmembrane serine protease dependency as drivers of severe infection. Although a targeted approach highlights key pathways, a multiomics approach will provide a clearer and more comprehensive picture of severe COVID-19 etiology and progression. OBJECTIVE: The COVID-19 Response Study aims to carry out an integrated multiomics analysis to identify biomarkers in blood and saliva that could contribute to host susceptibility to SARS-CoV-2 and the development of severe COVID-19. METHODS: The COVID-19 Response Study aims to recruit 1000 people who recovered from SARS-CoV-2 infection in both community and hospital settings on the island of Ireland. This protocol describes the retrospective observational study component carried out in Northern Ireland (NI; Cohort A); the Republic of Ireland cohort will be described separately. For all NI participants (n=519), SARS-CoV-2 infection has been confirmed by reverse transcription-quantitative polymerase chain reaction. A prospective Cohort B of 40 patients is also being followed up at 1, 3, 6, and 12 months postinfection to assess longitudinal symptom frequency and immune response. Data will be sourced from whole blood, saliva samples, and clinical data from the electronic care records, the general health questionnaire, and a 12-item general health questionnaire mental health survey. Saliva and blood samples were processed to extract DNA and RNA before whole-genome sequencing, RNA sequencing, DNA methylation analysis, microbiome analysis, 16S ribosomal RNA gene sequencing, and proteomic analysis were performed on the plasma. Multiomics data will be combined with clinical data to produce sensitive and specific prognostic models for severity risk. RESULTS: An initial demographic and clinical profile of the NI Cohort A has been completed. A total of 249 hospitalized patients and 270 nonhospitalized patients were recruited, of whom 184 (64.3%) were female, and the mean age was 45.4 (SD 13) years. High levels of comorbidity were evident in the hospitalized cohort, with cardiovascular disease and metabolic and respiratory disorders being the most significant (P<.001), grouped according to the International Classification of Diseases 10 codes. CONCLUSIONS: This study will provide a comprehensive opportunity to study the mechanisms of COVID-19 severity in recontactable participants. INTERNATIONAL REGISTERED REPORT IDENTIFIER (IRRID): DERR1-10.2196/50733.
RESUMO
BACKGROUND: Patients with de novo chest pain, referred for evaluation of possible coronary artery disease (CAD), frequently have an absence of CAD resulting in millions of tests not having any clinical impact. The objective of this study was to investigate whether polygenic risk scores and targeted proteomics improve the prediction of absence of CAD in patients with suspected CAD, when added to the PROMISE (Prospective Multicenter Imaging Study for Evaluation of Chest Pain) minimal risk score (PMRS). METHODS: Genotyping and targeted plasma proteomics (N=368 proteins) were performed in 1440 patients with symptoms suspected to be caused by CAD undergoing coronary computed tomography angiography. Based on individual genotypes, a polygenic risk score for CAD (PRSCAD) was calculated. The prediction was performed using combinations of PRSCAD, proteins, and PMRS as features in models using stability selection and machine learning. RESULTS: Prediction of absence of CAD yielded an area under the curve of PRSCAD-model, 0.64±0.03; proteomic-model, 0.58±0.03; and PMRS model, 0.76±0.02. No significant correlation was found between the genetic and proteomic risk scores (Pearson correlation coefficient, -0.04; P=0.13). Optimal predictive ability was achieved by the full model (PRSCAD+protein+PMRS) yielding an area under the curve of 0.80±0.02 for absence of CAD, significantly better than the PMRS model alone (P<0.001). For reclassification purpose, the full model enabled down-classification of 49% (324 of 661) of the 5% to 15% pretest probability patients and 18% (113 of 611) of >15% pretest probability patients. CONCLUSIONS: For patients with chest pain and low-intermediate CAD risk, incorporating targeted proteomics and polygenic risk scores into the risk assessment substantially improved the ability to predict the absence of CAD. Genetics and proteomics seem to add complementary information to the clinical risk factors and improve risk stratification in this large patient group. REGISTRATION: URL: https://www. CLINICALTRIALS: gov; Unique identifier: NCT02264717.
Assuntos
Doença da Artéria Coronariana , Humanos , Doença da Artéria Coronariana/diagnóstico , Doença da Artéria Coronariana/genética , Proteômica , Estudos Prospectivos , Angiografia Coronária/métodos , Fatores de Risco , Dor no Peito/diagnóstico , Dor no Peito/genéticaRESUMO
BACKGROUND: In the last decade, percutaneous coronary intervention (PCI) has evolved toward the treatment of complex disease in patients with multiple comorbidities. Whilst there are several definitions of complexity, it is unclear whether there is agreement between cardiologists in classifying complexity of cases. Inconsistent identification of complex PCI can lead to significant variation in clinical decision-making. AIM: This study aimed to determine the inter-rater agreement in rating the complexity and risk of PCI procedures. METHOD: An online survey was designed and disseminated amongst interventional cardiologists by the European Association of Percutaneous Cardiovascular Intervention (EAPCI) board. The survey presented four patient vignettes, with study participants assessing these cases to classify their complexity. RESULTS: From 215 respondents, there was poor inter-rater agreement in classifying the complexity level (k = 0.1) and a fair agreement (k = 0.31) in classifying the risk level. The experience level of participants did not show any significant impact on the inter-rater agreement of rating the complexity level and the risk level. There was good level of agreement between participants in terms of rating 26 factors for classifying complex PCI. The top five factors were (1) impaired left ventricular function, (2) concomitant severe aortic stenosis, (3) last remaining vessel PCI, (4) requirement fort calcium modification and (5) significant renal impairment. CONCLUSION: Agreement among cardiologists in classifying complexity of PCI is poor, which may lead to suboptimal clinical decision-making, procedural planning as well as long-term management. Consensus is needed to define complex PCI, and this requires clear criteria incorporating both lesion and patient characteristics.
Assuntos
Cardiologistas , Doença da Artéria Coronariana , Intervenção Coronária Percutânea , Humanos , Intervenção Coronária Percutânea/métodos , Resultado do Tratamento , Inquéritos e Questionários , Consenso , Doença da Artéria Coronariana/diagnóstico por imagem , Doença da Artéria Coronariana/terapia , Doença da Artéria Coronariana/etiologiaRESUMO
BACKGROUND: The application of artificial intelligence to interpret the electrocardiogram (ECG) has predominantly included the use of knowledge engineered rule-based algorithms which have become widely used today in clinical practice. However, over recent decades, there has been a steady increase in the number of research studies that are using machine learning (ML) to read or interrogate ECG data. OBJECTIVE: The aim of this study is to review the use of ML with ECG data using a time series approach. METHODS: Papers that address the subject of ML and the ECG were identified by systematically searching databases that archive papers from January 1995 to October 2019. Time series analysis was used to study the changing popularity of the different types of ML algorithms that have been used with ECG data over the past two decades. Finally, a meta-analysis of how various ML techniques performed for various diagnostic classifications was also undertaken. RESULTS: A total of 757 papers was identified. Based on results, the use of ML with ECG data started to increase sharply (p < 0.001) from 2012. Healthcare applications, especially in heart abnormality classification, were the most common application of ML when using ECG data (p < 0.001). However, many new emerging applications include using ML and the ECG for biometrics and driver drowsiness. The support vector machine was the technique of choice for a decade. However, since 2018, deep learning has been trending upwards and is likely to be the leading technique in the coming few years. Despite the accuracy paradox, accuracy was the most frequently used metric in the studies reviewed, followed by sensitivity, specificity, F1 score and then AUC. CONCLUSION: Applying ML using ECG data has shown promise. Data scientists and physicians should collaborate to ensure that clinical knowledge is being applied appropriately and is informing the design of ML algorithms. Data scientists also need to consider knowledge guided feature engineering and the explicability of the ML algorithm as well as being transparent in the algorithm's performance to appropriately calibrate human-AI trust. Future work is required to enhance ML performance in ECG classification.
Assuntos
Inteligência Artificial , Benchmarking , Algoritmos , Eletrocardiografia , Humanos , Aprendizado de Máquina , Fatores de TempoRESUMO
In this commentary paper, we discuss the use of the electrocardiogram to help clinicians make diagnostic and patient referral decisions in acute care settings. The paper discusses the factors that are likely to contribute to the variability and noise in the clinical decision making process for catheterization lab activation. These factors include the variable competence in reading ECGs, the intra/inter rater reliability, the lack of standard ECG training, the various ECG machine and filter settings, cognitive biases (such as automation bias which is the tendency to agree with the computer-aided diagnosis or AI diagnosis), the order of the information being received, tiredness or decision fatigue as well as ECG artefacts such as the signal noise or lead misplacement. We also discuss potential research questions and tools that could be used to mitigate this 'noise' and improve the quality of ECG based decision making.
Assuntos
Diagnóstico por Computador , Eletrocardiografia , Tomada de Decisão Clínica , Tomada de Decisões , Humanos , Reprodutibilidade dos TestesRESUMO
BACKGROUND: Treatment decisions in myocardial infarction (MI) are currently stratified by ST elevation (ST-elevation myocardial infarction [STEMI]) or lack of ST elevation (non-ST elevation myocardial infarction [NSTEMI]) on the electrocardiogram. This arose from the assumption that ST elevation indicated acute coronary artery occlusion (OMI). However, one-quarter of all NSTEMI cases are an OMI, and have a higher mortality. The purpose of this study was to identify features that could help identify OMI. METHODS: Prospectively collected data from patients undergoing percutaneous coronary intervention (PCI) was analyzed. Data included presentation characteristics, comorbidities, treatments, and outcomes. Latent class analysis was undertaken, to determine patterns of presentation and history associated with OMI. RESULTS: A total of 1412 patients underwent PCI for acute MI, and 263 were diagnosed as OMI. Compared to nonocclusive MI, OMI patients are more likely to have fewer comorbidities but no difference in cerebrovascular disease and increased acute mortality (4.2% vs. 1.1%; p < .001). Of OMI, 29.5% had delays to their treatment such as immediate reperfusion therapy. With latent class analysis, while clusters of similar patients are observed in the data set, the data available did not usefully identify patients with OMI compared to non-OMI. CONCLUSION: Features between OMI and STEMI are broadly very similar. However, there was no difference in age and risk of cerebrovascular disease in the OMI/non-OMI group. There are no reliable characteristics therefore for identifying OMI versus non-OMI. Delays to treatment also suggest that OMI patients are still missing out on optimal treatment. An alternative strategy is required to improve the identification of OMI patients.
Assuntos
Infarto do Miocárdio , Infarto do Miocárdio sem Supradesnível do Segmento ST , Intervenção Coronária Percutânea , Infarto do Miocárdio com Supradesnível do Segmento ST , Humanos , Análise de Classes Latentes , Infarto do Miocárdio/diagnóstico , Infarto do Miocárdio/terapia , Infarto do Miocárdio sem Supradesnível do Segmento ST/diagnóstico , Intervenção Coronária Percutânea/efeitos adversos , Sistema de Registros , Infarto do Miocárdio com Supradesnível do Segmento ST/diagnóstico por imagem , Infarto do Miocárdio com Supradesnível do Segmento ST/cirurgia , Resultado do TratamentoRESUMO
Background and aims: TACE/ADAM17 is a membrane bound metalloprotease, which cleaves substrates involved in immune and inflammatory responses and plays a role in coronary artery disease (CAD). We measured TACE and its substrates in CAD patients to identify potential biomarkers within this molecular pathway with potential for acute coronary syndrome (ACS) and major adverse cardiovascular events (MACE) prediction. Methods: Blood samples were obtained from consecutive patients (n = 229) with coronary angiographic evidence of CAD admitted with ACS or electively. MACE were recorded after a median 3-year follow-up. Controls (n = 115) had a <10% CAD risk as per the HeartSCORE. TACE and TIMP3 protein and mRNA levels were measured by ELISA and RT-qPCR respectively. TACE substrates were measured using a multiplex proximity extension assay. Results: TACE mRNA and cell protein levels (p < 0.01) and TACE substrates LDLR (p = 0.006), TRANCE (p = 0.045), LAG-3 (p < 0.001) and ACE2 (p < 0.001) plasma levels were significantly higher in CAD patients versus controls. TACE inhibitor TIMP3 mRNA levels were significantly lower in CAD patients and tended to be lower in the ACS population (p < 0.05). TACE substrates TNFR1 (OR:3.237,CI:1.514-6.923,p = 0.002), HB-EGF (OR:0.484,CI:0.288-0.813,p = 0.006) and Ep-CAM (OR:0.555,CI:0.327-0.829,p = 0.004) accurately classified ACS patients with HB-EGF and Ep-CAM levels being lower compared to electively admitted patients. TNFR1 (OR:2.317,CI:1.377-3.898,p = 0.002) and TNFR2 (OR:1.902,CI:1.072-3.373,p = 0.028) were significantly higher on admission in those patients who developed MACE within 3 years. Conclusions: We demonstrate a possible role of TACE substrates LAG-3, HB-EGF and Ep-CAM in atherosclerotic plaque development and stability. We also underline the importance of measuring TNFR1 and TNFR2 earlier than previously appreciated for MACE prediction. We report an important role of TIMP3 in regulating TACE levels.
RESUMO
Glaucoma is a group of optic neuropathies characterised by the degeneration of retinal ganglion cells, resulting in damage to the optic nerve head (ONH) and loss of vision in one or both eyes. Increased intraocular pressure (IOP) is one of the major aetiological risk factors in glaucoma, and is currently the only modifiable risk factor. However, 30-40% of glaucoma patients do not present with elevated IOP and still proceed to lose vision. The pathophysiology of glaucoma is therefore not completely understood, and there is a need for the development of IOP-independent neuroprotective therapies to preserve vision. Neuroinflammation has been shown to play a key role in glaucoma and, specifically, the NLRP3 inflammasome, a key driver of inflammation, has recently been implicated. The NLRP3 inflammasome is expressed in the eye and its activation is reported in pre-clinical studies of glaucoma. Activation of the NLRP3 inflammasome results in IL-1ß processing. This pro inflammatory cytokine is elevated in the blood of glaucoma patients and is believed to drive neurotoxic inflammation, resulting in axon degeneration and the death of retinal ganglion cells (RGCs). This review discusses glaucoma as an inflammatory disease and evaluates targeting the NLRP3 inflammasome as a therapeutic strategy. A hypothetical mechanism for the action of the NLRP3 inflammasome in glaucoma is presented.
Assuntos
Glaucoma/metabolismo , Glaucoma/terapia , Inflamassomos/metabolismo , Proteína 3 que Contém Domínio de Pirina da Família NLR/metabolismo , Células Ganglionares da Retina/metabolismo , Animais , Anti-Inflamatórios/química , Axônios , Humanos , Inflamação , Interleucina-1beta/metabolismo , Camundongos , Neuroproteção , Espécies Reativas de Oxigênio , Receptores de Reconhecimento de Padrão , Fatores de RiscoRESUMO
Vitamin D and cholesterol metabolism overlap significantly in the pathways that contribute to their biosynthesis. However, our understanding of their independent and co-regulation is limited. Cardiovascular disease is the leading cause of death globally and atherosclerosis, the pathology associated with elevated cholesterol, is the leading cause of cardiovascular disease. It is therefore important to understand vitamin D metabolism as a contributory factor. From the literature, we compile evidence of how these systems interact, relating the understanding of the molecular mechanisms involved to the results from observational studies. We also present the first systems biology pathway map of the joint cholesterol and vitamin D metabolisms made available using the Systems Biology Graphical Notation (SBGN) Markup Language (SBGNML). It is shown that the relationship between vitamin D supplementation, total cholesterol, and LDL-C status, and between latitude, vitamin D, and cholesterol status are consistent with our knowledge of molecular mechanisms. We also highlight the results that cannot be explained with our current knowledge of molecular mechanisms: (i) vitamin D supplementation mitigates the side-effects of statin therapy; (ii) statin therapy does not impact upon vitamin D status; and critically (iii) vitamin D supplementation does not improve cardiovascular outcomes, despite improving cardiovascular risk factors. For (iii), we present a hypothesis, based on observations in the literature, that describes how vitamin D regulates the balance between cellular and plasma cholesterol. Answering these questions will create significant opportunities for advancement in our understanding of cardiovascular health.
Assuntos
Doenças Cardiovasculares/metabolismo , Colesterol/metabolismo , Dislipidemias/metabolismo , Deficiência de Vitamina D/metabolismo , Vitamina D/metabolismo , Animais , Doenças Cardiovasculares/epidemiologia , Doenças Cardiovasculares/prevenção & controle , Colesterol/sangue , LDL-Colesterol/metabolismo , Dislipidemias/tratamento farmacológico , Dislipidemias/epidemiologia , Fatores de Risco de Doenças Cardíacas , Humanos , Inibidores de Hidroximetilglutaril-CoA Redutases/uso terapêutico , Modelos Biológicos , Prognóstico , Medição de Risco , Biologia de Sistemas , Vitamina D/uso terapêutico , Deficiência de Vitamina D/tratamento farmacológico , Deficiência de Vitamina D/epidemiologiaRESUMO
BACKGROUND: Even in the era of digital technology, several hospitals still rely on paper-based forms for data entry for patient admission, triage, drug prescriptions, and procedures. Paper-based forms can be quick and convenient to complete but often at the expense of data quality, completeness, sustainability, and automated data analytics. Digital forms can improve data quality by assisting the user when deciding on the appropriate response to certain data inputs (eg, classifying symptoms). Greater data quality via digital form completion not only helps with auditing, service improvement, and patient record keeping but also helps with novel data science and machine learning research. Although digital forms are becoming more prevalent in health care, there is a lack of empirical best practices and guidelines for their design. The study-based hospital had a definite plan to abolish the paper form; hence, it was not necessary to compare the digital forms with the paper form. OBJECTIVE: This study aims to assess the usability of three different interactive forms: a single-page digital form (in which all data input is required on one web page), a multipage digital form, and a conversational digital form (a chatbot). METHODS: The three digital forms were developed as candidates to replace the current paper-based form used to record patient referrals to an interventional cardiology department (Cath-Lab) at Altnagelvin Hospital. We recorded usability data in a counterbalanced usability test (60 usability tests: 20 subjects×3 form usability tests). The usability data included task completion times, System Usability Scale (SUS) scores, User Experience Questionnaire data, and data from a postexperiment questionnaire. RESULTS: We found that the single-page form outperformed the other two digital forms in almost all usability metrics. The mean SUS score for the single-page form was 76 (SD 15.8; P=.01) when compared with the multipage form, which had a mean score of 67 (SD 17), and the conversational form attained the lowest scores in usability testing and was the least preferred choice of users, with a mean score of 57 (SD 24). An SUS score of >68 was considered above average. The single-page form achieved the least task completion time compared with the other two digital form styles. CONCLUSIONS: In conclusion, the digital single-page form outperformed the other two forms in almost all usability metrics; it had the least task completion time compared with those of the other two digital forms. Moreover, on answering the open-ended question from the final customized postexperiment questionnaire, the single-page form was the preferred choice.
RESUMO
Cellular senescence is a state of growth arrest that occurs after cells encounter various stresses. Senescence contributes to tumour suppression, embryonic development, and wound healing. It impacts on the pathology of various diseases by secreting inflammatory chemokines, immune modulators and other bioactive factors. These secretory biosignatures ultimately cause inflammation, tissue fibrosis, immunosenescence and many ageing-related diseases such as atrial fibrillation (AF). Because the molecular mechanisms underpinning AF development remain unclear, current treatments are suboptimal and have serious side effects. In this review, we summarize recent results describing the role of senescence in AF. We propose that senescence factors induce AF and have a causative role. Hence, targeting senescence and its secretory phenotype may attenuate AF.
Assuntos
Fibrilação Atrial , Imunossenescência , Fibrilação Atrial/tratamento farmacológico , Senescência Celular , Desenvolvimento de Medicamentos , Fibrose , HumanosRESUMO
BACKGROUND: A 12-lead electrocardiogram (ECG) is the most commonly used method to diagnose patients with cardiovascular diseases. However, there are a number of possible misinterpretations of the ECG that can be caused by several different factors, such as the misplacement of chest electrodes. OBJECTIVE: The aim of this study is to build advanced algorithms to detect precordial (chest) electrode misplacement. METHODS: In this study, we used traditional machine learning (ML) and deep learning (DL) to autodetect the misplacement of electrodes V1 and V2 using features from the resultant ECG. The algorithms were trained using data extracted from high-resolution body surface potential maps of patients who were diagnosed with myocardial infarction, diagnosed with left ventricular hypertrophy, or a normal ECG. RESULTS: DL achieved the highest accuracy in this study for detecting V1 and V2 electrode misplacement, with an accuracy of 93.0% (95% CI 91.46-94.53) for misplacement in the second intercostal space. The performance of DL in the second intercostal space was benchmarked with physicians (n=11 and age 47.3 years, SD 15.5) who were experienced in reading ECGs (mean number of ECGs read in the past year 436.54, SD 397.9). Physicians were poor at recognizing chest electrode misplacement on the ECG and achieved a mean accuracy of 60% (95% CI 56.09-63.90), which was significantly poorer than that of DL (P<.001). CONCLUSIONS: DL provides the best performance for detecting chest electrode misplacement when compared with the ability of experienced physicians. DL and ML could be used to help flag ECGs that have been incorrectly recorded and flag that the data may be flawed, which could reduce the number of erroneous diagnoses.
RESUMO
BACKGROUND: When a patient is suspected of having an acute myocardial infarction, they are accepted or declined for primary percutaneous coronary intervention partly based on clinical assessment of their 12-lead electrocardiogram (ECG) and ST-elevation myocardial infarction criteria. OBJECTIVE: We retrospectively determined the agreement rate between human (specialists called activator nurses) and computer interpretations of ECGs of patients who were declined for primary percutaneous coronary intervention. METHODS: Various features of patients who were referred for primary percutaneous coronary intervention were analyzed. Both the human and computer ECG interpretations were simplified to either "suggesting" or "not suggesting" acute myocardial infarction to avoid analysis of complex heterogeneous and synonymous diagnostic terms. Analyses, to measure agreement, and logistic regression, to determine if these ECG interpretations (and other variables such as patient age, chest pain) could predict patient mortality, were carried out. RESULTS: Of a total of 1464 patients referred to and declined for primary percutaneous coronary intervention, 722 (49.3%) computer diagnoses suggested acute myocardial infarction, whereas 634 (43.3%) of the human interpretations suggested acute myocardial infarction (P<.001). The human and computer agreed that there was a possible acute myocardial infarction for 342 out of 1464 (23.3%) patients. However, there was a higher rate of human-computer agreement for patients not having acute myocardial infarctions (450/1464, 30.7%). The overall agreement rate was 54.1% (792/1464). Cohen κ showed poor agreement (κ=0.08, P=.001). Only the age (odds ratio [OR] 1.07, 95% CI 1.05-1.09) and chest pain (OR 0.59, 95% CI 0.39-0.89) independent variables were statistically significant (P=.008) in predicting mortality after 30 days and 1 year. The odds for mortality within 1 year of referral were lower in patients with chest pain compared to those patients without chest pain. A referral being out of hours was a trending variable (OR 1.41, 95% CI 0.95-2.11, P=.09) for predicting the odds of 1-year mortality. CONCLUSIONS: Mortality in patients who were declined for primary percutaneous coronary intervention was higher than the reported mortality for ST-elevation myocardial infarction patients at 1 year. Agreement between computerized and human ECG interpretation is poor, perhaps leading to a high rate of inappropriate referrals. Work is needed to improve computer and human decision making when reading ECGs to ensure that patients are referred to the correct treatment facility for time-critical therapy.
RESUMO
INTRODUCTION: Electrode misplacement and interchange errors are known problems when recording the 12lead electrocardiogram (ECG). Automatic detection of these errors could play an important role for improving clinical decision making and outcomes in cardiac care. The objectives of this systematic review and meta-analysis is to 1) study the impact of electrode misplacement on ECG signals and ECG interpretation, 2) to determine the most challenging electrode misplacements to detect using machine learning (ML), 3) to analyse the ML performance of algorithms that detect electrode misplacement or interchange according to sensitivity and specificity and 4) to identify the most commonly used ML technique for detecting electrode misplacement/interchange. This review analysed the current literature regarding electrode misplacement/interchange recognition accuracy using machine learning techniques. METHOD: A search of three online databases including IEEE, PubMed and ScienceDirect identified 228 articles, while 3 articles were included from additional sources from co-authors. According to the eligibility criteria, 14 articles were selected. The selected articles were considered for qualitative analysis and meta-analysis. RESULTS: The articles showed the effect of lead interchange on ECG morphology and as a consequence on patient diagnoses. Statistical analysis of the included articles found that machine learning performance is high in detecting electrode misplacement/interchange except left arm/left leg interchange. CONCLUSION: This review emphasises the importance of detecting electrode misplacement detection in ECG diagnosis and the effects on decision making. Machine learning shows promise in detecting lead misplacement/interchange and highlights an opportunity for developing and operationalising deep learning algorithms such as convolutional neural network (CNN) to detect electrode misplacement/interchange.
Assuntos
Eletrocardiografia , Aprendizado de Máquina , Algoritmos , Eletrodos , Humanos , Redes Neurais de ComputaçãoRESUMO
OBJECTIVES: Timely prehospital diagnosis and treatment of acute coronary syndrome (ACS) are required to achieve optimal outcomes. Clinical decision support systems (CDSS) are platforms designed to integrate multiple data and can aid with management decisions in the prehospital environment. The review aim was to describe the accuracy of CDSS and individual components in the prehospital ACS management. METHODS: This systematic review examined the current literature regarding the accuracy of CDSS for ACS in the prehospital setting, the influence of computer-aided decision-making and of 4 components: electrocardiogram, biomarkers, patient history, and examination findings. The impact of these components on sensitivity, specificity, and positive and negative predictive values was assessed. RESULTS: A total of 11,439 articles were identified from a search of databases, of which 199 were screened against the eligibility criteria. Eight studies were found to meet the eligibility and quality criteria. There was marked heterogeneity between studies which precluded formal meta-analysis. However, individual components analysis found that patient history led to significant improvement in the sensitivity and negative predictive values. CDSS which incorporated all 4 components tended to show higher sensitivities and negative predictive values. CDSS incorporating computer-aided electrocardiogram diagnosis showed higher specificities and positive predictive values. CONCLUSIONS: Although heterogeneity precluded meta-analysis, this review emphasizes the potential of ACS CDSS in prehospital environments that incorporate patient history in addition to integration of multiple components. The higher sensitivity of certain components, along with higher specificity of computer-aided decision-making, highlights the opportunity for developing an integrated algorithm with computer-aided decision support.
Assuntos
Síndrome Coronariana Aguda/diagnóstico , Algoritmos , Sistemas de Apoio a Decisões Clínicas/organização & administração , Eletrocardiografia , Serviços Médicos de Emergência/métodos , Humanos , Valor Preditivo dos TestesRESUMO
This article retrospectively analyses a primary percutaneous coronary intervention dataset comprising patient referrals that were accepted for percutaneous coronary intervention and those who were turned down between January 2015 and December 2018 at Altnagelvin Hospital (United Kingdom). Time series analysis of these referrals was undertaken for analysing the referral rates per year, month, day and per hour. The overall referrals have 70 per cent (n = 1466, p < 0.001) males. Of total referrals, 65 per cent (p < 0.001) of referrals were 'out of hours'. Seasonality decomposition shows a peak in referrals on average every 3 months (standard deviation = 0.83). No significant correlation (R = 0.03, p = 0.86; R = -0.11, p = 0.62) was found between the referral numbers and turndown rate. Being female increased the probability of being out of hour in all the groups. The 30-day mortality was higher in the turndown group. The time series of all the referrals depict variation over the months or days which is not the same each year. The average age of the patients in the turndown group is higher. The number of referrals does not impact on the turndown rate and clinical decision making. Most patients are being referred out of hours, especially females. This analysis leads to the emphasis on the importance of working 24/7 CathLab service.
Assuntos
Intervenção Coronária Percutânea , Infarto do Miocárdio com Supradesnível do Segmento ST , Feminino , Humanos , Masculino , Encaminhamento e Consulta , Estudos Retrospectivos , Fatores de Risco , Fatores de Tempo , Resultado do Tratamento , Reino UnidoRESUMO
BACKGROUND: Electrocardiogram (ECG) lead misplacement can adversely affect ECG diagnosis and subsequent clinical decisions. V1 and V2 are commonly placed superior of their correct position. The aim of the current study was to use machine learning approaches to detect V1 and V2 lead misplacement to enhance ECG data quality. METHOD: ECGs for 453 patients, (normal nâ¯=â¯151, Left Ventricular Hypertrophy (LVH) nâ¯=â¯151, Myocardial Infarction nâ¯=â¯151) were extracted from body surface potential maps. These were used to extract both the correct and incorrectly placed V1 and V2 leads. The prevalence for correct and incorrect leads were 50%. Sixteen features were extracted in three different domains: time-based, statistical and time-frequency features using a wavelet transform. A hybrid feature selection approach was applied to select an optimal set of features. To ensure optimal model selection, five classifiers were used and compared. The aforementioned feature selection approach and classifiers were applied for V1 and V2 misplacement in three different positions: first, second and third intercostal spaces (ICS). RESULTS: The accuracy for V1 misplacement detection was 93.9%, 89.3%, 72.8% in the first, second and third ICS respectively. In V2, the accuracy was 93.6%, 86.6% and 68.1% in the first, second and third ICS respectively. There is a noticeable decline in accuracy when detecting misplacement in the third ICS which is expected.