Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 35
Filtrar
Más filtros

Bases de datos
País/Región como asunto
Tipo del documento
País de afiliación
Intervalo de año de publicación
1.
Sensors (Basel) ; 23(1)2022 Dec 29.
Artículo en Inglés | MEDLINE | ID: mdl-36616958

RESUMEN

Inertial sensors are widely used in human motion monitoring. Orientation and position are the two most widely used measurements for motion monitoring. Tracking with the use of multiple inertial sensors is based on kinematic modelling which achieves a good level of accuracy when biomechanical constraints are applied. More recently, there is growing interest in tracking motion with a single inertial sensor to simplify the measurement system. The dead reckoning method is commonly used for estimating position from inertial sensors. However, significant errors are generated after applying the dead reckoning method because of the presence of sensor offsets and drift. These errors limit the feasibility of monitoring upper limb motion via a single inertial sensing system. In this paper, error correction methods are evaluated to investigate the feasibility of using a single sensor to track the movement of one upper limb segment. These include zero velocity update, wavelet analysis and high-pass filtering. The experiments were carried out using the nine-hole peg test. The results show that zero velocity update is the most effective method to correct the drift from the dead reckoning-based position tracking. If this method is used, then the use of a single inertial sensor to track the movement of a single limb segment is feasible.


Asunto(s)
Movimiento , Extremidad Superior , Humanos , Movimiento (Física) , Fenómenos Biomecánicos
2.
Eur J Public Health ; 29(2): 320-328, 2019 04 01.
Artículo en Inglés | MEDLINE | ID: mdl-30239699

RESUMEN

BACKGROUND: Research into the use of digital technology for weight loss maintenance (intentionally losing at least 10% of initial body weight and actively maintaining it) is limited. The aim of this article was to systematically review randomized controlled trials (RCTs) reporting on the use of digital technologies for communicating on weight loss maintenance to determine its' effectiveness, and identify gaps and areas for further research. METHODS: A systematic literature review was conducted by searching electronic databases to locate publications dated between 2006 and February 2018. Criteria were applied, and RCTs using digital technologies for weight loss maintenance were selected. RESULTS: Seven RCTs were selected from a total of 6541 hits after de-duplication and criteria applied. Three trials used text messaging, one used e-mail, one used a web-based system and two compared such a system with face-to-face contact. From the seven RCTs, one included children (n = 141) and reported no difference in BMI Standard Deviation between groups. From the seven trials, four reported that technology is effective for significantly aiding weight loss maintenance compared with control (no contact) or face-to face-contact in the short term (between 3 and 24 months). CONCLUSIONS: It was concluded that digital technologies have the potential to be effective communication tools for significantly aiding weight loss maintenance, especially in the short term (from 3 to 24 months). Further research is required into the long-term effectiveness of contemporary technologies.


Asunto(s)
Correo Electrónico , Envío de Mensajes de Texto , Programas de Reducción de Peso/métodos , Índice de Masa Corporal , Análisis Costo-Beneficio , Humanos , Internet , Ensayos Clínicos Controlados Aleatorios como Asunto
3.
J Electrocardiol ; 57S: S51-S55, 2019.
Artículo en Inglés | MEDLINE | ID: mdl-31668699

RESUMEN

BACKGROUND: Body surface potential mapping (BSPM) provides additional electrophysiological information that can be useful for the detection of cardiac diseases. Moreover, BSPMs are currently utilized in electrocardiographic imaging (ECGI) systems within clinical practice. Missing information due to noisy recordings, poor electrode contact is inevitable. In this study, we present an interpolation method that combines Laplacian minimization and principal component analysis (PCA) techniques for interpolating this missing information. METHOD: The dataset used consisted of 117 lead BSPMs recorded from 744 subjects (a training set of 384 subjects, and a test set of 360). This dataset is a mixture of normal, old myocardial infarction, and left ventricular hypertrophy subjects. The missing data was simulated by ignoring data recorded from 7 regions: the first region represents three rows of five electrodes on the anterior torso surface (high potential gradient region), and the other six regions were realistic patterns that have been drawn from clinical data and represent the most likely regions of broken electrodes. Three interpolation methods including PCA based interpolation, Laplacian interpolation, and hybrid Laplacian-PCA interpolation methods were used to interpolate the missing data from the remaining electrodes. In the simulated region of missing data, the calculated potentials from each interpolation method were compared with the measured potentials using relative error (RE) and correlation coefficient (CC) over time. In the hybrid Laplacian-PCA interpolation method, the missing data are firstly interpolated using Laplacian interpolation, then the resulting BSPM of 117 potentials was multiplied by the (117 × 117) coefficient matrix calculated using the training set to get the principal components. Out of 117 principal components (PCs), the first 15 PCs were utilized for the second stage of interpolation. The best performance of interpolation was the reason for choosing the first 15 PCs. RESULTS: The differences in the median of relative error (RE) between Laplacian and Hybrid method ranged from 0.01 to 0.35 (p < 0.001), while the differences in the median of correlation between them ranged from 0.0006 to 0.034 (p < 0.001). PCA-interpolation method performed badly especially in some scenarios where the number of missing electrodes was up to 12 or higher causing a high region of missing data. The figures of median of RE for PCA-method were between 0.05 and 0.6 lower than that for Hybrid method (p < 0.001). However, the median of correlation was between 0.0002 and 0.26 lower than the figure for the Hybrid method (p < 0.001). CONCLUSION: Comparison between the three methods of interpolation (Laplacian, PCA, Hybrid) in reconstructing missing data in BSPM showed that the Hybrid method was always better than the other methods in all scenarios; whether the number of missed electrodes is high or low, and irrespective of the location of these missed electrodes.


Asunto(s)
Mapeo del Potencial de Superficie Corporal , Electrocardiografía , Infarto del Miocardio , Electrodos , Humanos , Hipertrofia Ventricular Izquierda , Infarto del Miocardio/diagnóstico
5.
J Electrocardiol ; 51(6S): S6-S11, 2018.
Artículo en Inglés | MEDLINE | ID: mdl-30122457

RESUMEN

INTRODUCTION: Interpretation of the 12­lead Electrocardiogram (ECG) is normally assisted with an automated diagnosis (AD), which can facilitate an 'automation bias' where interpreters can be anchored. In this paper, we studied, 1) the effect of an incorrect AD on interpretation accuracy and interpreter confidence (a proxy for uncertainty), and 2) whether confidence and other interpreter features can predict interpretation accuracy using machine learning. METHODS: This study analysed 9000 ECG interpretations from cardiology and non-cardiology fellows (CFs and non-CFs). One third of the ECGs involved no ADs, one third with ADs (half as incorrect) and one third had multiple ADs. Interpretations were scored and interpreter confidence was recorded for each interpretation and subsequently standardised using sigma scaling. Spearman coefficients were used for correlation analysis and C5.0 decision trees were used for predicting interpretation accuracy using basic interpreter features such as confidence, age, experience and designation. RESULTS: Interpretation accuracies achieved by CFs and non-CFs dropped by 43.20% and 58.95% respectively when an incorrect AD was presented (p < 0.001). Overall correlation between scaled confidence and interpretation accuracy was higher amongst CFs. However, correlation between confidence and interpretation accuracy decreased for both groups when an incorrect AD was presented. We found that an incorrect AD disturbs the reliability of interpreter confidence in predicting accuracy. An incorrect AD has a greater effect on the confidence of non-CFs (although this is not statistically significant it is close to the threshold, p = 0.065). The best C5.0 decision tree achieved an accuracy rate of 64.67% (p < 0.001), however this is only 6.56% greater than the no-information-rate. CONCLUSION: Incorrect ADs reduce the interpreter's diagnostic accuracy indicating an automation bias. Non-CFs tend to agree more with the ADs in comparison to CFs, hence less expert physicians are more effected by automation bias. Incorrect ADs reduce the interpreter's confidence and also reduces the predictive power of confidence for predicting accuracy (even more so for non-CFs). Whilst a statistically significant model was developed, it is difficult to predict interpretation accuracy using machine learning on basic features such as interpreter confidence, age, reader experience and designation.


Asunto(s)
Arritmias Cardíacas/diagnóstico , Automatización , Competencia Clínica , Errores Diagnósticos/estadística & datos numéricos , Electrocardiografía , Sesgo , Árboles de Decisión , Humanos , Variaciones Dependientes del Observador , Incertidumbre
6.
Rural Remote Health ; 18(4): 4618, 2018 10.
Artículo en Inglés | MEDLINE | ID: mdl-30368234

RESUMEN

INTRODUCTION: People who experience an ST-elevation myocardial infarction (STEMI) due to an occluded coronary artery require prompt treatment. Treatments to open a blocked artery are called reperfusion therapies (RTs) and can include intravenous pharmacological thrombolysis (TL) or primary percutaneous coronary intervention (pPCI) in a cardiac catheterisation laboratory (cath lab). Optimal RT (ORT) with pPCI or TL reduces morbidity and mortality. In remote areas, a number of geographical and organisational barriers may influence access to ORT. These are not well understood and the exact proportion of patients who receive ORT and the relationship to time of day and remoteness from the cardiac cath lab is unknown. The aim of this retrospective study was to compare the characteristics of ORT delivery in central and remote locations in the north of Scotland and to identify potential barriers to optimal care with a view to service redesign. METHOD: The study was set in the north of Scotland. All patients who attended hospital with a STEMI between March 2014 and April 2015 were identified from national coding data. A data collection form was developed by the research team in several iterative stages. Clinical details were collected retrospectively from patients' discharge letters. Data included treatment location, date of admission, distance of patient from the cath lab, route of access to health care, left ventricular function and RT received. Distance of patients from the cath lab was described as remote if they were more than 90 minutes of driving time from the cardiac cath lab and central if they were 90 minutes or less of driving time from the regional centre. For patients who made contact in a pre-hospital setting, ORT was defined as pre-hospital TL (PHT) or pPCI. For patients who self-presented to the hospital first, ORT was defined as in-hospital TL or pPCI. Data were described as mean (standard deviation) as appropriate. Chi-squared and student's t-test were used as appropriate. Each case was reviewed to determine if ORT was received; if ORT was not received, the reasons for this were recorded to identify potentially modifiable barriers. RESULTS: Of 627 acute myocardial infarction patients initially identified, 131 had a STEMI, and the others were non-STEMI. From this STEMI cohort, 82 (62%) patients were classed as central and 49 (38%) were remote. In terms of initial therapy, 26 (20%) received pPCI, 19 (15%) received PHTs, 52 (40%) received in-hospital TL, while 33 (25%) received no initial RT. ORT was received by 53 (65%) central and 20 (41%) remote patients; χ²=7.05, degrees of freedom =130, p<0.01).Several recurring barriers were identified. CONCLUSION: This study has demonstrated a significant health inequality between the treatment of STEMI in remote compared to central locations. Potential barriers identified include staffing availability and training, public awareness and inter-hospital communication. This suggests that there remain significant opportunities to improve STEMI care for people living in the north of Scotland.


Asunto(s)
Atención a la Salud/normas , Infarto del Miocardio con Elevación del ST/terapia , Anciano , Femenino , Humanos , Masculino , Persona de Mediana Edad , Características de la Residencia , Estudios Retrospectivos , Escocia , Tiempo de Tratamiento , Viaje , Resultado del Tratamiento
7.
J Electrocardiol ; 50(6): 781-786, 2017.
Artículo en Inglés | MEDLINE | ID: mdl-28903861

RESUMEN

BACKGROUND: The 12-lead Electrocardiogram (ECG) has been used to detect cardiac abnormalities in the same format for more than 70years. However, due to the complex nature of 12-lead ECG interpretation, there is a significant cognitive workload required from the interpreter. This complexity in ECG interpretation often leads to errors in diagnosis and subsequent treatment. We have previously reported on the development of an ECG interpretation support system designed to augment the human interpretation process. This computerised decision support system has been named 'Interactive Progressive based Interpretation' (IPI). In this study, a decision support algorithm was built into the IPI system to suggest potential diagnoses based on the interpreter's annotations of the 12-lead ECG. We hypothesise semi-automatic interpretation using a digital assistant can be an optimal man-machine model for ECG interpretation. OBJECTIVES: To improve interpretation accuracy and reduce missed co-abnormalities. METHODS: The Differential Diagnoses Algorithm (DDA) was developed using web technologies where diagnostic ECG criteria are defined in an open storage format, Javascript Object Notation (JSON), which is queried using a rule-based reasoning algorithm to suggest diagnoses. To test our hypothesis, a counterbalanced trial was designed where subjects interpreted ECGs using the conventional approach and using the IPI+DDA approach. RESULTS: A total of 375 interpretations were collected. The IPI+DDA approach was shown to improve diagnostic accuracy by 8.7% (although not statistically significant, p-value=0.1852), the IPI+DDA suggested the correct interpretation more often than the human interpreter in 7/10 cases (varying statistical significance). Human interpretation accuracy increased to 70% when seven suggestions were generated. CONCLUSION: Although results were not found to be statistically significant, we found; 1) our decision support tool increased the number of correct interpretations, 2) the DDA algorithm suggested the correct interpretation more often than humans, and 3) as many as 7 computerised diagnostic suggestions augmented human decision making in ECG interpretation. Statistical significance may be achieved by expanding sample size.


Asunto(s)
Algoritmos , Sistemas de Apoyo a Decisiones Clínicas , Errores Diagnósticos/prevención & control , Electrocardiografía , Competencia Clínica , Diagnóstico Diferencial , Humanos , Sistemas Hombre-Máquina , Programas Informáticos
8.
J Electrocardiol ; 50(6): 776-780, 2017.
Artículo en Inglés | MEDLINE | ID: mdl-28843654

RESUMEN

BACKGROUND: In clinical practice, data archiving of resting 12-lead electrocardiograms (ECGs) is mainly achieved by storing a PDF report in the hospital electronic health record (EHR). When available, digital ECG source data (raw samples) are only retained within the ECG management system. OBJECTIVE: The widespread availability of the ECG source data would undoubtedly permit successive analysis and facilitate longitudinal studies, with both scientific and diagnostic benefits. METHODS & RESULTS: PDF-ECG is a hybrid archival format which allows to store in the same file both the standard graphical report of an ECG together with its source ECG data (waveforms). Using PDF-ECG as a model to address the challenge of ECG data portability, long-term archiving and documentation, a real-world proof-of-concept test was conducted in a northern Italy hospital. A set of volunteers undertook a basic ECG using routine hospital equipment and the source data captured. Using dedicated web services, PDF-ECG documents were then generated and seamlessly uploaded in the hospital EHR, replacing the standard PDF reports automatically generated at the time of acquisition. Finally, the PDF-ECG files could be successfully retrieved and re-analyzed. CONCLUSION: Adding PDF-ECG to an existing EHR had a minimal impact on the hospital's workflow, while preserving the ECG digital data.


Asunto(s)
Electrocardiografía , Registros Electrónicos de Salud , Almacenamiento y Recuperación de la Información/métodos , Humanos , Programas Informáticos , Integración de Sistemas , Flujo de Trabajo
9.
J Biomed Inform ; 64: 93-107, 2016 12.
Artículo en Inglés | MEDLINE | ID: mdl-27687552

RESUMEN

INTRODUCTION: The 12-lead Electrocardiogram (ECG) presents a plethora of information and demands extensive knowledge and a high cognitive workload to interpret. Whilst the ECG is an important clinical tool, it is frequently incorrectly interpreted. Even expert clinicians are known to impulsively provide a diagnosis based on their first impression and often miss co-abnormalities. Given it is widely reported that there is a lack of competency in ECG interpretation, it is imperative to optimise the interpretation process. Predominantly the ECG interpretation process remains a paper based approach and whilst computer algorithms are used to assist interpreters by providing printed computerised diagnoses, there are a lack of interactive human-computer interfaces to guide and assist the interpreter. METHODS: An interactive computing system was developed to guide the decision making process of a clinician when interpreting the ECG. The system decomposes the interpretation process into a series of interactive sub-tasks and encourages the clinician to systematically interpret the ECG. We have named this model 'Interactive Progressive based Interpretation' (IPI) as the user cannot 'progress' unless they complete each sub-task. Using this model, the ECG is segmented into five parts and presented over five user interfaces (1: Rhythm interpretation, 2: Interpretation of the P-wave morphology, 3: Limb lead interpretation, 4: QRS morphology interpretation with chest lead and rhythm strip presentation and 5: Final review of 12-lead ECG). The IPI model was implemented using emerging web technologies (i.e. HTML5, CSS3, AJAX, PHP and MySQL). It was hypothesised that this system would reduce the number of interpretation errors and increase diagnostic accuracy in ECG interpreters. To test this, we compared the diagnostic accuracy of clinicians when they used the standard approach (control cohort) with clinicians who interpreted the same ECGs using the IPI approach (IPI cohort). RESULTS: For the control cohort, the (mean; standard deviation; confidence interval) of the ECG interpretation accuracy was (45.45%; SD=18.1%; CI=42.07, 48.83). The mean ECG interpretation accuracy rate for the IPI cohort was 58.85% (SD=42.4%; CI=49.12, 68.58), which indicates a positive mean difference of 13.4%. (CI=4.45, 22.35) An N-1 Chi-square test of independence indicated a 92% chance that the IPI cohort will have a higher accuracy rate. Interpreter self-rated confidence also increased between cohorts from a mean of 4.9/10 in the control cohort to 6.8/10 in the IPI cohort (p=0.06). Whilst the IPI cohort had greater diagnostic accuracy, the duration of ECG interpretation was six times longer when compared to the control cohort. CONCLUSIONS: We have developed a system that segments and presents the ECG across five graphical user interfaces. Results indicate that this approach improves diagnostic accuracy but with the expense of time, which is a valuable resource in medical practice.


Asunto(s)
Algoritmos , Toma de Decisiones Clínicas , Electrocardiografía , Cardiopatías/diagnóstico , Interfaz Usuario-Computador , Humanos
10.
J Electrocardiol ; 49(6): 871-876, 2016.
Artículo en Inglés | MEDLINE | ID: mdl-27717571

RESUMEN

Automated detection of AF from the electrocardiogram (ECG) still remains a challenge. In this study, we investigated two multivariate-based classification techniques, Random Forests (RF) and k-nearest neighbor (k-nn), for improved automated detection of AF from the ECG. We have compiled a new database from ECG data taken from existing sources. R-R intervals were then analyzed using four previously described R-R irregularity measurements: (1) the coefficient of sample entropy (CoSEn), (2) the coefficient of variance (CV), (3) root mean square of the successive differences (RMSSD), and (4) median absolute deviation (MAD). Using outputs from all four R-R irregularity measurements, RF and k-nn models were trained. RF classification improved AF detection over CoSEn with overall specificity of 80.1% vs. 98.3% and positive predictive value of 51.8% vs. 92.1% with a reduction in sensitivity, 97.6% vs. 92.8%. k-nn also improved specificity and PPV over CoSEn; however, the sensitivity of this approach was considerably reduced (68.0%).


Asunto(s)
Algoritmos , Fibrilación Atrial/diagnóstico , Diagnóstico por Computador/métodos , Electrocardiografía/métodos , Determinación de la Frecuencia Cardíaca/métodos , Reconocimiento de Normas Patrones Automatizadas/métodos , Adolescente , Adulto , Anciano , Anciano de 80 o más Años , Femenino , Humanos , Masculino , Persona de Mediana Edad , Reproducibilidad de los Resultados , Sensibilidad y Especificidad , Adulto Joven
11.
J Electrocardiol ; 49(6): 794-799, 2016.
Artículo en Inglés | MEDLINE | ID: mdl-27609012

RESUMEN

The 'spatial QRS-T angle' (SA) is frequently determined using linear lead transformation matrices that require the entire 12-lead electrocardiogram (ECG). While this approach is adequate when using 12-lead ECG data that is recorded in the resting supine position, it is not optimal in monitoring applications. This is because maintaining a good quality recording of the complete 12-lead ECG in monitoring applications is difficult. In this research, we assessed the differences between the 'gold standard' SA as determined using the Frank VGG and the SA as determined using different reduced lead systems (RLSs). The random error component (span of the Bland-Altman 95% limits of agreement) of the differences between the 'gold standard' SA and the SA values based upon the different RLSs was quantified. This was performed for all 62 RLSs that can be constructed from Mason-Likar (ML) limb leads I, II and all possible precordial lead subsets that contain between one and five of the precordial leads V1 to V6. The RLS with the smallest lead set size that produced SA estimates of a quality similar to what is achieved using the ML 12-lead ECG was based upon ML limb leads I, II and precordial leads V1, V3 and V6. The random error component (mean [95% confidence interval]) associated with this RLS and the ML 12-lead ECG were found to be 40.74° [35.56°-49.29°] and 39.57° [33.78°-45.70°], respectively. Our findings suggest that a RLS that is based upon the ML limb leads I and II and the three best precordial leads can yield SA estimates of a quality similar to what is achieved when using the complete ML 12-lead ECG.


Asunto(s)
Algoritmos , Diagnóstico por Computador/métodos , Electrocardiografía/instrumentación , Electrocardiografía/métodos , Hipertrofia Ventricular Izquierda/diagnóstico , Infarto del Miocardio/diagnóstico , Adulto , Anciano , Humanos , Persona de Mediana Edad , Reproducibilidad de los Resultados , Sensibilidad y Especificidad
12.
J Electrocardiol ; 49(6): 911-918, 2016.
Artículo en Inglés | MEDLINE | ID: mdl-27662775

RESUMEN

INTRODUCTION: The CardioQuick Patch® (CQP) has been developed to assist operators in accurately positioning precordial electrodes during 12-lead electrocardiogram (ECG) acquisition. This study describes the CQP design and assesses the device in comparison to conventional electrode application. METHODS: Twenty ECG technicians were recruited and a total of 60 ECG acquisitions were performed on the same patient model over four phases: (1) all participants applied single electrodes to the patient; (2) all participants were then re-trained on electrode placement and on how to use the CQP; (3) participants were randomly divided into two groups, the standard group applied single electrodes and the CQP group used the CQP; (4) after a one day interval, the same participants returned to carry out the same procedure on the same patient (measuring intra-practitioner variability). Accuracy was measured with reference to pre-marked correct locations using ultra violet ink. NASA-TLK was used to measure cognitive workload and the Systematic Usability Scale (SUS) was used to quantify the usability of the CQP. RESULTS: There was a large difference between the minimum time taken to complete each approach (CQP=38.58s vs. 65.96s). The standard group exhibited significant levels of electrode placement error (V1=25.35mm±29.33, V2=18.1mm±24.49, V3=38.65mm±15.57, V4=37.73mm±12.14, V5=35.75mm±15.61, V6=44.15mm±14.32). The CQP group had statistically greater accuracy when placing five of the six electrodes (V1=6.68mm±8.53 [p<0.001], V2=8.8mm±9.64 [p=0.122], V3=6.83mm±8.99 [p<0.001], V4=14.90mm±11.76 [p<0.001], V5=8.63mm±10.70 [p<0.001], V6=18.13mm±14.37 [p<0.001]). There was less intra-practitioner variability when using the CQP on the same patient model. NASA TLX revealed that the CQP did increase the cognitive workload (CQP group=16.51%±8.11 vs. 12.22%±8.07 [p=0.251]). The CQP also achieved a high SUS score of 91±7.28. CONCLUSION: The CQP significantly improved the reproducibility and accuracy of placing precordial electrodes V1, V3-V6 with little additional cognitive effort, and with a high degree of usability.


Asunto(s)
Competencia Clínica , Errores Diagnósticos/prevención & control , Electrocardiografía/instrumentación , Electrocardiografía/métodos , Electrodos , Sistemas Hombre-Máquina , Adulto , Diseño de Equipo , Análisis de Falla de Equipo , Ergonomía/instrumentación , Femenino , Humanos , Masculino , Reproducibilidad de los Resultados , Sensibilidad y Especificidad
13.
J Biomed Inform ; 56: 30-41, 2015 Aug.
Artículo en Inglés | MEDLINE | ID: mdl-25998520

RESUMEN

Reablement is new paradigm to increase independence in the home amongst the ageing population. And it remains a challenge to design an optimal electronic system to streamline and integrate reablement into current healthcare infrastructure. Furthermore, given reablement requires collaboration with a range of organisations (including national healthcare institutions and community/voluntary service providers), such a system needs to be co-created with all stakeholders involved. Thus, the purpose of this study is, (1) to bring together stakeholder groups to elicit a comprehensive set of requirements for a digital reablement system, (2) to utilise emerging technologies to implement a system and a data model based on the requirements gathered and (3) to involve user groups in a usability assessment of the system. In this study we employed a mixed qualitative approach that included a series of stakeholder-involved activities. Collectively, 73 subjects were recruited to participate in an ideation event, a quasi-hackathon and a usability study. The study unveiled stakeholder-led requirements, which resulted in a novel cloud-based system that was created using emerging web technologies. The system is driven by a unique data model and includes interactive features that are necessary for streamlining the reablement care model. In summary, this system allows community based interventions (or services) to be prescribed to occupants whilst also monitoring the occupant's progress of independent living.


Asunto(s)
Seguridad Computacional/instrumentación , Informática Médica/métodos , Monitoreo Fisiológico/instrumentación , Anciano , Anciano de 80 o más Años , Envejecimiento , Nube Computacional , Gráficos por Computador , Sistemas de Computación , Recolección de Datos , Atención a la Salud , Electrónica , Geografía , Humanos , Internet , Monitoreo Fisiológico/métodos , Programas Informáticos , Interfaz Usuario-Computador
14.
J Electrocardiol ; 48(6): 1017-21, 2015.
Artículo en Inglés | MEDLINE | ID: mdl-26410197

RESUMEN

This study investigates the use of multivariate linear regression to estimate three bipolar ECG leads from the 12-lead ECG in order to improve P-wave signal strength. The study population consisted of body surface potential maps recorded from 229 healthy subjects. P-waves were then isolated and population based transformation weights developed. A derived P-lead (measured between the right sternoclavicular joint and midway along the costal margin in line with the seventh intercostal space) demonstrated significant improvement in median P-wave root mean square (RMS) signal strength when compared to lead II (94µV vs. 76µV, p<0.001). A derived ES lead (from the EASI lead system) also showed small but significant improvement in median P-wave RMS (79µV vs. 76µV, p=0.0054). Finally, a derived modified Lewis lead did not improve median P-wave RMS when compared to lead II. However, this derived lead improved atrioventricular RMS ratio. P-wave leads derived from the 12-lead ECG can improve signal-to-noise ratio of the P-wave; this may improve the performance of detection algorithms that rely on P-wave analysis.


Asunto(s)
Algoritmos , Fibrilación Atrial/diagnóstico , Mapeo del Potencial de Superficie Corporal/instrumentación , Mapeo del Potencial de Superficie Corporal/métodos , Diagnóstico por Computador/métodos , Diseño de Equipo , Análisis de Falla de Equipo , Humanos , Reproducibilidad de los Resultados , Sensibilidad y Especificidad
15.
J Electrocardiol ; 48(6): 1045-52, 2015.
Artículo en Inglés | MEDLINE | ID: mdl-26381798

RESUMEN

Research has shown that the 'spatial QRS-T angle' (SA) and the 'spatial ventricular gradient' (SVG) have clinical value in a number of different applications. The determination of the SA and the SVG requires vectorcardiographic data. Such data is seldom recorded in clinical practice. The SA and the SVG are therefore frequently derived from 12-lead electrocardiogram (ECG) data using linear lead transformation matrices. This research compares the performance of two previously published linear lead transformation matrices (Kors and ML2VCG) in deriving the SA and the SVG from Mason-Likar (ML) 12-lead ECG data. This comparison was performed through an analysis of the estimation errors that are made when deriving the SA and the SVG for all 181 subjects in the study population. The estimation errors were quantified as the systematic error (mean difference) and the random error (span of the Bland-Altman 95% limits of agreement). The random error was found to be the dominating error component for both the Kors and the ML2VCG matrix. The random error [ML2VCG; Kors; result of the paired, two-sided Pitman-Morgan test for statistical significance of differences in the error variance between ML2VCG and Kors] for the vectorcardiographic parameters SA, magnitude of the SVG, elevation of the SVG and azimuth of the SVG were found to be [37.33°; 50.52°; p<0.001], [30.17mVms; 39.09mVms; p<0.001], [36.77°; 47.62°; p=0.001] and [63.45°; 80.32°; p<0.001] respectively. The findings of this research indicate that in comparison to the Kors matrix the ML2VCG provides greater precision for estimating the SA and SVG from ML 12-lead ECG data.


Asunto(s)
Arritmias Cardíacas/diagnóstico , Arritmias Cardíacas/fisiopatología , Mapeo del Potencial de Superficie Corporal/métodos , Diagnóstico por Computador/métodos , Sistema de Conducción Cardíaco/fisiopatología , Ventrículos Cardíacos/fisiopatología , Simulación por Computador , Humanos , Modelos Cardiovasculares , Reproducibilidad de los Resultados , Sensibilidad y Especificidad , Análisis Espacio-Temporal
16.
J Electrocardiol ; 46(3): 182-96, 2013.
Artículo en Inglés | MEDLINE | ID: mdl-23462202

RESUMEN

INTRODUCTION: The electrocardiogram (ECG) is a recording of the electrical activity of the heart. It is commonly used to non-invasively assess the cardiac activity of a patient. Since 1938, ECG data has been visualised as 12 scalar traces (known as the standard 12-lead ECG). Although this is known as the standard approach, there has been a myriad of alternative methods proposed to visualise ECG data. The purpose of this paper is to provide an overview of these methods and to introduce the field of ECG visualisation to early stage researchers. A scientific purpose is to consider the future of ECG visualisation within routine clinical practice. METHODS: This paper structures the different ECG visualisation methods using four categories, i.e. temporal, vectorial, spatial and interactive. Temporal methods present the data with respect to time, vectorial methods present data with respect to direction and magnitude, spatial methods present data in 2D or 3D space and interactive methods utilise interactive computing to facilitate efficient interrogation of ECG data at different levels of detail. CONCLUSION: Spatial visualisation has been around since its introduction by Waller and vector based visualisation has been around since the 1920s. Given these approaches have already been given the 'test of time', they are unlikely to be replaced as the standard in the near future. Instead of being replaced, the standard is more likely to be 'supplemented'. However, the design and presentation of these ECG visualisation supplements need to be universally standardised. Subsequent to the development of 'standardised supplements', as a requirement, they could then be integrated into all ECG machines. We recognise that without intuitive software and interactivity on mobile devices (e.g. tablet PCs), it is impractical to integrate the more advanced ECG visualisation methods into routine practice (i.e. epicardial mapping using an inverse solution).


Asunto(s)
Algoritmos , Gráficos por Computador , Diagnóstico por Computador/métodos , Electrocardiografía/métodos , Frecuencia Cardíaca/fisiología , Almacenamiento y Recuperación de la Información/métodos , Interfaz Usuario-Computador , Bases de Datos Factuales , Humanos
17.
Comput Methods Programs Biomed ; 241: 107780, 2023 Nov.
Artículo en Inglés | MEDLINE | ID: mdl-37651816

RESUMEN

BACKGROUND AND OBJECTIVE: Quantitative measures extracted from ventricular fibrillation (VF) waveform reflect the metabolic state of the myocardium and are associated with survival outcome. The quality of delivered chest compressions during cardiopulmonary resuscitation are also linked with survival. The aim of this research is to explore the viability and effectiveness of a thoracic impedance (TI) based chest compression (CC) guidance system to control CC depth within individual subjects and influence VF waveform properties. METHODS: This porcine investigation includes an analysis of two protocols. CC were delivered in 2 min episodes at a constant rate of 110 CC min-1. Subject-specific CC depth was controlled using a TI-thresholding system where CC were performed according to the amplitude (ZRMS, 0.125 to 1.250 Ω) of a band-passed TI signal (ZCC). Protocol A was a retrospective analysis of a 12-porcine study to characterise the response of two VF waveform metrics: amplitude spectrum area (AMSA) and mean slope (MS), to varying CC quality. Protocol B was a prospective 12-porcine study to determine if changes in VF waveform metrics, due to CC quality, were associated with defibrillation outcome. RESULTS: Protocol A: A directly proportional relationship was observed between ZRMS and CC depth applied within each subject (r = 0.90; p <0.001). A positive relationship was observed between ZRMS and both AMSA (p <0.001) and MS (p <0.001), where greater TI thresholds were associated with greater waveform metrics. PROTOCOL B: MS was associated with return of circulation following defibrillation (odds ratio = 2.657; p = 0.043). CONCLUSION: TI-thresholding was an effective way to control CC depth within-subjects. Compressions applied according to higher TI thresholds evoked an increase in AMSA and MS. The response in MS due to deeper CC resulted in a greater incidence of ROSC compared to shallow chest compressions.


Asunto(s)
Amsacrina , Fibrilación Ventricular , Porcinos , Animales , Fibrilación Ventricular/terapia , Impedancia Eléctrica , Estudios Prospectivos , Estudios Retrospectivos
18.
Npj Ment Health Res ; 2(1): 13, 2023 Aug 22.
Artículo en Inglés | MEDLINE | ID: mdl-38609479

RESUMEN

This paper makes a case for digital mental health and provides insights into how digital technologies can enhance (but not replace) existing mental health services. We describe digital mental health by presenting a suite of digital technologies (from digital interventions to the application of artificial intelligence). We discuss the benefits of digital mental health, for example, a digital intervention can be an accessible stepping-stone to receiving support. The paper does, however, present less-discussed benefits with new concepts such as 'poly-digital', where many different apps/features (e.g. a sleep app, mood logging app and a mindfulness app, etc.) can each address different factors of wellbeing, perhaps resulting in an aggregation of marginal gains. Another benefit is that digital mental health offers the ability to collect high-resolution real-world client data and provide client monitoring outside of therapy sessions. These data can be collected using digital phenotyping and ecological momentary assessment techniques (i.e. repeated mood or scale measures via an app). This allows digital mental health tools and real-world data to inform therapists and enrich face-to-face sessions. This can be referred to as blended care/adjunctive therapy where service users can engage in 'channel switching' between digital and non-digital (face-to-face) interventions providing a more integrated service. This digital integration can be referred to as a kind of 'digital glue' that helps join up the in-person sessions with the real world. The paper presents the challenges, for example, the majority of mental health apps are maybe of inadequate quality and there is a lack of user retention. There are also ethical challenges, for example, with the perceived 'over-promotion' of screen-time and the perceived reduction in care when replacing humans with 'computers', and the trap of 'technological solutionism' whereby technology can be naively presumed to solve all problems. Finally, we argue for the need to take an evidence-based, systems thinking and co-production approach in the form of stakeholder-centred design when developing digital mental health services based on technologies. The main contribution of this paper is the integration of ideas from many different disciplines as well as the framework for blended care using 'channel switching' to showcase how digital data and technology can enrich physical services. Another contribution is the emergence of 'poly-digital' and a discussion on the challenges of digital mental health, specifically 'digital ethics'.

19.
J Electrocardiol ; 45(6): 604-8, 2012.
Artículo en Inglés | MEDLINE | ID: mdl-23022301

RESUMEN

BACKGROUND: Reduced lead systems utilizing patient-specific transformation weights have been reported to achieve superior estimates than those utilizing population-based transformation weights. We report upon the effects of ischemic-type electrocardiographic changes on the estimation performance of a reduced lead system when utilizing patient-specific transformation weights and population-based transformation weights. METHOD: A reduced lead system that used leads I, II, V2 and V5 to estimate leads V1, V3, V4, and V6 was investigated. Patient-specific transformation weights were developed on electrocardiograms containing no ischemic-type changes. Patient-specific and population-based transformations weights were assessed on 45 electrocardiograms with ischemic-type changes and 59 electrocardiograms without ischemic-type changes. RESULTS: For patient-specific transformation weights the estimation performance measured as median root mean squared error values (no ischemic-type changes vs. ischemic-type changes) was found to be (V1, 27.5 µV vs. 95.8 µV, P<.001; V3, 33.9 µV vs. 65.2 µV, P<.001; V4, 24.8 µV vs. 62.0 µV, P<.001; V6, 11.7 µV vs. 51.5 µV, P<.001). The median magnitude of ST-amplitude difference 60 ms after the J-point between patient-specific estimated leads and actual recorded leads (no ischemic-type changes vs. ischemic-type changes) was found to be (V1, 18.9 µV vs. 61.4 µV, P<.001; V3, 14.3 µV vs. 61.1 µV, P<.001; V4, 9.7 µV vs. 61.3 µV, P<.001; V6, 5.9 µV vs. 46.0 µV, P<.001). CONCLUSION: The estimation performance of patient-specific transformations weights can deteriorate when ischemic-type changes develop. Performance assessment of patient-specific transformation weights should be performed using electrocardiographic data that represent the monitoring situation for which the reduced lead system is targeted.


Asunto(s)
Algoritmos , Diagnóstico por Computador/métodos , Electrocardiografía/instrumentación , Electrocardiografía/métodos , Infarto del Miocardio/diagnóstico , Adulto , Femenino , Humanos , Masculino , Persona de Mediana Edad , Reproducibilidad de los Resultados , Sensibilidad y Especificidad
20.
Front Physiol ; 13: 760000, 2022.
Artículo en Inglés | MEDLINE | ID: mdl-35399264

RESUMEN

Introduction: Representation learning allows artificial intelligence (AI) models to learn useful features from large, unlabelled datasets. This can reduce the need for labelled data across a range of downstream tasks. It was hypothesised that wave segmentation would be a useful form of electrocardiogram (ECG) representation learning. In addition to reducing labelled data requirements, segmentation masks may provide a mechanism for explainable AI. This study details the development and evaluation of a Wave Segmentation Pretraining (WaSP) application. Materials and Methods: Pretraining: A non-AI-based ECG signal and image simulator was developed to generate ECGs and wave segmentation masks. U-Net models were trained to segment waves from synthetic ECGs. Dataset: The raw sample files from the PTB-XL dataset were downloaded. Each ECG was also plotted into an image. Fine-tuning and evaluation: A hold-out approach was used with a 60:20:20 training/validation/test set split. The encoder portions of the U-Net models were fine-tuned to classify PTB-XL ECGs for two tasks: sinus rhythm (SR) vs atrial fibrillation (AF), and myocardial infarction (MI) vs normal ECGs. The fine-tuning was repeated without pretraining. Results were compared. Explainable AI: an example pipeline combining AI-derived segmentation masks and a rule-based AF detector was developed and evaluated. Results: WaSP consistently improved model performance on downstream tasks for both ECG signals and images. The difference between non-pretrained models and models pretrained for wave segmentation was particularly marked for ECG image analysis. A selection of segmentation masks are shown. An AF detection algorithm comprising both AI and rule-based components performed less well than end-to-end AI models but its outputs are proposed to be highly explainable. An example output is shown. Conclusion: WaSP using synthetic data and labels allows AI models to learn useful features for downstream ECG analysis with real-world data. Segmentation masks provide an intermediate output that may facilitate confidence calibration in the context of end-to-end AI. It is possible to combine AI-derived segmentation masks and rule-based diagnostic classifiers for explainable ECG analysis.

SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA