RESUMO
PURPOSE: Congenital heart disease (CHD) is the most common live birth defect and a proportion of these patients have chronic hypoxia. Chronic hypoxia leads to secondary erythrocytosis resulting in microvascular dysfunction and increased thrombosis risk. The conjunctival microcirculation is easily accessible for imaging and quantitative assessment. It has not previously been studied in adult CHD patients with cyanosis (CCHD). METHODS: We assessed the conjunctival microcirculation and compared CCHD patients and matched healthy controls to determine if there were differences in measured microcirculatory parameters. We acquired images using an iPhone 6s and slit-lamp biomicroscope. Parameters measured included diameter, axial velocity, wall shear rate and blood volume flow. The axial velocity was estimated by applying the 1D + T continuous wavelet transform (CWT). Results are for all vessels as they were not sub-classified into arterioles or venules. RESULTS: 11 CCHD patients and 14 healthy controls were recruited to the study. CCHD patients were markedly more hypoxic compared to the healthy controls (84% vs 98%, p = 0.001). A total of 736 vessels (292 vs 444) were suitable for analysis. Mean microvessel diameter (D) did not significantly differ between the CCHD patients and controls (20.4 ± 2.7 µm vs 20.2 ± 2.6 µm, p = 0.86). Axial velocity (Va) was lower in the CCHD patients (0.47 ± 0.06 mm/s vs 0.53 ± 0.05 mm/s, p = 0.03). Blood volume flow (Q) was lower for CCHD patients (121 ± 30pl/s vs 145 ± 50pl/s, p = 0.65) with the greatest differences observed in vessels >22 µm diameter (216 ± 121pl/s vs 258 ± 154pl/s, p = 0.001). Wall shear rate (WSR) was significantly lower for the CCHD group (153 ± 27 s-1 vs 174 ± 22 s-1, p = 0.04). CONCLUSIONS: This iPhone and slit-lamp combination assessment of conjunctival vessels found lower axial velocity, wall shear rate and in the largest vessel group, lower blood volume flow in chronically hypoxic patients with congenital heart disease. With further study this assessment method may have utility in the evaluation of patients with chronic hypoxia.
Assuntos
Túnica Conjuntiva/irrigação sanguínea , Cianose/diagnóstico , Cardiopatias Congênitas/diagnóstico , Microcirculação , Microscopia com Lâmpada de Fenda , Adulto , Velocidade do Fluxo Sanguíneo , Estudos de Casos e Controles , Cianose/etiologia , Cianose/fisiopatologia , Feminino , Cardiopatias Congênitas/complicações , Cardiopatias Congênitas/fisiopatologia , Humanos , Masculino , Pessoa de Meia-Idade , Valor Preditivo dos Testes , Fluxo Sanguíneo Regional , Lâmpada de Fenda , Microscopia com Lâmpada de Fenda/instrumentação , Smartphone , Estresse Mecânico , Adulto JovemRESUMO
PURPOSE: The conjunctival microcirculation is a readily-accessible vascular bed for quantitative haemodynamic assessment and has been studied previously using a digital charge-coupled device (CCD). Smartphone video imaging of the conjunctiva, and haemodynamic parameter quantification, represents a novel approach. We report the feasibility of smartphone video acquisition and subsequent haemodynamic measure quantification via semi-automated means. METHODS: Using an Apple iPhone 6â¯s and a Topcon SL-D4 slit-lamp biomicroscope, we obtained videos of the conjunctival microcirculation in 4 fields of view per patient, for 17 low cardiovascular risk patients. After image registration and processing, we quantified the diameter, mean axial velocity, mean blood volume flow, and wall shear rate for each vessel studied. Vessels were grouped into quartiles based on their diameter i.e. group 1 (<11⯵m), 2 (11-16⯵m), 3 (16-22⯵m) and 4 (>22⯵m). RESULTS: From the 17 healthy controls (mean QRISK3 6.6%), we obtained quantifiable haemodynamics from 626 vessel segments. The mean diameter of microvessels, across all sites, was 21.1µm (range 5.8-58⯵m). Mean axial velocity was 0.50mm/s (range 0.11-1mm/s) and there was a modestly positive correlation (r 0.322) seen with increasing diameter, best appreciated when comparing group 4 to the remaining groups (pâ¯<â¯.0001). Blood volume flow (mean 145.61pl/s, range 7.05-1178.81pl/s) was strongly correlated with increasing diameter (r 0.943, pâ¯<â¯.0001) and wall shear rate (mean 157.31â¯s-1, range 37.37-841.66â¯s-1) negatively correlated with increasing diameter (râ¯-â¯0.703, pâ¯<â¯.0001). CONCLUSIONS: We, for the first time, report the successful assessment and quantification of the conjunctival microcirculatory haemodynamics using a smartphone-based system.
Assuntos
Doenças Cardiovasculares/diagnóstico , Túnica Conjuntiva/irrigação sanguínea , Técnicas de Diagnóstico Oftalmológico/instrumentação , Hemodinâmica , Microcirculação , Lâmpada de Fenda , Smartphone , Adulto , Velocidade do Fluxo Sanguíneo , Doenças Cardiovasculares/fisiopatologia , Estudos de Casos e Controles , Estudos de Viabilidade , Feminino , Hemorreologia , Humanos , Interpretação de Imagem Assistida por Computador , Masculino , Pessoa de Meia-Idade , Aplicativos Móveis , Modelos Cardiovasculares , Valor Preditivo dos Testes , Fluxo Sanguíneo RegionalRESUMO
BACKGROUND: Body surface potential mapping (BSPM) provides additional electrophysiological information that can be useful for the detection of cardiac diseases. Moreover, BSPMs are currently utilized in electrocardiographic imaging (ECGI) systems within clinical practice. Missing information due to noisy recordings, poor electrode contact is inevitable. In this study, we present an interpolation method that combines Laplacian minimization and principal component analysis (PCA) techniques for interpolating this missing information. METHOD: The dataset used consisted of 117 lead BSPMs recorded from 744 subjects (a training set of 384 subjects, and a test set of 360). This dataset is a mixture of normal, old myocardial infarction, and left ventricular hypertrophy subjects. The missing data was simulated by ignoring data recorded from 7 regions: the first region represents three rows of five electrodes on the anterior torso surface (high potential gradient region), and the other six regions were realistic patterns that have been drawn from clinical data and represent the most likely regions of broken electrodes. Three interpolation methods including PCA based interpolation, Laplacian interpolation, and hybrid Laplacian-PCA interpolation methods were used to interpolate the missing data from the remaining electrodes. In the simulated region of missing data, the calculated potentials from each interpolation method were compared with the measured potentials using relative error (RE) and correlation coefficient (CC) over time. In the hybrid Laplacian-PCA interpolation method, the missing data are firstly interpolated using Laplacian interpolation, then the resulting BSPM of 117 potentials was multiplied by the (117â¯×â¯117) coefficient matrix calculated using the training set to get the principal components. Out of 117 principal components (PCs), the first 15 PCs were utilized for the second stage of interpolation. The best performance of interpolation was the reason for choosing the first 15 PCs. RESULTS: The differences in the median of relative error (RE) between Laplacian and Hybrid method ranged from 0.01 to 0.35 (pâ¯<â¯0.001), while the differences in the median of correlation between them ranged from 0.0006 to 0.034 (pâ¯<â¯0.001). PCA-interpolation method performed badly especially in some scenarios where the number of missing electrodes was up to 12 or higher causing a high region of missing data. The figures of median of RE for PCA-method were between 0.05 and 0.6 lower than that for Hybrid method (pâ¯<â¯0.001). However, the median of correlation was between 0.0002 and 0.26 lower than the figure for the Hybrid method (pâ¯<â¯0.001). CONCLUSION: Comparison between the three methods of interpolation (Laplacian, PCA, Hybrid) in reconstructing missing data in BSPM showed that the Hybrid method was always better than the other methods in all scenarios; whether the number of missed electrodes is high or low, and irrespective of the location of these missed electrodes.
Assuntos
Mapeamento Potencial de Superfície Corporal , Eletrocardiografia , Infarto do Miocárdio , Eletrodos , Humanos , Hipertrofia Ventricular Esquerda , Infarto do Miocárdio/diagnósticoRESUMO
BACKGROUND: In clinical practice, data archiving of resting 12-lead electrocardiograms (ECGs) is mainly achieved by storing a PDF report in the hospital electronic health record (EHR). When available, digital ECG source data (raw samples) are only retained within the ECG management system. OBJECTIVE: The widespread availability of the ECG source data would undoubtedly permit successive analysis and facilitate longitudinal studies, with both scientific and diagnostic benefits. METHODS & RESULTS: PDF-ECG is a hybrid archival format which allows to store in the same file both the standard graphical report of an ECG together with its source ECG data (waveforms). Using PDF-ECG as a model to address the challenge of ECG data portability, long-term archiving and documentation, a real-world proof-of-concept test was conducted in a northern Italy hospital. A set of volunteers undertook a basic ECG using routine hospital equipment and the source data captured. Using dedicated web services, PDF-ECG documents were then generated and seamlessly uploaded in the hospital EHR, replacing the standard PDF reports automatically generated at the time of acquisition. Finally, the PDF-ECG files could be successfully retrieved and re-analyzed. CONCLUSION: Adding PDF-ECG to an existing EHR had a minimal impact on the hospital's workflow, while preserving the ECG digital data.
Assuntos
Eletrocardiografia , Registros Eletrônicos de Saúde , Armazenamento e Recuperação da Informação/métodos , Humanos , Software , Integração de Sistemas , Fluxo de TrabalhoRESUMO
BACKGROUND: The 12-lead Electrocardiogram (ECG) has been used to detect cardiac abnormalities in the same format for more than 70years. However, due to the complex nature of 12-lead ECG interpretation, there is a significant cognitive workload required from the interpreter. This complexity in ECG interpretation often leads to errors in diagnosis and subsequent treatment. We have previously reported on the development of an ECG interpretation support system designed to augment the human interpretation process. This computerised decision support system has been named 'Interactive Progressive based Interpretation' (IPI). In this study, a decision support algorithm was built into the IPI system to suggest potential diagnoses based on the interpreter's annotations of the 12-lead ECG. We hypothesise semi-automatic interpretation using a digital assistant can be an optimal man-machine model for ECG interpretation. OBJECTIVES: To improve interpretation accuracy and reduce missed co-abnormalities. METHODS: The Differential Diagnoses Algorithm (DDA) was developed using web technologies where diagnostic ECG criteria are defined in an open storage format, Javascript Object Notation (JSON), which is queried using a rule-based reasoning algorithm to suggest diagnoses. To test our hypothesis, a counterbalanced trial was designed where subjects interpreted ECGs using the conventional approach and using the IPI+DDA approach. RESULTS: A total of 375 interpretations were collected. The IPI+DDA approach was shown to improve diagnostic accuracy by 8.7% (although not statistically significant, p-value=0.1852), the IPI+DDA suggested the correct interpretation more often than the human interpreter in 7/10 cases (varying statistical significance). Human interpretation accuracy increased to 70% when seven suggestions were generated. CONCLUSION: Although results were not found to be statistically significant, we found; 1) our decision support tool increased the number of correct interpretations, 2) the DDA algorithm suggested the correct interpretation more often than humans, and 3) as many as 7 computerised diagnostic suggestions augmented human decision making in ECG interpretation. Statistical significance may be achieved by expanding sample size.
Assuntos
Algoritmos , Sistemas de Apoio a Decisões Clínicas , Erros de Diagnóstico/prevenção & controle , Eletrocardiografia , Competência Clínica , Diagnóstico Diferencial , Humanos , Sistemas Homem-Máquina , SoftwareRESUMO
INTRODUCTION: The 12-lead Electrocardiogram (ECG) presents a plethora of information and demands extensive knowledge and a high cognitive workload to interpret. Whilst the ECG is an important clinical tool, it is frequently incorrectly interpreted. Even expert clinicians are known to impulsively provide a diagnosis based on their first impression and often miss co-abnormalities. Given it is widely reported that there is a lack of competency in ECG interpretation, it is imperative to optimise the interpretation process. Predominantly the ECG interpretation process remains a paper based approach and whilst computer algorithms are used to assist interpreters by providing printed computerised diagnoses, there are a lack of interactive human-computer interfaces to guide and assist the interpreter. METHODS: An interactive computing system was developed to guide the decision making process of a clinician when interpreting the ECG. The system decomposes the interpretation process into a series of interactive sub-tasks and encourages the clinician to systematically interpret the ECG. We have named this model 'Interactive Progressive based Interpretation' (IPI) as the user cannot 'progress' unless they complete each sub-task. Using this model, the ECG is segmented into five parts and presented over five user interfaces (1: Rhythm interpretation, 2: Interpretation of the P-wave morphology, 3: Limb lead interpretation, 4: QRS morphology interpretation with chest lead and rhythm strip presentation and 5: Final review of 12-lead ECG). The IPI model was implemented using emerging web technologies (i.e. HTML5, CSS3, AJAX, PHP and MySQL). It was hypothesised that this system would reduce the number of interpretation errors and increase diagnostic accuracy in ECG interpreters. To test this, we compared the diagnostic accuracy of clinicians when they used the standard approach (control cohort) with clinicians who interpreted the same ECGs using the IPI approach (IPI cohort). RESULTS: For the control cohort, the (mean; standard deviation; confidence interval) of the ECG interpretation accuracy was (45.45%; SD=18.1%; CI=42.07, 48.83). The mean ECG interpretation accuracy rate for the IPI cohort was 58.85% (SD=42.4%; CI=49.12, 68.58), which indicates a positive mean difference of 13.4%. (CI=4.45, 22.35) An N-1 Chi-square test of independence indicated a 92% chance that the IPI cohort will have a higher accuracy rate. Interpreter self-rated confidence also increased between cohorts from a mean of 4.9/10 in the control cohort to 6.8/10 in the IPI cohort (p=0.06). Whilst the IPI cohort had greater diagnostic accuracy, the duration of ECG interpretation was six times longer when compared to the control cohort. CONCLUSIONS: We have developed a system that segments and presents the ECG across five graphical user interfaces. Results indicate that this approach improves diagnostic accuracy but with the expense of time, which is a valuable resource in medical practice.
Assuntos
Algoritmos , Tomada de Decisão Clínica , Eletrocardiografia , Cardiopatias/diagnóstico , Interface Usuário-Computador , HumanosRESUMO
Automated detection of AF from the electrocardiogram (ECG) still remains a challenge. In this study, we investigated two multivariate-based classification techniques, Random Forests (RF) and k-nearest neighbor (k-nn), for improved automated detection of AF from the ECG. We have compiled a new database from ECG data taken from existing sources. R-R intervals were then analyzed using four previously described R-R irregularity measurements: (1) the coefficient of sample entropy (CoSEn), (2) the coefficient of variance (CV), (3) root mean square of the successive differences (RMSSD), and (4) median absolute deviation (MAD). Using outputs from all four R-R irregularity measurements, RF and k-nn models were trained. RF classification improved AF detection over CoSEn with overall specificity of 80.1% vs. 98.3% and positive predictive value of 51.8% vs. 92.1% with a reduction in sensitivity, 97.6% vs. 92.8%. k-nn also improved specificity and PPV over CoSEn; however, the sensitivity of this approach was considerably reduced (68.0%).
Assuntos
Algoritmos , Fibrilação Atrial/diagnóstico , Diagnóstico por Computador/métodos , Eletrocardiografia/métodos , Determinação da Frequência Cardíaca/métodos , Reconhecimento Automatizado de Padrão/métodos , Adolescente , Adulto , Idoso , Idoso de 80 Anos ou mais , Feminino , Humanos , Masculino , Pessoa de Meia-Idade , Reprodutibilidade dos Testes , Sensibilidade e Especificidade , Adulto JovemRESUMO
The 'spatial QRS-T angle' (SA) is frequently determined using linear lead transformation matrices that require the entire 12-lead electrocardiogram (ECG). While this approach is adequate when using 12-lead ECG data that is recorded in the resting supine position, it is not optimal in monitoring applications. This is because maintaining a good quality recording of the complete 12-lead ECG in monitoring applications is difficult. In this research, we assessed the differences between the 'gold standard' SA as determined using the Frank VGG and the SA as determined using different reduced lead systems (RLSs). The random error component (span of the Bland-Altman 95% limits of agreement) of the differences between the 'gold standard' SA and the SA values based upon the different RLSs was quantified. This was performed for all 62 RLSs that can be constructed from Mason-Likar (ML) limb leads I, II and all possible precordial lead subsets that contain between one and five of the precordial leads V1 to V6. The RLS with the smallest lead set size that produced SA estimates of a quality similar to what is achieved using the ML 12-lead ECG was based upon ML limb leads I, II and precordial leads V1, V3 and V6. The random error component (mean [95% confidence interval]) associated with this RLS and the ML 12-lead ECG were found to be 40.74° [35.56°-49.29°] and 39.57° [33.78°-45.70°], respectively. Our findings suggest that a RLS that is based upon the ML limb leads I and II and the three best precordial leads can yield SA estimates of a quality similar to what is achieved when using the complete ML 12-lead ECG.
Assuntos
Algoritmos , Diagnóstico por Computador/métodos , Eletrocardiografia/instrumentação , Eletrocardiografia/métodos , Hipertrofia Ventricular Esquerda/diagnóstico , Infarto do Miocárdio/diagnóstico , Adulto , Idoso , Humanos , Pessoa de Meia-Idade , Reprodutibilidade dos Testes , Sensibilidade e EspecificidadeRESUMO
INTRODUCTION: The CardioQuick Patch® (CQP) has been developed to assist operators in accurately positioning precordial electrodes during 12-lead electrocardiogram (ECG) acquisition. This study describes the CQP design and assesses the device in comparison to conventional electrode application. METHODS: Twenty ECG technicians were recruited and a total of 60 ECG acquisitions were performed on the same patient model over four phases: (1) all participants applied single electrodes to the patient; (2) all participants were then re-trained on electrode placement and on how to use the CQP; (3) participants were randomly divided into two groups, the standard group applied single electrodes and the CQP group used the CQP; (4) after a one day interval, the same participants returned to carry out the same procedure on the same patient (measuring intra-practitioner variability). Accuracy was measured with reference to pre-marked correct locations using ultra violet ink. NASA-TLK was used to measure cognitive workload and the Systematic Usability Scale (SUS) was used to quantify the usability of the CQP. RESULTS: There was a large difference between the minimum time taken to complete each approach (CQP=38.58s vs. 65.96s). The standard group exhibited significant levels of electrode placement error (V1=25.35mm±29.33, V2=18.1mm±24.49, V3=38.65mm±15.57, V4=37.73mm±12.14, V5=35.75mm±15.61, V6=44.15mm±14.32). The CQP group had statistically greater accuracy when placing five of the six electrodes (V1=6.68mm±8.53 [p<0.001], V2=8.8mm±9.64 [p=0.122], V3=6.83mm±8.99 [p<0.001], V4=14.90mm±11.76 [p<0.001], V5=8.63mm±10.70 [p<0.001], V6=18.13mm±14.37 [p<0.001]). There was less intra-practitioner variability when using the CQP on the same patient model. NASA TLX revealed that the CQP did increase the cognitive workload (CQP group=16.51%±8.11 vs. 12.22%±8.07 [p=0.251]). The CQP also achieved a high SUS score of 91±7.28. CONCLUSION: The CQP significantly improved the reproducibility and accuracy of placing precordial electrodes V1, V3-V6 with little additional cognitive effort, and with a high degree of usability.
Assuntos
Competência Clínica , Erros de Diagnóstico/prevenção & controle , Eletrocardiografia/instrumentação , Eletrocardiografia/métodos , Eletrodos , Sistemas Homem-Máquina , Adulto , Desenho de Equipamento , Análise de Falha de Equipamento , Ergonomia/instrumentação , Feminino , Humanos , Masculino , Reprodutibilidade dos Testes , Sensibilidade e EspecificidadeRESUMO
Reablement is new paradigm to increase independence in the home amongst the ageing population. And it remains a challenge to design an optimal electronic system to streamline and integrate reablement into current healthcare infrastructure. Furthermore, given reablement requires collaboration with a range of organisations (including national healthcare institutions and community/voluntary service providers), such a system needs to be co-created with all stakeholders involved. Thus, the purpose of this study is, (1) to bring together stakeholder groups to elicit a comprehensive set of requirements for a digital reablement system, (2) to utilise emerging technologies to implement a system and a data model based on the requirements gathered and (3) to involve user groups in a usability assessment of the system. In this study we employed a mixed qualitative approach that included a series of stakeholder-involved activities. Collectively, 73 subjects were recruited to participate in an ideation event, a quasi-hackathon and a usability study. The study unveiled stakeholder-led requirements, which resulted in a novel cloud-based system that was created using emerging web technologies. The system is driven by a unique data model and includes interactive features that are necessary for streamlining the reablement care model. In summary, this system allows community based interventions (or services) to be prescribed to occupants whilst also monitoring the occupant's progress of independent living.
Assuntos
Segurança Computacional/instrumentação , Informática Médica/métodos , Monitorização Fisiológica/instrumentação , Idoso , Idoso de 80 Anos ou mais , Envelhecimento , Computação em Nuvem , Gráficos por Computador , Sistemas Computacionais , Coleta de Dados , Atenção à Saúde , Eletrônica , Geografia , Humanos , Internet , Monitorização Fisiológica/métodos , Software , Interface Usuário-ComputadorRESUMO
Research has shown that the 'spatial QRS-T angle' (SA) and the 'spatial ventricular gradient' (SVG) have clinical value in a number of different applications. The determination of the SA and the SVG requires vectorcardiographic data. Such data is seldom recorded in clinical practice. The SA and the SVG are therefore frequently derived from 12-lead electrocardiogram (ECG) data using linear lead transformation matrices. This research compares the performance of two previously published linear lead transformation matrices (Kors and ML2VCG) in deriving the SA and the SVG from Mason-Likar (ML) 12-lead ECG data. This comparison was performed through an analysis of the estimation errors that are made when deriving the SA and the SVG for all 181 subjects in the study population. The estimation errors were quantified as the systematic error (mean difference) and the random error (span of the Bland-Altman 95% limits of agreement). The random error was found to be the dominating error component for both the Kors and the ML2VCG matrix. The random error [ML2VCG; Kors; result of the paired, two-sided Pitman-Morgan test for statistical significance of differences in the error variance between ML2VCG and Kors] for the vectorcardiographic parameters SA, magnitude of the SVG, elevation of the SVG and azimuth of the SVG were found to be [37.33°; 50.52°; p<0.001], [30.17mVms; 39.09mVms; p<0.001], [36.77°; 47.62°; p=0.001] and [63.45°; 80.32°; p<0.001] respectively. The findings of this research indicate that in comparison to the Kors matrix the ML2VCG provides greater precision for estimating the SA and SVG from ML 12-lead ECG data.
Assuntos
Arritmias Cardíacas/diagnóstico , Arritmias Cardíacas/fisiopatologia , Mapeamento Potencial de Superfície Corporal/métodos , Diagnóstico por Computador/métodos , Sistema de Condução Cardíaco/fisiopatologia , Ventrículos do Coração/fisiopatologia , Simulação por Computador , Humanos , Modelos Cardiovasculares , Reprodutibilidade dos Testes , Sensibilidade e Especificidade , Análise Espaço-TemporalRESUMO
This study investigates the use of multivariate linear regression to estimate three bipolar ECG leads from the 12-lead ECG in order to improve P-wave signal strength. The study population consisted of body surface potential maps recorded from 229 healthy subjects. P-waves were then isolated and population based transformation weights developed. A derived P-lead (measured between the right sternoclavicular joint and midway along the costal margin in line with the seventh intercostal space) demonstrated significant improvement in median P-wave root mean square (RMS) signal strength when compared to lead II (94µV vs. 76µV, p<0.001). A derived ES lead (from the EASI lead system) also showed small but significant improvement in median P-wave RMS (79µV vs. 76µV, p=0.0054). Finally, a derived modified Lewis lead did not improve median P-wave RMS when compared to lead II. However, this derived lead improved atrioventricular RMS ratio. P-wave leads derived from the 12-lead ECG can improve signal-to-noise ratio of the P-wave; this may improve the performance of detection algorithms that rely on P-wave analysis.
Assuntos
Algoritmos , Fibrilação Atrial/diagnóstico , Mapeamento Potencial de Superfície Corporal/instrumentação , Mapeamento Potencial de Superfície Corporal/métodos , Diagnóstico por Computador/métodos , Desenho de Equipamento , Análise de Falha de Equipamento , Humanos , Reprodutibilidade dos Testes , Sensibilidade e EspecificidadeRESUMO
INTRODUCTION: The electrocardiogram (ECG) is a recording of the electrical activity of the heart. It is commonly used to non-invasively assess the cardiac activity of a patient. Since 1938, ECG data has been visualised as 12 scalar traces (known as the standard 12-lead ECG). Although this is known as the standard approach, there has been a myriad of alternative methods proposed to visualise ECG data. The purpose of this paper is to provide an overview of these methods and to introduce the field of ECG visualisation to early stage researchers. A scientific purpose is to consider the future of ECG visualisation within routine clinical practice. METHODS: This paper structures the different ECG visualisation methods using four categories, i.e. temporal, vectorial, spatial and interactive. Temporal methods present the data with respect to time, vectorial methods present data with respect to direction and magnitude, spatial methods present data in 2D or 3D space and interactive methods utilise interactive computing to facilitate efficient interrogation of ECG data at different levels of detail. CONCLUSION: Spatial visualisation has been around since its introduction by Waller and vector based visualisation has been around since the 1920s. Given these approaches have already been given the 'test of time', they are unlikely to be replaced as the standard in the near future. Instead of being replaced, the standard is more likely to be 'supplemented'. However, the design and presentation of these ECG visualisation supplements need to be universally standardised. Subsequent to the development of 'standardised supplements', as a requirement, they could then be integrated into all ECG machines. We recognise that without intuitive software and interactivity on mobile devices (e.g. tablet PCs), it is impractical to integrate the more advanced ECG visualisation methods into routine practice (i.e. epicardial mapping using an inverse solution).
Assuntos
Algoritmos , Gráficos por Computador , Diagnóstico por Computador/métodos , Eletrocardiografia/métodos , Frequência Cardíaca/fisiologia , Armazenamento e Recuperação da Informação/métodos , Interface Usuário-Computador , Bases de Dados Factuais , HumanosRESUMO
BACKGROUND AND OBJECTIVE: Quantitative measures extracted from ventricular fibrillation (VF) waveform reflect the metabolic state of the myocardium and are associated with survival outcome. The quality of delivered chest compressions during cardiopulmonary resuscitation are also linked with survival. The aim of this research is to explore the viability and effectiveness of a thoracic impedance (TI) based chest compression (CC) guidance system to control CC depth within individual subjects and influence VF waveform properties. METHODS: This porcine investigation includes an analysis of two protocols. CC were delivered in 2 min episodes at a constant rate of 110 CC min-1. Subject-specific CC depth was controlled using a TI-thresholding system where CC were performed according to the amplitude (ZRMS, 0.125 to 1.250 Ω) of a band-passed TI signal (ZCC). Protocol A was a retrospective analysis of a 12-porcine study to characterise the response of two VF waveform metrics: amplitude spectrum area (AMSA) and mean slope (MS), to varying CC quality. Protocol B was a prospective 12-porcine study to determine if changes in VF waveform metrics, due to CC quality, were associated with defibrillation outcome. RESULTS: Protocol A: A directly proportional relationship was observed between ZRMS and CC depth applied within each subject (r = 0.90; p <0.001). A positive relationship was observed between ZRMS and both AMSA (p <0.001) and MS (p <0.001), where greater TI thresholds were associated with greater waveform metrics. PROTOCOL B: MS was associated with return of circulation following defibrillation (odds ratio = 2.657; p = 0.043). CONCLUSION: TI-thresholding was an effective way to control CC depth within-subjects. Compressions applied according to higher TI thresholds evoked an increase in AMSA and MS. The response in MS due to deeper CC resulted in a greater incidence of ROSC compared to shallow chest compressions.
Assuntos
Amsacrina , Fibrilação Ventricular , Suínos , Animais , Fibrilação Ventricular/terapia , Impedância Elétrica , Estudos Prospectivos , Estudos RetrospectivosRESUMO
BACKGROUND: Reduced lead systems utilizing patient-specific transformation weights have been reported to achieve superior estimates than those utilizing population-based transformation weights. We report upon the effects of ischemic-type electrocardiographic changes on the estimation performance of a reduced lead system when utilizing patient-specific transformation weights and population-based transformation weights. METHOD: A reduced lead system that used leads I, II, V2 and V5 to estimate leads V1, V3, V4, and V6 was investigated. Patient-specific transformation weights were developed on electrocardiograms containing no ischemic-type changes. Patient-specific and population-based transformations weights were assessed on 45 electrocardiograms with ischemic-type changes and 59 electrocardiograms without ischemic-type changes. RESULTS: For patient-specific transformation weights the estimation performance measured as median root mean squared error values (no ischemic-type changes vs. ischemic-type changes) was found to be (V1, 27.5 µV vs. 95.8 µV, P<.001; V3, 33.9 µV vs. 65.2 µV, P<.001; V4, 24.8 µV vs. 62.0 µV, P<.001; V6, 11.7 µV vs. 51.5 µV, P<.001). The median magnitude of ST-amplitude difference 60 ms after the J-point between patient-specific estimated leads and actual recorded leads (no ischemic-type changes vs. ischemic-type changes) was found to be (V1, 18.9 µV vs. 61.4 µV, P<.001; V3, 14.3 µV vs. 61.1 µV, P<.001; V4, 9.7 µV vs. 61.3 µV, P<.001; V6, 5.9 µV vs. 46.0 µV, P<.001). CONCLUSION: The estimation performance of patient-specific transformations weights can deteriorate when ischemic-type changes develop. Performance assessment of patient-specific transformation weights should be performed using electrocardiographic data that represent the monitoring situation for which the reduced lead system is targeted.
Assuntos
Algoritmos , Diagnóstico por Computador/métodos , Eletrocardiografia/instrumentação , Eletrocardiografia/métodos , Infarto do Miocárdio/diagnóstico , Adulto , Feminino , Humanos , Masculino , Pessoa de Meia-Idade , Reprodutibilidade dos Testes , Sensibilidade e EspecificidadeRESUMO
Microvascular haemodynamic alterations are associated with coronary artery disease (CAD). The conjunctival microcirculation can easily be assessed non-invasively. However, the microcirculation of the conjunctiva has not been previously explored in clinical algorithms aimed at identifying patients with CAD. This case-control study involved 66 patients with post-myocardial infarction and 66 gender-matched healthy controls. Haemodynamic properties of the conjunctival microcirculation were assessed with a validated iPhone and slit lamp-based imaging tool. Haemodynamic properties were extracted with semi-automated software and compared between groups. Biomarkers implicated in the development of CAD were assessed in combination with conjunctival microcirculatory parameters. The conjunctival blood vessel parameters and biomarkers were used to derive an algorithm to aid in the screening of patients for CAD. Conjunctival blood velocity measured in combination with the blood biomarkers (N-terminal pro-brain natriuretic peptide and adiponectin) had an area under receiver operator characteristic curve (AUROC) of 0.967, sensitivity 93.0%, specificity 91.5% for CAD. This study demonstrated that the novel algorithm which included a combination of conjunctival blood vessel haemodynamic properties, and blood-based biomarkers could be used as a potential screening tool for CAD and should be validated for potential utility in asymptomatic individuals.
Assuntos
Algoritmos , Túnica Conjuntiva , Biomarcadores , Velocidade do Fluxo Sanguíneo , Estudos de Casos e Controles , Túnica Conjuntiva/irrigação sanguínea , Humanos , MicrocirculaçãoRESUMO
INTRODUCTION: Studies have shown that between 0.4% and 4% of all 12-lead electrocardiograms (ECGs) are recorded using incorrect electrode positions. Electrode misplacement can cause a misdiagnosis either by concealing a pathology or, on the contrary, by emulating a pathology. Irrespective of this fact, ECG textbooks contain little or no information regarding the effects of electrode misplacement. Moreover, current pedagogic tools, which include physical mannequins, do not allow for the free positioning of electrodes to demonstrate these effects. In recognition of this, an electrode misplacement simulator (EMS) has been developed in this study. METHODS: The EMS is a Web-based simulation developed using the Adobe Flash technology. The software allows the user to position the electrodes anywhere on the torso while rendering the corresponding ECG leads using body surface potential maps. A beta version of the EMS has been made available on the Internet. RESULTS: The EMS has been briefly evaluated by a random selection of delegates (n = 17) from the 37th Annual Conference on Computing in Cardiology. After completing representative tasks and using the EMS for approximately 30 minutes, all 17 participants completed a questionnaire. Overall, the responsiveness of the EMS was rated between 4 and 5 on a Likert scale, 94% of participants rated the "ease of use" between 4 and 5, and 88% of participants also rated the "look and feel" between 4 and 5 on a Likert scale. CONCLUSION: The EMS has the potential to be used to support researchers in enhancing criteria currently used for detecting electrode misplacement. It could also be used to assist academic staff in teaching the effects of electrode misplacement. In this respect, it is currently being used as part of an undergraduate "Clinical Physiology" degree program at the University of Ulster.
Assuntos
Cardiologia/educação , Simulação por Computador , Eletrocardiografia/instrumentação , Eletrodos , Mapeamento Potencial de Superfície Corporal/instrumentação , Humanos , Internet , Software , Inquéritos e QuestionáriosRESUMO
Electrocardiographic imaging is an imaging modality that has been introduced recently to help in visualizing the electrical activity of the heart and consequently guide the ablation therapy for ventricular arrhythmias. One of the main challenges of this modality is that the electrocardiographic signals recorded at the torso surface are contaminated with noise from different sources. Low amplitude leads are more affected by noise due to their low peak-to-peak amplitude. In this paper, we have studied 6 datasets from two torso tank experiments (Bordeaux and Utah experiments) to investigate the impact of removing or interpolating these low amplitude leads on the inverse reconstruction of cardiac electrical activity. Body surface potential maps used were calculated by using the full set of recorded leads, removing 1, 6, 11, 16, or 21 low amplitude leads, or interpolating 1, 6, 11, 16, or 21 low amplitude leads using one of the three interpolation methods - Laplacian interpolation, hybrid interpolation, or the inverse-forward interpolation. The epicardial potential maps and activation time maps were computed from these body surface potential maps and compared with those recorded directly from the heart surface in the torso tank experiments. There was no significant change in the potential maps and activation time maps after the removal of up to 11 low amplitude leads. Laplacian interpolation and hybrid interpolation improved the inverse reconstruction in some datasets and worsened it in the rest. The inverse forward interpolation of low amplitude leads improved it in two out of 6 datasets and at least remained the same in the other datasets. It was noticed that after doing the inverse-forward interpolation, the selected lambda value was closer to the optimum lambda value that gives the inverse solution best correlated with the recorded one.
RESUMO
BACKGROUND AND OBJECTIVE: Cloud computing has the ability to offload processing tasks to a remote computing resources. Presently, the majority of biomedical digital signal processing involves a ground-up approach by writing code in a variety of languages. This may reduce the time a researcher or health professional has to process data, while increasing the barrier to entry to those with little or no software development experience. In this study, we aim to provide a service capable of handling and processing biomedical data via a code-free interface. Furthermore, our solution should support multiple file formats and processing languages while saving user inputs for repeated use. METHODS: A web interface via the Python-based Django framework was developed with the potential to shorten the time taken to create an algorithm, encourage code reuse, and democratise digital signal processing tasks for non-technical users using a code-free user interface. A user can upload data, create an algorithm and download the result. Using discrete functions and multi-lingual scripts (e.g. MATLAB or Python), the user can manipulate data rapidly in a repeatable manner. Multiple data file formats are supported by a decision-based file handler and user authentication-based storage allocation method. RESULTS: The proposed system has been demonstrated as effective in handling multiple input data types in various programming languages, including Python and MATLAB. This, in turn, has the potential to reduce currently experienced bottlenecks in cross-platform development of bio-signal processing algorithms. The source code for this system has been made available to encourage reuse. A cloud service for digital signal processing has the ability to reduce the apparent complexity and abstract the need to understand the intricacies of signal processing. CONCLUSION: We have introduced a web-based system capable of reducing the barrier to entry for inexperienced programmers. Furthermore, our system is reproducable and scalable for use in a variety of clinical or research fields.
Assuntos
Computação em Nuvem , Software , Algoritmos , Linguagens de Programação , Processamento de Sinais Assistido por ComputadorRESUMO
Microcirculatory dysfunction occurs early in cardiovascular disease (CVD) development. Acute myocardial infarction (MI) is a late consequence of CVD. The conjunctival microcirculation is readily-accessible for quantitative assessment and has not previously been studied in MI patients. We compared the conjunctival microcirculation of acute MI patients and age/sex-matched healthy controls to determine if there were differences in microcirculatory parameters. We acquired images using an iPhone 6s and slit-lamp biomicroscope. Parameters measured included diameter, axial velocity, wall shear rate and blood volume flow. Results are for all vessels as they were not sub-classified into arterioles or venules. The conjunctival microcirculation was assessed in 56 controls and 59 inpatients with a presenting diagnosis of MI. Mean vessel diameter for the controls was 21.41 ± 7.57 µm compared to 22.32 ± 7.66 µm for the MI patients (p < 0.001). Axial velocity for the controls was 0.53 ± 0.15 mm/s compared to 0.49 ± 0.17 mm/s for the MI patients (p < 0.001). Wall shear rate was higher for controls than MI patients (162 ± 93 s-1 vs 145 ± 88 s-1, p < 0.001). Blood volume flow did not differ significantly for the controls and MI patients (153 ± 124 pl/s vs 154 ± 125 pl/s, p = 0.84). This pilot iPhone and slit-lamp assessment of the conjunctival microcirculation found lower axial velocity and wall shear rate in patients with acute MI. Further study is required to correlate these findings further and assess long-term outcomes in this patient group with a severe CVD phenotype.