Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 31
Filtrar
Mais filtros

Base de dados
Tipo de documento
Intervalo de ano de publicação
1.
Sensors (Basel) ; 22(20)2022 Oct 14.
Artigo em Inglês | MEDLINE | ID: mdl-36298178

RESUMO

Power line infrastructure is available almost everywhere. Positioning systems aim to estimate where a device or target is. Consequently, there may be an opportunity to use power lines for positioning purposes. This survey article reports the different efforts, working principles, and possibilities for implementing positioning systems relying on power line infrastructure for power line positioning systems (PLPS). Since Power Line Communication (PLC) systems of different characteristics have been deployed to provide communication services using the existing mains, we also address how PLC systems may be employed to build positioning systems. Although some efforts exist, PLPS are still prospective and thus open to research and development, and we try to indicate the possible directions and potential applications for PLPS.

2.
Sensors (Basel) ; 21(1)2021 Jan 04.
Artigo em Inglês | MEDLINE | ID: mdl-33406684

RESUMO

The aim of this paper is to formulate the physical layer of the broadband and narrowband power line communication (PLC) systems described in standards IEEE 1901 and IEEE 1901.2, which address new communication technologies over electrical networks for Smart Grid and Internet of Things applications. Specifically, this paper presents a mathematical formulation by means of matrices of a transmitter and receiver system based on windowed OFDM. The proposed formulation is essential for obtaining the input-output relation, as well as an analysis of the interference present in the system. It is very useful for simulating PLC systems using software designed to operate primarily on whole matrices and arrays, such as Matlab. In addition, it eases the analysis and design of different receiver configurations, simply by modifying or adding a matrix. Since the relevant standards only describe the blocks corresponding to the transmitter, and leave the set-up of the receiver open to the manufacturer, we analysed four different possible schemes that include window functions in different configurations. In simulations, the behaviour of each of these schemes is analysed in terms of bit error and achievable data rates using artificial and real noises.

3.
Sensors (Basel) ; 20(11)2020 Jun 01.
Artigo em Inglês | MEDLINE | ID: mdl-32492938

RESUMO

During the last years, Electrocardiographic Imaging (ECGI) has emerged as a powerful and promising clinical tool to support cardiologists. Starting from a plurality of potential measurements on the torso, ECGI yields a noninvasive estimation of their causing potentials on the epicardium. This unprecedented amount of measured cardiac signals needs to be conditioned and adapted to current knowledge and methods in cardiac electrophysiology in order to maximize its support to the clinical practice. In this setting, many cardiac indices are defined in terms of the so-called bipolar electrograms, which correspond with differential potentials between two spatially close potential measurements. Our aim was to contribute to the usefulness of ECGI recordings in the current knowledge and methods of cardiac electrophysiology. For this purpose, we first analyzed the basic stages of conventional cardiac signal processing and scrutinized the implications of the spatial-temporal nature of signals in ECGI scenarios. Specifically, the stages of baseline wander removal, low-pass filtering, and beat segmentation and synchronization were considered. We also aimed to establish a mathematical operator to provide suitable bipolar electrograms from the ECGI-estimated epicardium potentials. Results were obtained on data from an infarction patient and from a healthy subject. First, the low-frequency and high-frequency noises are shown to be non-independently distributed in the ECGI-estimated recordings due to their spatial dimension. Second, bipolar electrograms are better estimated when using the criterion of the maximum-amplitude difference between spatial neighbors, but also a temporal delay in discrete time of about 40 samples has to be included to obtain the usual morphology in clinical bipolar electrograms from catheters. We conclude that spatial-temporal digital signal processing and bipolar electrograms can pave the way towards the usefulness of ECGI recordings in the cardiological clinical practice. The companion paper is devoted to analyzing clinical indices obtained from ECGI epicardial electrograms measuring waveform variability and repolarization tissue properties.


Assuntos
Mapeamento Potencial de Superfície Corporal , Eletrocardiografia , Pericárdio/fisiologia , Processamento de Sinais Assistido por Computador , Diagnóstico por Imagem , Humanos
4.
Sensors (Basel) ; 20(11)2020 May 29.
Artigo em Inglês | MEDLINE | ID: mdl-32485879

RESUMO

During the last years, attention and controversy have been present for the first commercially available equipment being used in Electrocardiographic Imaging (ECGI), a new cardiac diagnostic tool which opens up a new field of diagnostic possibilities. Previous knowledge and criteria of cardiologists using intracardiac Electrograms (EGM) should be revisited from the newly available spatial-temporal potentials, and digital signal processing should be readapted to this new data structure. Aiming to contribute to the usefulness of ECGI recordings in the current knowledge and methods of cardiac electrophysiology, we previously presented two results: First, spatial consistency can be observed even for very basic cardiac signal processing stages (such as baseline wander and low-pass filtering); second, useful bipolar EGMs can be obtained by a digital processing operator searching for the maximum amplitude and including a time delay. In addition, this work aims to demonstrate the functionality of ECGI for cardiac electrophysiology from a twofold view, namely, through the analysis of the EGM waveforms, and by studying the ventricular repolarization properties. The former is scrutinized in terms of the clustering properties of the unipolar an bipolar EGM waveforms, in control and myocardial infarction subjects, and the latter is analyzed using the properties of T-wave alternans (TWA) in control and in Long-QT syndrome (LQTS) example subjects. Clustered regions of the EGMs were spatially consistent and congruent with the presence of infarcted tissue in unipolar EGMs, and bipolar EGMs with adequate signal processing operators hold this consistency and yielded a larger, yet moderate, number of spatial-temporal regions. TWA was not present in control compared with an LQTS subject in terms of the estimated alternans amplitude from the unipolar EGMs, however, higher spatial-temporal variation was present in LQTS torso and epicardium measurements, which was consistent through three different methods of alternans estimation. We conclude that spatial-temporal analysis of EGMs in ECGI will pave the way towards enhanced usefulness in the clinical practice, so that atomic signal processing approach should be conveniently revisited to be able to deal with the great amount of information that ECGI conveys for the clinician.


Assuntos
Arritmias Cardíacas , Eletrocardiografia , Técnicas Eletrofisiológicas Cardíacas , Arritmias Cardíacas/diagnóstico , Mapeamento Potencial de Superfície Corporal , Análise por Conglomerados , Humanos
5.
Sensors (Basel) ; 18(11)2018 Nov 05.
Artigo em Inglês | MEDLINE | ID: mdl-30400587

RESUMO

In recent years, a number of proposals for electrocardiogram (ECG) monitoring based on mobile systems have been delivered. We propose here an STM32F-microcontroller-based ECG mobile system providing both long-term (several weeks) Holter monitoring and 12-lead ECG recording, according to the clinical standard requirements for these kinds of recordings, which in addition can yield further digital compression at stages close to the acquisition. The system can be especially useful in rural areas of developing countries, where the lack of specialized medical personnel justifies the introduction of telecardiology services, and the limitations of coverage and bandwidth of cellular networks require the use of efficient signal compression systems. The prototype was implemented using a small architecture, with a 16-bits-per-sample resolution. We also used a low-noise instrumentation amplifier TI ADS1198, which has a multiplexer and an analog-to-digital converter (16 bits and 8 channels) connected to the STM32F processor, the architecture of which incorporates a digital signal processing unit and a floating-point unit. On the one hand, the system portability allows the user to take the prototype in her/his pocket and to perform an ECG examination, either in 12-lead controlled conditions or in Holter monitoring, according to the required clinical scenario. An app in the smartphone is responsible for giving the users a friendly interface to set up the system. On the other hand, electronic health recording of the patients are registered in a web application, which in turn allows them to connect to the Internet from their cellphones, and the ECG signals are then sent though a web server for subsequent and ubiquitous analysis by doctors at any convenient terminal device. In order to determine the quality of the received signals, system testing was performed in the three following scenarios: (1) The prototype was connected to the patient and the signals were subsequently stored; (2) the prototype was connected to the patient and the data were subsequently transferred to the cellphone; (3) the prototype was connected to the patient, and the data were transferred to the cellphone and to the web via the Internet. An additional benchmarking test with expert clinicians showed the clinical quality provided by the system. The proposed ECG system is the first step and paves the way toward mobile cardiac monitors in terms of compatibility with the electrocardiographic practice, including the long-term monitoring, the usability with 12 leads, and the possibility of incorporating signal compression at the early stages of the ECG acquisition.


Assuntos
Eletrocardiografia/instrumentação , Processamento de Sinais Assistido por Computador , Telemedicina/instrumentação , Calibragem , Telefone Celular , Eletrodos , Humanos , Internet , Reprodutibilidade dos Testes , Smartphone , Software
6.
Sensors (Basel) ; 18(5)2018 May 01.
Artigo em Inglês | MEDLINE | ID: mdl-29723990

RESUMO

Despite the wide literature on R-wave detection algorithms for ECG Holter recordings, the long-term monitoring applications are bringing new requirements, and it is not clear that the existing methods can be straightforwardly used in those scenarios. Our aim in this work was twofold: First, we scrutinized the scope and limitations of existing methods for Holter monitoring when moving to long-term monitoring; Second, we proposed and benchmarked a beat detection method with adequate accuracy and usefulness in long-term scenarios. A longitudinal study was made with the most widely used waveform analysis algorithms, which allowed us to tune the free parameters of the required blocks, and a transversal study analyzed how these parameters change when moving to different databases. With all the above, the extension to long-term monitoring in a database of 7-day Holter monitoring was proposed and analyzed, by using an optimized simultaneous-multilead processing. We considered both own and public databases. In this new scenario, the noise-avoid mechanisms are more important due to the amount of noise that exists in these recordings, moreover, the computational efficiency is a key parameter in order to export the algorithm to the clinical practice. The method based on a Polling function outperformed the others in terms of accuracy and computational efficiency, yielding 99.48% sensitivity, 99.54% specificity, 99.69% positive predictive value, 99.46% accuracy, and 0.85% error for MIT-BIH arrhythmia database. We conclude that the method can be used in long-term Holter monitoring systems.

7.
Sensors (Basel) ; 17(11)2017 Oct 25.
Artigo em Inglês | MEDLINE | ID: mdl-29068362

RESUMO

Noise and artifacts are inherent contaminating components and are particularly present in Holter electrocardiogram (ECG) monitoring. The presence of noise is even more significant in long-term monitoring (LTM) recordings, as these are collected for several days in patients following their daily activities; hence, strong artifact components can temporarily impair the clinical measurements from the LTM recordings. Traditionally, the noise presence has been dealt with as a problem of non-desirable component removal by means of several quantitative signal metrics such as the signal-to-noise ratio (SNR), but current systems do not provide any information about the true impact of noise on the ECG clinical evaluation. As a first step towards an alternative to classical approaches, this work assesses the ECG quality under the assumption that an ECG has good quality when it is clinically interpretable. Therefore, our hypotheses are that it is possible (a) to create a clinical severity score for the effect of the noise on the ECG, (b) to characterize its consistency in terms of its temporal and statistical distribution, and (c) to use it for signal quality evaluation in LTM scenarios. For this purpose, a database of external event recorder (EER) signals is assembled and labeled from a clinical point of view for its use as the gold standard of noise severity categorization. These devices are assumed to capture those signal segments more prone to be corrupted with noise during long-term periods. Then, the ECG noise is characterized through the comparison of these clinical severity criteria with conventional quantitative metrics taken from traditional noise-removal approaches, and noise maps are proposed as a novel representation tool to achieve this comparison. Our results showed that neither of the benchmarked quantitative noise measurement criteria represent an accurate enough estimation of the clinical severity of the noise. A case study of long-term ECG is reported, showing the statistical and temporal correspondences and properties with respect to EER signals used to create the gold standard for clinical noise. The proposed noise maps, together with the statistical consistency of the characterization of the noise clinical severity, paves the way towards forthcoming systems providing us with noise maps of the noise clinical severity, allowing the user to process different ECG segments with different techniques and in terms of different measured clinical parameters.


Assuntos
Eletrocardiografia/métodos , Algoritmos , Artefatos , Eletrocardiografia/normas , Eletrocardiografia Ambulatorial , Humanos , Razão Sinal-Ruído
8.
Comput Methods Programs Biomed ; 249: 108157, 2024 Jun.
Artigo em Inglês | MEDLINE | ID: mdl-38582037

RESUMO

BACKGROUND AND OBJECTIVE: T-wave alternans (TWA) is a fluctuation in the repolarization morphology of the ECG. It is associated with cardiac instability and sudden cardiac death risk. Diverse methods have been proposed for TWA analysis. However, TWA detection in ambulatory settings remains a challenge due to the absence of standardized evaluation metrics and detection thresholds. METHODS: In this work we use traditional TWA analysis signal processing-based methods for feature extraction, and two machine learning (ML) methods, namely, K-nearest-neighbor (KNN) and random forest (RF), for TWA detection, addressing hyper-parameter tuning and feature selection. The final goal is the detection in ambulatory recordings of short, non-sustained and sparse TWA events. RESULTS: We train ML methods to detect a wide variety of alternant voltage from 20 to 100 µV, i.e., ranging from non-visible micro-alternans to TWA of higher amplitudes, to recognize a wide range in concordance to risk stratification. In classification, RF outperforms significantly the recall in comparison with the signal processing methods, at the expense of a small lost in precision. Despite ambulatory detection stands for an imbalanced category context, the trained ML systems always outperform signal processing methods. CONCLUSIONS: We propose a comprehensive integration of multiple variables inspired by TWA signal processing methods to fed learning-based methods. ML models consistently outperform the best signal processing methods, yielding superior recall scores.


Assuntos
Arritmias Cardíacas , Eletrocardiografia Ambulatorial , Humanos , Eletrocardiografia Ambulatorial/métodos , Frequência Cardíaca , Arritmias Cardíacas/diagnóstico , Morte Súbita Cardíaca , Processamento de Sinais Assistido por Computador , Eletrocardiografia/métodos
9.
Med Biol Eng Comput ; 61(9): 2227-2240, 2023 Sep.
Artigo em Inglês | MEDLINE | ID: mdl-37010711

RESUMO

Noise and artifacts affect strongly the quality of the electrocardiogram (ECG) in long-term ECG monitoring (LTM), making some of its parts impractical for diagnosis. The clinical severity of noise defines a qualitative quality score according to the manner clinicians make the interpretation of the ECG, in contrast to assess noise from a quantitative standpoint. So clinical noise refers to a scale of different levels of qualitative severity of noise which aims at elucidating which ECG fragments are valid to achieve diagnosis from a clinical point of view, unlike the traditional approach, which assesses noise in terms of quantitative severity. This work proposes the use of machine learning (ML) techniques to categorize different qualitative noise severity using a database annotated according to a clinical noise taxonomy as gold standard. A comparative study is carried out using five representative ML methods, namely, K neareast neighbors, decision trees, support vector machine, single-layer perceptron, and random forest. The models are fed by signal quality indexes characterizing the waveform in time and frequency domains, as well as from a statistical viewpoint, to distinguish between clinically valid ECG segments from invalid ones. A solid methodology to prevent overfitting to both the dataset and the patient is developed, taking into account balance of classes, patient separation, and patient rotation in the test set. All the proposed learning systems have demonstrated good classification performance, attaining a recall, precision, and F1 score up to 0.78, 0.80, and 0.77, respectively, in the test set by a single-layer perceptron approach. These systems provide a classification solution for assessing the clinical quality of the ECG taken from LTM recordings. Graphical Abstract Clinical Noise Severity Classification based on Machine Learning techniques towards Long-Term ECG Monitoring.


Assuntos
Eletrocardiografia , Redes Neurais de Computação , Humanos , Eletrocardiografia/métodos , Algoritmo Florestas Aleatórias , Artefatos , Aprendizado de Máquina , Algoritmos
10.
Heliyon ; 9(1): e12947, 2023 Jan.
Artigo em Inglês | MEDLINE | ID: mdl-36699267

RESUMO

Background and objective: T-wave alternans (TWA) is a fluctuation of the ST-T complex of the surface electrocardiogram (ECG) on an every-other-beat basis. It has been shown to be clinically helpful for sudden cardiac death stratification, though the lack of a gold standard to benchmark detection methods limits its application and impairs the development of alternative techniques. In this work, a novel approach based on machine learning for TWA detection is proposed. Additionally, a complete experimental setup is presented for TWA detection methods benchmarking. Methods: The proposed experimental setup is based on the use of open-source databases to enable experiment replication and the use of real ECG signals with added TWA episodes. Also, intra-patient overfitting and class imbalance have been carefully avoided. The Spectral Method (SM), the Modified Moving Average Method (MMA), and the Time Domain Method (TM) are used to obtain input features to the Machine Learning (ML) algorithms, namely, K Nearest Neighbor, Decision Trees, Random Forest, Support Vector Machine and Multi-Layer Perceptron. Results: There were not found large differences in the performance of the different ML algorithms. Decision Trees showed the best overall performance (accuracy 0.88 ± 0.04 , precision 0.89 ± 0.05 , Recall 0.90 ± 0.05 , F1 score 0.89 ± 0.03 ). Compared to the SM (accuracy 0.79, precision 0.93, Recall 0.64, F1 score 0.76) there was an improvement in every metric except for the precision. Conclusions: In this work, a realistic database to test the presence of TWA using ML algorithms was assembled. The ML algorithms overall outperformed the SM used as a gold standard. Learning from data to identify alternans elicits a substantial detection growth at the expense of a small increment of the false alarm.

11.
Comput Biol Med ; 38(1): 1-13, 2008 Jan.
Artigo em Inglês | MEDLINE | ID: mdl-17669389

RESUMO

The electrocardiogram (ECG) is widely used for diagnosis of heart diseases. Good quality ECG are utilized by physicians for interpretation and identification of physiological and pathological phenomena. However, in real situations, ECG recordings are often corrupted by artifacts. Two dominant artifacts present in ECG recordings are: (1) high-frequency noise caused by electromyogram induced noise, power line interferences, or mechanical forces acting on the electrodes; (2) baseline wander (BW) that may be due to respiration or the motion of the patients or the instruments. These artifacts severely limit the utility of recorded ECGs and thus need to be removed for better clinical evaluation. Several methods have been developed for ECG enhancement. In this paper, we propose a new ECG enhancement method based on the recently developed empirical mode decomposition (EMD). The proposed EMD-based method is able to remove both high-frequency noise and BW with minimum signal distortion. The method is validated through experiments on the MIT-BIH databases. Both quantitative and qualitative results are given. The simulations show that the proposed EMD-based method provides very good results for denoising and BW removal.


Assuntos
Artefatos , Eletrocardiografia/métodos , Processamento de Sinais Assistido por Computador , Algoritmos , Arritmias Cardíacas/diagnóstico , Arritmias Cardíacas/fisiopatologia , Bases de Dados Factuais , Diagnóstico por Computador/métodos , Humanos , Reprodutibilidade dos Testes
12.
IEEE Trans Biomed Eng ; 54(4): 766-9, 2007 Apr.
Artigo em Inglês | MEDLINE | ID: mdl-17405386

RESUMO

Most of the recent electrocardiogram (ECG) compression approaches developed with the wavelet transform are implemented using the discrete wavelet transform. Conversely, wavelet packets (WP) are not extensively used, although they are an adaptive decomposition for representing signals. In this paper, we present a thresholding-based method to encode ECG signals using WP. The design of the compressor has been carried out according to two main goals: (1) The scheme should be simple to allow real-time implementation; (2) quality, i.e., the reconstructed signal should be as similar as possible to the original signal. The proposed scheme is versatile as far as neither QRS detection nor a priori signal information is required. As such, it can thus be applied to any ECG. Results show that WP perform efficiently and can now be considered as an alternative in ECG compression applications.


Assuntos
Algoritmos , Artefatos , Compressão de Dados/métodos , Eletrocardiografia/métodos , Processamento de Sinais Assistido por Computador , Estudos de Viabilidade , Humanos , Reprodutibilidade dos Testes , Sensibilidade e Especificidade
13.
Comput Methods Programs Biomed ; 145: 147-155, 2017 Jul.
Artigo em Inglês | MEDLINE | ID: mdl-28552120

RESUMO

BACKGROUND AND OBJECTIVE: T-wave alternans (TWA) is a fluctuation of the ST-T complex occurring on an every-other-beat basis of the surface electrocardiogram (ECG). It has been shown to be an informative risk stratifier for sudden cardiac death, though the lack of gold standard to benchmark detection methods has promoted the use of synthetic signals. This work proposes a novel signal model to study the performance of a TWA detection. Additionally, the methodological validation of a denoising technique based on empirical mode decomposition (EMD), which is used here along with the spectral method, is also tackled. METHODS: The proposed test bed system is based on the following guidelines: (1) use of open source databases to enable experimental replication; (2) use of real ECG signals and physiological noise; (3) inclusion of randomized TWA episodes. Both sensitivity (Se) and specificity (Sp) are separately analyzed. Also a nonparametric hypothesis test, based on Bootstrap resampling, is used to determine whether the presence of the EMD block actually improves the performance. RESULTS: The results show an outstanding specificity when the EMD block is used, even in very noisy conditions (0.96 compared to 0.72 for SNR = 8 dB), being always superior than that of the conventional SM alone. Regarding the sensitivity, using the EMD method also outperforms in noisy conditions (0.57 compared to 0.46 for SNR=8 dB), while it decreases in noiseless conditions. CONCLUSIONS: The proposed test setting designed to analyze the performance guarantees that the actual physiological variability of the cardiac system is reproduced. The use of the EMD-based block in noisy environment enables the identification of most patients with fatal arrhythmias.


Assuntos
Arritmias Cardíacas/diagnóstico , Eletrocardiografia/normas , Benchmarking , Humanos , Sensibilidade e Especificidade
14.
IEEE Trans Biomed Eng ; 53(10): 1943-53, 2006 Oct.
Artigo em Inglês | MEDLINE | ID: mdl-17019858

RESUMO

Voice diseases have been increasing dramatically in recent times due mainly to unhealthy social habits and voice abuse. These diseases must be diagnosed and treated at an early stage, especially in the case of larynx cancer. It is widely recognized that vocal and voice diseases do not necessarily cause changes in voice quality as perceived by a listener. Acoustic analysis could be a useful tool to diagnose this type of disease. Preliminary research has shown that the detection of voice alterations can be carried out by means of Gaussian mixture models and short-term mel cepstral parameters complemented by frame energy together with first and second derivatives. This paper, using the F-Ratio and Fisher's discriminant ratio, will demonstrate that the detection of voice impairments can be performed using both mel cesptral vectors and their first derivative, ignoring the second derivative.


Assuntos
Diagnóstico por Computador/métodos , Modelos Biológicos , Espectrografia do Som/métodos , Medida da Produção da Fala/métodos , Distúrbios da Voz/diagnóstico , Distúrbios da Voz/fisiopatologia , Qualidade da Voz , Simulação por Computador , Humanos , Modelos Estatísticos , Distribuição Normal , Reprodutibilidade dos Testes , Sensibilidade e Especificidade
15.
Front Physiol ; 7: 82, 2016.
Artigo em Inglês | MEDLINE | ID: mdl-27014083

RESUMO

Great effort has been devoted in recent years to the development of sudden cardiac risk predictors as a function of electric cardiac signals, mainly obtained from the electrocardiogram (ECG) analysis. But these prediction techniques are still seldom used in clinical practice, partly due to its limited diagnostic accuracy and to the lack of consensus about the appropriate computational signal processing implementation. This paper addresses a three-fold approach, based on ECG indices, to structure this review on sudden cardiac risk stratification. First, throughout the computational techniques that had been widely proposed for obtaining these indices in technical literature. Second, over the scientific evidence, that although is supported by observational clinical studies, they are not always representative enough. And third, via the limited technology transfer of academy-accepted algorithms, requiring further meditation for future systems. We focus on three families of ECG derived indices which are tackled from the aforementioned viewpoints, namely, heart rate turbulence (HRT), heart rate variability (HRV), and T-wave alternans. In terms of computational algorithms, we still need clearer scientific evidence, standardizing, and benchmarking, siting on advanced algorithms applied over large and representative datasets. New scenarios like electronic health recordings, big data, long-term monitoring, and cloud databases, will eventually open new frameworks to foresee suitable new paradigms in the near future.

16.
Med Eng Phys ; 27(9): 798-802, 2005 Nov.
Artigo em Inglês | MEDLINE | ID: mdl-15869896

RESUMO

The quality measurement of the reconstructed signal in an electrocardiogram (ECG) compression scheme must be obtained by objective means being the percentage root-mean-square difference (PRD) the most widely used. However, this parameter is dependent on the dc level so that confusion can be stated in the evaluation of ECG compressors. In this communication, it will be shown that if the performance of an ECG coder is evaluated only in terms of quality, considering exclusively the PRD parameter, incorrect conclusions can be inferred. The objective of this work is to propose the joint use of several parameters, as simulations will show, effectiveness and performance of the ECG coder are evaluated with more precision, and the way of inferring conclusions from the obtained results is more reliable.


Assuntos
Algoritmos , Compressão de Dados/métodos , Diagnóstico por Computador/métodos , Eletrocardiografia/métodos , Processamento de Sinais Assistido por Computador , Interpretação Estatística de Dados , Humanos , Modelos Cardiovasculares , Modelos Estatísticos
17.
Physiol Meas ; 36(9): 1981-94, 2015 Sep.
Artigo em Inglês | MEDLINE | ID: mdl-26260978

RESUMO

The aim of electrocardiogram (ECG) compression is to reduce the amount of data as much as possible while preserving the significant information for diagnosis. Objective metrics that are derived directly from the signal are suitable for controlling the quality of the compressed ECGs in practical applications. Many approaches have employed figures of merit based on the percentage root mean square difference (PRD) for this purpose. The benefits and drawbacks of the PRD measures, along with other metrics for quality assessment in ECG compression, are analysed in this work. We propose the use of the root mean square error (RMSE) for quality control because it provides a clearer and more stable idea about how much the retrieved ECG waveform, which is the reference signal for establishing diagnosis, separates from the original. For this reason, the RMSE is applied here as the target metric in a thresholding algorithm that relies on the retained energy. A state of the art compressor based on this approach, and its PRD-based counterpart, are implemented to test the actual capabilities of the proposed technique. Both compression schemes are employed in several experiments with the whole MIT-BIH Arrhythmia Database to assess both global and local signal distortion. The results show that, using the RMSE for quality control, the distortion of the reconstructed signal is better controlled without reducing the compression ratio.


Assuntos
Compressão de Dados/métodos , Eletrocardiografia/métodos , Algoritmos , Arritmias Cardíacas/fisiopatologia , Compressão de Dados/normas , Bases de Dados Factuais , Eletrocardiografia/normas , Controle de Qualidade
18.
IEEE J Biomed Health Inform ; 19(2): 508-19, 2015 Mar.
Artigo em Inglês | MEDLINE | ID: mdl-24846672

RESUMO

Recent results in telecardiology show that compressed sensing (CS) is a promising tool to lower energy consumption in wireless body area networks for electrocardiogram (ECG) monitoring. However, the performance of current CS-based algorithms, in terms of compression rate and reconstruction quality of the ECG, still falls short of the performance attained by state-of-the-art wavelet-based algorithms. In this paper, we propose to exploit the structure of the wavelet representation of the ECG signal to boost the performance of CS-based methods for compression and reconstruction of ECG signals. More precisely, we incorporate prior information about the wavelet dependencies across scales into the reconstruction algorithms and exploit the high fraction of common support of the wavelet coefficients of consecutive ECG segments. Experimental results utilizing the MIT-BIH Arrhythmia Database show that significant performance gains, in terms of compression rate and reconstruction quality, can be obtained by the proposed algorithms compared to current CS-based methods.


Assuntos
Compressão de Dados/métodos , Eletrocardiografia/métodos , Algoritmos , Bases de Dados Factuais , Humanos , Tecnologia de Sensoriamento Remoto , Análise de Ondaletas , Tecnologia sem Fio
19.
Med Eng Phys ; 26(7): 553-68, 2004 Sep.
Artigo em Inglês | MEDLINE | ID: mdl-15271283

RESUMO

In this work, a filter bank-based algorithm for electrocardiogram (ECG) signals compression is proposed. The new coder consists of three different stages. In the first one--the subband decomposition stage--we compare the performance of a nearly perfect reconstruction (N-PR) cosine-modulated filter bank with the wavelet packet (WP) technique. Both schemes use the same coding algorithm, thus permitting an effective comparison. The target of the comparison is the quality of the reconstructed signal, which must remain within predetermined accuracy limits. We employ the most widely used quality criterion for the compressed ECG: the percentage root-mean-square difference (PRD). It is complemented by means of the maximum amplitude error (MAX). The tests have been done for the 12 principal cardiac leads, and the amount of compression is evaluated by means of the mean number of bits per sample (MBPS) and the compression ratio (CR). The implementation cost for both the filter bank and the WP technique has also been studied. The results show that the N-PR cosine-modulated filter bank method outperforms the WP technique in both quality and efficiency.


Assuntos
Algoritmos , Eletrocardiografia , Modelos Cardiovasculares , Processamento de Sinais Assistido por Computador , Engenharia Biomédica , Interpretação Estatística de Dados , Fatores de Tempo
20.
Med Eng Phys ; 34(7): 892-9, 2012 Sep.
Artigo em Inglês | MEDLINE | ID: mdl-22056794

RESUMO

The recent use of long-term records in electroencephalography is becoming more frequent due to its diagnostic potential and the growth of novel signal processing methods that deal with these types of recordings. In these cases, the considerable volume of data to be managed makes compression necessary to reduce the bit rate for transmission and storage applications. In this paper, a new compression algorithm specifically designed to encode electroencephalographic (EEG) signals is proposed. Cosine modulated filter banks are used to decompose the EEG signal into a set of subbands well adapted to the frequency bands characteristic of the EEG. Given that no regular pattern may be easily extracted from the signal in time domain, a thresholding-based method is applied for quantizing samples. The method of retained energy is designed for efficiently computing the threshold in the decomposition domain which, at the same time, allows the quality of the reconstructed EEG to be controlled. The experiments are conducted over a large set of signals taken from two public databases available at Physionet and the results show that the compression scheme yields better compression than other reported methods.


Assuntos
Eletroencefalografia/métodos , Processamento de Sinais Assistido por Computador , Adolescente , Algoritmos , Criança , Pré-Escolar , Bases de Dados Factuais , Entropia , Feminino , Humanos , Lactente , Masculino , Adulto Jovem
SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA