Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 60
Filtrar
Mais filtros

Tipo de documento
Intervalo de ano de publicação
1.
Sensors (Basel) ; 23(5)2023 Feb 24.
Artigo em Inglês | MEDLINE | ID: mdl-36904731

RESUMO

The causes of ventricular fibrillation (VF) are not yet elucidated, and it has been proposed that different mechanisms might exist. Moreover, conventional analysis methods do not seem to provide time or frequency domain features that allow for recognition of different VF patterns in electrode-recorded biopotentials. The present work aims to determine whether low-dimensional latent spaces could exhibit discriminative features for different mechanisms or conditions during VF episodes. For this purpose, manifold learning using autoencoder neural networks was analyzed based on surface ECG recordings. The recordings covered the onset of the VF episode as well as the next 6 min, and comprised an experimental database based on an animal model with five situations, including control, drug intervention (amiodarone, diltiazem, and flecainide), and autonomic nervous system blockade. The results show that latent spaces from unsupervised and supervised learning schemes yielded moderate though quite noticeable separability among the different types of VF according to their type or intervention. In particular, unsupervised schemes reached a multi-class classification accuracy of 66%, while supervised schemes improved the separability of the generated latent spaces, providing a classification accuracy of up to 74%. Thus, we conclude that manifold learning schemes can provide a valuable tool for studying different types of VF while working in low-dimensional latent spaces, as the machine-learning generated features exhibit separability among different VF types. This study confirms that latent variables are better VF descriptors than conventional time or domain features, making this technique useful in current VF research on elucidation of the underlying VF mechanisms.


Assuntos
Eletrocardiografia , Fibrilação Ventricular , Animais , Eletrocardiografia/métodos , Redes Neurais de Computação
2.
Int J Sports Med ; 42(2): 138-146, 2021 Feb.
Artigo em Inglês | MEDLINE | ID: mdl-32842158

RESUMO

The aim of this study was to validate the measurements of the beat intervals taken at rest by the Omegawave® device by comparing them to an ambulatory electrocardiogram system. For this purpose, the electrocardiogram was digitally processed, time-aligned, and scrutinized for its suitable use as gold-standard. Rest measurements were made for 10 minutes on 5 different days to 10 men and 3 women (24.8±5.05 years; 71.82±11.02 kg; 174.35±9.13 cm). RR intervals were simultaneously recorded using the Omegawave device and a Holter electrocardiogram. The processing of Holter electrocardiogram signals included the detrending of baseline noise and a high-pass filtering for emphasizing the QRS complexes and attenuating the T waves. After obtaining the RR intervals from the electrocardiogram, those from the Omegawave device were automatically aligned to them with cross-correlation digital processing techniques and compared to check whether both measurements could be considered superimposable. A Bland-Altman analysis was applied to the 5 measurements made for all subjects. The Omegawave device exhibited very strong agreement with a quality-controlled Holter electrocardiogram. Deviations not exceeding 25 ms could be expected in 95% of the cases, which is within manageable ranges both for clinical practice and for sports.


Assuntos
Eletrocardiografia Ambulatorial/instrumentação , Eletrocardiografia Ambulatorial/normas , Adulto , Feminino , Humanos , Masculino , Adulto Jovem
3.
Sensors (Basel) ; 20(11)2020 Jun 01.
Artigo em Inglês | MEDLINE | ID: mdl-32492938

RESUMO

During the last years, Electrocardiographic Imaging (ECGI) has emerged as a powerful and promising clinical tool to support cardiologists. Starting from a plurality of potential measurements on the torso, ECGI yields a noninvasive estimation of their causing potentials on the epicardium. This unprecedented amount of measured cardiac signals needs to be conditioned and adapted to current knowledge and methods in cardiac electrophysiology in order to maximize its support to the clinical practice. In this setting, many cardiac indices are defined in terms of the so-called bipolar electrograms, which correspond with differential potentials between two spatially close potential measurements. Our aim was to contribute to the usefulness of ECGI recordings in the current knowledge and methods of cardiac electrophysiology. For this purpose, we first analyzed the basic stages of conventional cardiac signal processing and scrutinized the implications of the spatial-temporal nature of signals in ECGI scenarios. Specifically, the stages of baseline wander removal, low-pass filtering, and beat segmentation and synchronization were considered. We also aimed to establish a mathematical operator to provide suitable bipolar electrograms from the ECGI-estimated epicardium potentials. Results were obtained on data from an infarction patient and from a healthy subject. First, the low-frequency and high-frequency noises are shown to be non-independently distributed in the ECGI-estimated recordings due to their spatial dimension. Second, bipolar electrograms are better estimated when using the criterion of the maximum-amplitude difference between spatial neighbors, but also a temporal delay in discrete time of about 40 samples has to be included to obtain the usual morphology in clinical bipolar electrograms from catheters. We conclude that spatial-temporal digital signal processing and bipolar electrograms can pave the way towards the usefulness of ECGI recordings in the cardiological clinical practice. The companion paper is devoted to analyzing clinical indices obtained from ECGI epicardial electrograms measuring waveform variability and repolarization tissue properties.


Assuntos
Mapeamento Potencial de Superfície Corporal , Eletrocardiografia , Pericárdio/fisiologia , Processamento de Sinais Assistido por Computador , Diagnóstico por Imagem , Humanos
4.
Sensors (Basel) ; 20(11)2020 May 29.
Artigo em Inglês | MEDLINE | ID: mdl-32485879

RESUMO

During the last years, attention and controversy have been present for the first commercially available equipment being used in Electrocardiographic Imaging (ECGI), a new cardiac diagnostic tool which opens up a new field of diagnostic possibilities. Previous knowledge and criteria of cardiologists using intracardiac Electrograms (EGM) should be revisited from the newly available spatial-temporal potentials, and digital signal processing should be readapted to this new data structure. Aiming to contribute to the usefulness of ECGI recordings in the current knowledge and methods of cardiac electrophysiology, we previously presented two results: First, spatial consistency can be observed even for very basic cardiac signal processing stages (such as baseline wander and low-pass filtering); second, useful bipolar EGMs can be obtained by a digital processing operator searching for the maximum amplitude and including a time delay. In addition, this work aims to demonstrate the functionality of ECGI for cardiac electrophysiology from a twofold view, namely, through the analysis of the EGM waveforms, and by studying the ventricular repolarization properties. The former is scrutinized in terms of the clustering properties of the unipolar an bipolar EGM waveforms, in control and myocardial infarction subjects, and the latter is analyzed using the properties of T-wave alternans (TWA) in control and in Long-QT syndrome (LQTS) example subjects. Clustered regions of the EGMs were spatially consistent and congruent with the presence of infarcted tissue in unipolar EGMs, and bipolar EGMs with adequate signal processing operators hold this consistency and yielded a larger, yet moderate, number of spatial-temporal regions. TWA was not present in control compared with an LQTS subject in terms of the estimated alternans amplitude from the unipolar EGMs, however, higher spatial-temporal variation was present in LQTS torso and epicardium measurements, which was consistent through three different methods of alternans estimation. We conclude that spatial-temporal analysis of EGMs in ECGI will pave the way towards enhanced usefulness in the clinical practice, so that atomic signal processing approach should be conveniently revisited to be able to deal with the great amount of information that ECGI conveys for the clinician.


Assuntos
Arritmias Cardíacas , Eletrocardiografia , Técnicas Eletrofisiológicas Cardíacas , Arritmias Cardíacas/diagnóstico , Mapeamento Potencial de Superfície Corporal , Análise por Conglomerados , Humanos
5.
Sensors (Basel) ; 20(15)2020 Jul 27.
Artigo em Inglês | MEDLINE | ID: mdl-32726931

RESUMO

Ventricular fibrillation (VF) signals are characterized by highly volatile and erratic electrical impulses, the analysis of which is difficult given the complex behavior of the heart rhythms in the left (LV) and right ventricles (RV), as sometimes shown in intracardiac recorded Electrograms (EGM). However, there are few studies that analyze VF in humans according to the simultaneous behavior of heart signals in the two ventricles. The objective of this work was to perform a spectral and a non-linear analysis of the recordings of 22 patients with Congestive Heart Failure (CHF) and clinical indication for a cardiac resynchronization device, simultaneously obtained in LV and RV during induced VF in patients with a Biventricular Implantable Cardioverter Defibrillator (BICD) Contak Renewal IVTM (Boston Sci.). The Fourier Transform was used to identify the spectral content of the first six seconds of signals recorded in the RV and LV simultaneously. In addition, measurements that were based on Information Theory were scrutinized, including Entropy and Mutual Information. The results showed that in most patients the spectral envelopes of the EGM sources of RV and LV were complex, different, and with several frequency peaks. In addition, the Dominant Frequency (DF) in the LV was higher than in the RV, while the Organization Index (OI) had the opposite trend. The entropy measurements were more regular in the RV than in the LV, thus supporting the spectral findings. We can conclude that basic stochastic processing techniques should be scrutinized with caution and from basic to elaborated techniques, but they can provide us with useful information on the biosignals from both ventricles during VF.


Assuntos
Fibrilação Ventricular , Arritmias Cardíacas , Desfibriladores Implantáveis , Eletrocardiografia , Técnicas Eletrofisiológicas Cardíacas , Insuficiência Cardíaca , Ventrículos do Coração , Humanos , Fibrilação Ventricular/diagnóstico
6.
J Electrocardiol ; 52: 99-100, 2019.
Artigo em Inglês | MEDLINE | ID: mdl-30529813

RESUMO

Autonomic regulation plays a role in the progression of heart failure with reduced ejection fraction (HrEF).Twenty-one HFrEF patients, 60.8 ±â€¯13.1 years, receiving angiotensin inhibition, were replaced by angiotensin receptor-neprilysin inhibitor (ARNI). A 24-hour Holter recording was performed before and after 3 months of the maximum tolerated dose of ARNi. We evaluated changes in autonomic tone using heart rate variability (SDNN, rMSSD, pNN50, LF, HF, LF/HF, α1, α2), and heart rate turbulence (TO and TS). ARNI was up-titrated to a maximum daily dose of 190 ±â€¯102 mg, 47.5% of the target dose. ARNI therapy was not associated with any improvement in any of the parameters related with heart rate variability or heart rate turbulence (p > 0.05 for all). ARNI use at lower than target doses did not improve autonomic cardiac tone as evaluated by 24-hour Holter monitoring.


Assuntos
Aminobutiratos/administração & dosagem , Antagonistas de Receptores de Angiotensina/administração & dosagem , Sistema Nervoso Autônomo/efeitos dos fármacos , Insuficiência Cardíaca/tratamento farmacológico , Tetrazóis/administração & dosagem , Compostos de Bifenilo , Relação Dose-Resposta a Droga , Combinação de Medicamentos , Eletrocardiografia Ambulatorial , Feminino , Determinação da Frequência Cardíaca , Humanos , Masculino , Pessoa de Meia-Idade , Volume Sistólico , Valsartana
7.
Sensors (Basel) ; 19(18)2019 Sep 14.
Artigo em Inglês | MEDLINE | ID: mdl-31540042

RESUMO

During the last decades there has been a rapidly growing elderly population and the number of patients with chronic heart-related diseases has exploded. Many of them (such as those with congestive heart failure or some types of arrhythmias) require close medical supervision, thus imposing a big burden on healthcare costs in most western economies. Specifically, continuous or frequent Arterial Blood Pressure (ABP) and electrocardiogram (ECG) monitoring are important tools in the follow-up of many of these patients. In this work, we present a novel remote non-ambulatory and clinically validated heart self-monitoring system, which allows ABP and ECG monitoring to effectively identify clinically relevant arrhythmias. The system integrates digital transmission of the ECG and tensiometer measurements, within a patient-comfortable support, easy to recharge and with a multi-function software, all of them aiming to adapt for elderly people. The main novelty is that both physiological variables (ABP and ECG) are simultaneously measured in an ambulatory environment, which to our best knowledge is not readily available in the clinical market. Different processing techniques were implemented to analyze the heart rhythm, including pause detection, rhythm alterations and atrial fibrillation, hence allowing early detection of these diseases. Our results achieved clinical quality both for in-lab hardware testing and for ambulatory scenario validations. The proposed active assisted living (AAL) Sensor-based system is an end-to-end multidisciplinary system, fully connected to a platform and tested by the clinical team from beginning to end.


Assuntos
Moradias Assistidas , Coração/fisiologia , Monitorização Fisiológica/instrumentação , Telemedicina/instrumentação , Algoritmos , Eletrocardiografia , Humanos , Aplicativos Móveis , Processamento de Sinais Assistido por Computador , Interface Usuário-Computador
8.
Entropy (Basel) ; 21(4)2019 Apr 19.
Artigo em Inglês | MEDLINE | ID: mdl-33267133

RESUMO

Customer Relationship Management (CRM) is a fundamental tool in the hospitality industry nowadays, which can be seen as a big-data scenario due to the large amount of recordings which are annually handled by managers. Data quality is crucial for the success of these systems, and one of the main issues to be solved by businesses in general and by hospitality businesses in particular in this setting is the identification of duplicated customers, which has not received much attention in recent literature, probably and partly because it is not an easy-to-state problem in statistical terms. In the present work, we address the problem statement of duplicated customer identification as a large-scale data analysis, and we propose and benchmark a general-purpose solution for it. Our system consists of four basic elements: (a) A generic feature representation for the customer fields in a simple table-shape database; (b) An efficient distance for comparison among feature values, in terms of the Wagner-Fischer algorithm to calculate the Levenshtein distance; (c) A big-data implementation using basic map-reduce techniques to readily support the comparison of strategies; (d) An X-from-M criterion to identify those possible neighbors to a duplicated-customer candidate. We analyze the mass density function of the distances in the CRM text-based fields and characterized their behavior and consistency in terms of the entropy and of the mutual information for these fields. Our experiments in a large CRM from a multinational hospitality chain show that the distance distributions are statistically consistent for each feature, and that neighbourhood thresholds are automatically adjusted by the system at a first step and they can be subsequently more-finely tuned according to the manager experience. The entropy distributions for the different variables, as well as the mutual information between pairs, are characterized by multimodal profiles, where a wide gap between close and far fields is often present. This motivates the proposal of the so-called X-from-M strategy, which is shown to be computationally affordable, and can provide the expert with a reduced number of duplicated candidates to supervise, with low X values being enough to warrant the sensitivity required at the automatic detection stage. The proposed system again encourages and supports the benefits of big-data technologies in CRM scenarios for hotel chains, and rather than the use of ad-hoc heuristic rules, it promotes the research and development of theoretically principled approaches.

9.
Entropy (Basel) ; 21(6)2019 Jun 15.
Artigo em Inglês | MEDLINE | ID: mdl-33267308

RESUMO

The identification of patients with increased risk of Sudden Cardiac Death (SCD) has been widely studied during recent decades, and several quantitative measurements have been proposed from the analysis of the electrocardiogram (ECG) stored in 1-day Holter recordings. Indices based on nonlinear dynamics of Heart Rate Variability (HRV) have shown to convey predictive information in terms of factors related with the cardiac regulation by the autonomous nervous system, and among them, multiscale methods aim to provide more complete descriptions than single-scale based measures. However, there is limited knowledge on the suitability of nonlinear measurements to characterize the cardiac dynamics in current long-term monitoring scenarios of several days. Here, we scrutinized the long-term robustness properties of three nonlinear methods for HRV characterization, namely, the Multiscale Entropy (MSE), the Multiscale Time Irreversibility (MTI), and the Multifractal Spectrum (MFS). These indices were selected because all of them have been theoretically designed to take into account the multiple time scales inherent in healthy and pathological cardiac dynamics, and they have been analyzed so far when monitoring up to 24 h of ECG signals, corresponding to about 20 time scales. We analyzed them in 7-day Holter recordings from two data sets, namely, patients with Atrial Fibrillation and with Congestive Heart Failure, by reaching up to 100 time scales. In addition, a new comparison procedure is proposed to statistically compare the poblational multiscale representations in different patient or processing conditions, in terms of the non-parametric estimation of confidence intervals for the averaged median differences. Our results show that variance reduction is actually obtained in the multiscale estimators. The MSE (MTI) exhibited the lowest (largest) bias and variance at large scales, whereas all the methods exhibited a consistent description of the large-scale processes in terms of multiscale index robustness. In all the methods, the used algorithms could turn to give some inconsistency in the multiscale profile, which was checked not to be due to the presence of artifacts, but rather with unclear origin. The reduction in standard error for several-day recordings compared to one-day recordings was more evident in MSE, whereas bias was more patently present in MFS. Our results pave the way of these techniques towards their use, with improved algorithmic implementations and nonparametric statistical tests, in long-term cardiac Holter monitoring scenarios.

10.
Biomed Eng Online ; 17(1): 86, 2018 Jun 20.
Artigo em Inglês | MEDLINE | ID: mdl-29925384

RESUMO

BACKGROUND: The inverse problem in electrophysiology consists of the accurate estimation of the intracardiac electrical sources from a reduced set of electrodes at short distances and from outside the heart. This estimation can provide an image with relevant knowledge on arrhythmia mechanisms for the clinical practice. Methods based on truncated singular value decomposition (TSVD) and regularized least squares require a matrix inversion, which limits their resolution due to the unavoidable low-pass filter effect of the Tikhonov regularization techniques. METHODS: We propose to use, for the first time, a Mercer's kernel given by the Laplacian of the distance in the quasielectrostatic field equations, hence providing a Support Vector Regression (SVR) formulation by following the principles of the Dual Signal Model (DSM) principles for creating kernel algorithms. RESULTS: Simulations in one- and two-dimensional models show the performance of our Laplacian distance kernel technique versus several conventional methods. Firstly, the one-dimensional model is adjusted for yielding recorded electrograms, similar to the ones that are usually observed in electrophysiological studies, and suitable strategy is designed for the free-parameter search. Secondly, simulations both in one- and two-dimensional models show larger noise sensitivity in the estimated transfer matrix than in the observation measurements, and DSM-SVR is shown to be more robust to noisy transfer matrix than TSVD. CONCLUSION: These results suggest that our proposed DSM-SVR with Laplacian distance kernel can be an efficient alternative to improve the resolution in current and emerging intracardiac imaging systems.


Assuntos
Fenômenos Eletrofisiológicos , Coração/fisiologia , Modelos Cardiovasculares , Eletroencefalografia , Análise dos Mínimos Quadrados , Razão Sinal-Ruído , Máquina de Vetores de Suporte
11.
Sensors (Basel) ; 18(11)2018 Nov 05.
Artigo em Inglês | MEDLINE | ID: mdl-30400587

RESUMO

In recent years, a number of proposals for electrocardiogram (ECG) monitoring based on mobile systems have been delivered. We propose here an STM32F-microcontroller-based ECG mobile system providing both long-term (several weeks) Holter monitoring and 12-lead ECG recording, according to the clinical standard requirements for these kinds of recordings, which in addition can yield further digital compression at stages close to the acquisition. The system can be especially useful in rural areas of developing countries, where the lack of specialized medical personnel justifies the introduction of telecardiology services, and the limitations of coverage and bandwidth of cellular networks require the use of efficient signal compression systems. The prototype was implemented using a small architecture, with a 16-bits-per-sample resolution. We also used a low-noise instrumentation amplifier TI ADS1198, which has a multiplexer and an analog-to-digital converter (16 bits and 8 channels) connected to the STM32F processor, the architecture of which incorporates a digital signal processing unit and a floating-point unit. On the one hand, the system portability allows the user to take the prototype in her/his pocket and to perform an ECG examination, either in 12-lead controlled conditions or in Holter monitoring, according to the required clinical scenario. An app in the smartphone is responsible for giving the users a friendly interface to set up the system. On the other hand, electronic health recording of the patients are registered in a web application, which in turn allows them to connect to the Internet from their cellphones, and the ECG signals are then sent though a web server for subsequent and ubiquitous analysis by doctors at any convenient terminal device. In order to determine the quality of the received signals, system testing was performed in the three following scenarios: (1) The prototype was connected to the patient and the signals were subsequently stored; (2) the prototype was connected to the patient and the data were subsequently transferred to the cellphone; (3) the prototype was connected to the patient, and the data were transferred to the cellphone and to the web via the Internet. An additional benchmarking test with expert clinicians showed the clinical quality provided by the system. The proposed ECG system is the first step and paves the way toward mobile cardiac monitors in terms of compatibility with the electrocardiographic practice, including the long-term monitoring, the usability with 12 leads, and the possibility of incorporating signal compression at the early stages of the ECG acquisition.


Assuntos
Eletrocardiografia/instrumentação , Processamento de Sinais Assistido por Computador , Telemedicina/instrumentação , Calibragem , Telefone Celular , Eletrodos , Humanos , Internet , Reprodutibilidade dos Testes , Smartphone , Software
12.
Sensors (Basel) ; 18(8)2018 Aug 04.
Artigo em Inglês | MEDLINE | ID: mdl-30081559

RESUMO

In recent years, attention has been paid to wireless sensor networks (WSNs) applied to precision agriculture. However, few studies have compared the technologies of different communication standards in terms of topology and energy efficiency. This paper presents the design and implementation of the hardware and software of three WSNs with different technologies and topologies of wireless communication for tomato greenhouses in the Andean region of Ecuador, as well as the comparative study of the performance of each of them. Two companion papers describe the study of the dynamics of the energy consumption and of the monitored variables. Three WSNs were deployed, two of them with the IEEE 802.15.4 standard with star and mesh topologies (ZigBee and DigiMesh, respectively), and a third with the IEEE 802.11 standard with access point topology (WiFi). The measured variables were selected after investigation of the climatic conditions required for efficient tomato growth. The measurements for each variable could be displayed in real time using either a laboratory virtual instrument engineering workbench (LabVIEWTM) interface or an Android mobile application. The comparative study of the three networks made evident that the configuration of the DigiMesh network is the most complex for adding new nodes, due to its mesh topology. However, DigiMesh maintains the bit rate and prevents data loss by the location of the nodes as a function of crop height. It has been also shown that the WiFi network has better stability with larger precision in its measurements.

13.
Sensors (Basel) ; 18(8)2018 Aug 04.
Artigo em Inglês | MEDLINE | ID: mdl-30081565

RESUMO

Tomato greenhouses are a crucial element in the Equadorian economy. Wireless sensor networks (WSNs) have received much attention in recent years in specialized applications such as precision farming. The energy consumption in WSNs is relevant nowadays for their adequate operation, and attention is being paid to analyzing the affecting factors, energy optimization techniques working on the network hardware or software, and characterizing the consumption in the nodes (especially in the ZigBee standard). However, limited information exists on the analysis of the consumption dynamics in each node, across different network technologies and communication topologies, or on the incidence of data transmission speed. The present study aims to provide a detailed analysis of the dynamics of the energy consumption for tomato greenhouse monitoring in Ecuador, in three types of WSNs, namely, ZigBee with star topology, ZigBee with mesh topology (referred to here as DigiMesh), and WiFi with access point topology. The networks were installed and maintained in operation with a line of sight between nodes and a 2-m length, whereas the energy consumption measurements of each node were acquired and stored in the laboratory. Each experiment was repeated ten times, and consumption measurements were taken every ten milliseconds at a rate of fifty thousand samples for each realization. The dynamics were scrutinized by analyzing the recorded time series using stochastic-process analysis methods, including amplitude probability functions and temporal autocorrelation, as well as bootstrap resampling techniques and representations of various embodiments with the so-called M-mode plots. Our results show that the energy consumption of each network strongly depends on the type of sensors installed in the nodes and on the network topology. Specifically, the CO2 sensor has the highest power consumption because its chemical composition requires preheating to start logging measurements. The ZigBee network is more efficient in energy saving independently of the transmission rate, since the communication modules have lower average consumption in data transmission, in contrast to the DigiMesh network, whose consumption is high due to its topology. Results also show that the average energy consumption in WiFi networks is the highest, given that the coordinator node is a Meshlium™ router with larger energy demand. The transmission duration in the ZigBee network is lower than in the other two networks. In conclusion, the ZigBee network with star topology is the most energy-suitable one when designing wireless monitoring systems in greenhouses. The proposed methodology for consumption dynamics analysis in tomato greenhouse WSNs can be applied to other scenarios where the practical choice of an energy-efficient network is necessary due to energy constrains in the sensor and coordinator nodes.

14.
Sensors (Basel) ; 18(8)2018 Aug 04.
Artigo em Inglês | MEDLINE | ID: mdl-30081567

RESUMO

World population growth currently brings unequal access to food, whereas crop yields are not increasing at a similar rate, so that future food demand could be unmet. Many recent research works address the use of optimization techniques and technological resources on precision agriculture, especially in large demand crops, including climatic variables monitoring using wireless sensor networks (WSNs). However, few studies have focused on analyzing the dynamics of the environmental measurement properties in greenhouses. In the two companion papers, we describe the design and implementation of three WSNs with different technologies and topologies further scrutinizing their comparative performance, and a detailed analysis of their energy consumption dynamics is also presented, both considering tomato greenhouses in the Andean region of Ecuador. The three WSNs use ZigBee with star topology, ZigBee with mesh topology (referred to here as DigiMesh), and WiFi with access point topology. The present study provides a systematic and detailed analysis of the environmental measurement dynamics from multiparametric monitoring in Ecuadorian tomato greenhouses. A set of monitored variables (including CO2, air temperature, and wind direction, among others) are first analyzed in terms of their intrinsic variability and their short-term (circadian) rhythmometric behavior. Then, their cross-information is scrutinized in terms of scatter representations and mutual information analysis. Based on Bland⁻Altman diagrams, good quality rhythmometric models were obtained at high-rate sampling signals during four days when using moderate regularization and preprocessing filtering with 100-coefficient order. Accordingly, and especially for the adjustment of fast transition variables, it is appropriate to use high sampling rates and then to filter the signal to discriminate against false peaks and noise. In addition, for variables with similar behavior, a longer period of data acquisition is required for the adequate processing, which makes more precise the long-term modeling of the environmental signals.

15.
Sensors (Basel) ; 18(5)2018 May 01.
Artigo em Inglês | MEDLINE | ID: mdl-29723990

RESUMO

Despite the wide literature on R-wave detection algorithms for ECG Holter recordings, the long-term monitoring applications are bringing new requirements, and it is not clear that the existing methods can be straightforwardly used in those scenarios. Our aim in this work was twofold: First, we scrutinized the scope and limitations of existing methods for Holter monitoring when moving to long-term monitoring; Second, we proposed and benchmarked a beat detection method with adequate accuracy and usefulness in long-term scenarios. A longitudinal study was made with the most widely used waveform analysis algorithms, which allowed us to tune the free parameters of the required blocks, and a transversal study analyzed how these parameters change when moving to different databases. With all the above, the extension to long-term monitoring in a database of 7-day Holter monitoring was proposed and analyzed, by using an optimized simultaneous-multilead processing. We considered both own and public databases. In this new scenario, the noise-avoid mechanisms are more important due to the amount of noise that exists in these recordings, moreover, the computational efficiency is a key parameter in order to export the algorithm to the clinical practice. The method based on a Polling function outperformed the others in terms of accuracy and computational efficiency, yielding 99.48% sensitivity, 99.54% specificity, 99.69% positive predictive value, 99.46% accuracy, and 0.85% error for MIT-BIH arrhythmia database. We conclude that the method can be used in long-term Holter monitoring systems.

16.
Sensors (Basel) ; 18(5)2018 May 02.
Artigo em Inglês | MEDLINE | ID: mdl-29724033

RESUMO

The intracardiac electrical activation maps are commonly used as a guide in the ablation of cardiac arrhythmias. The use of catheters with force sensors has been proposed in order to know if the electrode is in contact with the tissue during the registration of intracardiac electrograms (EGM). Although threshold criteria on force signals are often used to determine the catheter contact, this may be a limited criterion due to the complexity of the heart dynamics and cardiac vorticity. The present paper is devoted to determining the criteria and force signal profiles that guarantee the contact of the electrode with the tissue. In this study, we analyzed 1391 force signals and their associated EGM recorded during 2 and 8 s, respectively, in 17 patients (82 ± 60 points per patient). We aimed to establish a contact pattern by first visually examining and classifying the signals, according to their likely-contact joint profile and following the suggestions from experts in the doubtful cases. First, we used Principal Component Analysis to scrutinize the force signal dynamics by analyzing the main eigen-directions, first globally and then grouped according to the certainty of their tissue-catheter contact. Second, we used two different linear classifiers (Fisher discriminant and support vector machines) to identify the most relevant components of the previous signal models. We obtained three main types of eigenvectors, namely, pulsatile relevant, non-pulsatile relevant, and irrelevant components. The classifiers reached a moderate to sufficient discrimination capacity (areas under the curve between 0.84 and 0.95 depending on the contact certainty and on the classifier), which allowed us to analyze the relevant properties in the force signals. We conclude that the catheter-tissue contact profiles in force recordings are complex and do not depend only on the signal intensity being above a threshold at a single time instant, but also on time pulsatility and trends. These findings pave the way towards a subsystem which can be included in current intracardiac navigation systems assisted by force contact sensors, and it can provide the clinician with an estimate of the reliability on the tissue-catheter contact in the point-by-point EGM acquisition procedure.


Assuntos
Arritmias Cardíacas/terapia , Ablação por Cateter/métodos , Técnicas Eletrofisiológicas Cardíacas/normas , Humanos , Reprodutibilidade dos Testes
17.
Sensors (Basel) ; 17(10)2017 Oct 16.
Artigo em Inglês | MEDLINE | ID: mdl-29035333

RESUMO

Pollution on water resources is usually analyzed with monitoring campaigns, which consist of programmed sampling, measurement, and recording of the most representative water quality parameters. These campaign measurements yields a non-uniform spatio-temporal sampled data structure to characterize complex dynamics phenomena. In this work, we propose an enhanced statistical interpolation method to provide water quality managers with statistically interpolated representations of spatial-temporal dynamics. Specifically, our proposal makes efficient use of the a priori available information of the quality parameter measurements through Support Vector Regression (SVR) based on Mercer's kernels. The methods are benchmarked against previously proposed methods in three segments of the Machángara River and one segment of the San Pedro River in Ecuador, and their different dynamics are shown by statistically interpolated spatial-temporal maps. The best interpolation performance in terms of mean absolute error was the SVR with Mercer's kernel given by either the Mahalanobis spatial-temporal covariance matrix or by the bivariate estimated autocorrelation function. In particular, the autocorrelation kernel provides with significant improvement of the estimation quality, consistently for all the six water quality variables, which points out the relevance of including a priori knowledge of the problem.

18.
Sensors (Basel) ; 17(11)2017 Oct 25.
Artigo em Inglês | MEDLINE | ID: mdl-29068362

RESUMO

Noise and artifacts are inherent contaminating components and are particularly present in Holter electrocardiogram (ECG) monitoring. The presence of noise is even more significant in long-term monitoring (LTM) recordings, as these are collected for several days in patients following their daily activities; hence, strong artifact components can temporarily impair the clinical measurements from the LTM recordings. Traditionally, the noise presence has been dealt with as a problem of non-desirable component removal by means of several quantitative signal metrics such as the signal-to-noise ratio (SNR), but current systems do not provide any information about the true impact of noise on the ECG clinical evaluation. As a first step towards an alternative to classical approaches, this work assesses the ECG quality under the assumption that an ECG has good quality when it is clinically interpretable. Therefore, our hypotheses are that it is possible (a) to create a clinical severity score for the effect of the noise on the ECG, (b) to characterize its consistency in terms of its temporal and statistical distribution, and (c) to use it for signal quality evaluation in LTM scenarios. For this purpose, a database of external event recorder (EER) signals is assembled and labeled from a clinical point of view for its use as the gold standard of noise severity categorization. These devices are assumed to capture those signal segments more prone to be corrupted with noise during long-term periods. Then, the ECG noise is characterized through the comparison of these clinical severity criteria with conventional quantitative metrics taken from traditional noise-removal approaches, and noise maps are proposed as a novel representation tool to achieve this comparison. Our results showed that neither of the benchmarked quantitative noise measurement criteria represent an accurate enough estimation of the clinical severity of the noise. A case study of long-term ECG is reported, showing the statistical and temporal correspondences and properties with respect to EER signals used to create the gold standard for clinical noise. The proposed noise maps, together with the statistical consistency of the characterization of the noise clinical severity, paves the way towards forthcoming systems providing us with noise maps of the noise clinical severity, allowing the user to process different ECG segments with different techniques and in terms of different measured clinical parameters.


Assuntos
Eletrocardiografia/métodos , Algoritmos , Artefatos , Eletrocardiografia/normas , Eletrocardiografia Ambulatorial , Humanos , Razão Sinal-Ruído
19.
J Biomed Inform ; 61: 87-96, 2016 06.
Artigo em Inglês | MEDLINE | ID: mdl-26980235

RESUMO

OBJECTIVE: In this work, we have developed a learning system capable of exploiting information conveyed by longitudinal Electronic Health Records (EHRs) for the prediction of a common postoperative complication, Anastomosis Leakage (AL), in a data-driven way and by fusing temporal population data from different and heterogeneous sources in the EHRs. MATERIAL AND METHODS: We used linear and non-linear kernel methods individually for each data source, and leveraging the powerful multiple kernels for their effective combination. To validate the system, we used data from the EHR of the gastrointestinal department at a university hospital. RESULTS: We first investigated the early prediction performance from each data source separately, by computing Area Under the Curve values for processed free text (0.83), blood tests (0.74), and vital signs (0.65), respectively. When exploiting the heterogeneous data sources combined using the composite kernel framework, the prediction capabilities increased considerably (0.92). Finally, posterior probabilities were evaluated for risk assessment of patients as an aid for clinicians to raise alertness at an early stage, in order to act promptly for avoiding AL complications. DISCUSSION: Machine-learning statistical model from EHR data can be useful to predict surgical complications. The combination of EHR extracted free text, blood samples values, and patient vital signs, improves the model performance. These results can be used as a framework for preoperative clinical decision support.


Assuntos
Procedimentos Cirúrgicos do Sistema Digestório , Registros Eletrônicos de Saúde , Complicações Pós-Operatórias , Fístula Anastomótica , Colo/cirurgia , Humanos , Modelos Estatísticos , Reto/cirurgia , Medição de Risco , Máquina de Vetores de Suporte
SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA