Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 27
Filter
1.
IEEE J Biomed Health Inform ; 28(4): 1993-2004, 2024 Apr.
Article in English | MEDLINE | ID: mdl-38241105

ABSTRACT

Electrocardiogram (ECG) signals frequently encounter diverse types of noise, such as baseline wander (BW), electrode motion (EM) artifacts, muscle artifact (MA), and others. These noises often occur in combination during the actual data acquisition process, resulting in erroneous or perplexing interpretations for cardiologists. To suppress random mixed noise (RMN) in ECG with less distortion, we propose a Transformer-based Convolutional Denoising AutoEncoder model (TCDAE) in this study. The encoder of TCDAE is composed of three stacked gated convolutional layers and a Transformer encoder block with a point-wise multi-head self-attention module. To obtain minimal distortion in both time and frequency domains, we also propose a frequency weighted Huber loss function in training phase to better approximate the original signals. The TCDAE model is trained and tested on the QT Database (QTDB) and MIT-BIH Noise Stress Test Database (NSTDB), with the training data and testing data coming from different records. All the metrics perform the most robust in overall noise and separate noise intervals for RMN removal compared with the baseline methods. We also conduct generalization tests on the Icentia11k database where the TCDAE outperforms the state-of-the-art models, with a 55% reduction of the false positives in R peak detection after denoising. The TCDAE model approximates the short-term and long-term characteristics of ECG signals and has higher stability even under extreme RMN corruption. The memory consumption and inference speed of TCDAE are also feasible for its deployment in clinical applications.


Subject(s)
Algorithms , Signal Processing, Computer-Assisted , Humans , Electrocardiography/methods , Exercise Test , Artifacts , Signal-To-Noise Ratio
2.
Comput Methods Programs Biomed ; 238: 107565, 2023 Aug.
Article in English | MEDLINE | ID: mdl-37210927

ABSTRACT

BACKGROUND AND OBJECTIVE: Automatic recognition of wearable dynamic electrocardiographic (ECG) signals is a difficult problem in biomedical signal processing. However, with the widespread use of long-range ambulatory ECG, a large number of real-time ECG signals are generated in the clinic, and it is very difficult for clinicians to perform timely atrial fibrillation (AF) diagnosis. Therefore, developing a new AF diagnosis algorithm can relieve the pressure on the healthcare system and improve the efficiency of AF screening. METHODS: In this study, a self-complementary attentional convolutional neural network (SCCNN) was designed to accurately identify AF in wearable dynamic ECG signals. First, a 1D ECG signal was converted into a 2D ECG matrix using the proposed Z-shaped signal reconstruction method. Then, a 2D convolutional network was used to extract shallow information from adjacent sampling points at close distances and interval sampling points at distant distances in the ECG signal. The self-complementary attention mechanism (SCNet) was used to focus and fuse channel information with spatial information. Finally, fused feature sequences were used to detect AF. RESULTS: The accuracies of the proposed method on the three public databases were 99.79%, 95.51%, and 98.80%. The AUC values were 99.79%, 95.51%, and 98.77%, respectively. The sensitivity on the clinical database was as high as 99.62%. CONCLUSIONS: These results show that the proposed method can accurately identify AF and has good generalization.


Subject(s)
Atrial Fibrillation , Humans , Atrial Fibrillation/diagnosis , Electrocardiography/methods , Neural Networks, Computer , Algorithms , Electrocardiography, Ambulatory
3.
Comput Biol Med ; 150: 106081, 2022 11.
Article in English | MEDLINE | ID: mdl-36130422

ABSTRACT

Accurate segmentation of electrocardiogram (ECG) waves is crucial for cardiovascular diseases (CVDs). In this study, a bidirectional hidden semi-Markov model (BI-HSMM) based on the probability distributions of ECG waveform duration was proposed for ECG wave segmentation. Four feature-vectors of ECG signals were extracted as the observation sequence of the hidden Markov model (HMM), and the statistical probability distribution of each waveform duration was counted. Logistic regression (LR) was used to train model parameters. The starting and ending positions of the QRS wave were first detected, and thereafter, bidirectional prediction was employed for the other waves. Forwardly, ST segment, T wave, and TP segment were predicted. Backwardly, P wave and PQ segments were detected. The Viterbi algorithm was improved by integrating the recursive formula of the forward prediction and backward backtracking algorithms. In the QT database, the proposed method demonstrated excellent performance (Acc = 97.98%, F1 score of P wave = 98.37%, F1 score of QRS wave = 97.60%, F1 score of T wave = 97.79%). For the wearable dynamic electrocardiography (DCG) signals collected by the Shandong Provincial Hospital (SPH), the detection accuracy was 99.71% and the F1 of each waveform was above 99%. The experimental results and real DCG signal validation confirmed that the proposed new BI-HSMM method exhibits significant ability to segment the resting and DCG signals; this is conducive to the detection and monitoring of CVDs.


Subject(s)
Cardiovascular Diseases , Signal Processing, Computer-Assisted , Humans , Algorithms , Electrocardiography/methods , Electrocardiography, Ambulatory , Arrhythmias, Cardiac
4.
Front Physiol ; 13: 905447, 2022.
Article in English | MEDLINE | ID: mdl-35845989

ABSTRACT

As the fast development of wearable devices and Internet of things technologies, real-time monitoring of ECG signals is quite critical for cardiovascular diseases. However, dynamic ECG signals recorded in free-living conditions suffered from extremely serious noise pollution. Presently, most algorithms for ECG signal evaluation were designed to divide signals into acceptable and unacceptable. Such classifications were not enough for real-time cardiovascular disease monitoring. In the study, a wearable ECG quality database with 50,085 recordings was built, including A/B/C (or high quality/medium quality/low quality) three quality grades (A: high quality signals can be used for CVD detection; B: slight contaminated signals can be used for heart rate extracting; C: heavily polluted signals need to be abandoned). A new SQA classification method based on a three-layer wavelet scattering network and transfer learning LSTM was proposed in this study, which can extract more systematic and comprehensive characteristics by analyzing the signals thoroughly and deeply. Experimental results ( mACC = 98.56%, mF 1 = 98.55%, Se A = 97.90%, Se B = 98.16%, Se C = 99.60%, + P A = 98.52%, + P B = 97.60%, + P C = 99.54%, F 1A = 98.20%, F 1B = 97.90%, F 1C = 99.60%) and real data validations proved that this proposed method showed the high accuracy, robustness, and computationally efficiency. It has the ability to evaluate the long-term dynamic ECG signal quality. It is advantageous to promoting cardiovascular disease monitoring by removing contaminating signals and selecting high-quality signal segments for further analysis.

5.
Comput Methods Programs Biomed ; 216: 106651, 2022 Apr.
Article in English | MEDLINE | ID: mdl-35104686

ABSTRACT

BACKGROUND AND OBJECTIVE: Craniopharyngioma is a kind of benign brain tumor in histography. However, it might be clinically aggressive and have severe manifestations, such as increased intracranial pressure, hypothalamic-pituitary dysfunction, and visual impairment. It is considered challenging for radiologists to predict the invasiveness of craniopharyngioma through MRI images. Therefore, developing a non-invasive method that can predict the invasiveness and boundary of CP as a reference before surgery is of clinical value for making more appropriate and individualized treatment decisions and reducing the occurrence of inappropriate surgical plan choices. METHODS: The MT-Brain system has consisted of two pathways, a sub-path based on 2D CNN for capturing the features from each slice of MRI images, and a 3D sub-network for capturing additional context information between slices. By introducing the two-path architecture, our system can make full use of the fusion of the above 2D and 3D features for classification. Furthermore, position encoding and mask-guided attention also have been introduced to improve the segmentation and diagnosis performance. To verify the performance of the MT-Brain system, we have enrolled 1032 patients with craniopharyngioma (302 invasion and 730 non-invasion patients), segmented the tumors on postcontrast coronal T1WI and randomized them into a training dataset and a testing dataset at a ratio of 8:2. RESULTS: The MT-Brain system achieved a remarkable performance in diagnosing the invasiveness of craniopharyngioma with the AUC of 83.84%, the accuracy of 77.94%, the sensitivity of 70.97%, and the specificity of 80.99%. In the lesion segmentation task, the predicted boundaries of lesions were similar to those labeled by radiologists with the dice of 66.36%. In addition, some explorations also have been made on the interpretability of deep learning models, illustrating the reliability of the model. CONCLUSIONS: To the best of our knowledge, this study is the first to develop an integrated deep learning model to predict the invasiveness of craniopharyngioma preoperatively and locate the lesion boundary synchronously on MRI. The excellent performances indicate that the MT-Brain system has great potential in real-world clinical applications.


Subject(s)
Craniopharyngioma , Deep Learning , Pituitary Neoplasms , Craniopharyngioma/diagnostic imaging , Humans , Image Processing, Computer-Assisted , Magnetic Resonance Imaging , Pituitary Neoplasms/diagnostic imaging , Reproducibility of Results
6.
Entropy (Basel) ; 23(9)2021 Sep 11.
Article in English | MEDLINE | ID: mdl-34573824

ABSTRACT

Entropy algorithm is an important nonlinear method for cardiovascular disease detection due to its power in analyzing short-term time series. In previous a study, we proposed a new entropy-based atrial fibrillation (AF) detector, i.e., EntropyAF, which showed a high classification accuracy in identifying AF and non-AF rhythms. As a variation of entropy measures, EntropyAF has two parameters that need to be initialized before the calculation: (1) tolerance threshold r and (2) similarity weight n. In this study, a comprehensive analysis for the two parameters determination was presented, aiming to achieve a high detection accuracy for AF events. Data were from the MIT-BIH AF database. RR interval recordings were segmented using a 30-beat time window. The parameters r and n were initialized from a relatively small value, then gradually increased, and finally the best parameter combination was determined using grid searching. AUC (area under curve) values from the receiver operator characteristic curve (ROC) were compared under different parameter combinations of parameters r and n, and the results demonstrated that the selection of these two parameters plays an important role in AF/non-AF classification. Small values of parameters r and n can lead to a better detection accuracy than other selections. The best AUC value for AF detection was 98.15%, and the corresponding parameter combinations for EntropyAF were as follows: r = 0.01, n = 0.0625, 0.125, 0.25, or 0.5; r = 0.05 and n = 0.0625, 0.125, or 0.25; and r = 0.10 and n = 0.0625 or 0.125.

7.
Comput Biol Med ; 137: 104814, 2021 10.
Article in English | MEDLINE | ID: mdl-34481179

ABSTRACT

Automatic classification of heart sound plays an important role in the diagnosis of cardiovascular diseases. In this study, a heart sound sample classification method based on quality assessment and wavelet scattering transform was proposed. First, the ratio of zero crossings (RZC) and the root mean square of successive differences (RMSSD) were used for assessing the quality of heart sound signal. The first signal segment conforming to the threshold standard was selected as the current sample for the continuous heart sound signal. Using the wavelet scattering transform, the wavelet scattering coefficients were expanded according to the wavelet scale dimension, to obtain the features. Support vector machine (SVM) was used for classification, and the classification results for the samples were obtained using the wavelet scale dimension voting approach. The effects of RZC and RMSSD on the results are discussed in detail. On the database of PhysioNet Computing in Cardiology Challenge 2016 (CinC 2016), the proposed method yields 92.23% accuracy (Acc), 96.62% sensitivity (Se), 90.65% specificity (Sp), and 93.64% measure of accuracy (Macc). The results show that the proposed method can effectively classify normal and abnormal heart sound samples with high accuracy.


Subject(s)
Heart Sounds , Algorithms , Databases, Factual , Signal Processing, Computer-Assisted , Support Vector Machine , Wavelet Analysis
8.
IEEE J Biomed Health Inform ; 25(11): 4175-4184, 2021 11.
Article in English | MEDLINE | ID: mdl-34077377

ABSTRACT

The classification of heartbeats is an important method for cardiac arrhythmia analysis. This study proposes a novel heartbeat classification method using hybrid time-frequency analysis and transfer learning based on ResNet-101. The proposed method has the following major advantages over the afore-mentioned methods: it avoids the need for manual features extraction in the traditional machine learning method, and it utilizes 2-D time-frequency diagrams which provide not only frequency and energy information but also preserve the morphological characteristic within the ECG recordings, and it owns enough deep to make better use of performance of CNN. The method deploys a hybrid time-frequency analysis of the Hilbert transform (HT) and the Wigner-Ville distribution (WVD) to transform 1-D ECG recordings into 2-D time-frequency diagrams which were then fed into a transfer learning classifier based on ResNet-101 for two classification tasks (i.e., 5 heartbeat categories assigned by the ANSI/AAMI standard (i.e., N, V, S, Q and F) and 14 original beat kinds of the MIT/BIH arrhythmia database). For 5 heartbeat categories classification, the results show the F1-score of N, V, S, Q and F categories are F N 0.9899, F V 0.9845, F S 0.9376, F Q 0.9968, F F 0.8889, respectively, and the overall F1-score is 0.9595 using the combination data balancing. The results show the average values for accuracy, sensitivity, specificity, predictive value and F1-score on test set for 14 beat kinds the MIT-BIH arrhythmia database are 99.75%, 91.36%, 99.85%, 90.81% and 0.9016, respectively. Compared with other methods, the proposed method can yield more accurate results.


Subject(s)
Arrhythmias, Cardiac , Electrocardiography , Algorithms , Arrhythmias, Cardiac/diagnosis , Databases, Factual , Heart Rate , Humans , Machine Learning , Signal Processing, Computer-Assisted
9.
Med Biol Eng Comput ; 58(9): 2039-2047, 2020 Sep.
Article in English | MEDLINE | ID: mdl-32638275

ABSTRACT

We purpose a novel method that combines modified frequency slice wavelet transform (MFSWT) and convolutional neural network (CNN) for classifying normal and abnormal heart sounds. A hidden Markov model is used to find the position of each cardiac cycle in the heart sound signal and determine the exact position of the four periods of S1, S2, systole, and diastole. Then the one-dimensional cardiac cycle signal was converted into a two-dimensional time-frequency picture using the MFSWT. Finally, two CNN models are trained using the aforementioned pictures. We combine two CNN models using sample entropy (SampEn) to determine which model is used to classify the heart sound signal. We evaluated our model on the heart sound public dataset provided by the PhysioNet Computing in Cardiology Challenge 2016. Experimental classification performance from a 10-fold cross-validation indicated that sensitivity (Se), specificity (Sp) and mean accuracy (MAcc) were 0.95, 0.93, and 0.94, respectively. The results showed the proposed method can classify normal and abnormal heart sounds with efficiency and high accuracy. Graphical abstract Block diagram of heart sound classification.


Subject(s)
Heart Sounds/physiology , Models, Cardiovascular , Neural Networks, Computer , Wavelet Analysis , Algorithms , Biomedical Engineering , Cardiovascular Diseases/diagnosis , Cardiovascular Diseases/physiopathology , Diagnosis, Computer-Assisted/methods , Diagnosis, Computer-Assisted/statistics & numerical data , Humans , Markov Chains , Phonocardiography/statistics & numerical data , Signal Processing, Computer-Assisted
10.
Comput Math Methods Med ; 2019: 3130527, 2019.
Article in English | MEDLINE | ID: mdl-31065291

ABSTRACT

BACKGROUND: The T wave represents ECG repolarization, whose detection is required during myocardial ischemia, and the first significant change in the ECG signal is being observed in the ST segment followed by changes in other waves like P wave and QRS complex. To offer guidance in clinical diagnosis, decision-making, and daily mobile ECG monitoring, the T wave needs to be detected firstly. Recently, the sliding area-based method has received an increasing amount of attention due to its robustness and low computational burden. However, the parameter setting of the search window's boundaries in this method is not adaptive. Therefore, in this study, we proposed an improved sliding window area method with more adaptive parameter setting for T wave detection. METHODS: Firstly, k-means clustering was used in the annotated MIT QT database to generate three piecewise functions for delineating the relationship between the RR interval and the interval from the R peak to the T wave onset and that between the RR interval and the interval from the R peak to the T wave offset. Then, the grid search technique combined with 5-fold cross validation was used to select the suitable parameters' combination for the sliding window area method. RESULTS: With respect to onset detection in the QT database, F1 improved from 54.70% to 70.46% and 54.05% to 72.94% for the first and second electrocardiogram (ECG) channels, respectively. For offset detection, F1 also improved in both channels as it did in the European ST-T database. CONCLUSIONS: F1 results from the improved algorithm version were higher than those from the traditional method, indicating a potentially useful application for the proposed method in ECG monitoring.


Subject(s)
Algorithms , Diagnosis, Computer-Assisted/statistics & numerical data , Electrocardiography/statistics & numerical data , Computational Biology , Databases, Factual , Humans , Mathematical Computing , Myocardial Ischemia/diagnosis , Signal Processing, Computer-Assisted
11.
Physiol Meas ; 39(11): 115005, 2018 11 26.
Article in English | MEDLINE | ID: mdl-30475743

ABSTRACT

OBJECTIVE: Sleep quality helps to reflect on the physical and mental condition, and efficient sleep stage scoring promises considerable advantages to health care. The aim of this study is to propose a simple and efficient sleep classification method based on entropy features and a support vector machine classifier, named SC-En&SVM. APPROACH: Entropy features, including fuzzy measure entropy (FuzzMEn), fuzzy entropy, and sample entropy are applied for the analysis and classification of sleep stages. FuzzyMEn has been used for heart rate variability analysis since it was proposed, while this is the first time it has been used for sleep scoring. The three features are extracted from 6 376 730 s epochs from Fpz-Cz electroencephalogram (EEG), Pz-Oz EEG and horizontal electrooculogram (EOG) signals in the sleep-EDF database. The independent samples t-test shows that the entropy values have significant differences among six sleep stages. The multi-class support vector machine (SVM) with a one-against-all class approach is utilized in this specific application for the first time. We perform 10-fold cross-validation as well as leave-one-subject-out cross-validation for 61 subjects to test the effectiveness and reliability of SC-En&SVM. MAIN RESULTS: The 10-fold cross-validation shows an effective performance with high stability of SC-En&SVM. The average accuracy and standard deviation for 2-6 states are 97.02 ± 0.58, 92.74 ± 1.32, 89.08 ± 0.90, 86.02 ± 1.06 and 83.94 ± 1.61, respectively. While for a more practical evaluation, the independent scheme is further performed, and the results show that our method achieved similar or slightly better average accuracies for 2-6 states of 94.15%, 85.06%, 80.96%, 78.68% and 75.98% compared with state-of-the-art methods. The corresponding kappa coefficients (0.81, 0.74, 0.72, 0.71, 0.67) guarantee substantial agreement of the classification. SIGNIFICANCE: We propose a novel sleep stage scoring method, SC-En&SVM, with easily accessible features and a simple classification algorithm, without reducing the classification performance compared with other approaches.


Subject(s)
Entropy , Signal Processing, Computer-Assisted , Sleep/physiology , Support Vector Machine , Adult , Aged , Aged, 80 and over , Electroencephalography , Electrooculography , Female , Healthy Volunteers , Heart Rate , Humans , Male , Middle Aged
12.
J Healthc Eng ; 2018: 2102918, 2018.
Article in English | MEDLINE | ID: mdl-30057730

ABSTRACT

Atrial fibrillation (AF) is a serious cardiovascular disease with the phenomenon of beating irregularly. It is the major cause of variety of heart diseases, such as myocardial infarction. Automatic AF beat detection is still a challenging task which needs further exploration. A new framework, which combines modified frequency slice wavelet transform (MFSWT) and convolutional neural networks (CNNs), was proposed for automatic AF beat identification. MFSWT was used to transform 1 s electrocardiogram (ECG) segments to time-frequency images, and then, the images were fed into a 12-layer CNN for feature extraction and AF/non-AF beat classification. The results on the MIT-BIH Atrial Fibrillation Database showed that a mean accuracy (Acc) of 81.07% from 5-fold cross validation is achieved for the test data. The corresponding sensitivity (Se), specificity (Sp), and the area under the ROC curve (AUC) results are 74.96%, 86.41%, and 0.88, respectively. When excluding an extremely poor signal quality ECG recording in the test data, a mean Acc of 84.85% is achieved, with the corresponding Se, Sp, and AUC values of 79.05%, 89.99%, and 0.92. This study indicates that it is possible to accurately identify AF or non-AF ECGs from a short-term signal episode.


Subject(s)
Atrial Fibrillation/diagnosis , Electrocardiography/methods , Neural Networks, Computer , Wavelet Analysis , Algorithms , Humans
13.
IEEE Trans Med Imaging ; 37(10): 2176-2184, 2018 10.
Article in English | MEDLINE | ID: mdl-29993826

ABSTRACT

Fluorescence molecular tomography (FMT), as a promising imaging modality in preclinical research, can obtain the three-dimensional (3-D) position information of the stem cell in mice. However, because of the ill-posed nature and sensitivity to noise of the inverse problem, it is a challenge to develop a robust reconstruction method, which can accurately locate the stem cells and define the distribution. In this paper, we proposed a sparsity adaptive correntropy matching pursuit (SACMP) method. SACMP method is independent on the noise distribution of measurements and it assigns small weights on severely corrupted entries of data and large weights on clean ones adaptively. These properties make it more suitable for in vivo experiment. To analyze the performance in terms of robustness and practicability of SACMP, we conducted numerical simulation and in vivo mice experiments. The results demonstrated that the SACMP method obtained the highest robustness and accuracy in locating stem cells and depicting stem cell distribution compared with stagewise orthogonal matching pursuit and sparsity adaptive subspace pursuit reconstruction methods. To the best of our knowledge, this is the first study that acquired such accurate and robust FMT distribution reconstruction for stem cell tracking in mice brain. This promotes the application of FMT in locating stem cell and distribution reconstruction in practical mice brain injury models.


Subject(s)
Brain/diagnostic imaging , Cell Tracking/methods , Imaging, Three-Dimensional/methods , Optical Imaging/methods , Stem Cell Transplantation/methods , Algorithms , Animals , Brain/cytology , Brain/surgery , Entropy , Female , Mice , Mice, Nude , Stem Cells/cytology , Tomography/methods
14.
Contrast Media Mol Imaging ; 2018: 9321862, 2018.
Article in English | MEDLINE | ID: mdl-29853812

ABSTRACT

Tumor cell complete extinction is a crucial measure to evaluate antitumor efficacy. The difficulties in defining tumor margins and finding satellite metastases are the reason for tumor recurrence. A synergistic method based on multimodality molecular imaging needs to be developed so as to achieve the complete extinction of the tumor cells. In this study, graphene oxide conjugated with gold nanostars and chelated with Gd through 1,4,7,10-tetraazacyclododecane-N,N',N,N'-tetraacetic acid (DOTA) (GO-AuNS-DOTA-Gd) were prepared to target HCC-LM3-fLuc cells and used for therapy. For subcutaneous tumor, multimodality molecular imaging including photoacoustic imaging (PAI) and magnetic resonance imaging (MRI) and the related processing techniques were used to monitor the pharmacokinetics process of GO-AuNS-DOTA-Gd in order to determine the optimal time for treatment. For orthotopic tumor, MRI was used to delineate the tumor location and margin in vivo before treatment. Then handheld photoacoustic imaging system was used to determine the tumor location during the surgery and guided the photothermal therapy. The experiment result based on orthotopic tumor demonstrated that this synergistic method could effectively reduce tumor residual and satellite metastases by 85.71% compared with the routine photothermal method without handheld PAI guidance. These results indicate that this multimodality molecular imaging-guided photothermal therapy method is promising with a good prospect in clinical application.


Subject(s)
Carcinoma, Hepatocellular , Chelating Agents , Gadolinium , Gold , Graphite , Hyperthermia, Induced , Liver Neoplasms, Experimental , Magnetic Resonance Imaging , Metal Nanoparticles , Phototherapy , Animals , Carcinoma, Hepatocellular/diagnostic imaging , Carcinoma, Hepatocellular/metabolism , Carcinoma, Hepatocellular/therapy , Cell Line, Tumor , Chelating Agents/chemistry , Chelating Agents/pharmacology , Gadolinium/chemistry , Gadolinium/pharmacology , Gold/chemistry , Gold/pharmacology , Graphite/chemistry , Graphite/pharmacology , Humans , Liver Neoplasms, Experimental/diagnostic imaging , Liver Neoplasms, Experimental/metabolism , Liver Neoplasms, Experimental/therapy , Metal Nanoparticles/chemistry , Metal Nanoparticles/therapeutic use , Mice , Mice, Inbred BALB C
15.
J Healthc Eng ; 2018: 9050812, 2018.
Article in English | MEDLINE | ID: mdl-29854370

ABSTRACT

A systematical evaluation work was performed on ten widely used and high-efficient QRS detection algorithms in this study, aiming at verifying their performances and usefulness in different application situations. Four experiments were carried on six internationally recognized databases. Firstly, in the test of high-quality ECG database versus low-quality ECG database, for high signal quality database, all ten QRS detection algorithms had very high detection accuracy (F1 >99%), whereas the F1 results decrease significantly for the poor signal-quality ECG signals (all <80%). Secondly, in the test of normal ECG database versus arrhythmic ECG database, all ten QRS detection algorithms had good F1 results for these two databases (all >95% except RS slope algorithm with 94.24% on normal ECG database and 94.44% on arrhythmia database). Thirdly, for the paced rhythm ECG database, all ten algorithms were immune to the paced beats (>94%) except the RS slope method, which only output a low F1 result of 78.99%. At last, the detection accuracies had obvious decreases when dealing with the dynamic telehealth ECG signals (all <80%) except OKB algorithm with 80.43%. Furthermore, the time costs from analyzing a 10 s ECG segment were given as the quantitative index of the computational complexity. All ten algorithms had high numerical efficiency (all <4 ms) except RS slope (94.07 ms) and sixth power algorithms (8.25 ms). And OKB algorithm had the highest numerical efficiency (1.54 ms).


Subject(s)
Electrocardiography/methods , Signal Processing, Computer-Assisted , Algorithms , Arrhythmias, Cardiac/diagnosis , Humans
16.
Technol Health Care ; 26(S1): 113-119, 2018.
Article in English | MEDLINE | ID: mdl-29710744

ABSTRACT

BACKGROUND: Changes of pulse transit time (PTT) induced by arm position were studied for unilateral arm. However, consistency of the PTT changes was not validated for both arm sides. OBJECTIVE: We aimed to quantify the PTT changes between horizontal and non-horizontal positions from right arm and left arm in order to explore the consistency of both arms. METHODS: Twenty-four normal subjects aged between 21 and 50 (14 male and 10 female) years were enrolled. Left and right radial artery pulses were synchronously recorded from 24 healthy subjects with one arm (left or right) at five angles (90∘, 45∘, 0∘, -45∘ and -90∘) and the other arm at the horizontal level (0∘) for reference. RESULTS: The overall mean PTT changes at the five angles (from 90∘ to -90∘) in the left arm (right as reference) were 16.1, 12.3, -0.5, -2.5 and -2.6 ms, respectively, and in the right arm (left as reference) were 18.0, 12.6, 1.6, -1.6 and -2.0 ms, respectively. CONCLUSIONS: Obvious differences were not found in the PTT changes between the two arms (left arm moving or right arm moving) under each of the five different positions (all P> 0.05).


Subject(s)
Arm/physiology , Arteries/physiology , Blood Pressure/physiology , Heart Rate/physiology , Monitoring, Physiologic/methods , Pulse Wave Analysis/methods , Adult , Female , Humans , Male , Middle Aged , Young Adult
17.
Entropy (Basel) ; 20(12)2018 Nov 26.
Article in English | MEDLINE | ID: mdl-33266628

ABSTRACT

Entropy-based atrial fibrillation (AF) detectors have been applied for short-term electrocardiogram (ECG) analysis. However, existing methods suffer from several limitations. To enhance the performance of entropy-based AF detectors, we have developed a new entropy measure, named EntropyAF, which includes the following improvements: (1) use of a ranged function rather than the Chebyshev function to define vector distance, (2) use of a fuzzy function to determine vector similarity, (3) replacement of the probability estimation with density estimation for entropy calculation, (4) use of a flexible distance threshold parameter, and (5) use of adjusted entropy results for the heart rate effect. EntropyAF was trained using the MIT-BIH Atrial Fibrillation (AF) database, and tested on the clinical wearable long-term AF recordings. Three previous entropy-based AF detectors were used for comparison: sample entropy (SampEn), fuzzy measure entropy (FuzzyMEn) and coefficient of sample entropy (COSEn). For classifying AF and non-AF rhythms in the MIT-BIH AF database, EntropyAF achieved the highest area under receiver operating characteristic curve (AUC) values of 98.15% when using a 30-beat time window, which was higher than COSEn with AUC of 91.86%. SampEn and FuzzyMEn resulted in much lower AUCs of 74.68% and 79.24% respectively. For classifying AF and non-AF rhythms in the clinical wearable AF database, EntropyAF also generated the largest values of Youden index (77.94%), sensitivity (92.77%), specificity (85.17%), accuracy (87.10%), positive predictivity (68.09%) and negative predictivity (97.18%). COSEn had the second-best accuracy of 78.63%, followed by an accuracy of 65.08% in FuzzyMEn and an accuracy of 59.91% in SampEn. The new proposed EntropyAF also generated highest classification accuracy when using a 12-beat time window. In addition, the results from time cost analysis verified the efficiency of the new EntropyAF. This study showed the better discrimination ability for identifying AF when using EntropyAF method, indicating that it would be useful for the practical clinical wearable AF scanning.

18.
Technol Health Care ; 25(3): 435-445, 2017.
Article in English | MEDLINE | ID: mdl-27911348

ABSTRACT

BACKGROUND: The usefulness of heart rate variability (HRV) in the clinical research has been verified in numerous studies. However, it is controversy that using pulse rate variability (PRV) as a surrogate of HRV in different clinical applications. OBJECTIVE: We aimed to investigate whether PRV extracted from finger pulse photoplethysmography (Pleth) signal could substitute HRV from ECG signal during different sleep stages by analyzing the common time-domain, frequency-domain and non-linear indices. METHODS: Seventy-five sleep apnea patients were enrolled. For each patient, ECG and Pleth signals were simultaneously recorded for the whole night using Alice Sleepware Polysomnographic System and the sleep stage signals were automatically calculated by this System. Time-domain, frequency-domain and non-linear indices of both HRV and PRV were calculated for each sleep stage. RESULTS: Mann-Whitney U-test showed that for both time-domain and frequency-domain indices, there were no statistical differences between HRV and PRV results during all four sleep stages. For non-linear indices, sample entropy reported statistical differences between HRV and PRV results for N1, N2 and REM sleeps (all P< 0.01) whereas fuzzy measure entropy only reported statistical differences for REM sleep (P< 0.05). SDNN, LF and LF/HF indices decreased for both HRV and PRV with the sleep deepening while HF and non-linear indices increased. In addition, there were strong and significant correlation between HRV and PRV indices during all four sleep stages (all P< 0.01). CONCLUSIONS: PRV measurement could present the similar results as HRV analysis for sleep apnea patients during different sleep stages.


Subject(s)
Heart Rate/physiology , Pulse , Sleep Apnea Syndromes/physiopathology , Sleep Stages/physiology , Electrocardiography , Female , Humans , Male , Middle Aged , Photoplethysmography , Polysomnography/methods
19.
J Med Biol Eng ; 36(5): 625-634, 2016.
Article in English | MEDLINE | ID: mdl-27853413

ABSTRACT

The poor quality of wireless electrocardiography (ECG) recordings can lead to misdiagnosis and waste of medical resources. This study presents an interpretation of Lempel-Ziv (LZ) complexity in terms of ECG quality assessment, and verifies its performance on real ECG signals. Firstly, LZ complexities for typical signals, namely high-frequency (HF) noise, low-frequency (LF) noise, power-line (PL) noise, impulse (IM) noise, clean artificial ECG signals, and ECG signals with various types of noise added (ECG plus HF, LF, PL, and IM noise, respectively) were analyzed. Then, the effects of noise, signal length, and signal-to-noise ratio (SNR) on the LZ complexity of ECG signals were analyzed. The simulation results show that LZ complexity for HF noise was obviously different from those for PL and LF noise. The LZ value can be used to determine the presence of HF noise. ECG plus HF noise had the highest LZ values. Other types of noise had low LZ values. Signal lengths of over 40 s had only a small effect on LZ values. The LZ values for ECG plus all types of noise increased monotonically with decreasing SNR except for LF and PL noise. For the test of real ECG signals plus three types of noise, namely muscle artefacts (MAs), baseline wander (BW), and electrode motion (EM) artefacts, LZ complexity varied obviously with increasing MA but not for BW and EM noise. This study demonstrates that LZ complexity is sensitive to noise level (especially for HF noise) and can thus be a valuable reference index for the assessment of ECG signal quality.

20.
Physiol Meas ; 37(8): 1298-312, 2016 08.
Article in English | MEDLINE | ID: mdl-27454710

ABSTRACT

False alarm (FA) rates as high as 86% have been reported in intensive care unit monitors. High FA rates decrease quality of care by slowing staff response times while increasing patient burdens and stresses. In this study, we proposed a rule-based and multi-channel information fusion method for accurately classifying the true or false alarms for five life-threatening arrhythmias: asystole (ASY), extreme bradycardia (EBR), extreme tachycardia (ETC), ventricular tachycardia (VTA) and ventricular flutter/fibrillation (VFB). The proposed method consisted of five steps: (1) signal pre-processing, (2) feature detection and validation, (3) true/false alarm determination for each channel, (4) 'real-time' true/false alarm determination and (5) 'retrospective' true/false alarm determination (if needed). Up to four signal channels, that is, two electrocardiogram signals, one arterial blood pressure and/or one photoplethysmogram signal were included in the analysis. Two events were set for the method validation: event 1 for 'real-time' and event 2 for 'retrospective' alarm classification. The results showed that 100% true positive ratio (i.e. sensitivity) on the training set were obtained for ASY, EBR, ETC and VFB types, and 94% for VTA type, accompanied by the corresponding true negative ratio (i.e. specificity) results of 93%, 81%, 78%, 85% and 50% respectively, resulting in the score values of 96.50, 90.70, 88.89, 92.31 and 64.90, as well as with a final score of 80.57 for event 1 and 79.12 for event 2. For the test set, the proposed method obtained the score of 88.73 for ASY, 77.78 for EBR, 89.92 for ETC, 67.74 for VFB and 61.04 for VTA types, with the final score of 71.68 for event 1 and 75.91 for event 2.


Subject(s)
Algorithms , Arrhythmias, Cardiac/diagnosis , Clinical Alarms , Intensive Care Units , Signal Processing, Computer-Assisted , Arrhythmias, Cardiac/physiopathology , Blood Pressure , Electrocardiography/instrumentation , False Positive Reactions , Humans , Monitoring, Physiologic/instrumentation , Photoplethysmography/instrumentation
SELECTION OF CITATIONS
SEARCH DETAIL
...