An accurate valvular heart disorders detection model based on a new dual symmetric tree pattern using stethoscope sounds.
Comput Biol Med
; 146: 105599, 2022 07.
Article
em En
| MEDLINE
| ID: mdl-35609471
BACKGROUND AND PURPOSE: Valvular heart disease (VHD) is an important cause of morbidity and mortality. Echocardiography is the reference standard for VHD diagnosis but is not universally accessible. Manual cardiac auscultation is inadequate for screening VHD. Many machine learning models using heart sounds acquired with an electronic stethoscope may improve the accuracy of VHD diagnosis. We aimed to develop an accurate sound classification model for VHD diagnosis. MATERIALS AND METHODS: A new large stethoscope sound dataset containing 10,366 heart sounds divided into ten categories (nine VHD and one healthy) were prospectively collected. We developed a handcrafted learning model that comprised multilevel feature extraction based on a dual symmetric tree pattern (DSTP) and multilevel discrete wavelet transform (DWT), feature selection, and classification. The multilevel DWT was used to create subbands to extract features at both high and low levels. Then, iterative neighborhood component analysis was used to select the most discriminative 512 features from among the extracted features in the generated feature vector. In the classification phase, a support vector machine (SVM) was used with 10-fold cross-validation (CV) and leave-one-subject-out (LOSO) CV. RESULTS: Our proposed DSTP-based model attained 99.58% and 99.84% classification accuracies using SVM classifier with 10-fold CV and LOSO CV, respectively. CONCLUSIONS: The presented DSTP-based classification model attained excellent multiclass classification performance on a large prospective heart sound dataset at a low computational cost.
Palavras-chave
Texto completo:
1
Base de dados:
MEDLINE
Assunto principal:
Doenças das Valvas Cardíacas
/
Modelos Teóricos
Tipo de estudo:
Diagnostic_studies
/
Guideline
/
Observational_studies
/
Prognostic_studies
Limite:
Humans
Idioma:
En
Ano de publicação:
2022
Tipo de documento:
Article