Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 3 de 3
Filtrar
Más filtros











Base de datos
Intervalo de año de publicación
1.
Heliyon ; 10(5): e27200, 2024 Mar 15.
Artículo en Inglés | MEDLINE | ID: mdl-38486759

RESUMEN

Arrhythmia, a frequently encountered and life-threatening cardiac disorder, can manifest as a transient or isolated event. Traditional automatic arrhythmia detection methods have predominantly relied on QRS-wave signal detection. Contemporary research has focused on the utilization of wearable devices for continuous monitoring of heart rates and rhythms through single-lead electrocardiogram (ECG), which holds the potential to promptly detect arrhythmias. However, in this study, we employed a convolutional neural network (CNN) to classify distinct arrhythmias without QRS wave detection step. The ECG data utilized in this study were sourced from the publicly accessible PhysioNet databases. Taking into account the impact of the duration of ECG signal on accuracy, this study trained one-dimensional CNN models with 5-s and 10-s segments, respectively, and compared their results. In the results, the CNN model exhibited the capability to differentiate between Normal Sinus Rhythm (NSR) and various arrhythmias, including Atrial Fibrillation (AFIB), Atrial Flutter (AFL), Wolff-Parkinson-White syndrome (WPW), Ventricular Fibrillation (VF), Ventricular Tachycardia (VT), Ventricular Flutter (VFL), Mobitz II AV Block (MII), and Sinus Bradycardia (SB). Both 10-s and 5-s ECG segments exhibited comparable results, with an average classification accuracy of 97.31%. It reveals the feasibility of utilizing even shorter 5-s recordings for detecting arrhythmias in everyday scenarios. Detecting arrhythmias with a single lead aligns well with the practicality of wearable devices for daily use, and shorter detection times also align with their clinical utility in emergency situations.

2.
Front Med (Lausanne) ; 10: 1178798, 2023.
Artículo en Inglés | MEDLINE | ID: mdl-37593404

RESUMEN

Introduction: Rib fractures are a prevalent injury among trauma patients, and accurate and timely diagnosis is crucial to mitigate associated risks. Unfortunately, missed rib fractures are common, leading to heightened morbidity and mortality rates. While more sensitive imaging modalities exist, their practicality is limited due to cost and radiation exposure. Point of care ultrasound offers an alternative but has drawbacks in terms of procedural time and operator expertise. Therefore, this study aims to explore the potential of deep convolutional neural networks (DCNNs) in identifying rib fractures on chest radiographs. Methods: We assembled a comprehensive retrospective dataset of chest radiographs with formal image reports documenting rib fractures from a single medical center over the last five years. The DCNN models were trained using 2000 region-of-interest (ROI) slices for each category, which included fractured ribs, non-fractured ribs, and background regions. To optimize training of the deep learning models (DLMs), the images were segmented into pixel dimensions of 128 × 128. Results: The trained DCNN models demonstrated remarkable validation accuracies. Specifically, AlexNet achieved 92.6%, GoogLeNet achieved 92.2%, EfficientNetb3 achieved 92.3%, DenseNet201 achieved 92.4%, and MobileNetV2 achieved 91.2%. Discussion: By integrating DCNN models capable of rib fracture recognition into clinical decision support systems, the incidence of missed rib fracture diagnoses can be significantly reduced, resulting in tangible decreases in morbidity and mortality rates among trauma patients. This innovative approach holds the potential to revolutionize the diagnosis and treatment of chest trauma, ultimately leading to improved clinical outcomes for individuals affected by these injuries. The utilization of DCNNs in rib fracture detection on chest radiographs addresses the limitations of other imaging modalities, offering a promising and practical solution to improve patient care and management.

SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA