Real-time carotid plaque recognition from dynamic ultrasound videos based on artificial neural network.
Ultraschall Med
; 45(5): 493-500, 2024 Oct.
Article
em En
| MEDLINE
| ID: mdl-38113893
ABSTRACT
PURPOSE:
Carotid ultrasound allows noninvasive assessment of vascular anatomy and function with real-time display. Based on the transfer learning method, a series of research results have been obtained on the optimal image recognition and analysis of static images. However, for carotid plaque recognition, there are high requirements for self-developed algorithms in real-time ultrasound detection. This study aims to establish an automatic recognition system, Be Easy to Use (BETU), for the real-time and synchronous diagnosis of carotid plaque from ultrasound videos based on an artificial neural network. MATERIALS ANDMETHODS:
445 participants (mean age, 54.6±7.8 years; 227 men) were evaluated. Radiologists labeled a total of 3259 segmented ultrasound images from 445 videos with the diagnosis of carotid plaque, 2725 images were collected as a training dataset, and 554 images as a testing dataset. The automatic plaque recognition system BETU was established based on an artificial neural network, and remote application on a 5G environment was performed to test its diagnostic performance.RESULTS:
The diagnostic accuracy of BETU (98.5%) was consistent with the radiologist's (Kappa = 0.967, P < 0.001). Remote diagnostic feedback based on BETU-processed ultrasound videos could be obtained in 150ms across a distance of 1023 km between the ultrasound/BETU station and the consultation workstation.CONCLUSION:
Based on the good performance of BETU in real-time plaque recognition from ultrasound videos, 5G plus Artificial intelligence (AI)-assisted ultrasound real-time carotid plaque screening was achieved, and the diagnosis was made.
Texto completo:
1
Coleções:
01-internacional
Base de dados:
MEDLINE
Assunto principal:
Ultrassonografia
/
Redes Neurais de Computação
Limite:
Adult
/
Female
/
Humans
/
Male
/
Middle aged
Idioma:
En
Ano de publicação:
2024
Tipo de documento:
Article