Active learning for left ventricle segmentation in echocardiography.
Comput Methods Programs Biomed
; 248: 108111, 2024 May.
Article
em En
| MEDLINE
| ID: mdl-38479147
ABSTRACT
BACKGROUND AND OBJECTIVE:
Training deep learning models for medical image segmentation require large annotated datasets, which can be expensive and time-consuming to create. Active learning is a promising approach to reduce this burden by strategically selecting the most informative samples for segmentation. This study investigates the use of active learning for efficient left ventricle segmentation in echocardiography with sparse expert annotations.METHODS:
We adapt and evaluate various sampling techniques, demonstrating their effectiveness in judiciously selecting samples for segmentation. Additionally, we introduce a novel strategy, Optimised Representativeness Sampling, which combines feature-based outliers with the most representative samples to enhance annotation efficiency.RESULTS:
Our findings demonstrate a substantial reduction in annotation costs, achieving a remarkable 99% upper bound performance while utilising only 20% of the labelled data. This equates to a reduction of 1680 images needing annotation within our dataset. When applied to a publicly available dataset, our approach yielded a remarkable 70% reduction in required annotation efforts, representing a significant advancement compared to baseline active learning strategies, which achieved only a 50% reduction. Our experiments highlight the nuanced performance of diverse sampling strategies across datasets within the same domain.CONCLUSIONS:
The study provides a cost-effective approach to tackle the challenges of limited expert annotations in echocardiography. By introducing a distinct dataset, made publicly available for research purposes, our work contributes to the field's understanding of efficient annotation strategies in medical image segmentation.Palavras-chave
Texto completo:
1
Base de dados:
MEDLINE
Assunto principal:
Ecocardiografia
/
Ventrículos do Coração
Idioma:
En
Ano de publicação:
2024
Tipo de documento:
Article