Deep active learning for suggestive segmentation of biomedical image stacks via optimisation of Dice scores and traced boundary length.
Med Image Anal
; 81: 102549, 2022 10.
Article
en En
| MEDLINE
| ID: mdl-36113320
Manual segmentation of stacks of 2D biomedical images (e.g., histology) is a time-consuming task which can be sped up with semi-automated techniques. In this article, we present a suggestive deep active learning framework that seeks to minimise the annotation effort required to achieve a certain level of accuracy when labelling such a stack. The framework suggests, at every iteration, a specific region of interest (ROI) in one of the images for manual delineation. Using a deep segmentation neural network and a mixed cross-entropy loss function, we propose a principled strategy to estimate class probabilities for the whole stack, conditioned on heterogeneous partial segmentations of the 2D images, as well as on weak supervision in the form of image indices that bound each ROI. Using the estimated probabilities, we propose a novel active learning criterion based on predictions for the estimated segmentation performance and delineation effort, measured with average Dice scores and total delineated boundary length, respectively, rather than common surrogates such as entropy. The query strategy suggests the ROI that is expected to maximise the ratio between performance and effort, while considering the adjacency of structures that may have already been labelled - which decrease the length of the boundary to trace. We provide quantitative results on synthetically deformed MRI scans and real histological data, showing that our framework can reduce labelling effort by up to 60-70% without compromising accuracy.
Palabras clave
Texto completo:
1
Colección:
01-internacional
Banco de datos:
MEDLINE
Asunto principal:
Imagen por Resonancia Magnética
/
Redes Neurales de la Computación
Tipo de estudio:
Prognostic_studies
Límite:
Humans
Idioma:
En
Revista:
Med Image Anal
Asunto de la revista:
DIAGNOSTICO POR IMAGEM
Año:
2022
Tipo del documento:
Article