Multi-scale and local feature guidance network for corneal nerve fiber segmentation.
Phys Med Biol
; 68(9)2023 05 03.
Article
em En
| MEDLINE
| ID: mdl-37054733
Objective. Corneal confocal microscopy (CCM) is a rapid and non-invasive ophthalmic imaging technique that can reveal corneal nerve fiber. The automatic segmentation of corneal nerve fiber in CCM images is vital for the subsequent abnormality analysis, which is the main basis for the early diagnosis of degenerative neurological systemic diseases such as diabetic peripheral neuropathy.Approach. In this paper, a U-shape encoder-decoder structure based multi-scale and local feature guidance neural network (MLFGNet) is proposed for the automatic corneal nerve fiber segmentation in CCM images. Three novel modules including multi-scale progressive guidance (MFPG) module, local feature guided attention (LFGA) module, and multi-scale deep supervision (MDS) module are proposed and applied in skip connection, bottom of the encoder and decoder path respectively, which are designed from both multi-scale information fusion and local information extraction perspectives to enhance the network's ability to discriminate the global and local structure of nerve fibers. The proposed MFPG module solves the imbalance between semantic information and spatial information, the LFGA module enables the network to capture attention relationships on local feature maps and the MDS module fully utilizes the relationship between high-level and low-level features for feature reconstruction in the decoder path.Main results. The proposed MLFGNet is evaluated on three CCM image Datasets, the Dice coefficients reach 89.33%, 89.41%, and 88.29% respectively.Significance. The proposed method has excellent segmentation performance for corneal nerve fibers and outperforms other state-of-the-art methods.
Palavras-chave
Texto completo:
1
Coleções:
01-internacional
Base de dados:
MEDLINE
Assunto principal:
Olho
/
Face
Tipo de estudo:
Guideline
/
Screening_studies
Idioma:
En
Revista:
Phys Med Biol
Ano de publicação:
2023
Tipo de documento:
Article