Enhancing medical image analysis with unsupervised domain adaptation approach across microscopes and magnifications.
Comput Biol Med
; 170: 108055, 2024 Mar.
Article
en En
| MEDLINE
| ID: mdl-38295480
ABSTRACT
In the domain of medical image analysis, deep learning models are heralding a revolution, especially in detecting complex and nuanced features characteristic of diseases like tumors and cancers. However, the robustness and adaptability of these models across varied imaging conditions and magnifications remain a formidable challenge. This paper introduces the Fourier Adaptive Recognition System (FARS), a pioneering model primarily engineered to address adaptability in malarial parasite recognition. Yet, the foundational principles guiding FARS lend themselves seamlessly to broader applications, including tumor and cancer diagnostics. FARS capitalizes on the untapped potential of transitioning from bounding box labels to richer semantic segmentation labels, enabling a more refined examination of microscopy slides. With the integration of adversarial training and the Color Domain Aware Fourier Domain Adaptation (F2DA), the model ensures consistent feature extraction across diverse microscopy configurations. The further inclusion of category-dependent context attention amplifies FARS's cross-domain versatility. Evidenced by a substantial elevation in cross-magnification performance from 31.3% mAP to 55.19% mAP and a 15.68% boost in cross-domain adaptability, FARS positions itself as a significant advancement in malarial parasite recognition. Furthermore, the core methodologies of FARS can serve as a blueprint for enhancing precision in other realms of medical image analysis, especially in the complex terrains of tumor and cancer imaging. The code is available at; https//github.com/Mr-TalhaIlyas/FARS.
Palabras clave
Texto completo:
1
Colección:
01-internacional
Banco de datos:
MEDLINE
Asunto principal:
Microscopía
/
Neoplasias
Tipo de estudio:
Prognostic_studies
Límite:
Humans
Idioma:
En
Revista:
Comput Biol Med
Año:
2024
Tipo del documento:
Article