Your browser doesn't support javascript.
loading
Improved breast ultrasound tumor classification using dual-input CNN with GAP-guided attention loss.
Zou, Xiao; Zhai, Jintao; Qian, Shengyou; Li, Ang; Tian, Feng; Cao, Xiaofei; Wang, Runmin.
Afiliación
  • Zou X; School of Physics and Electronics, Hunan Normal University, Changsha 410081, China.
  • Zhai J; School of Physics and Electronics, Hunan Normal University, Changsha 410081, China.
  • Qian S; School of Physics and Electronics, Hunan Normal University, Changsha 410081, China.
  • Li A; School of Physics and Electronics, Hunan Normal University, Changsha 410081, China.
  • Tian F; School of Physics and Electronics, Hunan Normal University, Changsha 410081, China.
  • Cao X; College of Information Science and Engineering, Hunan Normal University, Changsha 410081, China.
  • Wang R; College of Information Science and Engineering, Hunan Normal University, Changsha 410081, China.
Math Biosci Eng ; 20(8): 15244-15264, 2023 07 20.
Article en En | MEDLINE | ID: mdl-37679179
ABSTRACT
Ultrasonography is a widely used medical imaging technique for detecting breast cancer. While manual diagnostic methods are subject to variability and time-consuming, computer-aided diagnostic (CAD) methods have proven to be more efficient. However, current CAD approaches neglect the impact of noise and artifacts on the accuracy of image analysis. To enhance the precision of breast ultrasound image analysis for identifying tissues, organs and lesions, we propose a novel approach for improved tumor classification through a dual-input model and global average pooling (GAP)-guided attention loss function. Our approach leverages a convolutional neural network with transformer architecture and modifies the single-input model for dual-input. This technique employs a fusion module and GAP operation-guided attention loss function simultaneously to supervise the extraction of effective features from the target region and mitigate the effect of information loss or redundancy on misclassification. Our proposed method has three key features (i) ResNet and MobileViT are combined to enhance local and global information extraction. In addition, a dual-input channel is designed to include both attention images and original breast ultrasound images, mitigating the impact of noise and artifacts in ultrasound images. (ii) A fusion module and GAP operation-guided attention loss function are proposed to improve the fusion of dual-channel feature information, as well as supervise and constrain the weight of the attention mechanism on the fused focus region. (iii) Using the collected uterine fibroid ultrasound dataset to train ResNet18 and load the pre-trained weights, our experiments on the BUSI and BUSC public datasets demonstrate that the proposed method outperforms some state-of-the-art methods. The code will be publicly released at https//github.com/425877/Improved-Breast-Ultrasound-Tumor-Classification.
Asunto(s)
Palabras clave

Texto completo: 1 Colección: 01-internacional Base de datos: MEDLINE Asunto principal: Ultrasonografía Mamaria / Neoplasias Tipo de estudio: Guideline Límite: Female / Humans Idioma: En Revista: Math Biosci Eng Año: 2023 Tipo del documento: Article País de afiliación: China

Texto completo: 1 Colección: 01-internacional Base de datos: MEDLINE Asunto principal: Ultrasonografía Mamaria / Neoplasias Tipo de estudio: Guideline Límite: Female / Humans Idioma: En Revista: Math Biosci Eng Año: 2023 Tipo del documento: Article País de afiliación: China