Your browser doesn't support javascript.
loading
BraNet: a mobil application for breast image classification based on deep learning algorithms.
Jiménez-Gaona, Yuliana; Álvarez, María José Rodríguez; Castillo-Malla, Darwin; García-Jaen, Santiago; Carrión-Figueroa, Diana; Corral-Domínguez, Patricio; Lakshminarayanan, Vasudevan.
Afiliação
  • Jiménez-Gaona Y; Departamento de Química y Ciencias Exactas, Universidad Técnica Particular de Loja, San Cayetano Alto s/n CP1101608, Loja, Ecuador. ydjimenez@utpl.edu.ec.
  • Álvarez MJR; Instituto de Instrumentación para la Imagen Molecular I3M, Universitat Politécnica de Valencia, 46022, Valencia, Spain. ydjimenez@utpl.edu.ec.
  • Castillo-Malla D; Theoretical and Experimental Epistemology Lab, School of Opto ΩN2L3G1, Waterloo, Canada. ydjimenez@utpl.edu.ec.
  • García-Jaen S; Instituto de Instrumentación para la Imagen Molecular I3M, Universitat Politécnica de Valencia, 46022, Valencia, Spain.
  • Carrión-Figueroa D; Departamento de Química y Ciencias Exactas, Universidad Técnica Particular de Loja, San Cayetano Alto s/n CP1101608, Loja, Ecuador.
  • Corral-Domínguez P; Instituto de Instrumentación para la Imagen Molecular I3M, Universitat Politécnica de Valencia, 46022, Valencia, Spain.
  • Lakshminarayanan V; Theoretical and Experimental Epistemology Lab, School of Opto ΩN2L3G1, Waterloo, Canada.
Med Biol Eng Comput ; 62(9): 2737-2756, 2024 Sep.
Article em En | MEDLINE | ID: mdl-38693328
ABSTRACT
Mobile health apps are widely used for breast cancer detection using artificial intelligence algorithms, providing radiologists with second opinions and reducing false diagnoses. This study aims to develop an open-source mobile app named "BraNet" for 2D breast imaging segmentation and classification using deep learning algorithms. During the phase off-line, an SNGAN model was previously trained for synthetic image generation, and subsequently, these images were used to pre-trained SAM and ResNet18 segmentation and classification models. During phase online, the BraNet app was developed using the react native framework, offering a modular deep-learning pipeline for mammography (DM) and ultrasound (US) breast imaging classification. This application operates on a client-server architecture and was implemented in Python for iOS and Android devices. Then, two diagnostic radiologists were given a reading test of 290 total original RoI images to assign the perceived breast tissue type. The reader's agreement was assessed using the kappa coefficient. The BraNet App Mobil exhibited the highest accuracy in benign and malignant US images (94.7%/93.6%) classification compared to DM during training I (80.9%/76.9%) and training II (73.7/72.3%). The information contrasts with radiological experts' accuracy, with DM classification being 29%, concerning US 70% for both readers, because they achieved a higher accuracy in US ROI classification than DM images. The kappa value indicates a fair agreement (0.3) for DM images and moderate agreement (0.4) for US images in both readers. It means that not only the amount of data is essential in training deep learning algorithms. Also, it is vital to consider the variety of abnormalities, especially in the mammography data, where several BI-RADS categories are present (microcalcifications, nodules, mass, asymmetry, and dense breasts) and can affect the API accuracy model.
Assuntos
Palavras-chave

Texto completo: 1 Coleções: 01-internacional Base de dados: MEDLINE Assunto principal: Algoritmos / Mama / Neoplasias da Mama / Mamografia / Aprendizado Profundo Limite: Female / Humans Idioma: En Revista: Med Biol Eng Comput Ano de publicação: 2024 Tipo de documento: Article

Texto completo: 1 Coleções: 01-internacional Base de dados: MEDLINE Assunto principal: Algoritmos / Mama / Neoplasias da Mama / Mamografia / Aprendizado Profundo Limite: Female / Humans Idioma: En Revista: Med Biol Eng Comput Ano de publicação: 2024 Tipo de documento: Article