Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 1 de 1
Filtrar
Mais filtros

Base de dados
Ano de publicação
Tipo de documento
País de afiliação
Intervalo de ano de publicação
1.
J Med Imaging (Bellingham) ; 10(6): 061108, 2023 Nov.
Artigo em Inglês | MEDLINE | ID: mdl-38106815

RESUMO

Purpose: Breast ultrasound suffers from low positive predictive value and specificity. Artificial intelligence (AI) proposes to improve accuracy, reduce false negatives, reduce inter- and intra-observer variability and decrease the rate of benign biopsies. Perpetuating racial/ethnic disparities in healthcare and patient outcome is a potential risk when incorporating AI-based models into clinical practice; therefore, it is necessary to validate its non-bias before clinical use. Approach: Our retrospective review assesses whether our AI decision support (DS) system demonstrates racial/ethnic bias by evaluating its performance on 1810 biopsy proven cases from nine breast imaging facilities within our health system from January 1, 2018 to October 28, 2021. Patient age, gender, race/ethnicity, AI DS output, and pathology results were obtained. Results: Significant differences in breast pathology incidence were seen across different racial and ethnic groups. Stratified analysis showed that the difference in output by our AI DS system was due to underlying differences in pathology incidence for our specific cohort and did not demonstrate statistically significant bias in output among race/ethnic groups, suggesting similar effectiveness of our AI DS system among different races (p>0.05 for all). Conclusions: Our study shows promise that an AI DS system may serve as a valuable second opinion in the detection of breast cancer on diagnostic ultrasound without significant racial or ethnic bias. AI tools are not meant to replace the radiologist, but rather to aid in screening and diagnosis without perpetuating racial/ethnic disparities.

SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA