Bias Perpetuates Bias: ChatGPT Learns Gender Inequities in Academic Surgery Promotions.
J Surg Educ
; 81(11): 1553-1557, 2024 Nov.
Article
em En
| MEDLINE
| ID: mdl-39232303
ABSTRACT
OBJECTIVE:
Gender inequities persist in academic surgery with implicit bias impacting hiring and promotion at all levels. We hypothesized that creating letters of recommendation for both female and male candidates for academic promotion in surgery using an AI platform, ChatGPT, would elucidate the entrained gender biases already present in the promotion process.DESIGN:
Using ChatGPT, we generated 6 letters of recommendation for "a phenomenal surgeon applying for job promotion to associate professor position", specifying "female" or "male" before surgeon in the prompt. We compared 3 "female" letters to 3 "male" letters for differences in length, language, and tone.RESULTS:
The letters written for females averaged 298 words compared to 314 for males. Female letters more frequently referred to "compassion", "empathy", and "inclusivity"; whereas male letters referred to "respect", "reputation", and "skill".CONCLUSIONS:
These findings highlight the gender bias present in promotion letters generated by ChatGPT, reiterating existing literature regarding real letters of recommendation in academic surgery. Our study suggests that surgeons should use AI tools, such as ChatGPT, with caution when writing LORs for academic surgery faculty promotion.Palavras-chave
Texto completo:
1
Base de dados:
MEDLINE
Assunto principal:
Docentes de Medicina
/
Sexismo
Idioma:
En
Ano de publicação:
2024
Tipo de documento:
Article