Your browser doesn't support javascript.
loading
An in-depth evaluation of federated learning on biomedical natural language processing for information extraction.
Peng, Le; Luo, Gaoxiang; Zhou, Sicheng; Chen, Jiandong; Xu, Ziyue; Sun, Ju; Zhang, Rui.
Afiliação
  • Peng L; Department of Computer Science and Engineering, University of Minnesota, Minneapolis, MN, USA.
  • Luo G; Department of Computer and Information Science, University of Pennsylvania, Philadelphia, PA, USA.
  • Zhou S; Institute for Health Informatics, University of Minnesota, Minneapolis, MN, USA.
  • Chen J; Institute for Health Informatics, University of Minnesota, Minneapolis, MN, USA.
  • Xu Z; Nvidia Corporation, Santa Clara, CA, USA.
  • Sun J; Department of Computer Science and Engineering, University of Minnesota, Minneapolis, MN, USA. jusun@umn.edu.
  • Zhang R; Division of Computational Health Sciences, Department of Surgery, University of Minnesota, Minneapolis, MN, USA. zhan1386@umn.edu.
NPJ Digit Med ; 7(1): 127, 2024 May 15.
Article em En | MEDLINE | ID: mdl-38750290
ABSTRACT
Language models (LMs) such as BERT and GPT have revolutionized natural language processing (NLP). However, the medical field faces challenges in training LMs due to limited data access and privacy constraints imposed by regulations like the Health Insurance Portability and Accountability Act (HIPPA) and the General Data Protection Regulation (GDPR). Federated learning (FL) offers a decentralized solution that enables collaborative learning while ensuring data privacy. In this study, we evaluated FL on 2 biomedical NLP tasks encompassing 8 corpora using 6 LMs. Our results show that (1) FL models consistently outperformed models trained on individual clients' data and sometimes performed comparably with models trained with polled data; (2) with the fixed number of total data, FL models training with more clients produced inferior performance but pre-trained transformer-based models exhibited great resilience. (3) FL models significantly outperformed pre-trained LLMs with few-shot prompting.

Texto completo: 1 Coleções: 01-internacional Base de dados: MEDLINE Idioma: En Revista: NPJ Digit Med Ano de publicação: 2024 Tipo de documento: Article País de afiliação: Estados Unidos

Texto completo: 1 Coleções: 01-internacional Base de dados: MEDLINE Idioma: En Revista: NPJ Digit Med Ano de publicação: 2024 Tipo de documento: Article País de afiliação: Estados Unidos