Your browser doesn't support javascript.
loading
Clinically relevant pretraining is all you need.
Bear Don't Walk Iv, Oliver J; Sun, Tony; Perotte, Adler; Elhadad, Noémie.
Afiliação
  • Bear Don't Walk Iv OJ; Department of Biomedical Informatics, Columbia University, New York, New York, USA.
  • Sun T; Department of Biomedical Informatics, Columbia University, New York, New York, USA.
  • Perotte A; Department of Biomedical Informatics, Columbia University, New York, New York, USA.
  • Elhadad N; Department of Biomedical Informatics, Columbia University, New York, New York, USA.
J Am Med Inform Assoc ; 28(9): 1970-1976, 2021 08 13.
Article em En | MEDLINE | ID: mdl-34151966
ABSTRACT
Clinical notes present a wealth of information for applications in the clinical domain, but heterogeneity across clinical institutions and settings presents challenges for their processing. The clinical natural language processing field has made strides in overcoming domain heterogeneity, while pretrained deep learning models present opportunities to transfer knowledge from one task to another. Pretrained models have performed well when transferred to new tasks; however, it is not well understood if these models generalize across differences in institutions and settings within the clinical domain. We explore if institution or setting specific pretraining is necessary for pretrained models to perform well when transferred to new tasks. We find no significant performance difference between models pretrained across institutions and settings, indicating that clinically pretrained models transfer well across such boundaries. Given a clinically pretrained model, clinical natural language processing researchers may forgo the time-consuming pretraining step without a significant performance drop.
Assuntos
Palavras-chave

Texto completo: 1 Coleções: 01-internacional Base de dados: MEDLINE Assunto principal: Aprendizado Profundo Limite: Humans Idioma: En Ano de publicação: 2021 Tipo de documento: Article

Texto completo: 1 Coleções: 01-internacional Base de dados: MEDLINE Assunto principal: Aprendizado Profundo Limite: Humans Idioma: En Ano de publicação: 2021 Tipo de documento: Article