Your browser doesn't support javascript.
loading
Hierarchical Pretraining on Multimodal Electronic Health Records.
Wang, Xiaochen; Luo, Junyu; Wang, Jiaqi; Yin, Ziyi; Cui, Suhan; Zhong, Yuan; Wang, Yaqing; Ma, Fenglong.
Afiliación
  • Wang X; The Pennsylvania State University.
  • Luo J; The Pennsylvania State University.
  • Wang J; The Pennsylvania State University.
  • Yin Z; The Pennsylvania State University.
  • Cui S; The Pennsylvania State University.
  • Zhong Y; The Pennsylvania State University.
  • Wang Y; Google Research.
  • Ma F; The Pennsylvania State University.
Proc Conf Empir Methods Nat Lang Process ; 2023: 2839-2852, 2023 Dec.
Article en En | MEDLINE | ID: mdl-38600913
ABSTRACT
Pretraining has proven to be a powerful technique in natural language processing (NLP), exhibiting remarkable success in various NLP downstream tasks. However, in the medical domain, existing pretrained models on electronic health records (EHR) fail to capture the hierarchical nature of EHR data, limiting their generalization capability across diverse downstream tasks using a single pretrained model. To tackle this challenge, this paper introduces a novel, general, and unified pretraining framework called MedHMP, specifically designed for hierarchically multimodal EHR data. The effectiveness of the proposed MedHMP is demonstrated through experimental results on eight downstream tasks spanning three levels. Comparisons against eighteen baselines further highlight the efficacy of our approach.

Texto completo: 1 Banco de datos: MEDLINE Idioma: En Revista: Proc Conf Empir Methods Nat Lang Process Año: 2023 Tipo del documento: Article

Texto completo: 1 Banco de datos: MEDLINE Idioma: En Revista: Proc Conf Empir Methods Nat Lang Process Año: 2023 Tipo del documento: Article