Your browser doesn't support javascript.
loading
Multimodal Representation Learning via Maximization of Local Mutual Information.
Liao, Ruizhi; Moyer, Daniel; Cha, Miriam; Quigley, Keegan; Berkowitz, Seth; Horng, Steven; Golland, Polina; Wells, William M.
Afiliação
  • Liao R; CSAIL, Massachusetts Institute of Technology, Cambridge, MA, USA.
  • Moyer D; CSAIL, Massachusetts Institute of Technology, Cambridge, MA, USA.
  • Cha M; MIT Lincoln Laboratory, Lexington, MA, USA.
  • Quigley K; MIT Lincoln Laboratory, Lexington, MA, USA.
  • Berkowitz S; Beth Israel Deaconess Medical Center, Harvard Medical School, Boston, MA, USA.
  • Horng S; Beth Israel Deaconess Medical Center, Harvard Medical School, Boston, MA, USA.
  • Golland P; CSAIL, Massachusetts Institute of Technology, Cambridge, MA, USA.
  • Wells WM; CSAIL, Massachusetts Institute of Technology, Cambridge, MA, USA.
Article em En | MEDLINE | ID: mdl-36282980
ABSTRACT
We propose and demonstrate a representation learning approach by maximizing the mutual information between local features of images and text. The goal of this approach is to learn useful image representations by taking advantage of the rich information contained in the free text that describes the findings in the image. Our method trains image and text encoders by encouraging the resulting representations to exhibit high local mutual information. We make use of recent advances in mutual information estimation with neural network discriminators. We argue that the sum of local mutual information is typically a lower bound on the global mutual information. Our experimental results in the downstream image classification tasks demonstrate the advantages of using local features for image-text representation learning. Our code is available at https//github.com/RayRuizhiLiao/mutual_info_img_txt.
Palavras-chave

Texto completo: 1 Base de dados: MEDLINE Idioma: En Ano de publicação: 2021 Tipo de documento: Article

Texto completo: 1 Base de dados: MEDLINE Idioma: En Ano de publicação: 2021 Tipo de documento: Article