Your browser doesn't support javascript.
loading
Unsupervised graph-level representation learning with hierarchical contrasts.
Ju, Wei; Gu, Yiyang; Luo, Xiao; Wang, Yifan; Yuan, Haochen; Zhong, Huasong; Zhang, Ming.
Afiliação
  • Ju W; School of Computer Science, Peking University, Beijing, 100871, China.
  • Gu Y; School of Computer Science, Peking University, Beijing, 100871, China.
  • Luo X; Department of Computer Science, University of California, Los Angeles, 90095, USA. Electronic address: xiaoluo@cs.ucla.edu.
  • Wang Y; School of Computer Science, Peking University, Beijing, 100871, China.
  • Yuan H; School of Computer Science, Peking University, Beijing, 100871, China.
  • Zhong H; Alibaba Group, Hangzhou, 311100, China.
  • Zhang M; School of Computer Science, Peking University, Beijing, 100871, China. Electronic address: mzhang_cs@pku.edu.cn.
Neural Netw ; 158: 359-368, 2023 Jan.
Article em En | MEDLINE | ID: mdl-36516542
ABSTRACT
Unsupervised graph-level representation learning has recently shown great potential in a variety of domains, ranging from bioinformatics to social networks. Plenty of graph contrastive learning methods have been proposed to generate discriminative graph-level representations recently. They typically design multiple types of graph augmentations and enforce a graph to have consistent representations under different views. However, these techniques mostly neglect the intrinsic hierarchical structure of the graph, resulting in a limited exploration of semantic information for graph representation. Moreover, they often rely on a large number of negative samples to prevent collapsing into trivial solutions, while a great need for negative samples may lead to memory issues during optimization in graph domains. To address the two issues, this paper develops an unsupervised graph-level representation learning framework named Hierarchical Graph Contrastive Learning (HGCL), which investigates the hierarchical structural semantics of a graph at both node and graph levels. Specifically, our HGCL consists of three parts, i.e., node-level contrastive learning, graph-level contrastive learning, and mutual contrastive learning to capture graph semantics hierarchically. Furthermore, the Siamese network and momentum update are further involved to release the demand for excessive negative samples. Finally, the experimental results on both benchmark datasets for graph classification and large-scale OGB datasets for transfer learning demonstrate that our proposed HGCL significantly outperforms a broad range of state-of-the-art baselines.
Assuntos
Palavras-chave

Texto completo: 1 Base de dados: MEDLINE Assunto principal: Aprendizado Profundo / Aprendizagem Idioma: En Ano de publicação: 2023 Tipo de documento: Article

Texto completo: 1 Base de dados: MEDLINE Assunto principal: Aprendizado Profundo / Aprendizagem Idioma: En Ano de publicação: 2023 Tipo de documento: Article