Your browser doesn't support javascript.
loading
Self-supervised Pre-training via Multi-view Graph Information Bottleneck for Molecular Property Prediction.
Article em En | MEDLINE | ID: mdl-38959149
ABSTRACT
Molecular representation learning has remarkably accelerated the development of drug analysis and discovery. It implements machine learning methods to encode molecule embeddings for diverse downstream drug-related tasks. Due to the scarcity of labeled molecular data, self-supervised molecular pre-training is promising as it can handle large-scale unlabeled molecular data to prompt representation learning. Although many universal graph pre-training methods have been successfully introduced into molecular learning, there still exist some limitations. Many graph augmentation methods, such as atom deletion and bond perturbation, tend to destroy the intrinsic properties and connections of molecules. In addition, identifying subgraphs that are important to specific chemical properties is also challenging for molecular learning. To address these limitations, we propose the self-supervised Molecular Graph Information Bottleneck (MGIB) model for molecular pre-training. MGIB observes molecular graphs from the atom view and the motif view, deploys a learnable graph compression process to extract the core subgraphs, and extends the graph information bottleneck into the self-supervised molecular pre-training framework. Model analysis validates the contribution of the self-supervised graph information bottleneck and illustrates the interpretability of MGIB through the extracted subgraphs. Extensive experiments involving molecular property prediction, including 7 binary classification tasks and 6 regression tasks demonstrate the effectiveness and superiority of our proposed MGIB.

Texto completo: 1 Bases de dados: MEDLINE Idioma: En Revista: IEEE J Biomed Health Inform Ano de publicação: 2024 Tipo de documento: Article

Texto completo: 1 Bases de dados: MEDLINE Idioma: En Revista: IEEE J Biomed Health Inform Ano de publicação: 2024 Tipo de documento: Article