A lightweight CNN-based knowledge graph embedding model with channel attention for link prediction.
Math Biosci Eng
; 20(6): 9607-9624, 2023 Mar 21.
Article
en En
| MEDLINE
| ID: mdl-37322903
Knowledge graph (KG) embedding is to embed the entities and relations of a KG into a low-dimensional continuous vector space while preserving the intrinsic semantic associations between entities and relations. One of the most important applications of knowledge graph embedding (KGE) is link prediction (LP), which aims to predict the missing fact triples in the KG. A promising approach to improving the performance of KGE for the task of LP is to increase the feature interactions between entities and relations so as to express richer semantics between them. Convolutional neural networks (CNNs) have thus become one of the most popular KGE models due to their strong expression and generalization abilities. To further enhance favorable features from increased feature interactions, we propose a lightweight CNN-based KGE model called IntSE in this paper. Specifically, IntSE not only increases the feature interactions between the components of entity and relationship embeddings with more efficient CNN components but also incorporates the channel attention mechanism that can adaptively recalibrate channel-wise feature responses by modeling the interdependencies between channels to enhance the useful features while suppressing the useless ones for improving its performance for LP. The experimental results on public datasets confirm that IntSE is superior to state-of-the-art CNN-based KGE models for link prediction in KGs.
Palabras clave
Texto completo:
1
Base de datos:
MEDLINE
Asunto principal:
Reconocimiento de Normas Patrones Automatizadas
/
Redes Neurales de la Computación
Tipo de estudio:
Prognostic_studies
/
Risk_factors_studies
Idioma:
En
Revista:
Math Biosci Eng
Año:
2023
Tipo del documento:
Article
País de afiliación:
China