Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 2 de 2
Filtrar
Más filtros










Base de datos
Intervalo de año de publicación
1.
Artículo en Inglés | MEDLINE | ID: mdl-37018088

RESUMEN

Although Transformer has achieved success in language and vision tasks, its capacity for knowledge graph (KG) embedding has not been fully exploited. Using the self-attention (SA) mechanism in Transformer to model the subject-relation-object triples in KGs suffers from training inconsistency as SA is invariant to the order of input tokens. As a result, it is unable to distinguish a (real) relation triple from its shuffled (fake) variants (e.g., object-relation-subject) and, thus, fails to capture the correct semantics. To cope with this issue, we propose a novel Transformer architecture, namely, for KG embedding. It incorporates relational compositions in entity representations to explicitly inject semantics and capture the role of an entity based on its position (subject or object) in a relation triple. The relational composition for a subject (or object) entity of a relation triple refers to an operator on the relation and the object (or subject). We borrow ideas from the typical translational and semantic-matching embedding techniques to design relational compositions. We carefully design a residual block to integrate relational compositions into SA and efficiently propagate the composed relational semantics layer by layer. We formally prove that the SA with relational compositions is able to distinguish the entity roles in different positions and correctly capture relational semantics. Extensive experiments and analyses on six benchmark datasets show that achieves state-of-the-art performance on both link prediction and entity alignment.

2.
Neural Netw ; 20(7): 799-809, 2007 Sep.
Artículo en Inglés | MEDLINE | ID: mdl-17714914

RESUMEN

This paper is concerned with the problem of robust stability for stochastic interval delayed additive neural networks (SIDANN) with Markovian switching. The time delay is assumed to be time-varying. In such neural networks, the features of stochastic systems, interval systems, time-varying delay systems and Markovian switching are taken into account. The mathematical model of this kind of neural networks is first proposed. Secondly, the global exponential stability in the mean square is studied for the SIDANN with Markovian switching. Based on the Lyapunov method, several stability conditions are presented, which can be expressed in terms of linear matrix inequalities. As a subsequent result, the stochastic interval additive neural networks with time-varying delay are also discussed. A sufficient condition is given to determine its stability. Finally, two simulation examples are provided to illustrate the effectiveness of the results developed.


Asunto(s)
Cadenas de Markov , Redes Neurales de la Computación , Procesos Estocásticos , Algoritmos , Inteligencia Artificial , Simulación por Computador , Retroalimentación , Lógica Difusa , Almacenamiento y Recuperación de la Información , Modelos Lineales , Matemática , Neuronas/fisiología , Dinámicas no Lineales , Reconocimiento de Normas Patrones Automatizadas , Teoría de Sistemas , Factores de Tiempo
SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA