Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 5 de 5
Filtrar
Mais filtros

Base de dados
Tipo de documento
País de afiliação
Intervalo de ano de publicação
1.
PLoS Comput Biol ; 17(9): e1009037, 2021 09.
Artigo em Inglês | MEDLINE | ID: mdl-34570773

RESUMO

Graph representations are traditionally used to represent protein structures in sequence design protocols in which the protein backbone conformation is known. This infrequently extends to machine learning projects: existing graph convolution algorithms have shortcomings when representing protein environments. One reason for this is the lack of emphasis on edge attributes during massage-passing operations. Another reason is the traditionally shallow nature of graph neural network architectures. Here we introduce an improved message-passing operation that is better equipped to model local kinematics problems such as protein design. Our approach, XENet, pays special attention to both incoming and outgoing edge attributes. We compare XENet against existing graph convolutions in an attempt to decrease rotamer sample counts in Rosetta's rotamer substitution protocol, used for protein side-chain optimization and sequence design. This use case is motivating because it both reduces the size of the search space for classical side-chain optimization algorithms, and allows larger protein design problems to be solved with quantum algorithms on near-term quantum computers with limited qubit counts. XENet outperformed competing models while also displaying a greater tolerance for deeper architectures. We found that XENet was able to decrease rotamer counts by 40% without loss in quality. This decreased the memory consumption for classical pre-computation of rotamer energies in our use case by more than a factor of 3, the qubit consumption for an existing sequence design quantum algorithm by 40%, and the size of the solution space by a factor of 165. Additionally, XENet displayed an ability to handle deeper architectures than competing convolutions.


Assuntos
Algoritmos , Gráficos por Computador , Desenho Assistido por Computador , Aprendizado de Máquina , Proteínas/química , Biologia Computacional , Computadores , Modelos Moleculares , Redes Neurais de Computação , Conformação Proteica
2.
Artigo em Inglês | MEDLINE | ID: mdl-35862329

RESUMO

Many recent works in the field of graph machine learning have introduced pooling operators to reduce the size of graphs. In this article, we present an operational framework to unify this vast and diverse literature by describing pooling operators as the combination of three functions: selection, reduction, and connection (SRC). We then introduce a taxonomy of pooling operators, based on some of their key characteristics and implementation differences under the SRC framework. Finally, we propose three criteria to evaluate the performance of pooling operators and use them to investigate the behavior of different operators on a variety of tasks.

3.
IEEE Trans Neural Netw Learn Syst ; 33(5): 2195-2207, 2022 May.
Artigo em Inglês | MEDLINE | ID: mdl-33382662

RESUMO

In graph neural networks (GNNs), pooling operators compute local summaries of input graphs to capture their global properties, and they are fundamental for building deep GNNs that learn hierarchical representations. In this work, we propose the Node Decimation Pooling (NDP), a pooling operator for GNNs that generates coarser graphs while preserving the overall graph topology. During training, the GNN learns new node representations and fits them to a pyramid of coarsened graphs, which is computed offline in a preprocessing stage. NDP consists of three steps. First, a node decimation procedure selects the nodes belonging to one side of the partition identified by a spectral algorithm that approximates the MAXCUT solution. Afterward, the selected nodes are connected with Kron reduction to form the coarsened graph. Finally, since the resulting graph is very dense, we apply a sparsification procedure that prunes the adjacency matrix of the coarsened graph to reduce the computational cost in the GNN. Notably, we show that it is possible to remove many edges without significantly altering the graph structure. Experimental results show that NDP is more efficient compared to state-of-the-art graph pooling operators while reaching, at the same time, competitive performance on a significant variety of graph classification tasks.

4.
IEEE Trans Pattern Anal Mach Intell ; 44(7): 3496-3507, 2022 07.
Artigo em Inglês | MEDLINE | ID: mdl-33497331

RESUMO

Popular graph neural networks implement convolution operations on graphs based on polynomial spectral filters. In this paper, we propose a novel graph convolutional layer inspired by the auto-regressive moving average (ARMA) filter that, compared to polynomial ones, provides a more flexible frequency response, is more robust to noise, and better captures the global graph structure. We propose a graph neural network implementation of the ARMA filter with a recursive and distributed formulation, obtaining a convolutional layer that is efficient to train, localized in the node space, and can be transferred to new graphs at test time. We perform a spectral analysis to study the filtering effect of the proposed ARMA layer and report experiments on four downstream tasks: semi-supervised node classification, graph signal classification, graph classification, and graph regression. Results show that the proposed ARMA layer brings significant improvements over graph neural networks based on polynomial filters.


Assuntos
Algoritmos , Redes Neurais de Computação
5.
IEEE Trans Neural Netw Learn Syst ; 31(6): 1856-1869, 2020 06.
Artigo em Inglês | MEDLINE | ID: mdl-31380770

RESUMO

The space of graphs is often characterized by a nontrivial geometry, which complicates learning and inference in practical applications. A common approach is to use embedding techniques to represent graphs as points in a conventional Euclidean space, but non-Euclidean spaces have often been shown to be better suited for embedding graphs. Among these, constant-curvature Riemannian manifolds (CCMs) offer embedding spaces suitable for studying the statistical properties of a graph distribution, as they provide ways to easily compute metric geodesic distances. In this paper, we focus on the problem of detecting changes in stationarity in a stream of attributed graphs. To this end, we introduce a novel change detection framework based on neural networks and CCMs, which takes into account the non-Euclidean nature of graphs. Our contribution in this paper is twofold. First, via a novel approach based on adversarial learning, we compute graph embeddings by training an autoencoder to represent graphs on CCMs. Second, we introduce two novel change detection tests operating on CCMs. We perform experiments on synthetic data, as well as two real-world application scenarios: the detection of epileptic seizures using functional connectivity brain networks and the detection of hostility between two subjects, using human skeletal graphs. Results show that the proposed methods are able to detect even small changes in a graph-generating process, consistently outperforming approaches based on Euclidean embeddings.

SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA