Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 5 de 5
Filtrar
Más filtros










Intervalo de año de publicación
3.
IEEE Trans Neural Netw ; 8(6): 1351-8, 1997.
Artículo en Inglés | MEDLINE | ID: mdl-18255737

RESUMEN

In this paper we consider the determination of the structure of the high-order Boltzmann machine (HOBM), a stochastic recurrent network for approximating probability distributions. We obtain the structure of the HOBM, the hypergraph of connections, from conditional independences of the probability distribution to model. We assume that an expert provides these conditional independences and from them we build independence maps, Markov and Bayesian networks, which represent conditional independences through undirected graphs and directed acyclic graphs respectively. From these independence maps we construct the HOBM hypergraph. The central aim of this paper is to obtain a minimal hypergraph. Given that different orderings of the variables provide in general different Bayesian networks, we define their intersection hypergraph. We prove that the intersection hypergraph of all the Bayesian networks (N!) of the distribution is contained by the hypergraph of the Markov network, it is more simple, and we give a procedure to determine a subset of the Bayesian networks that verifies this property. We also prove that the Markov network graph establishes a minimum connectivity for the hypergraphs from Bayesian networks.

4.
Neural Netw ; 9(9): 1561-1567, 1996 Dec.
Artículo en Inglés | MEDLINE | ID: mdl-12662553

RESUMEN

The high-order Boltzmann machine (HOBM) approximates probability distributions defined on a set of binary variables, through a learning algorithm that uses Monte Carlo methods. The approximation distribution is a normalized exponential of a consensus function formed by high-degree terms and the structure of the HOBM is given by the set of weighted connections. We prove the convexity of the Kullback-Leibler divergence between the distribution to learn and the approximation distribution of the HOBM. We prove the convergence of the learning algorithm to the strict global minimum of the divergence, which corresponds to the maximum likelihood estimate of the connection weights, establishing the uniqueness of the solution. These theoretical results do not hold in the conventional Boltzmann machine, where the consensus function has first and second-degree terms and hidden units are used. Copyright 1996 Elsevier Science Ltd.

5.
IEEE Trans Neural Netw ; 6(3): 767-70, 1995.
Artículo en Inglés | MEDLINE | ID: mdl-18263362

RESUMEN

In this paper we give a formal definition of the high-order Boltzmann machine (BM), and extend the well-known results on the convergence of the learning algorithm of the two-order BM. From the Bahadur-Lazarsfeld expansion we characterize the probability distribution learned by the high order BM. Likewise a criterion is given to establish the topology of the BM depending on the significant correlations of the particular probability distribution to be learned.

SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA
...