Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 9 de 9
Filtrar
1.
Neural Comput ; 35(5): 930-957, 2023 Apr 18.
Artigo em Inglês | MEDLINE | ID: mdl-36944235

RESUMO

Hebb's learning traces its origin in Pavlov's classical conditioning; however, while the former has been extensively modeled in the past decades (e.g., by the Hopfield model and countless variations on theme), as for the latter, modeling has remained largely unaddressed so far. Furthermore, a mathematical bridge connecting these two pillars is totally lacking. The main difficulty toward this goal lies in the intrinsically different scales of the information involved: Pavlov's theory is about correlations between concepts that are (dynamically) stored in the synaptic matrix as exemplified by the celebrated experiment starring a dog and a ringing bell; conversely, Hebb's theory is about correlations between pairs of neurons as summarized by the famous statement that neurons that fire together wire together. In this letter, we rely on stochastic process theory to prove that as long as we keep neurons' and synapses' timescales largely split, Pavlov's mechanism spontaneously takes place and ultimately gives rise to synaptic weights that recover the Hebbian kernel.

2.
Proc Natl Acad Sci U S A ; 120(11): e2122352120, 2023 03 14.
Artigo em Inglês | MEDLINE | ID: mdl-36897966

RESUMO

A crucial challenge in medicine is choosing which drug (or combination) will be the most advantageous for a particular patient. Usually, drug response rates differ substantially, and the reasons for this response unpredictability remain ambiguous. Consequently, it is central to classify features that contribute to the observed drug response variability. Pancreatic cancer is one of the deadliest cancers with limited therapeutic achievements due to the massive presence of stroma that generates an environment that enables tumor growth, metastasis, and drug resistance. To understand the cancer-stroma cross talk within the tumor microenvironment and to develop personalized adjuvant therapies, there is a necessity for effective approaches that offer measurable data to monitor the effect of drugs at the single-cell level. Here, we develop a computational approach, based on cell imaging, that quantifies the cellular cross talk between pancreatic tumor cells (L3.6pl or AsPC1) and pancreatic stellate cells (PSCs), coordinating their kinetics in presence of the chemotherapeutic agent gemcitabine. We report significant heterogeneity in the organization of cellular interactions in response to the drug. For L3.6pl cells, gemcitabine sensibly decreases stroma-stroma interactions but increases stroma-cancer interactions, overall enhancing motility and crowding. In the AsPC1 case, gemcitabine promotes the interactions among tumor cells, but it does not affect stroma-cancer interplay, possibly suggesting a milder effect of the drug on cell dynamics.


Assuntos
Carcinoma Ductal Pancreático , Neoplasias Pancreáticas , Humanos , Carcinoma Ductal Pancreático/patologia , Neoplasias Pancreáticas/patologia , Gencitabina , Comunicação Celular , Linhagem Celular Tumoral , Microambiente Tumoral
3.
Artigo em Inglês | MEDLINE | ID: mdl-35724278

RESUMO

Inspired by a formal equivalence between the Hopfield model and restricted Boltzmann machines (RBMs), we design a Boltzmann machine, referred to as the dreaming Boltzmann machine (DBM), which achieves better performances than the standard one. The novelty in our model lies in a precise prescription for intralayer connections among hidden neurons whose strengths depend on features correlations. We analyze learning and retrieving capabilities in DBMs, both theoretically and numerically, and compare them to the RBM reference. We find that, in a supervised scenario, the former significantly outperforms the latter. Furthermore, in the unsupervised case, the DBM achieves better performances both in features extraction and representation learning, especially when the network is properly pretrained. Finally, we compare both models in simple classification tasks and find that the DBM again outperforms the RBM reference.

4.
Sci Rep ; 10(1): 15353, 2020 09 18.
Artigo em Inglês | MEDLINE | ID: mdl-32948805

RESUMO

In this work we apply statistical mechanics tools to infer cardiac pathologies over a sample of M patients whose heart rate variability has been recorded via 24 h Holter device and that are divided in different classes according to their clinical status (providing a repository of labelled data). Considering the set of inter-beat interval sequences [Formula: see text], with [Formula: see text], we estimate their probability distribution [Formula: see text] exploiting the maximum entropy principle. By setting constraints on the first and on the second moment we obtain an effective pairwise [Formula: see text] model, whose parameters are shown to depend on the clinical status of the patient. In order to check this framework, we generate synthetic data from our model and we show that their distribution is in excellent agreement with the one obtained from experimental data. Further, our model can be related to a one-dimensional spin-glass with quenched long-range couplings decaying with the spin-spin distance as a power-law. This allows us to speculate that the 1/f noise typical of heart-rate variability may stem from the interplay between the parasympathetic and orthosympathetic systems.


Assuntos
Frequência Cardíaca/fisiologia , Modelos Cardiovasculares , Eletrocardiografia , Entropia , Humanos , Modelos Estatísticos
5.
Sci Rep ; 10(1): 8845, 2020 06 01.
Artigo em Inglês | MEDLINE | ID: mdl-32483156

RESUMO

In this paper we develop statistical algorithms to infer possible cardiac pathologies, based on data collected from 24 h Holter recording over a sample of 2829 labelled patients; labels highlight whether a patient is suffering from cardiac pathologies. In the first part of the work we analyze statistically the heart-beat series associated to each patient and we work them out to get a coarse-grained description of heart variability in terms of 49 markers well established in the reference community. These markers are then used as inputs for a multi-layer feed-forward neural network that we train in order to make it able to classify patients. However, before training the network, preliminary operations are in order to check the effective number of markers (via principal component analysis) and to achieve data augmentation (because of the broadness of the input data). With such groundwork, we finally train the network and show that it can classify with high accuracy (at most ~85% successful identifications) patients that are healthy from those displaying atrial fibrillation or congestive heart failure. In the second part of the work, we still start from raw data and we get a classification of pathologies in terms of their related networks: patients are associated to nodes and links are drawn according to a similarity measure between the related heart-beat series. We study the emergent properties of these networks looking for features (e.g., degree, clustering, clique proliferation) able to robustly discriminate between networks built over healthy patients or over patients suffering from cardiac pathologies. We find overall very good agreement among the two paved routes.


Assuntos
Fibrilação Atrial/patologia , Biomarcadores/metabolismo , Insuficiência Cardíaca/patologia , Frequência Cardíaca/fisiologia , Aprendizado de Máquina , Fibrilação Atrial/diagnóstico , Análise por Conglomerados , Bases de Dados Factuais , Insuficiência Cardíaca/diagnóstico , Humanos , Análise de Componente Principal
6.
Neural Netw ; 128: 254-267, 2020 Aug.
Artigo em Inglês | MEDLINE | ID: mdl-32454370

RESUMO

In this work we develop analytical techniques to investigate a broad class of associative neural networks set in the high-storage regime. These techniques translate the original statistical-mechanical problem into an analytical-mechanical one which implies solving a set of partial differential equations, rather than tackling the canonical probabilistic route. We test the method on the classical Hopfield model - where the cost function includes only two-body interactions (i.e., quadratic terms) - and on the "relativistic" Hopfield model - where the (expansion of the) cost function includes p-body (i.e., of degree p) contributions. Under the replica symmetric assumption, we paint the phase diagrams of these models by obtaining the explicit expression of their free energy as a function of the model parameters (i.e., noise level and memory storage). Further, since for non-pairwise models ergodicity breaking is non necessarily a critical phenomenon, we develop a fluctuation analysis and find that criticality is preserved in the relativistic model.


Assuntos
Redes Neurais de Computação
7.
Phys Rev Lett ; 124(2): 028301, 2020 Jan 17.
Artigo em Inglês | MEDLINE | ID: mdl-32004010

RESUMO

We consider a three-layer Sejnowski machine and show that features learnt via contrastive divergence have a dual representation as patterns in a dense associative memory of order P=4. The latter is known to be able to Hebbian store an amount of patterns scaling as N^{P-1}, where N denotes the number of constituting binary neurons interacting P wisely. We also prove that, by keeping the dense associative network far from the saturation regime (namely, allowing for a number of patterns scaling only linearly with N, while P>2) such a system is able to perform pattern recognition far below the standard signal-to-noise threshold. In particular, a network with P=4 is able to retrieve information whose intensity is O(1) even in the presence of a noise O(sqrt[N]) in the large N limit. This striking skill stems from a redundancy representation of patterns-which is afforded given the (relatively) low-load information storage-and it contributes to explain the impressive abilities in pattern recognition exhibited by new-generation neural networks. The whole theory is developed rigorously, at the replica symmetric level of approximation, and corroborated by signal-to-noise analysis and Monte Carlo simulations.

8.
Neural Netw ; 112: 24-40, 2019 Apr.
Artigo em Inglês | MEDLINE | ID: mdl-30735914

RESUMO

The standard Hopfield model for associative neural networks accounts for biological Hebbian learning and acts as the harmonic oscillator for pattern recognition, however its maximal storage capacity is α∼0.14, far from the theoretical bound for symmetric networks, i.e. α=1. Inspired by sleeping and dreaming mechanisms in mammal brains, we propose an extension of this model displaying the standard on-line (awake) learning mechanism (that allows the storage of external information in terms of patterns) and an off-line (sleep) unlearning&consolidating mechanism (that allows spurious-pattern removal and pure-pattern reinforcement): this obtained daily prescription is able to saturate the theoretical bound α=1, remaining also extremely robust against thermal noise. The emergent neural and synaptic features are analyzed both analytically and numerically. In particular, beyond obtaining a phase diagram for neural dynamics, we focus on synaptic plasticity and we give explicit prescriptions on the temporal evolution of the synaptic matrix. We analytically prove that our algorithm makes the Hebbian kernel converge with high probability to the projection matrix built over the pure stored patterns. Furthermore, we obtain a sharp and explicit estimate for the "sleep rate" in order to ensure such a convergence. Finally, we run extensive numerical simulations (mainly Monte Carlo sampling) to check the approximations underlying the analytical investigations (e.g., we developed the whole theory at the so called replica-symmetric level, as standard in the Amit-Gutfreund-Sompolinsky reference framework) and possible finite-size effects, finding overall full agreement with the theory.


Assuntos
Memória/fisiologia , Redes Neurais de Computação , Reforço Psicológico , Algoritmos , Método de Monte Carlo , Plasticidade Neuronal/fisiologia
9.
Neural Netw ; 106: 205-222, 2018 Oct.
Artigo em Inglês | MEDLINE | ID: mdl-30081347

RESUMO

We propose a modification of the cost function of the Hopfield model whose salient features shine in its Taylor expansion and result in more than pairwise interactions with alternate signs, suggesting a unified framework for handling both with deep learning and network pruning. In our analysis, we heavily rely on the Hamilton-Jacobi correspondence relating the statistical model with a mechanical system. In this picture, our model is nothing but the relativistic extension of the original Hopfield model (whose cost function is a quadratic form in the Mattis magnetization and mimics the non-relativistic counterpart, the so-called classical limit). We focus on the low-storage regime and solve the model analytically by taking advantage of the mechanical analogy, thus obtaining a complete characterization of the free energy and the associated self-consistency equations in the thermodynamic limit. Further, on the numerical side, we test the performances of our proposal with extensive Monte Carlo simulations, showing that the stability of spurious states (limiting the capabilities of the standard Hebbian construction) is sensibly reduced due to presence of unlearning contributions that prune them massively.


Assuntos
Método de Monte Carlo , Redes Neurais de Computação , Algoritmos , Modelos Estatísticos , Termodinâmica
SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA