Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 30
Filter
1.
Phys Rev E ; 108(2-1): 024401, 2023 Aug.
Article in English | MEDLINE | ID: mdl-37723769

ABSTRACT

Eukaryotic cells maintain their inner order by a hectic process of sorting and distillation of molecular factors taking place on their lipid membranes. A similar sorting process is implied in the assembly and budding of enveloped viruses. To understand the properties of this molecular sorting process, we have recently proposed a physical model [Zamparo et al., Phys. Rev. Lett. 126, 088101 (2021)]10.1103/PhysRevLett.126.088101, based on (1) the phase separation of a single, initially dispersed molecular species into spatially localized sorting domains on the lipid membrane and (2) domain-induced membrane bending leading to the nucleation of submicrometric lipid vesicles, naturally enriched in the molecules of the engulfed sorting domain. The analysis of the model showed the existence of an optimal region of parameter space where sorting is most efficient. Here the model is extended to account for the simultaneous distillation of a pool of distinct molecular species. We find that the mean time spent by sorted molecules on the membrane increases with the heterogeneity of the pool (i.e., the number of distinct molecular species sorted) according to a simple scaling law, and that a large number of distinct molecular species can in principle be sorted in parallel on cell membranes without significantly interfering with each other. Moreover, sorting is found to be most efficient when the distinct molecular species have comparable homotypic affinities. We also consider how valence (i.e., the average number of interacting neighbors of a molecule in a sorting domain) affects the sorting process, finding that higher-valence molecules can be sorted with greater efficiency than lower-valence molecules.


Subject(s)
Lipids , Cell Membrane , Cell Division , Cell Movement
2.
Sci Rep ; 13(1): 7350, 2023 May 05.
Article in English | MEDLINE | ID: mdl-37147382

ABSTRACT

Estimating observables from conditioned dynamics is typically computationally hard. While obtaining independent samples efficiently from unconditioned dynamics is usually feasible, most of them do not satisfy the imposed conditions and must be discarded. On the other hand, conditioning breaks the causal properties of the dynamics, which ultimately renders the sampling of the conditioned dynamics non-trivial and inefficient. In this work, a Causal Variational Approach is proposed, as an approximate method to generate independent samples from a conditioned distribution. The procedure relies on learning the parameters of a generalized dynamical model that optimally describes the conditioned distribution in a variational sense. The outcome is an effective and unconditioned dynamical model from which one can trivially obtain independent samples, effectively restoring the causality of the conditioned dynamics. The consequences are twofold: the method allows one to efficiently compute observables from the conditioned dynamics by averaging over independent samples; moreover, it provides an effective unconditioned distribution that is easy to interpret. This approximation can be applied virtually to any dynamics. The application of the method to epidemic inference is discussed in detail. The results of direct comparison with state-of-the-art inference methods, including the soft-margin approach and mean-field methods, are promising.

3.
Phys Rev E ; 106(4-1): 044412, 2022 Oct.
Article in English | MEDLINE | ID: mdl-36397477

ABSTRACT

Molecular sorting is a fundamental process that allows eukaryotic cells to distill and concentrate specific chemical factors in appropriate cell membrane subregions, thus endowing them with different chemical identities and functional properties. A phenomenological theory of this molecular distillation process has recently been proposed [M. Zamparo, D. Valdembri, G. Serini, I. V. Kolokolov, V. V. Lebedev, L. Dall'Asta, and A. Gamba, Phys. Rev. Lett. 126, 088101 (2021)0031-900710.1103/PhysRevLett.126.088101], based on the idea that molecular sorting emerges from the combination of (a) phase separation driven formation of sorting domains and (b) domain-induced membrane bending, leading to the production of submicrometric lipid vesicles enriched in the sorted molecules. In this framework, a natural parameter controlling the efficiency of molecular distillation is the critical size of phase separated domains. In the experiments, sorting domains appear to fall into two classes: unproductive domains, characterized by short lifetimes and low probability of extraction, and productive domains, that evolve into vesicles that ultimately detach from the membrane system. It is tempting to link these two classes to the different fates predicted by classical phase separation theory for subcritical and supercritical phase separated domains. Here, we discuss the implication of this picture in the framework of the previously introduced phenomenological theory of molecular sorting. Several predictions of the theory are verified by numerical simulations of a lattice-gas model. Sorting is observed to be most efficient when the number of sorting domains is close to a minimum. To help in the analysis of experimental data, an operational definition of the critical size of sorting domains is proposed. Comparison with experimental results shows that the statistical properties of productive and unproductive domains inferred from experimental data are in agreement with those predicted from numerical simulations of the model, compatibly with the hypothesis that molecular sorting is driven by a phase separation process.

4.
Sci Rep ; 12(1): 19673, 2022 11 16.
Article in English | MEDLINE | ID: mdl-36385141

ABSTRACT

The reconstruction of missing information in epidemic spreading on contact networks can be essential in the prevention and containment strategies. The identification and warning of infectious but asymptomatic individuals (i.e., contact tracing), the well-known patient-zero problem, or the inference of the infectivity values in structured populations are examples of significant epidemic inference problems. As the number of possible epidemic cascades grows exponentially with the number of individuals involved and only an almost negligible subset of them is compatible with the observations (e.g., medical tests), epidemic inference in contact networks poses incredible computational challenges. We present a new generative neural networks framework that learns to generate the most probable infection cascades compatible with observations. The proposed method achieves better (in some cases, significantly better) or comparable results with existing methods in all problems considered both in synthetic and real contact networks. Given its generality, clear Bayesian and variational nature, the presented framework paves the way to solve fundamental inference epidemic problems with high precision in small and medium-sized real case scenarios such as the spread of infections in workplaces and hospitals.


Subject(s)
Epidemics , Humans , Bayes Theorem , Epidemics/prevention & control , Contact Tracing , Neural Networks, Computer
5.
Proc Natl Acad Sci U S A ; 118(32)2021 08 10.
Article in English | MEDLINE | ID: mdl-34312253

ABSTRACT

Contact tracing is an essential tool to mitigate the impact of a pandemic, such as the COVID-19 pandemic. In order to achieve efficient and scalable contact tracing in real time, digital devices can play an important role. While a lot of attention has been paid to analyzing the privacy and ethical risks of the associated mobile applications, so far much less research has been devoted to optimizing their performance and assessing their impact on the mitigation of the epidemic. We develop Bayesian inference methods to estimate the risk that an individual is infected. This inference is based on the list of his recent contacts and their own risk levels, as well as personal information such as results of tests or presence of syndromes. We propose to use probabilistic risk estimation to optimize testing and quarantining strategies for the control of an epidemic. Our results show that in some range of epidemic spreading (typically when the manual tracing of all contacts of infected people becomes practically impossible but before the fraction of infected people reaches the scale where a lockdown becomes unavoidable), this inference of individuals at risk could be an efficient way to mitigate the epidemic. Our approaches translate into fully distributed algorithms that only require communication between individuals who have recently been in contact. Such communication may be encrypted and anonymized, and thus, it is compatible with privacy-preserving standards. We conclude that probabilistic risk estimation is capable of enhancing the performance of digital contact tracing and should be considered in the mobile applications.


Subject(s)
Contact Tracing/methods , Epidemics/prevention & control , Algorithms , Bayes Theorem , COVID-19/epidemiology , COVID-19/prevention & control , Contact Tracing/statistics & numerical data , Humans , Mobile Applications , Privacy , Risk Assessment , SARS-CoV-2
6.
Comput Struct Biotechnol J ; 19: 3225-3233, 2021.
Article in English | MEDLINE | ID: mdl-34141141

ABSTRACT

Compartmentalization of cellular functions is at the core of the physiology of eukaryotic cells. Recent evidences indicate that a universal organizing process - phase separation - supports the partitioning of biomolecules in distinct phases from a single homogeneous mixture, a landmark event in both the biogenesis and the maintenance of membrane and non-membrane-bound organelles. In the cell, 'passive' (non energy-consuming) mechanisms are flanked by 'active' mechanisms of separation into phases of distinct density and stoichiometry, that allow for increased partitioning flexibility and programmability. A convergence of physical and biological approaches is leading to new insights into the inner functioning of this driver of intracellular order, holding promises for future advances in both biological research and biotechnological applications.

7.
Phys Rev Lett ; 126(8): 088101, 2021 Feb 26.
Article in English | MEDLINE | ID: mdl-33709726

ABSTRACT

We introduce a simple physical picture to explain the process of molecular sorting, whereby specific proteins are concentrated and distilled into submicrometric lipid vesicles in eukaryotic cells. To this purpose, we formulate a model based on the coupling of spontaneous molecular aggregation with vesicle nucleation. Its implications are studied by means of a phenomenological theory describing the diffusion of molecules toward multiple sorting centers that grow due to molecule absorption and are extracted when they reach a sufficiently large size. The predictions of the theory are compared with numerical simulations of a lattice-gas realization of the model and with experimental observations. The efficiency of the distillation process is found to be optimal for intermediate aggregation rates, where the density of sorted molecules is minimal and the process obeys simple scaling laws. Quantitative measures of endocytic sorting performed in primary endothelial cells are compatible with the hypothesis that these optimal conditions are realized in living cells.


Subject(s)
Eukaryotic Cells/metabolism , Membrane Lipids/metabolism , Models, Biological , Proteins/metabolism , Diffusion , Transport Vesicles/metabolism
8.
Phys Rev Lett ; 123(2): 020604, 2019 Jul 12.
Article in English | MEDLINE | ID: mdl-31386499

ABSTRACT

Computing marginal distributions of discrete or semidiscrete Markov random fields (MRFs) is a fundamental, generally intractable problem with a vast number of applications in virtually all fields of science. We present a new family of computational schemes to approximately calculate the marginals of discrete MRFs. This method shares some desirable properties with belief propagation, in particular, providing exact marginals on acyclic graphs, but it differs with the latter in that it includes some loop corrections; i.e., it takes into account correlations coming from all cycles in the factor graph. It is also similar to the adaptive Thouless-Anderson-Palmer method, but it differs with the latter in that the consistency is not on the first two moments of the distribution but rather on the value of its density on a subset of values. The results on finite-dimensional Isinglike models show a significant improvement with respect to the Bethe-Peierls (tree) approximation in all cases and with respect to the plaquette cluster variational method approximation in many cases. In particular, for the critical inverse temperature ß_{c} of the homogeneous hypercubic lattice, the expansion of (dß_{c})^{-1} around d=∞ of the proposed scheme is exact up to d^{-4} order, whereas the latter two are exact only up to d^{-2} order.

9.
PLoS One ; 12(5): e0176859, 2017.
Article in English | MEDLINE | ID: mdl-28475583

ABSTRACT

BACKGROUND AND AIM: Sarcoidosis is a systemic granulomatous inflammatory disease whose causes are still unknown and for which epidemiological data are often discordant. The aim of our study is to investigate prevalence and spatial distribution of cases, and identify environmental exposures associated with sarcoidosis in an Italian province. METHODS: After georeferentiation of cases, the area under study was subdivided with respect to Municipality and Health Districts and to the altitude in order to identify zonal differences in prevalence. The bioaccumulation levels of 12 metals in lichen tissues were analyzed, in order to determine sources of air pollution. Finally, the analysis of the correlation between metals and between pickup stations was performed. RESULTS: 223 patients were identified (58.3% female and 41.7% male of total) and the mean age was 50.6±15.4 years (53.5±15.5 years for the females and 46.5±14.4 for the males). The mean prevalence was 49 per 100.000 individuals. However, we observed very heterogeneous prevalence in the area under study. The correlations among metals revealed different deposition patterns in lowland area respect to hilly and mountain areas. CONCLUSIONS: The study highlights a high prevalence of sarcoidosis cases, characterized by a very inhomogeneous and patchy distribution with phenomena of local aggregation. Moreover, the bioaccumulation analysis was an effective method to identify the mineral particles that mostly contribute to air pollution in the different areas, but it was not sufficient to establish a clear correlation between the onset of sarcoidosis and environmental risk factors.


Subject(s)
Sarcoidosis/epidemiology , Adult , Aged , Environmental Monitoring , Female , Geographic Information Systems , Humans , Italy/epidemiology , Male , Middle Aged , Prevalence , Risk Factors
10.
PLoS One ; 12(4): e0176376, 2017.
Article in English | MEDLINE | ID: mdl-28445537

ABSTRACT

The massive employment of computational models in network epidemiology calls for the development of improved inference methods for epidemic forecast. For simple compartment models, such as the Susceptible-Infected-Recovered model, Belief Propagation was proved to be a reliable and efficient method to identify the origin of an observed epidemics. Here we show that the same method can be applied to predict the future evolution of an epidemic outbreak from partial observations at the early stage of the dynamics. The results obtained using Belief Propagation are compared with Monte Carlo direct sampling in the case of SIR model on random (regular and power-law) graphs for different observation methods and on an example of real-world contact network. Belief Propagation gives in general a better prediction that direct sampling, although the quality of the prediction depends on the quantity under study (e.g. marginals of individual states, epidemic size, extinction-time distribution) and on the actual number of observed nodes that are infected before the observation time.


Subject(s)
Models, Theoretical , Area Under Curve , Bayes Theorem , Communicable Diseases/epidemiology , Epidemics , Humans , Monte Carlo Method , ROC Curve
11.
Proc Natl Acad Sci U S A ; 113(44): 12368-12373, 2016 11 01.
Article in English | MEDLINE | ID: mdl-27791075

ABSTRACT

We study the network dismantling problem, which consists of determining a minimal set of vertices in which removal leaves the network broken into connected components of subextensive size. For a large class of random graphs, this problem is tightly connected to the decycling problem (the removal of vertices, leaving the graph acyclic). Exploiting this connection and recent works on epidemic spreading, we present precise predictions for the minimal size of a dismantling set in a large random graph with a prescribed (light-tailed) degree distribution. Building on the statistical mechanics perspective, we propose a three-stage Min-Sum algorithm for efficiently dismantling networks, including heavy-tailed ones for which the dismantling and decycling problems are not equivalent. We also provide additional insights into the dismantling problem, concluding that it is an intrinsically collective problem and that optimal dismantling sets cannot be viewed as a collection of individually well-performing nodes.

12.
Sci Rep ; 6: 20706, 2016 Feb 12.
Article in English | MEDLINE | ID: mdl-26869210

ABSTRACT

The controllability of a network is a theoretical problem of relevance in a variety of contexts ranging from financial markets to the brain. Until now, network controllability has been characterized only on isolated networks, while the vast majority of complex systems are formed by multilayer networks. Here we build a theoretical framework for the linear controllability of multilayer networks by mapping the problem into a combinatorial matching problem. We found that correlating the external signals in the different layers can significantly reduce the multiplex network robustness to node removal, as it can be seen in conjunction with a hybrid phase transition occurring in interacting Poisson networks. Moreover we observe that multilayer networks can stabilize the fully controllable multiplex network configuration that can be stable also when the full controllability of the single network is not stable.

13.
PLoS One ; 10(12): e0145222, 2015.
Article in English | MEDLINE | ID: mdl-26710102

ABSTRACT

We present a message-passing algorithm to solve a series of edge-disjoint path problems on graphs based on the zero-temperature cavity equations. Edge-disjoint paths problems are important in the general context of routing, that can be defined by incorporating under a unique framework both traffic optimization and total path length minimization. The computation of the cavity equations can be performed efficiently by exploiting a mapping of a generalized edge-disjoint path problem on a star graph onto a weighted maximum matching problem. We perform extensive numerical simulations on random graphs of various types to test the performance both in terms of path length minimization and maximization of the number of accommodated paths. In addition, we test the performance on benchmark instances on various graphs by comparison with state-of-the-art algorithms and results found in the literature. Our message-passing algorithm always outperforms the others in terms of the number of accommodated paths when considering non trivial instances (otherwise it gives the same trivial results). Remarkably, the largest improvement in performance with respect to the other methods employed is found in the case of benchmarks with meshes, where the validity hypothesis behind message-passing is expected to worsen. In these cases, even though the exact message-passing equations do not converge, by introducing a reinforcement parameter to force convergence towards a sub optimal solution, we were able to always outperform the other algorithms with a peak of 27% performance improvement in terms of accommodated paths. On random graphs, we numerically observe two separated regimes: one in which all paths can be accommodated and one in which this is not possible. We also investigate the behavior of both the number of paths to be accommodated and their minimum total length.


Subject(s)
Algorithms , Artificial Intelligence , Computer Simulation , Automobiles , Computer Communication Networks , Computer Graphics , Travel
14.
PLoS One ; 10(7): e0119286, 2015.
Article in English | MEDLINE | ID: mdl-26177449

ABSTRACT

We study a class of games which models the competition among agents to access some service provided by distributed service units and which exhibits congestion and frustration phenomena when service units have limited capacity. We propose a technique, based on the cavity method of statistical physics, to characterize the full spectrum of Nash equilibria of the game. The analysis reveals a large variety of equilibria, with very different statistical properties. Natural selfish dynamics, such as best-response, usually tend to large-utility equilibria, even though those of smaller utility are exponentially more numerous. Interestingly, the latter actually can be reached by selecting the initial conditions of the best-response dynamics close to the saturation limit of the service unit capacities. We also study a more realistic stochastic variant of the game by means of a simple and effective approximation of the average over the random parameters, showing that the properties of the average-case Nash equilibria are qualitatively similar to the deterministic ones.


Subject(s)
Game Theory , Entropy , Models, Theoretical , Stochastic Processes
15.
Article in English | MEDLINE | ID: mdl-25871077

ABSTRACT

In subdivided populations, migration acts together with selection and genetic drift and determines their evolution. Building upon a recently proposed method, which hinges on the emergence of a time scale separation between local and global dynamics, we study the fixation properties of subdivided populations in the presence of balancing selection. The approximation implied by the method is accurate when the effective selection strength is small and the number of subpopulations is large. In particular, it predicts a phase transition between species coexistence and biodiversity loss in the infinite-size limit and, in finite populations, a nonmonotonic dependence of the mean fixation time on the migration rate. In order to investigate the fixation properties of the subdivided population for stronger selection, we introduce an effective coarser description of the dynamics in terms of a voter model with intermediate states, which highlights the basic mechanisms driving the evolutionary process.


Subject(s)
Evolution, Molecular , Models, Genetic , Selection, Genetic , Genetic Drift , Genetics, Population , Time Factors
16.
Phys Rev Lett ; 113(7): 078701, 2014 Aug 15.
Article in English | MEDLINE | ID: mdl-25170736

ABSTRACT

The problem of controllability of the dynamical state of a network is central in network theory and has wide applications ranging from network medicine to financial markets. The driver nodes of the network are the nodes that can bring the network to the desired dynamical state if an external signal is applied to them. Using the framework of structural controllability, here, we show that the density of nodes with in degree and out degree equal to one and two determines the number of driver nodes in the network. Moreover, we show that random networks with minimum in degree and out degree greater than two, are always fully controllable by an infinitesimal fraction of driver nodes, regardless of the other properties of the degree distribution. Finally, based on these results, we propose an algorithm to improve the controllability of networks.


Subject(s)
Algorithms , Models, Theoretical , Computer Simulation
17.
Phys Rev Lett ; 112(11): 118701, 2014 Mar 21.
Article in English | MEDLINE | ID: mdl-24702425

ABSTRACT

We study several Bayesian inference problems for irreversible stochastic epidemic models on networks from a statistical physics viewpoint. We derive equations which allow us to accurately compute the posterior distribution of the time evolution of the state of each node given some observations. At difference with most existing methods, we allow very general observation models, including unobserved nodes, state observations made at different or unknown times, and observations of infection times, possibly mixed together. Our method, which is based on the belief propagation algorithm, is efficient, naturally distributed, and exact on trees. As a particular case, we consider the problem of finding the "zero patient" of a susceptible-infected-recovered or susceptible-infected epidemic given a snapshot of the state of the network at a later unknown time. Numerical simulations show that our method outperforms previous ones on both synthetic and real networks, often by a very large margin.


Subject(s)
Bayes Theorem , Contact Tracing/methods , Epidemiologic Methods , Models, Statistical , Stochastic Processes
18.
Phys Rev Lett ; 112(14): 148101, 2014 Apr 11.
Article in English | MEDLINE | ID: mdl-24766019

ABSTRACT

The influence of migration on the stochastic dynamics of subdivided populations is still an open issue in various evolutionary models. Here, we develop a self-consistent mean-field-like method in order to determine the effects of migration on relevant nonequilibrium properties, such as the mean fixation time. If evolution strongly favors coexistence of species (e.g., balancing selection), the mean fixation time develops an unexpected minimum as a function of the migration rate. Our analysis hinges only on the presence of a separation of time scales between local and global dynamics, and therefore, it carries over to other nonequilibrium processes in physics, biology, ecology, and social sciences.


Subject(s)
Ecosystem , Genetics, Population/methods , Models, Genetic , Animal Migration , Biological Evolution , Competitive Behavior , Population Dynamics
19.
Article in English | MEDLINE | ID: mdl-23767494

ABSTRACT

We propose a modified voter model with locally conserved magnetization and investigate its phase ordering dynamics in two dimensions in numerical simulations. Imposing a local constraint on the dynamics has the surprising effect of speeding up the phase ordering process. The system is shown to exhibit a scaling regime characterized by algebraic domain growth, at odds with the logarithmic coarsening of the standard voter model. A phenomenological approach based on cluster diffusion and similar to Smoluchowski ripening correctly predicts the observed scaling regime. Our analysis exposes unexpected complexity in the phase ordering dynamics without thermodynamic potential.


Subject(s)
Algorithms , Models, Statistical , Phase Transition , Thermodynamics , Computer Simulation
20.
Proc Natl Acad Sci U S A ; 109(12): 4395-400, 2012 Mar 20.
Article in English | MEDLINE | ID: mdl-22383559

ABSTRACT

The very notion of social network implies that linked individuals interact repeatedly with each other. This notion allows them not only to learn successful strategies and adapt to them, but also to condition their own behavior on the behavior of others, in a strategic forward looking manner. Game theory of repeated games shows that these circumstances are conducive to the emergence of collaboration in simple games of two players. We investigate the extension of this concept to the case where players are engaged in a local contribution game and show that rationality and credibility of threats identify a class of Nash equilibria--that we call "collaborative equilibria"--that have a precise interpretation in terms of subgraphs of the social network. For large network games, the number of such equilibria is exponentially large in the number of players. When incentives to defect are small, equilibria are supported by local structures whereas when incentives exceed a threshold they acquire a nonlocal nature, which requires a "critical mass" of more than a given fraction of the players to collaborate. Therefore, when incentives are high, an individual deviation typically causes the collapse of collaboration across the whole system. At the same time, higher incentives to defect typically support equilibria with a higher density of collaborators. The resulting picture conforms with several results in sociology and in the experimental literature on game theory, such as the prevalence of collaboration in denser groups and in the structural hubs of sparse networks.


Subject(s)
Social Support , Algorithms , Communication , Cooperative Behavior , Game Theory , Humans , Models, Psychological , Models, Statistical , Models, Theoretical
SELECTION OF CITATIONS
SEARCH DETAIL
...