RESUMEN
Contact tracing is an essential tool to mitigate the impact of a pandemic, such as the COVID-19 pandemic. In order to achieve efficient and scalable contact tracing in real time, digital devices can play an important role. While a lot of attention has been paid to analyzing the privacy and ethical risks of the associated mobile applications, so far much less research has been devoted to optimizing their performance and assessing their impact on the mitigation of the epidemic. We develop Bayesian inference methods to estimate the risk that an individual is infected. This inference is based on the list of his recent contacts and their own risk levels, as well as personal information such as results of tests or presence of syndromes. We propose to use probabilistic risk estimation to optimize testing and quarantining strategies for the control of an epidemic. Our results show that in some range of epidemic spreading (typically when the manual tracing of all contacts of infected people becomes practically impossible but before the fraction of infected people reaches the scale where a lockdown becomes unavoidable), this inference of individuals at risk could be an efficient way to mitigate the epidemic. Our approaches translate into fully distributed algorithms that only require communication between individuals who have recently been in contact. Such communication may be encrypted and anonymized, and thus, it is compatible with privacy-preserving standards. We conclude that probabilistic risk estimation is capable of enhancing the performance of digital contact tracing and should be considered in the mobile applications.
Asunto(s)
Trazado de Contacto/métodos , Epidemias/prevención & control , Algoritmos , Teorema de Bayes , COVID-19/epidemiología , COVID-19/prevención & control , Trazado de Contacto/estadística & datos numéricos , Humanos , Aplicaciones Móviles , Privacidad , Medición de Riesgo , SARS-CoV-2RESUMEN
We consider a class of spreading processes on networks, which generalize commonly used epidemic models such as the SIR model or the SIS model with a bounded number of reinfections. We analyze the related problem of inference of the dynamics based on its partial observations. We analyze these inference problems on random networks via a message-passing inference algorithm derived from the belief propagation (BP) equations. We investigate whether said algorithm solves the problems in a Bayes-optimal way, i.e., no other algorithm can reach a better performance. For this, we leverage the so-called Nishimori conditions that must be satisfied by a Bayes-optimal algorithm. We also probe for phase transitions by considering the convergence time and by initializing the algorithm in both a random and an informed way and comparing the resulting fixed points. We present the corresponding phase diagrams. We find large regions of parameters where even for moderate system sizes the BP algorithm converges and satisfies closely the Nishimori conditions, and the problem is thus conjectured to be solved optimally in those regions. In other limited areas of the space of parameters, the Nishimori conditions are no longer satisfied and the BP algorithm struggles to converge. No sign of a phase transition is detected, however, and we attribute this failure of optimality to finite-size effects. The article is accompanied by a Python implementation of the algorithm that is easy to use or adapt.
RESUMEN
The reconstruction of missing information in epidemic spreading on contact networks can be essential in the prevention and containment strategies. The identification and warning of infectious but asymptomatic individuals (i.e., contact tracing), the well-known patient-zero problem, or the inference of the infectivity values in structured populations are examples of significant epidemic inference problems. As the number of possible epidemic cascades grows exponentially with the number of individuals involved and only an almost negligible subset of them is compatible with the observations (e.g., medical tests), epidemic inference in contact networks poses incredible computational challenges. We present a new generative neural networks framework that learns to generate the most probable infection cascades compatible with observations. The proposed method achieves better (in some cases, significantly better) or comparable results with existing methods in all problems considered both in synthetic and real contact networks. Given its generality, clear Bayesian and variational nature, the presented framework paves the way to solve fundamental inference epidemic problems with high precision in small and medium-sized real case scenarios such as the spread of infections in workplaces and hospitals.
Asunto(s)
Epidemias , Humanos , Teorema de Bayes , Epidemias/prevención & control , Trazado de Contacto , Redes Neurales de la ComputaciónRESUMEN
We know that maximal efficiency in physical systems is attained by reversible processes. It is then interesting to see how irreversibility affects efficiency in other systems, e.g., in a city. In this study, we focus on a cyclic process of movements (home to workplace and back to home) in a city to investigate the above question. To this end, we present a minimal model of the movements, along with plausible definitions for the efficiency and irreversibility of the process; more precisely, we take the inverse of the total travel time per number of trips for efficiency and the relative entropy of the forward and backward flow distributions for the process irreversibility. We perform numerical simulations of the model for reasonable choices of the population distribution, the mobility law, and the movement strategy. The results show that the efficiency of movements is indeed negatively correlated with the above measure of irreversibility. The structure of the network and the impact of the flows on the travel times are the main factors here that affect the time intervals of arriving to destinations and returning to origins, which are usually larger than the time interval of the departures. This in turn gives rise to diverging of the backward flows from the forward ones and results to entropy (disorder or uncertainty) production in the system. The findings of this study might be helpful in characterizing more accurately the city efficiency and in better understanding of the main working principles of these complex systems.
RESUMEN
In the last decades, the acceleration of urban growth has led to an unprecedented level of urban interactions and interdependence. This situation calls for a significant effort among the scientific community to come up with engaging and meaningful visualizations and accessible scenario simulation engines. The present paper gives a contribution in this direction by providing general methods to evaluate accessibility in cities based on public transportation data. Through the notion of isochrones, the accessibility quantities proposed measure the performance of transport systems at connecting places and people in urban systems. Then we introduce scores ranking cities according to their overall accessibility. We highlight significant inequalities in the distribution of these measures across the population, which are found to be strikingly similar across various urban environments. Our results are released through the interactive platform: www.citychrone.org, aimed at providing the community at large with a useful tool for awareness and decision-making.
RESUMEN
We study the behavior of an algorithm derived from the cavity method for the prize-collecting steiner tree (PCST) problem on graphs. The algorithm is based on the zero temperature limit of the cavity equations and as such is formally simple (a fixed point equation resolved by iteration) and distributed (parallelizable). We provide a detailed comparison with state-of-the-art algorithms on a wide range of existing benchmarks, networks, and random graphs. Specifically, we consider an enhanced derivative of the Goemans-Williamson heuristics and the dhea solver, a branch and cut integer linear programming based approach. The comparison shows that the cavity algorithm outperforms the two algorithms in most large instances both in running time and quality of the solution. Finally we prove a few optimality properties of the solutions provided by our algorithm, including optimality under the two postprocessing procedures defined in the Goemans-Williamson derivative and global optimality in some limit cases.
Asunto(s)
Modelos Estadísticos , Física/métodos , Algoritmos , Biofisica/métodos , Biología Computacional/métodos , Simulación por Computador , Modelos Teóricos , Transducción de Señal , Programas InformáticosRESUMEN
We extend our theory of amorphous packings of hard spheres to binary mixtures and more generally to multicomponent systems. The theory is based on the assumption that amorphous packings produced by typical experimental or numerical protocols can be identified with the infinite pressure limit of long-lived metastable glassy states. We test this assumption against numerical and experimental data and show that the theory correctly reproduces the variation with mixture composition of structural observables, such as the total packing fraction and the partial coordination numbers.