Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 13 de 13
Filtrar
Más filtros











Base de datos
Intervalo de año de publicación
1.
Phys Rev E ; 109(1-1): 014110, 2024 Jan.
Artículo en Inglés | MEDLINE | ID: mdl-38366404

RESUMEN

Birth and death Markov processes can model stochastic physical systems from percolation to disease spread and, in particular, wildfires. We introduce and analyze a birth-death-suppression Markov process as a model of controlled culling of an abstract, dynamic population. Using analytic techniques, we characterize the probabilities and timescales of outcomes like absorption at zero (extinguishment) and the probability of the cumulative population (burned area) reaching a given size. The latter requires control over the embedded Markov chain: this discrete process is solved using the Pollazcek orthogonal polynomials, a deformation of the Gegenbauer/ultraspherical polynomials. This allows analysis of processes with bounded cumulative population, corresponding to finite burnable substrate in the wildfire interpretation, with probabilities represented as spectral integrals. This technology is developed to lay the foundations for a dynamic decision support framework. We devise real-time risk metrics and suggest future directions for determining optimal suppression strategies, including multievent resource allocation problems and potential applications for reinforcement learning.

2.
Nat Commun ; 14(1): 186, 2023 01 17.
Artículo en Inglés | MEDLINE | ID: mdl-36650144

RESUMEN

Dynamic processes on networks, be it information transfer in the Internet, contagious spreading in a social network, or neural signaling, take place along shortest or nearly shortest paths. Computing shortest paths is a straightforward task when the network of interest is fully known, and there are a plethora of computational algorithms for this purpose. Unfortunately, our maps of most large networks are substantially incomplete due to either the highly dynamic nature of networks, or high cost of network measurements, or both, rendering traditional path finding methods inefficient. We find that shortest paths in large real networks, such as the network of protein-protein interactions and the Internet at the autonomous system level, are not random but are organized according to latent-geometric rules. If nodes of these networks are mapped to points in latent hyperbolic spaces, shortest paths in them align along geodesic curves connecting endpoint nodes. We find that this alignment is sufficiently strong to allow for the identification of shortest path nodes even in the case of substantially incomplete networks, where numbers of missing links exceed those of observable links. We demonstrate the utility of latent-geometric path finding in problems of cellular pathway reconstruction and communication security.


Asunto(s)
Algoritmos , Transducción de Señal , Comunicación , Comunicación Celular
3.
Risk Anal ; 43(8): 1694-1707, 2023 Aug.
Artículo en Inglés | MEDLINE | ID: mdl-36229425

RESUMEN

The Mission Dependency Index (MDI) is a risk metric used by US military services and federal agencies for guiding operations, management, and funding decisions for facilities. Despite its broad adoption for guiding the expenditure of billions in federal funds, several studies on MDI suggest it may have flaws that limit its efficacy. We present a detailed technical analysis of MDI to show how its flaws impact infrastructure decisions. We present the MDI used by the US Navy and develop a critique of current methods. We identify six problems with MDI that stem from its interpretation, use, and mathematical formulation, and we provide examples demonstrating how these flaws can bias decisions. We provide recommendations to overcome flaws for infrastructure risk decision making but ultimately recommend the US government develop a new metric less susceptible to bias.

4.
Risk Anal ; 39(9): 1870-1884, 2019 09.
Artículo en Inglés | MEDLINE | ID: mdl-31100198

RESUMEN

The concept of "resilience analytics" has recently been proposed as a means to leverage the promise of big data to improve the resilience of interdependent critical infrastructure systems and the communities supported by them. Given recent advances in machine learning and other data-driven analytic techniques, as well as the prevalence of high-profile natural and man-made disasters, the temptation to pursue resilience analytics without question is almost overwhelming. Indeed, we find big data analytics capable to support resilience to rare, situational surprises captured in analytic models. Nonetheless, this article examines the efficacy of resilience analytics by answering a single motivating question: Can big data analytics help cyber-physical-social (CPS) systems adapt to surprise? This article explains the limitations of resilience analytics when critical infrastructure systems are challenged by fundamental surprises never conceived during model development. In these cases, adoption of resilience analytics may prove either useless for decision support or harmful by increasing dangers during unprecedented events. We demonstrate that these dangers are not limited to a single CPS context by highlighting the limits of analytic models during hurricanes, dam failures, blackouts, and stock market crashes. We conclude that resilience analytics alone are not able to adapt to the very events that motivate their use and may, ironically, make CPS systems more vulnerable. We present avenues for future research to address this deficiency, with emphasis on improvisation to adapt CPS systems to fundamental surprise.

5.
Phys Rev E ; 97(1-1): 012309, 2018 Jan.
Artículo en Inglés | MEDLINE | ID: mdl-29448477

RESUMEN

We analyze the stability of the network's giant connected component under impact of adverse events, which we model through the link percolation. Specifically, we quantify the extent to which the largest connected component of a network consists of the same nodes, regardless of the specific set of deactivated links. Our results are intuitive in the case of single-layered systems: the presence of large degree nodes in a single-layered network ensures both its robustness and stability. In contrast, we find that interdependent networks that are robust to adverse events have unstable connected components. Our results bring novel insights to the design of resilient network topologies and the reinforcement of existing networked systems.

7.
Risk Anal ; 35(4): 562-86, 2015 Apr.
Artículo en Inglés | MEDLINE | ID: mdl-25808298

RESUMEN

We propose a definition of infrastructure resilience that is tied to the operation (or function) of an infrastructure as a system of interacting components and that can be objectively evaluated using quantitative models. Specifically, for any particular system, we use quantitative models of system operation to represent the decisions of an infrastructure operator who guides the behavior of the system as a whole, even in the presence of disruptions. Modeling infrastructure operation in this way makes it possible to systematically evaluate the consequences associated with the loss of infrastructure components, and leads to a precise notion of "operational resilience" that facilitates model verification, validation, and reproducible results. Using a simple example of a notional infrastructure, we demonstrate how to use these models for (1) assessing the operational resilience of an infrastructure system, (2) identifying critical vulnerabilities that threaten its continued function, and (3) advising policymakers on investments to improve resilience.

8.
PLoS One ; 10(2): e0115826, 2015.
Artículo en Inglés | MEDLINE | ID: mdl-25688857

RESUMEN

Developing robust, quantitative methods to optimize resource allocations in response to epidemics has the potential to save lives and minimize health care costs. In this paper, we develop and apply a computationally efficient algorithm that enables us to calculate the complete probability distribution for the final epidemic size in a stochastic Susceptible-Infected-Recovered (SIR) model. Based on these results, we determine the optimal allocations of a limited quantity of vaccine between two non-interacting populations. We compare the stochastic solution to results obtained for the traditional, deterministic SIR model. For intermediate quantities of vaccine, the deterministic model is a poor estimate of the optimal strategy for the more realistic, stochastic case.


Asunto(s)
Control de Enfermedades Transmisibles , Epidemias/prevención & control , Modelos Teóricos , Vacunación , Algoritmos , Humanos
9.
PLoS One ; 9(2): e87380, 2014.
Artículo en Inglés | MEDLINE | ID: mdl-24520331

RESUMEN

Identifying and quantifying factors influencing human decision making remains an outstanding challenge, impacting the performance and predictability of social and technological systems. In many cases, system failures are traced to human factors including congestion, overload, miscommunication, and delays. Here we report results of a behavioral network science experiment, targeting decision making in a natural disaster. In a controlled laboratory setting, our results quantify several key factors influencing individual evacuation decision making in a controlled laboratory setting. The experiment includes tensions between broadcast and peer-to-peer information, and contrasts the effects of temporal urgency associated with the imminence of the disaster and the effects of limited shelter capacity for evacuees. Based on empirical measurements of the cumulative rate of evacuations as a function of the instantaneous disaster likelihood, we develop a quantitative model for decision making that captures remarkably well the main features of observed collective behavior across many different scenarios. Moreover, this model captures the sensitivity of individual- and population-level decision behaviors to external pressures, and systematic deviations from the model provide meaningful estimates of variability in the collective response. Identification of robust methods for quantifying human decisions in the face of risk has implications for policy in disasters and other threat scenarios, specifically the development and testing of robust strategies for training and control of evacuations that account for human behavior and network topologies.


Asunto(s)
Conducta Cooperativa , Técnicas de Apoyo para la Decisión , Planificación en Desastres , Actitud , Desastres , Humanos , Riesgo , Red Social , Factores de Tiempo
10.
Phys Rev E Stat Nonlin Soft Matter Phys ; 86(3 Pt 2): 036105, 2012 Sep.
Artículo en Inglés | MEDLINE | ID: mdl-23030978

RESUMEN

We develop a sequence of models describing information transmission and decision dynamics for a network of individual agents subject to multiple sources of influence. Our general framework is set in the context of an impending natural disaster, where individuals, represented by nodes on the network, must decide whether or not to evacuate. Sources of influence include a one-to-many externally driven global broadcast as well as pairwise interactions, across links in the network, in which agents transmit either continuous opinions or binary actions. We consider both uniform and variable threshold rules on the individual opinion as baseline models for decision making. Our results indicate that (1) social networks lead to clustering and cohesive action among individuals, (2) binary information introduces high temporal variability and stagnation, and (3) information transmission over the network can either facilitate or hinder action adoption, depending on the influence of the global broadcast relative to the social network. Our framework highlights the essential role of local interactions between agents in predicting collective behavior of the population as a whole.


Asunto(s)
Conducta Cooperativa , Toma de Decisiones , Técnicas de Apoyo para la Decisión , Teoría del Juego , Modelos Teóricos , Simulación por Computador
11.
PLoS One ; 7(4): e33285, 2012.
Artículo en Inglés | MEDLINE | ID: mdl-22514605

RESUMEN

Challenges associated with the allocation of limited resources to mitigate the impact of natural disasters inspire fundamentally new theoretical questions for dynamic decision making in coupled human and natural systems. Wildfires are one of several types of disaster phenomena, including oil spills and disease epidemics, where (1) the disaster evolves on the same timescale as the response effort, and (2) delays in response can lead to increased disaster severity and thus greater demand for resources. We introduce a minimal stochastic process to represent wildfire progression that nonetheless accurately captures the heavy tailed statistical distribution of fire sizes observed in nature. We then couple this model for fire spread to a series of response models that isolate fundamental tradeoffs both in the strength and timing of response and also in division of limited resources across multiple competing suppression efforts. Using this framework, we compute optimal strategies for decision making scenarios that arise in fire response policy.


Asunto(s)
Incendios , Modelos Teóricos , Asignación de Recursos , Toma de Decisiones , Planificación en Desastres
12.
Phys Rev E Stat Nonlin Soft Matter Phys ; 75(4 Pt 2): 046102, 2007 Apr.
Artículo en Inglés | MEDLINE | ID: mdl-17500956

RESUMEN

A popular approach for describing the structure of many complex networks focuses on graph theoretic properties that characterize their large-scale connectivity. While it is generally recognized that such descriptions based on aggregate statistics do not uniquely characterize a particular graph and also that many such statistical features are interdependent, the relationship between competing descriptions is not entirely understood. This paper lends perspective on this problem by showing how the degree sequence and other constraints (e.g., connectedness, no self-loops or parallel edges) on a particular graph play a primary role in dictating many features, including its correlation structure. Building on recent work, we show how a simple structural metric characterizes key differences between graphs having the same degree sequence. More broadly, we show how the (often implicit) choice of a background set against which to measure graph features has serious implications for the interpretation and comparability of graph theoretic descriptions.

13.
Proc Natl Acad Sci U S A ; 102(41): 14497-502, 2005 Oct 11.
Artículo en Inglés | MEDLINE | ID: mdl-16204384

RESUMEN

The search for unifying properties of complex networks is popular, challenging, and important. For modeling approaches that focus on robustness and fragility as unifying concepts, the Internet is an especially attractive case study, mainly because its applications are ubiquitous and pervasive, and widely available exposition exists at every level of detail. Nevertheless, alternative approaches to modeling the Internet often make extremely different assumptions and derive opposite conclusions about fundamental properties of one and the same system. Fortunately, a detailed understanding of Internet technology combined with a unique ability to measure the network means that these differences can be understood thoroughly and resolved unambiguously. This article aims to make recent results of this process accessible beyond Internet specialists to the broader scientific community and to clarify several sources of basic methodological differences that are relevant beyond either the Internet or the two specific approaches focused on here (i.e., scale-free networks and highly optimized tolerance networks).

SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA