Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 9 de 9
Filtrar
Mais filtros

Base de dados
Tipo de documento
País/Região como assunto
Intervalo de ano de publicação
1.
Risk Anal ; 43(12): 2659-2670, 2023 Dec.
Artigo em Inglês | MEDLINE | ID: mdl-36810893

RESUMO

Planning for community resilience through public infrastructure projects often engenders problems associated with social dilemmas, but little work has been done to understand how individuals respond when presented with opportunities to invest in such developments. Using statistical learning techniques trained on the results of a web-based common pool resource game, we analyze participants' decisions to invest in hypothetical public infrastructure projects that bolster their community's resilience to disasters. Given participants' dispositions and in-game circumstances, Bayesian additive regression tree (BART) models are able to accurately predict deviations from players' decisions that would reasonably lead to Pareto-efficient outcomes for their communities. Participants tend to overcontribute relative to these Pareto-efficient strategies, indicating general risk aversion that is analogous to individuals purchasing disaster insurance even though it exceeds expected actuarial costs. However, higher trait Openness scores reflect an individual's tendency to follow a risk-neutral strategy, and fewer available resources predict lower perceived utilities derived from the infrastructure developments. In addition, several input variables have nonlinear effects on decisions, suggesting that it may be warranted to use more sophisticated statistical learning methods to reexamine results from previous studies that assume linear relationships between individuals' dispositions and responses in applications of game theory or decision theory.


Assuntos
Planejamento em Desastres , Desastres , Resiliência Psicológica , Humanos , Teorema de Bayes , Teoria dos Jogos , Tomada de Decisões
2.
Risk Anal ; 40(9): 1795-1810, 2020 Sep.
Artigo em Inglês | MEDLINE | ID: mdl-32583477

RESUMO

The concepts of vulnerability and resilience help explain why natural hazards of similar type and magnitude can have disparate impacts on varying communities. Numerous frameworks have been developed to measure these concepts, but a clear and consistent method of comparing them is lacking. Here, we develop a data-driven approach for reconciling a popular class of frameworks known as vulnerability and resilience indices. In particular, we conduct an exploratory factor analysis on a comprehensive set of variables from established indices measuring community vulnerability and resilience at the U.S. county level. The resulting factor model suggests that 50 of the 130 analyzed variables effectively load onto five dimensions: wealth, poverty, agencies per capita, elderly populations, and non-English-speaking populations. Additionally, the factor structure establishes an objective and intuitive schema for relating the constituent elements of vulnerability and resilience indices, in turn affording researchers a flexible yet robust baseline for validating and expanding upon current approaches.

3.
Risk Anal ; 39(9): 1930-1948, 2019 09.
Artigo em Inglês | MEDLINE | ID: mdl-31287575

RESUMO

The ability to accurately measure recovery rate of infrastructure systems and communities impacted by disasters is vital to ensure effective response and resource allocation before, during, and after a disruption. However, a challenge in quantifying such measures resides in the lack of data as community recovery information is seldom recorded. To provide accurate community recovery measures, a hierarchical Bayesian kernel model (HBKM) is developed to predict the recovery rate of communities experiencing power outages during storms. The performance of the proposed method is evaluated using cross-validation and compared with two models, the hierarchical Bayesian regression model and the Poisson generalized linear model. A case study focusing on the recovery of communities in Shelby County, Tennessee after severe storms between 2007 and 2017 is presented to illustrate the proposed approach. The predictive accuracy of the models is evaluated using the log-likelihood and root mean squared error. The HBKM yields on average the highest out-of-sample predictive accuracy. This approach can help assess the recoverability of a community when data are scarce and inform decision making in the aftermath of a disaster. An illustrative example is presented demonstrating how accurate measures of community resilience can help reduce the cost of infrastructure restoration.

4.
Risk Anal ; 39(11): 2479-2498, 2019 11.
Artigo em Inglês | MEDLINE | ID: mdl-31290175

RESUMO

Communities are complex systems subject to a variety of hazards that can result in significant disruption to critical functions. Community resilience assessment is rapidly gaining popularity as a means to help communities better prepare for, respond to, and recover from disruption. Sustainable resilience, a recently developed concept, requires communities to assess system-wide capability to maintain desired performance levels while simultaneously evaluating impacts to resilience due to changes in hazards and vulnerability over extended periods of time. To enable assessment of community sustainable resilience, we review current literature, consolidate available indicators and metrics, and develop a classification scheme and organizational structure to aid in identification, selection, and application of indicators within a dynamic assessment framework. A nonduplicative set of community sustainable resilience indicators and metrics is provided that can be tailored to a community's needs, thereby enhancing the ability to operationalize the assessment process.

5.
Risk Anal ; 39(9): 1913-1929, 2019 09.
Artigo em Inglês | MEDLINE | ID: mdl-31173664

RESUMO

Managing risk in infrastructure systems implies dealing with interdependent physical networks and their relationships with the natural and societal contexts. Computational tools are often used to support operational decisions aimed at improving resilience, whereas economics-related tools tend to be used to address broader societal and policy issues in infrastructure management. We propose an optimization-based framework for infrastructure resilience analysis that incorporates organizational and socioeconomic aspects into operational problems, allowing to understand relationships between decisions at the policy level (e.g., regulation) and the technical level (e.g., optimal infrastructure restoration). We focus on three issues that arise when integrating such levels. First, optimal restoration strategies driven by financial and operational factors evolve differently compared to those driven by socioeconomic and humanitarian factors. Second, regulatory aspects have a significant impact on recovery dynamics (e.g., effective recovery is most challenging in societies with weak institutions and regulation, where individual interests may compromise societal well-being). And third, the decision space (i.e., available actions) in postdisaster phases is strongly determined by predisaster decisions (e.g., resource allocation). The proposed optimization framework addresses these issues by using: (1) parametric analyses to test the influence of operational and socioeconomic factors on optimization outcomes, (2) regulatory constraints to model and assess the cost and benefit (for a variety of actors) of enforcing specific policy-related conditions for the recovery process, and (3) sensitivity analyses to capture the effect of predisaster decisions on recovery. We illustrate our methodology with an example regarding the recovery of interdependent water, power, and gas networks in Shelby County, TN (USA), with exposure to natural hazards.

6.
Risk Anal ; 35(4): 642-62, 2015 Apr.
Artigo em Inglês | MEDLINE | ID: mdl-24924523

RESUMO

Recent studies in system resilience have proposed metrics to understand the ability of systems to recover from a disruptive event, often offering a qualitative treatment of resilience. This work provides a quantitative treatment of resilience and focuses specifically on measuring resilience in infrastructure networks. Inherent cost metrics are introduced: loss of service cost and total network restoration cost. Further, "costs" of network resilience are often shared across multiple infrastructures and industries that rely upon those networks, particularly when such networks become inoperable in the face of disruptive events. As such, this work integrates the quantitative resilience approach with a model describing the regional, multi-industry impacts of a disruptive event to measure the interdependent impacts of network resilience. The approaches discussed in this article are deployed in a case study of an inland waterway transportation network, the Mississippi River Navigation System.

7.
Risk Anal ; 34(7): 1317-35, 2014 Jul.
Artigo em Inglês | MEDLINE | ID: mdl-24576121

RESUMO

Given the ubiquitous nature of infrastructure networks in today's society, there is a global need to understand, quantify, and plan for the resilience of these networks to disruptions. This work defines network resilience along dimensions of reliability, vulnerability, survivability, and recoverability, and quantifies network resilience as a function of component and network performance. The treatment of vulnerability and recoverability as random variables leads to stochastic measures of resilience, including time to total system restoration, time to full system service resilience, and time to a specific α% resilience. Ultimately, a means to optimize network resilience strategies is discussed, primarily through an adaption of the Copeland Score for nonparametric stochastic ranking. The measures of resilience and optimization techniques are applied to inland waterway networks, an important mode in the larger multimodal transportation network upon which we rely for the flow of commodities. We provide a case study analyzing and planning for the resilience of commodity flows along the Mississippi River Navigation System to illustrate the usefulness of the proposed metrics.

8.
PLoS One ; 17(8): e0271230, 2022.
Artigo em Inglês | MEDLINE | ID: mdl-35921327

RESUMO

A spatially-resolved understanding of the intensity of a flood hazard is required for accurate predictions of infrastructure reliability and losses in the aftermath. Currently, researchers who wish to predict flood losses or infrastructure reliability following a flood usually rely on computationally intensive hydrodynamic modeling or on flood hazard maps (e.g., the 100-year floodplain) to build a spatially-resolved understanding of the flood's intensity. However, both have specific limitations. The former requires both subject matter expertise to create the models and significant computation time, while the latter is a static metric that provides no variation among specific events. The objective of this work is to develop an integrated data-driven approach to rapidly predict flood damages using two emerging flood intensity heuristics, namely the Flood Peak Ratio (FPR) and NASA's Giovanni Flooded Fraction (GFF). This study uses data on flood claims from the National Flood Insurance Program (NFIP) to proxy flood damage, along with other well-established flood exposure variables, such as regional slope and population. The approach uses statistical learning methods to generate predictive models at two spatial levels: nationwide and statewide for the entire contiguous United States. A variable importance analysis demonstrates the significance of FPR and GFF data in predicting flood damage. In addition, the model performance at the state-level was higher than the nationwide level analysis, indicating the effectiveness of both FPR and GFF models at the regional level. A data-driven approach to predict flood damage using the FPR and GFF data offer promise considering their relative simplicity, their reliance on publicly accessible data, and their comparatively fast computational speed.


Assuntos
Inundações , Reprodutibilidade dos Testes , Estados Unidos
9.
Accid Anal Prev ; 165: 106501, 2022 Feb.
Artigo em Inglês | MEDLINE | ID: mdl-34929574

RESUMO

In the last fifty years, researchers have developed statistical, data-driven, analytical, and algorithmic approaches for designing and improving emergency response management (ERM) systems. The problem has been noted as inherently difficult and constitutes spatio-temporal decision making under uncertainty, which has been addressed in the literature with varying assumptions and approaches. This survey provides a detailed review of these approaches, focusing on the key challenges and issues regarding four sub-processes: (a) incident prediction, (b) incident detection, (c) resource allocation, and (c) computer-aided dispatch for emergency response. We highlight the strengths and weaknesses of prior work in this domain and explore the similarities and differences between different modeling paradigms. We conclude by illustrating open challenges and opportunities for future research in this complex domain.


Assuntos
Acidentes de Trânsito , Alocação de Recursos , Humanos , Incerteza
SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA