Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 27
Filtrar
Mais filtros











Base de dados
Intervalo de ano de publicação
1.
Front Bioeng Biotechnol ; 11: 1250298, 2023.
Artigo em Inglês | MEDLINE | ID: mdl-37711457

RESUMO

In the last 20 years, the field of biotechnology has made significant progress and attracted substantial investments, leading to different paths of technological modernization among nations. As a result, there is now an international divide in the commercial and intellectual capabilities of biotechnology, and the implications of this divergence are not well understood. This raises important questions about why global actors are motivated to participate in biotechnology modernization, the challenges they face in achieving their goals, and the possible future direction of global biotechnology development. Using the framework of prospect theory, this paper explores the role of risk culture as a fundamental factor contributing to this divergence. It aims to assess the risks and benefits associated with the early adoption of biotechnology and the regulatory frameworks that shape the development and acceptance of biotechnological innovations. By doing so, it provides valuable insights into the future of biotechnology development and its potential impact on the global landscape.

2.
Ann Oper Res ; : 1-38, 2022 Mar 06.
Artigo em Inglês | MEDLINE | ID: mdl-35283547

RESUMO

Mitigating the impacts of COVID-19 comes with the evaluation of tradeoffs. However, the exact magnitude of the tradeoffs being made cannot be known ahead of time. There are three major concerns to balance: life, liberty, and economy. Here, we create a multi-attribute utility function including those three attributes and provide reasonable bounds on the weights of each. No one set of weights on the utility function can be considered "correct." Furthermore, the outcomes of each mitigation strategy are deeply uncertain. Not only do we need to take into account the characteristics of the disease, but we also need to take into account the efficacy of the mitigation strategies and how each outcome would be evaluated. To handle this, we use Robust Decision Making methods to simulate plausible outcomes for various strategies and evaluate those outcomes using different weights on the multi-attribute utility function. The simulation is done with a compartmental epidemiological model combined with a simple economic model and a model of liberty costs. Rather than trying to optimize likely outcomes for a particular version of the utility function, we find which strategies are robust across a wide range of plausible scenarios even when there is disagreement about how to weigh the competing values of life, liberty, and economy.

5.
Risk Anal ; 41(1): 3-15, 2021 01.
Artigo em Inglês | MEDLINE | ID: mdl-32818299

RESUMO

Desirable system performance in the face of threats has been characterized by various management concepts. Through semistructured interviews with editors of journals in the fields of emergency response and systems management, a literature review, and professional judgment, we identified nine related and often interchangeably used system performance concepts: adaptability, agility, reliability, resilience, resistance, robustness, safety, security, and sustainability. A better understanding of these concepts will allow system planners to pursue management strategies best suited to their unique system dynamics and specific objectives of good performance. We analyze expert responses and review the linguistic definitions and mathematical framing of these concepts to understand their applications. We find a lack of consensus on their usage between interview subjects, but by using the mathematical framing to enrich the linguistic definitions, we formulate comparative visualizations and propose distinct definitions for the nine concepts. We present a conceptual framing to relate the concepts for management purposes.

6.
Risk Anal ; 41(9): 1513-1521, 2021 09.
Artigo em Inglês | MEDLINE | ID: mdl-33174246

RESUMO

Recent guidelines for risk-informed decision making (RIDM) provide a gold-standard for how to incorporate probabilistic risk models in conjunction with other considerations in a decision process. Nevertheless, risk quantification using probabilistic and statistical methods is difficult in situations where threat, vulnerability, and consequences are highly uncertain and risk quantification. In such situations a wider variety of methods could be employed, which we call decision making informed by risk (DMIR) combining risk and decision analytics. Risk informed decision making (RIDM) can be considered as a special case of DMIR. Multi criteria decision analysis (MCDA) often serves as a basis for DMIR in order to flexibly accommodate different levels of analytical detail. DMIR often involves artful use of proxy variables that correlate with, and are more measurable than, underlying factors of interest. This article introduces the notion of DMIR and discusses the use of MCDA in its application in the context of risk-based problems. MCDA-based risk analyses identify metrics associated with threats of concern and system vulnerabilities, characterize the way in which alternative actions can affect these threats and vulnerabilities, and ultimately synthesize this information to compare, prioritize, or select alternative mitigation strategies. Simple linear additive MCDA models often integrate these inputs, but the same simplicity can limit such approaches and create pitfalls and more advanced models including multiplicative relationships can be warranted. This essay qualitatively explores the critical practitioner questions of how and when the use of linear multicriteria models creates significant problems, and how to avoid them.


Assuntos
Tomada de Decisões , Guias como Assunto , Medição de Risco , Modelos Teóricos
7.
Environ Sci Technol ; 54(8): 4706-4708, 2020 04 21.
Artigo em Inglês | MEDLINE | ID: mdl-32223156
8.
Risk Anal ; 40(1): 183-199, 2020 01.
Artigo em Inglês | MEDLINE | ID: mdl-28873246

RESUMO

Risk assessors and managers face many difficult challenges related to novel cyber systems. Among these challenges are the constantly changing nature of cyber systems caused by technical advances, their distribution across the physical, information, and sociocognitive domains, and the complex network structures often including thousands of nodes. Here, we review probabilistic and risk-based decision-making techniques applied to cyber systems and conclude that existing approaches typically do not address all components of the risk assessment triplet (threat, vulnerability, consequence) and lack the ability to integrate across multiple domains of cyber systems to provide guidance for enhancing cybersecurity. We present a decision-analysis-based approach that quantifies threat, vulnerability, and consequences through a set of criteria designed to assess the overall utility of cybersecurity management alternatives. The proposed framework bridges the gap between risk assessment and risk management, allowing an analyst to ensure a structured and transparent process of selecting risk management alternatives. The use of this technique is illustrated for a hypothetical, but realistic, case study exemplifying the process of evaluating and ranking five cybersecurity enhancement strategies. The approach presented does not necessarily eliminate biases and subjectivity necessary for selecting countermeasures, but provides justifiable methods for selecting risk management actions consistent with stakeholder and decisionmaker values and technical data.

9.
ALTEX ; 36(3): 353-362, 2019.
Artigo em Inglês | MEDLINE | ID: mdl-30662994

RESUMO

The adverse outcome pathway (AOP) framework is a conceptual construct that mechanistically links molecular initiating events to adverse biological outcomes through a series of causal key events (KEs) that represent the perturbation of the biological system. Quantitative, predictive AOPs are necessary for screening emerging contaminants and potential substitutes to inform their prioritization for testing. In practice, they are not widely used because they can be costly to develop and validate. A modular approach for assembly of quantitative AOPs, based on existing knowledge, would allow for rapid development of biological pathway models to screen contaminants for potential hazards and prioritize them for subsequent testing and modeling. For each pair of KEs, a quantitative KE relationship (KER) can be derived as a response-response function or a conditional probability matrix describing the anticipated change in a KE based on the response of the prior KE. This transfer of response across KERs can be used to assemble a quantitative AOP. Here we demonstrate the use of proposed approach in two cases: inhibition of cytochrome P450 aromatase leading to reduced fecundity in fathead minnows and ionic glutamate receptor mediated excitotoxicity leading to memory impairment in rodents. The model created from these chains have value in characterizing the pathway and the potential or relative level of toxicological effect anticipated. This approach to simplistic, modular AOP models has wide applicability for rapid development of biological pathway models.


Assuntos
Rotas de Resultados Adversos , Pesquisa Biomédica , Toxicologia , Animais , Humanos
12.
Nature ; 555(7694): 30, 2018 Mar.
Artigo em Inglês | MEDLINE | ID: mdl-32094907
13.
Nat Nanotechnol ; 12(8): 740-743, 2017 08 04.
Artigo em Inglês | MEDLINE | ID: mdl-28775358

RESUMO

Two analytic perspectives on environmental assessment dominate environmental policy and decision-making: risk analysis (RA) and life-cycle assessment (LCA). RA focuses on management of a toxicological hazard in a specific exposure scenario, while LCA seeks a holistic estimation of impacts of thousands of substances across multiple media, including non-toxicological and non-chemically deleterious effects. While recommendations to integrate the two approaches have remained a consistent feature of environmental scholarship for at least 15 years, the current perception is that progress is slow largely because of practical obstacles, such as a lack of data, rather than insurmountable theoretical difficulties. Nonetheless, the emergence of nanotechnology presents a serious challenge to both perspectives. Because the pace of nanomaterial innovation far outstrips acquisition of environmentally relevant data, it is now clear that a further integration of RA and LCA based on dataset completion will remain futile. In fact, the two approaches are suited for different purposes and answer different questions. A more pragmatic approach to providing better guidance to decision-makers is to apply the two methods in parallel, integrating only after obtaining separate results.

14.
Sci Adv ; 3(12): e1701079, 2017 12.
Artigo em Inglês | MEDLINE | ID: mdl-29291243

RESUMO

Urban transportation systems are vulnerable to congestion, accidents, weather, special events, and other costly delays. Whereas typical policy responses prioritize reduction of delays under normal conditions to improve the efficiency of urban road systems, analytic support for investments that improve resilience (defined as system recovery from additional disruptions) is still scarce. In this effort, we represent paved roads as a transportation network by mapping intersections to nodes and road segments between the intersections to links. We built road networks for 40 of the urban areas defined by the U.S. Census Bureau. We developed and calibrated a model to evaluate traffic delays using link loads. The loads may be regarded as traffic-based centrality measures, estimating the number of individuals using corresponding road segments. Efficiency was estimated as the average annual delay per peak-period auto commuter, and modeled results were found to be close to observed data, with the notable exception of New York City. Resilience was estimated as the change in efficiency resulting from roadway disruptions and was found to vary between cities, with increased delays due to a 5% random loss of road linkages ranging from 9.5% in Los Angeles to 56.0% in San Francisco. The results demonstrate that many urban road systems that operate inefficiently under normal conditions are nevertheless resilient to disruption, whereas some more efficient cities are more fragile. The implication is that resilience, not just efficiency, should be considered explicitly in roadway project selection and justify investment opportunities related to disaster and other disruptions.

15.
Risk Anal ; 37(9): 1644-1651, 2017 09.
Artigo em Inglês | MEDLINE | ID: mdl-27935146

RESUMO

Recent cyber attacks provide evidence of increased threats to our critical systems and infrastructure. A common reaction to a new threat is to harden the system by adding new rules and regulations. As federal and state governments request new procedures to follow, each of their organizations implements their own cyber defense strategies. This unintentionally increases time and effort that employees spend on training and policy implementation and decreases the time and latitude to perform critical job functions, thus raising overall levels of stress. People's performance under stress, coupled with an overabundance of information, results in even more vulnerabilities for adversaries to exploit. In this article, we embed a simple regulatory model that accounts for cybersecurity human factors and an organization's regulatory environment in a model of a corporate cyber network under attack. The resulting model demonstrates the effect of under- and overregulation on an organization's resilience with respect to insider threats. Currently, there is a tendency to use ad-hoc approaches to account for human factors rather than to incorporate them into cyber resilience modeling. It is clear that using a systematic approach utilizing behavioral science, which already exists in cyber resilience assessment, would provide a more holistic view for decisionmakers.

17.
Sci Rep ; 6: 19540, 2016 Jan 19.
Artigo em Inglês | MEDLINE | ID: mdl-26782180

RESUMO

Building resilience into today's complex infrastructures is critical to the daily functioning of society and its ability to withstand and recover from natural disasters, epidemics, and cyber-threats. This study proposes quantitative measures that capture and implement the definition of engineering resilience advanced by the National Academy of Sciences. The approach is applicable across physical, information, and social domains. It evaluates the critical functionality, defined as a performance function of time set by the stakeholders. Critical functionality is a source of valuable information, such as the integrated system resilience over a time interval, and its robustness. The paper demonstrates the formulation on two classes of models: 1) multi-level directed acyclic graphs, and 2) interdependent coupled networks. For both models synthetic case studies are used to explore trends. For the first class, the approach is also applied to the Linux operating system. Results indicate that desired resilience and robustness levels are achievable by trading off different design parameters, such as redundancy, node recovery time, and backup supply available. The nonlinear relationship between network parameters and resilience levels confirms the utility of the proposed approach, which is of benefit to analysts and designers of complex systems and networks.


Assuntos
Modelos Teóricos , Software , Desastres , National Academy of Sciences, U.S. , Estados Unidos
18.
Environ Sci Technol ; 50(1): 349-58, 2016 Jan 05.
Artigo em Inglês | MEDLINE | ID: mdl-26580228

RESUMO

Emerging technologies present significant challenges to researchers, decision-makers, industry professionals, and other stakeholder groups due to the lack of quantitative risk, benefit, and cost data associated with their use. Multi-criteria decision analysis (MCDA) can support early decisions for emerging technologies when data is too sparse or uncertain for traditional risk assessment. It does this by integrating expert judgment with available quantitative and qualitative inputs across multiple criteria to provide relative technology scores. Here, an MCDA framework provides preliminary insights on the suitability of emerging technologies for environmental remediation by comparing nanotechnology and synthetic biology to conventional remediation methods. Subject matter experts provided judgments regarding the importance of criteria used in the evaluations and scored the technologies with respect to those criteria. The results indicate that synthetic biology may be preferred over nanotechnology and conventional methods for high expected benefits and low deployment costs but that conventional technology may be preferred over emerging technologies for reduced risks and development costs. In the absence of field data regarding the risks, benefits, and costs of emerging technologies, structuring evidence-based expert judgment through a weighted hierarchy of topical questions may be helpful to inform preliminary risk governance and guide emerging technology development and policy.


Assuntos
Recuperação e Remediação Ambiental , Técnicas de Apoio para a Decisão , Recuperação e Remediação Ambiental/economia , Recuperação e Remediação Ambiental/métodos , Recuperação e Remediação Ambiental/tendências , Medição de Risco
19.
Nat Nanotechnol ; 11(2): 198-203, 2016 Feb.
Artigo em Inglês | MEDLINE | ID: mdl-26551015

RESUMO

Risk research for nanomaterials is currently prioritized by means of expert workshops and other deliberative processes. However, analytical techniques that quantify and compare alternative research investments are increasingly recommended. Here, we apply value of information and portfolio decision analysis-methods commonly applied in financial and operations management-to prioritize risk research for multiwalled carbon nanotubes and nanoparticulate silver and titanium dioxide. We modify the widely accepted CB Nanotool hazard evaluation framework, which combines nano- and bulk-material properties into a hazard score, to operate probabilistically with uncertain inputs. Literature is reviewed to develop uncertain estimates for each input parameter, and a Monte Carlo simulation is applied to assess how different research strategies can improve hazard classification. The relative cost of each research experiment is elicited from experts, which enables identification of efficient research portfolios-combinations of experiments that lead to the greatest improvement in hazard classification at the lowest cost. Nanoparticle shape, diameter, solubility and surface reactivity were most frequently identified within efficient portfolios in our results.

20.
Beilstein J Nanotechnol ; 6: 1594-600, 2015.
Artigo em Inglês | MEDLINE | ID: mdl-26425410

RESUMO

The increase in nanomaterial research has resulted in increased nanomaterial data. The next challenge is to meaningfully integrate and interpret these data for better and more efficient decisions. Due to the complex nature of nanomaterials, rapid changes in technology, and disunified testing and data publishing strategies, information regarding material properties is often illusive, uncertain, and/or of varying quality, which limits the ability of researchers and regulatory agencies to process and use the data. The vision of nanoinformatics is to address this problem by identifying the information necessary to support specific decisions (a top-down approach) and collecting and visualizing these relevant data (a bottom-up approach). Current nanoinformatics efforts, however, have yet to efficiently focus data acquisition efforts on the research most relevant for bridging specific nanomaterial data gaps. Collecting unnecessary data and visualizing irrelevant information are expensive activities that overwhelm decision makers. We propose that the decision analytic techniques of multicriteria decision analysis (MCDA), value of information (VOI), weight of evidence (WOE), and portfolio decision analysis (PDA) can bridge the gap from current data collection and visualization efforts to present information relevant to specific decision needs. Decision analytic and Bayesian models could be a natural extension of mechanistic and statistical models for nanoinformatics practitioners to master in solving complex nanotechnology challenges.

SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA