Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 54
Filtrar
Más filtros

Bases de datos
País/Región como asunto
Tipo del documento
Intervalo de año de publicación
1.
Environ Sci Technol ; 58(12): 5347-5356, 2024 Mar 26.
Artículo en Inglés | MEDLINE | ID: mdl-38478968

RESUMEN

Dechlorination is one of the main processes for the natural degradation of polychlorinated biphenyls (PCBs) in an anaerobic environment. However, PCB dechlorination pathways and products vary with PCB congeners, types of functional dechlorinating bacteria, and environmental conditions. The present study develops a novel model for determining dechlorination pathways and fluxes by tracking redox potential variability, transforming the complex dechlorination process into a stepwise sequence. The redox potential is calculated via the Gibbs free energy of formation, PCB concentrations in reactants and products, and environmental conditions. Thus, the continuous change in the PCB congener composition can be tracked during dechlorination processes. The new model is assessed against four measurements from several published studies on PCB dechlorination. The simulation errors in all four measurements are calculated between 2.67 and 35.1% under minimum (n = 0) and maximum (n = 34) numbers of co-eluters, respectively. The dechlorination fluxes for para-dechlorination pathways dominate PCB dechlorination in all measurements. Furthermore, the model also considers multiple-step dechlorination pathways containing intermediate PCB congeners absent in both the reactants and the products. The present study indicates that redox potential might be an appropriate indicator for predicting PCB dechlorination pathways and fluxes even without prior knowledge of the functional dechlorinating bacteria.


Asunto(s)
Bifenilos Policlorados , Bifenilos Policlorados/análisis , Bifenilos Policlorados/metabolismo , Biodegradación Ambiental , Sedimentos Geológicos/microbiología , Bacterias/metabolismo , Oxidación-Reducción , Cloro/metabolismo
2.
Environ Sci Technol ; 57(1): 842-851, 2023 01 10.
Artículo en Inglés | MEDLINE | ID: mdl-36563039

RESUMEN

Following an exceedance of the lead action level for drinking water in 2016, the Pittsburgh Water and Sewer Authority (PWSA) undertook two sampling programs: the required biannual Lead and Copper Rule (LCR) compliance testing and a home sampling program based on customer requests. The LCR sampling results, at locations expected to be elevated when corrosion is not well controlled, had higher concentrations than customer-requested homes, with 90th percentile values for the LCR sites exceeding the action level through 2019 (except for June 2018). Customer-requested concentrations showed greater variability, with the median lead concentration for customer-requested samples below detection for each year of sampling, suggesting only some homes show elevated lead when corrosion control is not fully effective. Corrosion control adjustments brought the utility back into compliance in 2020 (LCR 90th percentile of 5.1 ppb in June 2020); customer-requested sampling after the addition of orthophosphate indicated below detection levels for 59% of samples. Monte Carlo simulations indicate LCR samples do not all represent high lead risk sites, and the application of corrosion control more significantly affects higher lead concentration sites. Broader water quality sampling provides information about specific homes but is not well suited to assessing the efficacy of corrosion control efforts by utilities.


Asunto(s)
Agua Potable , Contaminantes Químicos del Agua , Plomo/análisis , Abastecimiento de Agua , Contaminantes Químicos del Agua/análisis , Calidad del Agua , Corrosión , Cobre/análisis
3.
Environ Sci Technol ; 57(46): 18215-18224, 2023 Nov 21.
Artículo en Inglés | MEDLINE | ID: mdl-37776276

RESUMEN

Sustainability challenges, such as solid waste management, are usually scientifically complex and data scarce, which makes them not amenable to science-based analytical forms or data-intensive learning paradigms. Deep integration between data science and sustainability science in highly complementary manners offers new opportunities for tackling these conundrums. This study develops a novel hybrid neural network (HNN) model that imposes the holistic decision-making context of solid waste management systems (SWMS) on a traditional neural network (NN) architecture. Equipped with adaptable hybridization designs of hand-crafted model structure, constrained or predetermined parameters, and a customized loss function, the HNN model is capable of learning various technical, economic, and social aspects of SWMS from a small and heterogeneous data set. In comparison, the versatile HNN model not only outperforms traditional NN models in convergence rates, which leads to a 22% lower mean testing error of 0.20, but also offers superior interpretability. The HNN model is capable of generating insights into the enabling factors, policy interventions, and driving forces of SWMS, laying a solid foundation for data-driven decision making.


Asunto(s)
Residuos Sólidos , Administración de Residuos , Aprendizaje Automático , Redes Neurales de la Computación
4.
Environ Sci Technol ; 56(4): 2709-2717, 2022 02 15.
Artículo en Inglés | MEDLINE | ID: mdl-35089697

RESUMEN

In a world of finite metallic minerals, demand forecasting is crucial for managing the stocks and flows of these critical resources. Previous studies have projected copper supply and demand at the global level and the regional level of EU and China. However, no comprehensive study exists for the U.S., which has displayed unique copper consumption and dematerialization trends. In this study, we adapted the stock dynamics approach to forecast the U.S. copper in-use stock (IUS), consumption, and end-of-life (EOL) flows from 2016 to 2070 under various U.S.-specific scenarios. Assuming different socio-technological development trajectories, our model results are consistent with a stabilization range of 215-260 kg/person for the IUS. This is projected along with steady growth in the annual copper consumption and EOL copper generation driven mainly by the growing U.S. population. This stabilization trend of per capita IUS indicates that future copper consumption will largely recuperate IUS losses, allowing 34-39% of future demand to be met potentially by recycling 43% of domestic EOL copper. Despite the recent trends of "dematerialization", adaptive policies still need to be designed for enhancing the EOL recovery, especially in light of a potential transitioning to a "green technology" future with increased electrification dictating higher copper demand.


Asunto(s)
Cobre , Reciclaje , China , Predicción , Humanos , Minerales
5.
Risk Anal ; 41(7): 1118-1128, 2021 07.
Artículo en Inglés | MEDLINE | ID: mdl-30698283

RESUMEN

There is a growing number of decision aids made available to the general public by those working on hazard and disaster management. When based on high-quality scientific studies across disciplines and designed to provide a high level of usability and trust, decision aids become more likely to improve the quality of hazard risk management and response decisions. Interdisciplinary teams have a vital role to play in this process, ensuring the scientific validity and effectiveness of a decision aid across the physical science, social science, and engineering dimensions of hazard awareness, option identification, and the decisions made by individuals and communities. Often, these aids are not evaluated before being widely distributed, which could improve their impact, due to a lack of dedicated resources and guidance on how to systematically do so. In this Perspective, we present a decision-centered method for evaluating the impact of hazard decision aids on decisionmaker preferences and choice during the design and development phase, drawing from the social and behavioral sciences and a value of information framework to inform the content, complexity, format, and overall evaluation of the decision aid. The first step involves quantifying the added value of the information contained in the decision aid. The second involves identifying the extent to which the decision aid is usable. Our method can be applied to a variety of hazards and disasters, and will allow interdisciplinary teams to more effectively evaluate the extent to which an aid can inform and improve decision making.


Asunto(s)
Técnicas de Apoyo para la Decisión , Investigación Interdisciplinaria , Investigadores , Medición de Riesgo , Humanos , Modelos Teóricos
6.
Environ Sci Technol ; 54(14): 8857-8867, 2020 07 21.
Artículo en Inglés | MEDLINE | ID: mdl-32579849

RESUMEN

The historical use of lead in potable water plumbing systems has caused significant public health challenges. The Lead and Copper Rule requires utilities to take action if the 90th percentile lead concentration exceeds the action level (AL) of 15 ppb. Assessment of the AL is based on a sample of homes representing a relatively small fraction of connections. Due to the intentional nonrepresentative sampling approach, the full set of conditions influencing lead concentrations in a large distribution system may be poorly characterized. Further, there is uncertainty in assessing statistical parameters such as the 90th percentile concentration. This work demonstrates methods to compute the uncertainty in the 90th percentile statistic and assesses the associated effect on compliance outcomes. The method is demonstrated on four utilities in southwest Pennsylvania (referred to as A, B, C, and D). For Utility A, evaluation of the 90th percentile showed an increase over time in observed and estimated values and the value's uncertainty. This type of change in the uncertainty might have served as an early warning of the exceedance that followed. This could have triggered more timely review of operational changes in order to avoid the effects of noncompliance on utility costs and consumer confidence.


Asunto(s)
Agua Potable , Plomo/análisis , Pennsylvania , Incertidumbre , Abastecimiento de Agua
7.
Environ Health ; 18(1): 23, 2019 03 22.
Artículo en Inglés | MEDLINE | ID: mdl-30902096

RESUMEN

Conventional environmental-health risk-assessment methods are often limited in their ability to account for uncertainty in contaminant exposure, chemical toxicity and resulting human health risk. Exposure levels and toxicity are both subject to significant measurement errors, and many predicted risks are well below those distinguishable from background incident rates in target populations. To address these issues methods are needed to characterize uncertainties in observations and inferences, including the ability to interpret the influence of improved measurements and larger datasets. Here we develop a Bayesian network (BN) model to quantify the joint effects of measurement errors and different sample sizes on an illustrative exposure-response system. Categorical variables are included in the network to describe measurement accuracies, actual and measured exposures, actual and measured response, and the true strength of the exposure-response relationship. Network scenarios are developed by fixing combinations of the exposure-response strength of relationship (none, medium or strong) and the accuracy of exposure and response measurements (low, high, perfect). Multiple cases are simulated for each scenario, corresponding to a synthetic exposure response study sampled from the known scenario population. A learn-from-cases algorithm is then used to assimilate the synthetic observations into an uninformed prior network, yielding updated probabilities for the strength of relationship. Ten replicate studies are simulated for each scenario and sample size, and results are presented for individual trials and their mean prediction. The model as parameterized yields little-to-no convergence when low accuracy measurements are used, though progressively faster convergence when employing high accuracy or perfect measurements. The inferences from the model are particularly efficient when the true strength of relationship is none or strong with smaller sample sizes. The tool developed in this study can help in the screening and design of exposure-response studies to better anticipate where such outcomes can occur under different levels of measurement error. It may also serve to inform methods of analysis for other network models that consider multiple streams of evidence from multiple studies of cumulative exposure and effects.


Asunto(s)
Teorema de Bayes , Exposición a Riesgos Ambientales , Modelos Estadísticos , Proyectos de Investigación , Medición de Riesgo , Humanos
8.
Environ Sci Technol ; 49(4): 2188-98, 2015 Feb 17.
Artículo en Inglés | MEDLINE | ID: mdl-25611369

RESUMEN

Engineered nanoparticles (NPs) released into natural environments will interact with natural organic matter (NOM) or humic substances, which will change their fate and transport behavior. Quantitative predictions of the effects of NOM are difficult because of its heterogeneity and variability. Here, the effects of six types of NOM and molecular weight fractions of each on the aggregation of citrate-stabilized gold NPs are investigated. Correlations of NP aggregation rates with electrophoretic mobility and the molecular weight distribution and chemical attributes of NOM (including UV absorptivity or aromaticity, functional group content, and fluorescence) are assessed. In general, the >100 kg/mol components provide better stability than lower molecular weight components for each type of NOM, and they contribute to the stabilizing effect of the unfractionated NOM even in small proportions. In many cases, unfractionated NOM provided better stability than its separated components, indicating a synergistic effect between the high and low molecular weight fractions for NP stabilization. Weight-averaged molecular weight was the best single explanatory variable for NP aggregation rates across all NOM types and molecular weight fractions. NP aggregation showed poorer correlation with UV absorptivity, but the exponential slope of the UV-vis absorbance spectrum was a better surrogate for molecular weight. Functional group data (including reduced sulfur and total nitrogen content) were explored as possible secondary parameters to explain the strong stabilizing effect of a low molecular weight Pony Lake fulvic acid sample to the gold NPs. These results can inform future correlations and measurement requirements to predict NP attachment in the presence of NOM.


Asunto(s)
Ingeniería Química/métodos , Electrólitos/química , Oro/química , Sustancias Húmicas/análisis , Nanopartículas del Metal/química , Benzopiranos/química , Ácido Cítrico , Ensayo de Cambio de Movilidad Electroforética , Peso Molecular
9.
Environ Sci Technol ; 49(2): 1215-24, 2015 Jan 20.
Artículo en Inglés | MEDLINE | ID: mdl-25551254

RESUMEN

This work uses probabilistic methods to simulate a hypothetical geologic CO2 storage site in a depleted oil and gas field, where the large number of legacy wells would make it cost-prohibitive to sample all wells for all measurements as part of the postinjection site care. Deep well leakage potential scores were assigned to the wells using a random subsample of 100 wells from a detailed study of 826 legacy wells that penetrate the basal Cambrian formation on the U.S. side of the U.S./Canadian border. Analytical solutions and Monte Carlo simulations were used to quantify the statistical power of selecting a leaking well. Power curves were developed as a function of (1) the number of leaking wells within the Area of Review; (2) the sampling design (random or judgmental, choosing first the wells with the highest deep leakage potential scores); (3) the number of wells included in the monitoring sampling plan; and (4) the relationship between a well's leakage potential score and its relative probability of leakage. Cases where the deep well leakage potential scores are fully or partially informative of the relative leakage probability are compared to a noninformative base case in which leakage is equiprobable across all wells in the Area of Review. The results show that accurate prior knowledge about the probability of well leakage adds measurable value to the ability to detect a leaking well during the monitoring program, and that the loss in detection ability due to imperfect knowledge of the leakage probability can be quantified. This work underscores the importance of a data-driven, risk-based monitoring program that incorporates uncertainty quantification into long-term monitoring sampling plans at geologic CO2 storage sites.


Asunto(s)
Dióxido de Carbono/análisis , Yacimiento de Petróleo y Gas , Contaminantes del Agua/análisis , Pozos de Agua , Canadá , Dióxido de Carbono/química , Simulación por Computador , Ambiente , Monitoreo del Ambiente/métodos , Geología , Modelos Estadísticos , Método de Montecarlo , Permeabilidad , Probabilidad , Incertidumbre , Estados Unidos
10.
Proc Natl Acad Sci U S A ; 109(9): 3247-52, 2012 Feb 28.
Artículo en Inglés | MEDLINE | ID: mdl-22331894

RESUMEN

The U.S. Department of Energy has estimated that if the United States is to generate 20% of its electricity from wind, over 50 GW will be required from shallow offshore turbines. Hurricanes are a potential risk to these turbines. Turbine tower buckling has been observed in typhoons, but no offshore wind turbines have yet been built in the United States. We present a probabilistic model to estimate the number of turbines that would be destroyed by hurricanes in an offshore wind farm. We apply this model to estimate the risk to offshore wind farms in four representative locations in the Atlantic and Gulf Coastal waters of the United States. In the most vulnerable areas now being actively considered by developers, nearly half the turbines in a farm are likely to be destroyed in a 20-y period. Reasonable mitigation measures--increasing the design reference wind load, ensuring that the nacelle can be turned into rapidly changing winds, and building most wind plants in the areas with lower risk--can greatly enhance the probability that offshore wind can help to meet the United States' electricity needs.

11.
Environ Sci Technol ; 48(6): 3420-9, 2014 Mar 18.
Artículo en Inglés | MEDLINE | ID: mdl-24564549

RESUMEN

The United States Geological Survey (USGS) reports that U.S. water withdrawals have been steady since 1980, but the population and economy have grown since then. This implies that other factors have contributed to offsetting decreases in water withdrawals. Using water withdrawal data from USGS and economic data from Bureau of Economic Analysis (BEA), direct and total water withdrawals were estimated for 134 industrial summary sectors in the 1997 U.S. economic input-output (EIO) table and 136 industrial sectors in the 2002 EIO table. Using structural decomposition analysis (SDA), the change in water withdrawals for the economy from 1997 to 2002 was allocated to changes in population, GDP per capita, water use intensity, production structure, and consumption patterns. The changes in population, GDP per capita, and water use intensity led to increased water withdrawals, while the changes in production structure and consumption patterns decreased water withdrawals from 1997 to 2002. Consumption patterns change was the largest net contributor to the change in water withdrawals. The model was used to predict aggregate changes in total water withdrawals from 2002 to 2010 due to known changes in population and GDP per capita; a more complete model assessment must await release of updated data on USGS water withdrawals and EIO data.


Asunto(s)
Industrias/estadística & datos numéricos , Abastecimiento de Agua/estadística & datos numéricos , Estados Unidos , Agua
12.
Environ Sci Technol ; 48(11): 6247-55, 2014 Jun 03.
Artículo en Inglés | MEDLINE | ID: mdl-24824160

RESUMEN

Carbon capture and sequestration (CCS) is a technology that provides a near-term solution to reduce anthropogenic CO2 emissions to the atmosphere and reduce our impact on the climate system. Assessments of carbon sequestration resources that have been made for North America using existing methodologies likely underestimate uncertainty and variability in the reservoir parameters. This paper describes a geostatistical model developed to estimate the CO2 storage resource in sedimentary formations. The proposed stochastic model accounts for the spatial distribution of reservoir properties and is implemented in a case study of the Oriskany Formation of the Appalachian sedimentary basin. Results indicate that the CO2 storage resource for the Pennsylvania part of the Oriskany Formation has substantial spatial variation due to heterogeneity of formation properties and basin geology leading to significant uncertainty in the storage assessment. The Oriskany Formation sequestration resource estimate in Pennsylvania calculated with the effective efficiency factor, E=5%, ranges from 0.15 to 1.01 gigatonnes (Gt) with a mean value of 0.52 Gt of CO2 (E=5%). The methodology is generalizable to other sedimentary formations in which site-specific trend analyses and statistical models are developed to estimate the CO2 sequestration storage capacity and its uncertainty. More precise CO2 storage resource estimates will provide better recommendations for government and industry leaders and inform their decisions on which greenhouse gas mitigation measures are best fit for their regions.


Asunto(s)
Dióxido de Carbono/química , Secuestro de Carbono , Sedimentos Geológicos/química , Modelos Teóricos , Dióxido de Carbono/análisis , Ecología , Pennsylvania , Procesos Estocásticos
13.
Environ Sci Technol ; 48(15): 8289-97, 2014.
Artículo en Inglés | MEDLINE | ID: mdl-24983403

RESUMEN

A broad assessment is provided of the current state of knowledge regarding the risks associated with shale gas development and their governance. For the principal domains of risk, we identify observed and potential hazards and promising mitigation options to address them, characterizing current knowledge and research needs. Important unresolved research questions are identified for each area of risk; however, certain domains exhibit especially acute deficits of knowledge and attention, including integrated studies of public health, ecosystems, air quality, socioeconomic impacts on communities, and climate change. For these, current research and analysis are insufficient to either confirm or preclude important impacts. The rapidly evolving landscape of shale gas governance in the U.S. is also assessed, noting challenges and opportunities associated with the current decentralized (state-focused) system of regulation. We briefly review emerging approaches to shale gas governance in other nations, and consider new governance initiatives and options in the U.S. involving voluntary industry certification, comprehensive development plans, financial instruments, and possible future federal roles. In order to encompass the multiple relevant disciplines, address the complexities of the evolving shale gas system and reduce the many key uncertainties needed for improved management, a coordinated multiagency federal research effort will need to be implemented.


Asunto(s)
Industria Procesadora y de Extracción , Gas Natural , Riesgo , Cambio Climático , Regulación Gubernamental , Humanos , Salud Pública , Estados Unidos
14.
Risk Anal ; 34(11): 1978-94, 2014 Nov.
Artículo en Inglés | MEDLINE | ID: mdl-24954376

RESUMEN

While scientific studies may help conflicting stakeholders come to agreement on a best management option or policy, often they do not. We review the factors affecting trust in the efficacy and objectivity of scientific studies in an analytical-deliberative process where conflict is present, and show how they may be incorporated in an extension to the traditional Bayesian decision model. The extended framework considers stakeholders who differ in their prior beliefs regarding the probability of possible outcomes (in particular, whether a proposed technology is hazardous), differ in their valuations of these outcomes, and differ in their assessment of the ability of a proposed study to resolve the uncertainty in the outcomes and their hazards--as measured by their perceived false positive and false negative rates for the study. The Bayesian model predicts stakeholder-specific preposterior probabilities of consensus, as well as pathways for increasing these probabilities, providing important insights into the value of scientific information in an analytic-deliberative decision process where agreement is sought. It also helps to identify the interactions among perceived risk and benefit allocations, scientific beliefs, and trust in proposed scientific studies when determining whether a consensus can be achieved. The article provides examples to illustrate the method, including an adaptation of a recent decision analysis for managing the health risks of electromagnetic fields from high voltage transmission lines.

15.
Environ Sci Technol ; 47(3): 1407-15, 2013 Feb 05.
Artículo en Inglés | MEDLINE | ID: mdl-23253153

RESUMEN

The prediction of carbon dioxide solubility in brine at conditions relevant to carbon sequestration (i.e., high temperature, pressure, and salt concentration (T-P-X)) is crucial when this technology is applied. Eleven mathematical models for predicting CO(2) solubility in brine are compared and considered for inclusion in a multimodel predictive system. Model goodness of fit is evaluated over the temperature range 304-433 K, pressure range 74-500 bar, and salt concentration range 0-7 m (NaCl equivalent), using 173 published CO(2) solubility measurements, particularly selected for those conditions. The performance of each model is assessed using various statistical methods, including the Akaike Information Criterion (AIC) and the Bayesian Information Criterion (BIC). Different models emerge as best fits for different subranges of the input conditions. A classification tree is generated using machine learning methods to predict the best-performing model under different T-P-X subranges, allowing development of a multimodel predictive system (MMoPS) that selects and applies the model expected to yield the most accurate CO(2) solubility prediction. Statistical analysis of the MMoPS predictions, including a stratified 5-fold cross validation, shows that MMoPS outperforms each individual model and increases the overall accuracy of CO(2) solubility prediction across the range of T-P-X conditions likely to be encountered in carbon sequestration applications.


Asunto(s)
Dióxido de Carbono/química , Modelos Químicos , Salinidad , Agua/química , Carbono/análisis , Solubilidad
16.
Risk Anal ; 33(12): 2126-41, 2013 Dec.
Artículo en Inglés | MEDLINE | ID: mdl-23763387

RESUMEN

The U.S. Department of Energy has estimated that over 50 GW of offshore wind power will be required for the United States to generate 20% of its electricity from wind. Developers are actively planning offshore wind farms along the U.S. Atlantic and Gulf coasts and several leases have been signed for offshore sites. These planned projects are in areas that are sometimes struck by hurricanes. We present a method to estimate the catastrophe risk to offshore wind power using simulated hurricanes. Using this method, we estimate the fraction of offshore wind power simultaneously offline and the cumulative damage in a region. In Texas, the most vulnerable region we studied, 10% of offshore wind power could be offline simultaneously because of hurricane damage with a 100-year return period and 6% could be destroyed in any 10-year period. We also estimate the risks to single wind farms in four representative locations; we find the risks are significant but lower than those estimated in previously published results. Much of the hurricane risk to offshore wind turbines can be mitigated by designing turbines for higher maximum wind speeds, ensuring that turbine nacelles can turn quickly to track the wind direction even when grid power is lost, and building in areas with lower risk.

17.
Langmuir ; 28(28): 10334-47, 2012 Jul 17.
Artículo en Inglés | MEDLINE | ID: mdl-22708677

RESUMEN

Soft particle electrokinetic models have been used to determine adsorbed nonionic polymer and polyelectrolyte layer properties on nanoparticles or colloids by fitting electrophoretic mobility data. Ohshima first established the formalism for these models and provided analytical approximations ( Ohshima, H. Adv. Colloid Interface Sci.1995, 62, 189 ). More recently, exact numerical solutions have been developed, which account for polarization and relaxation effects and require fewer assumptions on the particle and soft layer properties. This paper characterizes statistical uncertainty in the polyelectrolyte layer charge density, layer thickness, and permeability (Brinkman screening length) obtained from fitting data to either the analytical or numerical electrokinetic models. Various combinations of particle core and polymer layer properties are investigated to determine the range of systems for which this analysis can provide a solution with reasonably small uncertainty bounds, particularly for layer thickness. Identifiability of layer thickness in the analytical model ranges from poor confidence for cases with thick, highly charged coatings, to good confidence for cases with thin, low-charged coatings. Identifiability is similar for the numerical model, except that sensitivity is improved at very high charge and permeability, where polarization and relaxation effects are significant. For some poorly identifiable cases, parameter reduction can reduce collinearity to improve identifiability. Analysis of experimental data yielded results consistent with expectations from the simulated theoretical cases. Identifiability of layer charge density and permeability is also evaluated. Guidelines are suggested for evaluation of statistical confidence in polymer and polyelectrolyte layer parameters determined by application of the soft particle electrokinetic theory.


Asunto(s)
Polímeros/análisis , Coloides/química , Electroquímica , Electrólitos/análisis , Tamaño de la Partícula , Propiedades de Superficie
18.
Environ Manage ; 50(6): 1204-18, 2012 Dec.
Artículo en Inglés | MEDLINE | ID: mdl-23052473

RESUMEN

We present a decision support framework for science-based assessment and multi-stakeholder deliberation. The framework consists of two parts: a DPSIR (Drivers-Pressures-States-Impacts-Responses) analysis to identify the important causal relationships among anthropogenic environmental stressors, processes, and outcomes; and a Decision Landscape analysis to depict the legal, social, and institutional dimensions of environmental decisions. The Decision Landscape incorporates interactions among government agencies, regulated businesses, non-government organizations, and other stakeholders. It also identifies where scientific information regarding environmental processes is collected and transmitted to improve knowledge about elements of the DPSIR and to improve the scientific basis for decisions. Our application of the decision support framework to coral reef protection and restoration in the Florida Keys focusing on anthropogenic stressors, such as wastewater, proved to be successful and offered several insights. Using information from a management plan, it was possible to capture the current state of the science with a DPSIR analysis as well as important decision options, decision makers and applicable laws with a the Decision Landscape analysis. A structured elicitation of values and beliefs conducted at a coral reef management workshop held in Key West, Florida provided a diversity of opinion and also indicated a prioritization of several environmental stressors affecting coral reef health. The integrated DPSIR/Decision landscape framework for the Florida Keys developed based on the elicited opinion and the DPSIR analysis can be used to inform management decisions, to reveal the role that further scientific information and research might play to populate the framework, and to facilitate better-informed agreement among participants.


Asunto(s)
Toma de Decisiones , Ecosistema , Monitoreo del Ambiente
19.
Environ Sci Technol ; 45(15): 6380-7, 2011 Aug 01.
Artículo en Inglés | MEDLINE | ID: mdl-21732603

RESUMEN

A methodology is developed for predicting the performance of near-surface CO(2) leak detection systems at geologic sequestration sites. The methodology integrates site characterization and modeling to predict the statistical properties of natural CO(2) fluxes, the transport of CO(2) from potential subsurface leakage points, and the detection of CO(2) surface fluxes by the monitoring network. The probability of leak detection is computed as the probability that the leakage signal is sufficient to increase the total flux beyond a statistically determined threshold. The methodology is illustrated for a highly idealized site monitored with CO(2) accumulation chamber measurements taken on a uniform grid. The TOUGH2 code is used to predict the spatial profile of surface CO(2) fluxes resulting from different leakage rates and different soil permeabilities. A response surface is fit to the TOUGH2 results to allow interpolation across a continuous range of values of permeability and leakage rate. The spatial distribution of leakage probability is assumed uniform in this application. Nonlinear, nonmonotonic relationships of network performance to soil permeability and network density are evident. In general, dense networks (with ∼10-20 m between monitors) are required to ensure a moderate to high probability of leak detection.


Asunto(s)
Dióxido de Carbono/análisis , Monitoreo del Ambiente/métodos , Probabilidad , Simulación por Computador , Modelos Lineales , Cadenas de Markov , Montana , Método de Montecarlo , Suelo/química , Propiedades de Superficie , Temperatura
20.
Risk Anal ; 31(10): 1561-75, 2011 Oct.
Artículo en Inglés | MEDLINE | ID: mdl-21388425

RESUMEN

A methodology is presented for assessing the information value of an additional dosage experiment in existing bioassay studies. The analysis demonstrates the potential reduction in the uncertainty of toxicity metrics derived from expanded studies, providing insights for future studies. Bayesian methods are used to fit alternative dose-response models using Markov chain Monte Carlo (MCMC) simulation for parameter estimation and Bayesian model averaging (BMA) is used to compare and combine the alternative models. BMA predictions for benchmark dose (BMD) are developed, with uncertainty in these predictions used to derive the lower bound BMDL. The MCMC and BMA results provide a basis for a subsequent Monte Carlo analysis that backcasts the dosage where an additional test group would have been most beneficial in reducing the uncertainty in the BMD prediction, along with the magnitude of the expected uncertainty reduction. Uncertainty reductions are measured in terms of reduced interval widths of predicted BMD values and increases in BMDL values that occur as a result of this reduced uncertainty. The methodology is illustrated using two existing data sets for TCDD carcinogenicity, fitted with two alternative dose-response models (logistic and quantal-linear). The example shows that an additional dose at a relatively high value would have been most effective for reducing the uncertainty in BMA BMD estimates, with predicted reductions in the widths of uncertainty intervals of approximately 30%, and expected increases in BMDL values of 5-10%. The results demonstrate that dose selection for studies that subsequently inform dose-response models can benefit from consideration of how these models will be fit, combined, and interpreted.


Asunto(s)
Modelos Teóricos , Incertidumbre , Animales , Teorema de Bayes , Relación Dosis-Respuesta a Droga , Humanos , Método de Montecarlo
SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA