Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 93
Filter
Add more filters

Publication year range
1.
J Chem Inf Model ; 64(4): 1277-1289, 2024 Feb 26.
Article in English | MEDLINE | ID: mdl-38359461

ABSTRACT

Predicting the synthesizability of a new molecule remains an unsolved challenge that chemists have long tackled with heuristic approaches. Here, we report a new method for predicting synthesizability using a simple yet accurate thermochemical descriptor. We introduce Emin, the energy difference between a molecule and its lowest energy constitutional isomer, as a synthesizability predictor that is accurate, physically meaningful, and first-principles based. We apply Emin to 134,000 molecules in the QM9 data set and find that Emin is accurate when used alone and reduces incorrect predictions of "synthesizable" by up to 52% when used to augment commonly used prediction methods. Our work illustrates how first-principles thermochemistry and heuristic approximations for molecular stability are complementary, opening a new direction for synthesizability prediction methods.


Subject(s)
Heuristics , Isomerism
2.
Proc Natl Acad Sci U S A ; 118(21)2021 05 25.
Article in English | MEDLINE | ID: mdl-33972410

ABSTRACT

The genome of the severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) coronavirus has a capping modification at the 5'-untranslated region (UTR) to prevent its degradation by host nucleases. These modifications are performed by the Nsp10/14 and Nsp10/16 heterodimers using S-adenosylmethionine as the methyl donor. Nsp10/16 heterodimer is responsible for the methylation at the ribose 2'-O position of the first nucleotide. To investigate the conformational changes of the complex during 2'-O methyltransferase activity, we used a fixed-target serial synchrotron crystallography method at room temperature. We determined crystal structures of Nsp10/16 with substrates and products that revealed the states before and after methylation, occurring within the crystals during the experiments. Here we report the crystal structure of Nsp10/16 in complex with Cap-1 analog (m7GpppAm2'-O). Inhibition of Nsp16 activity may reduce viral proliferation, making this protein an attractive drug target.


Subject(s)
RNA Caps/metabolism , RNA, Messenger/metabolism , RNA, Viral/metabolism , SARS-CoV-2/chemistry , Crystallography , Methylation , Methyltransferases/chemistry , Methyltransferases/metabolism , Multiprotein Complexes/chemistry , Multiprotein Complexes/metabolism , RNA Cap Analogs/chemistry , RNA Cap Analogs/metabolism , RNA Caps/chemistry , RNA, Messenger/chemistry , RNA, Viral/chemistry , S-Adenosylhomocysteine/chemistry , S-Adenosylhomocysteine/metabolism , S-Adenosylmethionine/chemistry , S-Adenosylmethionine/metabolism , SARS-CoV-2/genetics , SARS-CoV-2/metabolism , Synchrotrons , Viral Nonstructural Proteins/chemistry , Viral Nonstructural Proteins/metabolism , Viral Regulatory and Accessory Proteins/chemistry , Viral Regulatory and Accessory Proteins/metabolism
3.
J Chem Phys ; 159(2)2023 Jul 14.
Article in English | MEDLINE | ID: mdl-37428051

ABSTRACT

Machine learning interatomic potentials have emerged as a powerful tool for bypassing the spatiotemporal limitations of ab initio simulations, but major challenges remain in their efficient parameterization. We present AL4GAP, an ensemble active learning software workflow for generating multicomposition Gaussian approximation potentials (GAP) for arbitrary molten salt mixtures. The workflow capabilities include: (1) setting up user-defined combinatorial chemical spaces of charge neutral mixtures of arbitrary molten mixtures spanning 11 cations (Li, Na, K, Rb, Cs, Mg, Ca, Sr, Ba and two heavy species, Nd, and Th) and 4 anions (F, Cl, Br, and I), (2) configurational sampling using low-cost empirical parameterizations, (3) active learning for down-selecting configurational samples for single point density functional theory calculations at the level of Strongly Constrained and Appropriately Normed (SCAN) exchange-correlation functional, and (4) Bayesian optimization for hyperparameter tuning of two-body and many-body GAP models. We apply the AL4GAP workflow to showcase high throughput generation of five independent GAP models for multicomposition binary-mixture melts, each of increasing complexity with respect to charge valency and electronic structure, namely: LiCl-KCl, NaCl-CaCl2, KCl-NdCl3, CaCl2-NdCl3, and KCl-ThCl4. Our results indicate that GAP models can accurately predict structure for diverse molten salt mixture with density functional theory (DFT)-SCAN accuracy, capturing the intermediate range ordering characteristic of the multivalent cationic melts.

4.
Proc Natl Acad Sci U S A ; 117(13): 7071-7081, 2020 03 31.
Article in English | MEDLINE | ID: mdl-32179678

ABSTRACT

A limited nuclear war between India and Pakistan could ignite fires large enough to emit more than 5 Tg of soot into the stratosphere. Climate model simulations have shown severe resulting climate perturbations with declines in global mean temperature by 1.8 °C and precipitation by 8%, for at least 5 y. Here we evaluate impacts for the global food system. Six harmonized state-of-the-art crop models show that global caloric production from maize, wheat, rice, and soybean falls by 13 (±1)%, 11 (±8)%, 3 (±5)%, and 17 (±2)% over 5 y. Total single-year losses of 12 (±4)% quadruple the largest observed historical anomaly and exceed impacts caused by historic droughts and volcanic eruptions. Colder temperatures drive losses more than changes in precipitation and solar radiation, leading to strongest impacts in temperate regions poleward of 30°N, including the United States, Europe, and China for 10 to 15 y. Integrated food trade network analyses show that domestic reserves and global trade can largely buffer the production anomaly in the first year. Persistent multiyear losses, however, would constrain domestic food availability and propagate to the Global South, especially to food-insecure countries. By year 5, maize and wheat availability would decrease by 13% globally and by more than 20% in 71 countries with a cumulative population of 1.3 billion people. In view of increasing instability in South Asia, this study shows that a regional conflict using <1% of the worldwide nuclear arsenal could have adverse consequences for global food security unmatched in modern history.


Subject(s)
Climate , Edible Grain , Food Supply , Models, Biological , Nuclear Warfare , Glycine max
5.
Proc Natl Acad Sci U S A ; 117(25): 14005-14014, 2020 06 23.
Article in English | MEDLINE | ID: mdl-32513736

ABSTRACT

Paleozoic and Precambrian sedimentary successions frequently contain massive dolomicrite [CaMg(CO3)2] units despite kinetic inhibitions to nucleation and precipitation of dolomite at Earth surface temperatures (<60 °C). This paradoxical observation is known as the "dolomite problem." Accordingly, the genesis of these dolostones is usually attributed to burial-hydrothermal dolomitization of primary limestones (CaCO3) at temperatures of >100 °C, thus raising doubt about the validity of these deposits as archives of Earth surface environments. We present a high-resolution, >63-My-long clumped-isotope temperature (TΔ47) record of shallow-marine dolomicrites from two drillcores of the Ediacaran (635 to 541 Ma) Doushantuo Formation in South China. Our T∆47 record indicates that a majority (87%) of these dolostones formed at temperatures of <100 °C. When considering the regional thermal history, modeling of the influence of solid-state reordering on our TΔ47 record further suggests that most of the studied dolostones formed at temperatures of <60 °C, providing direct evidence of a low-temperature origin of these dolostones. Furthermore, calculated δ18O values of diagenetic fluids, rare earth element plus yttrium compositions, and petrographic observations of these dolostones are consistent with an early diagenetic origin in a rock-buffered environment. We thus propose that a precursor precipitate from seawater was subsequently dolomitized during early diagenesis in a near-surface setting to produce the large volume of dolostones in the Doushantuo Formation. Our findings suggest that the preponderance of dolomite in Paleozoic and Precambrian deposits likely reflects oceanic conditions specific to those eras and that dolostones can be faithful recorders of environmental conditions in the early oceans.

6.
J Synchrotron Radiat ; 29(Pt 5): 1141-1151, 2022 Sep 01.
Article in English | MEDLINE | ID: mdl-36073872

ABSTRACT

Serial synchrotron crystallography enables the study of protein structures under physiological temperature and reduced radiation damage by collection of data from thousands of crystals. The Structural Biology Center at Sector 19 of the Advanced Photon Source has implemented a fixed-target approach with a new 3D-printed mesh-holder optimized for sample handling. The holder immobilizes a crystal suspension or droplet emulsion on a nylon mesh, trapping and sealing a near-monolayer of crystals in its mother liquor between two thin Mylar films. Data can be rapidly collected in scan mode and analyzed in near real-time using piezoelectric linear stages assembled in an XYZ arrangement, controlled with a graphical user interface and analyzed using a high-performance computing pipeline. Here, the system was applied to two ß-lactamases: a class D serine ß-lactamase from Chitinophaga pinensis DSM 2588 and L1 metallo-ß-lactamase from Stenotrophomonas maltophilia K279a.


Subject(s)
Stenotrophomonas maltophilia , Biology , Crystallography , Proteins
7.
J Chem Inf Model ; 62(1): 116-128, 2022 01 10.
Article in English | MEDLINE | ID: mdl-34793155

ABSTRACT

Despite the recent availability of vaccines against the acute respiratory syndrome coronavirus 2 (SARS-CoV-2), the search for inhibitory therapeutic agents has assumed importance especially in the context of emerging new viral variants. In this paper, we describe the discovery of a novel noncovalent small-molecule inhibitor, MCULE-5948770040, that binds to and inhibits the SARS-Cov-2 main protease (Mpro) by employing a scalable high-throughput virtual screening (HTVS) framework and a targeted compound library of over 6.5 million molecules that could be readily ordered and purchased. Our HTVS framework leverages the U.S. supercomputing infrastructure achieving nearly 91% resource utilization and nearly 126 million docking calculations per hour. Downstream biochemical assays validate this Mpro inhibitor with an inhibition constant (Ki) of 2.9 µM (95% CI 2.2, 4.0). Furthermore, using room-temperature X-ray crystallography, we show that MCULE-5948770040 binds to a cleft in the primary binding site of Mpro forming stable hydrogen bond and hydrophobic interactions. We then used multiple µs-time scale molecular dynamics (MD) simulations and machine learning (ML) techniques to elucidate how the bound ligand alters the conformational states accessed by Mpro, involving motions both proximal and distal to the binding site. Together, our results demonstrate how MCULE-5948770040 inhibits Mpro and offers a springboard for further therapeutic design.


Subject(s)
COVID-19 , Protease Inhibitors , Antiviral Agents , Coronavirus 3C Proteases , Humans , Molecular Docking Simulation , Molecular Dynamics Simulation , Orotic Acid/analogs & derivatives , Piperazines , SARS-CoV-2
8.
J Phys Chem A ; 126(27): 4528-4536, 2022 Jul 14.
Article in English | MEDLINE | ID: mdl-35786965

ABSTRACT

G4MP2 theory has proven to be a reliable and accurate quantum chemical composite method for the calculation of molecular energies using an approximation based on second-order perturbation theory to lower computational costs compared to G4 theory. However, it has been found to have significantly increased errors when applied to larger organic molecules with 10 or more nonhydrogen atoms. We report here on an investigation of the cause of the failure of G4MP2 theory for such larger molecules. One source of error is found to be the "higher-level correction (HLC)", which is meant to correct for deficiencies in correlation contributions to the calculated energies. This is because the HLC assumes that the contribution is independent of the element and the type of bonding involved, both of which become more important with larger molecules. We address this problem by adding an atom-specific correction, dependent on atom type but not bond type, to the higher-level correction. We find that a G4MP2 method that incorporates this modification of the higher-level correction, referred to as G4MP2A, becomes as accurate as G4 theory (for computing enthalpies of formation) for a test set of molecules with less than 10 nonhydrogen atoms as well as a set with 10-14 such atoms, the set of molecules considered here, with a much lower computational cost. The G4MP2A method is also found to significantly improve ionization potentials and electron affinities. Finally, we implemented the G4MP2A energies in a machine learning method to predict molecular energies.

9.
Int J High Perform Comput Appl ; 36(5-6): 603-623, 2022 Nov.
Article in English | MEDLINE | ID: mdl-38464362

ABSTRACT

The severe acute respiratory syndrome coronavirus-2 (SARS-CoV-2) replication transcription complex (RTC) is a multi-domain protein responsible for replicating and transcribing the viral mRNA inside a human cell. Attacking RTC function with pharmaceutical compounds is a pathway to treating COVID-19. Conventional tools, e.g., cryo-electron microscopy and all-atom molecular dynamics (AAMD), do not provide sufficiently high resolution or timescale to capture important dynamics of this molecular machine. Consequently, we develop an innovative workflow that bridges the gap between these resolutions, using mesoscale fluctuating finite element analysis (FFEA) continuum simulations and a hierarchy of AI-methods that continually learn and infer features for maintaining consistency between AAMD and FFEA simulations. We leverage a multi-site distributed workflow manager to orchestrate AI, FFEA, and AAMD jobs, providing optimal resource utilization across HPC centers. Our study provides unprecedented access to study the SARS-CoV-2 RTC machinery, while providing general capability for AI-enabled multi-resolution simulations at scale.

10.
J Chem Inf Model ; 61(12): 5793-5803, 2021 12 27.
Article in English | MEDLINE | ID: mdl-34905348

ABSTRACT

Perfluoroalkyl and polyfluoroalkyl substances (PFAS) pose a significant hazard because of their widespread industrial uses, environmental persistence, and bioaccumulation. A growing, increasingly diverse inventory of PFAS, including 8163 chemicals, has recently been updated by the U.S. Environmental Protection Agency. However, with the exception of a handful of well-studied examples, little is known about their human toxicity potential because of the substantial resources required for in vivo toxicity experiments. We tackle the problem of expensive in vivo experiments by evaluating multiple machine learning (ML) methods, including random forests, deep neural networks (DNN), graph convolutional networks, and Gaussian processes, for predicting acute toxicity (e.g., median lethal dose, or LD50) of PFAS compounds. To address the scarcity of toxicity information for PFAS, publicly available datasets of oral rat LD50 for all organic compounds are aggregated and used to develop state-of-the-art ML source models for transfer learning. A total of 519 fluorinated compounds containing two or more C-F bonds with known toxicity are used for knowledge transfer to ensembles of the best-performing source model, DNN, to generate the target models for the PFAS domain with access to uncertainty. This study predicts toxicity for PFAS with a defined chemical structure. To further inform prediction confidence, the transfer-learned model is embedded within a SelectiveNet architecture, where the model is allowed to identify regions of prediction with greater confidence and abstain from those with high uncertainty using a calibrated cutoff rate.


Subject(s)
Fluorocarbons , Animals , Fluorocarbons/chemistry , Fluorocarbons/toxicity , Machine Learning , Neural Networks, Computer , Rats , Uncertainty
11.
J Phys Chem A ; 125(27): 5990-5998, 2021 Jul 15.
Article in English | MEDLINE | ID: mdl-34191512

ABSTRACT

The solvation properties of molecules, often estimated using quantum chemical simulations, are important in the synthesis of energy storage materials, drugs, and industrial chemicals. Here, we develop machine learning models of solvation energies to replace expensive quantum chemistry calculations with inexpensive-to-compute message-passing neural network models that require only the molecular graph as inputs. Our models are trained on a new database of solvation energies for 130,258 molecules taken from the QM9 dataset computed in five solvents (acetone, ethanol, acetonitrile, dimethyl sulfoxide, and water) via an implicit solvent model. Our best model achieves a mean absolute error of 0.5 kcal/mol for molecules with nine or fewer non-hydrogen atoms and 1 kcal/mol for molecules with between 10 and 14 non-hydrogen atoms. We make the entire dataset of 651,290 computed entries openly available and provide simple web and programmatic interfaces to enable others to run our solvation energy model on new molecules. This model calculates the solvation energies for molecules using only the SMILES string and also provides an estimate of whether each molecule is within the domain of applicability of our model. We envision that the dataset and models will provide the functionality needed for the rapid screening of large chemical spaces to discover improved molecules for many applications.

12.
J Med Internet Res ; 23(12): e20028, 2021 12 02.
Article in English | MEDLINE | ID: mdl-34860667

ABSTRACT

BACKGROUND: The National Cancer Institute Informatics Technology for Cancer Research (ITCR) program provides a series of funding mechanisms to create an ecosystem of open-source software (OSS) that serves the needs of cancer research. As the ITCR ecosystem substantially grows, it faces the challenge of the long-term sustainability of the software being developed by ITCR grantees. To address this challenge, the ITCR sustainability and industry partnership working group (SIP-WG) was convened in 2019. OBJECTIVE: The charter of the SIP-WG is to investigate options to enhance the long-term sustainability of the OSS being developed by ITCR, in part by developing a collection of business model archetypes that can serve as sustainability plans for ITCR OSS development initiatives. The working group assembled models from the ITCR program, from other studies, and from the engagement of its extensive network of relationships with other organizations (eg, Chan Zuckerberg Initiative, Open Source Initiative, and Software Sustainability Institute) in support of this objective. METHODS: This paper reviews the existing sustainability models and describes 10 OSS use cases disseminated by the SIP-WG and others, including 3D Slicer, Bioconductor, Cytoscape, Globus, i2b2 (Informatics for Integrating Biology and the Bedside) and tranSMART, Insight Toolkit, Linux, Observational Health Data Sciences and Informatics tools, R, and REDCap (Research Electronic Data Capture), in 10 sustainability aspects: governance, documentation, code quality, support, ecosystem collaboration, security, legal, finance, marketing, and dependency hygiene. RESULTS: Information available to the public reveals that all 10 OSS have effective governance, comprehensive documentation, high code quality, reliable dependency hygiene, strong user and developer support, and active marketing. These OSS include a variety of licensing models (eg, general public license version 2, general public license version 3, Berkeley Software Distribution, and Apache 3) and financial models (eg, federal research funding, industry and membership support, and commercial support). However, detailed information on ecosystem collaboration and security is not publicly provided by most OSS. CONCLUSIONS: We recommend 6 essential attributes for research software: alignment with unmet scientific needs, a dedicated development team, a vibrant user community, a feasible licensing model, a sustainable financial model, and effective product management. We also stress important actions to be considered in future ITCR activities that involve the discussion of the sustainability and licensing models for ITCR OSS, the establishment of a central library, the allocation of consulting resources to code quality control, ecosystem collaboration, security, and dependency hygiene.


Subject(s)
Ecosystem , Neoplasms , Humans , Informatics , Neoplasms/therapy , Research , Software , Technology
13.
Philos Trans A Math Phys Eng Sci ; 378(2166): 20190056, 2020 Mar 06.
Article in English | MEDLINE | ID: mdl-31955678

ABSTRACT

As noted in Wikipedia, skin in the game refers to having 'incurred risk by being involved in achieving a goal', where 'skin is a synecdoche for the person involved, and game is the metaphor for actions on the field of play under discussion'. For exascale applications under development in the US Department of Energy Exascale Computing Project, nothing could be more apt, with the skin being exascale applications and the game being delivering comprehensive science-based computational applications that effectively exploit exascale high-performance computing technologies to provide breakthrough modelling and simulation and data science solutions. These solutions will yield high-confidence insights and answers to the most critical problems and challenges for the USA in scientific discovery, national security, energy assurance, economic competitiveness and advanced healthcare. This article is part of a discussion meeting issue 'Numerical algorithms for high-performance computational science'.

14.
J Phys Chem A ; 124(28): 5804-5811, 2020 Jul 16.
Article in English | MEDLINE | ID: mdl-32539388

ABSTRACT

High-fidelity quantum-chemical calculations can provide accurate predictions of molecular energies, but their high computational costs limit their utility, especially for larger molecules. We have shown in previous work that machine learning models trained on high-level quantum-chemical calculations (G4MP2) for organic molecules with one to nine non-hydrogen atoms can provide accurate predictions for other molecules of comparable size at much lower costs. Here we demonstrate that such models can also be used to effectively predict energies of molecules larger than those in the training set. To implement this strategy, we first established a set of 191 molecules with 10-14 non-hydrogen atoms having reliable experimental enthalpies of formation. We then assessed the accuracy of computed G4MP2 enthalpies of formation for these 191 molecules. The error in the G4MP2 results was somewhat larger than that for smaller molecules, and the reason for this increase is discussed. Two density functional methods, B3LYP and ωB97X-D, were also used on this set of molecules, with ωB97X-D found to perform better than B3LYP at predicting energies. The G4MP2 energies for the 191 molecules were then predicted using these two functionals with two machine learning methods, the FCHL-Δ and SchNet-Δ models, with the learning done on calculated energies of the one to nine non-hydrogen atom molecules. The better-performing model, FCHL-Δ, gave atomization energies of the 191 organic molecules with 10-14 non-hydrogen atoms within 0.4 kcal/mol of their G4MP2 energies. Thus, this work demonstrates that quantum-chemically informed machine learning can be used to successfully predict the energies of large organic molecules whose size is beyond that in the training set.

15.
J Opt Soc Am A Opt Image Sci Vis ; 37(3): 422-434, 2020 Mar 01.
Article in English | MEDLINE | ID: mdl-32118926

ABSTRACT

Synchrotron-based x-ray tomography is a noninvasive imaging technique that allows for reconstructing the internal structure of materials at high spatial resolutions from tens of micrometers to a few nanometers. In order to resolve sample features at smaller length scales, however, a higher radiation dose is required. Therefore, the limitation on the achievable resolution is set primarily by noise at these length scales. We present TomoGAN, a denoising technique based on generative adversarial networks, for improving the quality of reconstructed images for low-dose imaging conditions. We evaluate our approach in two photon-budget-limited experimental conditions: (1) sufficient number of low-dose projections (based on Nyquist sampling), and (2) insufficient or limited number of high-dose projections. In both cases, the angular sampling is assumed to be isotropic, and the photon budget throughout the experiment is fixed based on the maximum allowable radiation dose on the sample. Evaluation with both simulated and experimental datasets shows that our approach can significantly reduce noise in reconstructed images, improving the structural similarity score of simulation and experimental data from 0.18 to 0.9 and from 0.18 to 0.41, respectively. Furthermore, the quality of the reconstructed images with filtered back projection followed by our denoising approach exceeds that of reconstructions with the simultaneous iterative reconstruction technique, showing the computational superiority of our approach.

16.
J Soils Sediments ; 20(12): 4160-4193, 2020.
Article in English | MEDLINE | ID: mdl-33239964

ABSTRACT

PURPOSE: This review of sediment source fingerprinting assesses the current state-of-the-art, remaining challenges and emerging themes. It combines inputs from international scientists either with track records in the approach or with expertise relevant to progressing the science. METHODS: Web of Science and Google Scholar were used to review published papers spanning the period 2013-2019, inclusive, to confirm publication trends in quantities of papers by study area country and the types of tracers used. The most recent (2018-2019, inclusive) papers were also benchmarked using a methodological decision-tree published in 2017. SCOPE: Areas requiring further research and international consensus on methodological detail are reviewed, and these comprise spatial variability in tracers and corresponding sampling implications for end-members, temporal variability in tracers and sampling implications for end-members and target sediment, tracer conservation and knowledge-based pre-selection, the physico-chemical basis for source discrimination and dissemination of fingerprinting results to stakeholders. Emerging themes are also discussed: novel tracers, concentration-dependence for biomarkers, combining sediment fingerprinting and age-dating, applications to sediment-bound pollutants, incorporation of supportive spatial information to augment discrimination and modelling, aeolian sediment source fingerprinting, integration with process-based models and development of open-access software tools for data processing. CONCLUSIONS: The popularity of sediment source fingerprinting continues on an upward trend globally, but with this growth comes issues surrounding lack of standardisation and procedural diversity. Nonetheless, the last 2 years have also evidenced growing uptake of critical requirements for robust applications and this review is intended to signpost investigators, both old and new, towards these benchmarks and remaining research challenges for, and emerging options for different applications of, the fingerprinting approach.

18.
Proc Natl Acad Sci U S A ; 112(47): 14569-74, 2015 Nov 24.
Article in English | MEDLINE | ID: mdl-26554009

ABSTRACT

A scientist's choice of research problem affects his or her personal career trajectory. Scientists' combined choices affect the direction and efficiency of scientific discovery as a whole. In this paper, we infer preferences that shape problem selection from patterns of published findings and then quantify their efficiency. We represent research problems as links between scientific entities in a knowledge network. We then build a generative model of discovery informed by qualitative research on scientific problem selection. We map salient features from this literature to key network properties: an entity's importance corresponds to its degree centrality, and a problem's difficulty corresponds to the network distance it spans. Drawing on millions of papers and patents published over 30 years, we use this model to infer the typical research strategy used to explore chemical relationships in biomedicine. This strategy generates conservative research choices focused on building up knowledge around important molecules. These choices become more conservative over time. The observed strategy is efficient for initial exploration of the network and supports scientific careers that require steady output, but is inefficient for science as a whole. Through supercomputer experiments on a sample of the network, we study thousands of alternatives and identify strategies much more efficient at exploring mature knowledge networks. We find that increased risk-taking and the publication of experimental failures would substantially improve the speed of discovery. We consider institutional shifts in grant making, evaluation, and publication that would help realize these efficiencies.


Subject(s)
Research , Science , Humans , Publications , Qualitative Research , Risk-Taking
19.
Proc Natl Acad Sci U S A ; 111(24): 8776-81, 2014 Jun 17.
Article in English | MEDLINE | ID: mdl-24872455

ABSTRACT

Interest in estimating the potential socioeconomic costs of climate change has led to the increasing use of dynamical downscaling--nested modeling in which regional climate models (RCMs) are driven with general circulation model (GCM) output--to produce fine-spatial-scale climate projections for impacts assessments. We evaluate here whether this computationally intensive approach significantly alters projections of agricultural yield, one of the greatest concerns under climate change. Our results suggest that it does not. We simulate US maize yields under current and future CO2 concentrations with the widely used Decision Support System for Agrotechnology Transfer crop model, driven by a variety of climate inputs including two GCMs, each in turn downscaled by two RCMs. We find that no climate model output can reproduce yields driven by observed climate unless a bias correction is first applied. Once a bias correction is applied, GCM- and RCM-driven US maize yields are essentially indistinguishable in all scenarios (<10% discrepancy, equivalent to error from observations). Although RCMs correct some GCM biases related to fine-scale geographic features, errors in yield are dominated by broad-scale (100s of kilometers) GCM systematic errors that RCMs cannot compensate for. These results support previous suggestions that the benefits for impacts assessments of dynamically downscaling raw GCM output may not be sufficient to justify its computational demands. Progress on fidelity of yield projections may benefit more from continuing efforts to understand and minimize systematic error in underlying climate projections.


Subject(s)
Agriculture/methods , Conservation of Natural Resources , Algorithms , Carbon Dioxide , Climate , Climate Change , Computer Simulation , Crops, Agricultural , Food Supply , Forecasting , Geography , Models, Theoretical , North America , Probability , Reproducibility of Results , Zea mays
20.
Proc Natl Acad Sci U S A ; 111(9): 3239-44, 2014 Mar 04.
Article in English | MEDLINE | ID: mdl-24344283

ABSTRACT

We compare ensembles of water supply and demand projections from 10 global hydrological models and six global gridded crop models. These are produced as part of the Inter-Sectoral Impacts Model Intercomparison Project, with coordination from the Agricultural Model Intercomparison and Improvement Project, and driven by outputs of general circulation models run under representative concentration pathway 8.5 as part of the Fifth Coupled Model Intercomparison Project. Models project that direct climate impacts to maize, soybean, wheat, and rice involve losses of 400-1,400 Pcal (8-24% of present-day total) when CO2 fertilization effects are accounted for or 1,400-2,600 Pcal (24-43%) otherwise. Freshwater limitations in some irrigated regions (western United States; China; and West, South, and Central Asia) could necessitate the reversion of 20-60 Mha of cropland from irrigated to rainfed management by end-of-century, and a further loss of 600-2,900 Pcal of food production. In other regions (northern/eastern United States, parts of South America, much of Europe, and South East Asia) surplus water supply could in principle support a net increase in irrigation, although substantial investments in irrigation infrastructure would be required.


Subject(s)
Agricultural Irrigation/methods , Agriculture/methods , Climate Change , Models, Theoretical , Water Supply/statistics & numerical data , Agricultural Irrigation/economics , Agriculture/economics , Carbon Dioxide/analysis , Computer Simulation , Forecasting
SELECTION OF CITATIONS
SEARCH DETAIL