Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 207
Filter
Add more filters

Publication year range
1.
Nature ; 605(7910): 551-560, 2022 05.
Article in English | MEDLINE | ID: mdl-35332283

ABSTRACT

The design of proteins that bind to a specific site on the surface of a target protein using no information other than the three-dimensional structure of the target remains a challenge1-5. Here we describe a general solution to this problem that starts with a broad exploration of the vast space of possible binding modes to a selected region of a protein surface, and then intensifies the search in the vicinity of the most promising binding modes. We demonstrate the broad applicability of this approach through the de novo design of binding proteins to 12 diverse protein targets with different shapes and surface properties. Biophysical characterization shows that the binders, which are all smaller than 65 amino acids, are hyperstable and, following experimental optimization, bind their targets with nanomolar to picomolar affinities. We succeeded in solving crystal structures of five of the binder-target complexes, and all five closely match the corresponding computational design models. Experimental data on nearly half a million computational designs and hundreds of thousands of point mutants provide detailed feedback on the strengths and limitations of the method and of our current understanding of protein-protein interactions, and should guide improvements of both. Our approach enables the targeted design of binders to sites of interest on a wide variety of proteins for therapeutic and diagnostic applications.


Subject(s)
Carrier Proteins , Proteins , Amino Acids/metabolism , Binding Sites , Carrier Proteins/metabolism , Protein Binding , Proteins/chemistry
2.
Annu Rev Pharmacol Toxicol ; 64: 191-209, 2024 Jan 23.
Article in English | MEDLINE | ID: mdl-37506331

ABSTRACT

Traditionally, chemical toxicity is determined by in vivo animal studies, which are low throughput, expensive, and sometimes fail to predict compound toxicity in humans. Due to the increasing number of chemicals in use and the high rate of drug candidate failure due to toxicity, it is imperative to develop in vitro, high-throughput screening methods to determine toxicity. The Tox21 program, a unique research consortium of federal public health agencies, was established to address and identify toxicity concerns in a high-throughput, concentration-responsive manner using a battery of in vitro assays. In this article, we review the advancements in high-throughput robotic screening methodology and informatics processes to enable the generation of toxicological data, and their impact on the field; further, we discuss the future of assessing environmental toxicity utilizing efficient and scalable methods that better represent the corresponding biological and toxicodynamic processes in humans.


Subject(s)
High-Throughput Screening Assays , Toxicology , Animals , Humans , High-Throughput Screening Assays/methods , Toxicology/methods
3.
Environ Sci Technol ; 58(4): 2027-2037, 2024 Jan 30.
Article in English | MEDLINE | ID: mdl-38235672

ABSTRACT

The presence of numerous chemical contaminants from industrial, agricultural, and pharmaceutical sources in water supplies poses a potential risk to human and ecological health. Current chemical analyses suffer from limitations, including chemical coverage and high cost, and broad-coverage in vitro assays such as transcriptomics may further improve water quality monitoring by assessing a large range of possible effects. Here, we used high-throughput transcriptomics to assess the activity induced by field-derived water extracts in MCF7 breast carcinoma cells. Wastewater and surface water extracts induced the largest changes in expression among cell proliferation-related genes and neurological, estrogenic, and antibiotic pathways, whereas drinking and reclaimed water extracts that underwent advanced treatment showed substantially reduced bioactivity on both gene and pathway levels. Importantly, reclaimed water extracts induced fewer changes in gene expression than laboratory blanks, which reinforces previous conclusions based on targeted assays and improves confidence in bioassay-based monitoring of water quality.


Subject(s)
Water Pollutants, Chemical , Water Purification , Humans , Environmental Monitoring , Water Pollutants, Chemical/analysis , Water Quality , Gene Expression Profiling , Biological Assay
4.
Regul Toxicol Pharmacol ; 148: 105579, 2024 Mar.
Article in English | MEDLINE | ID: mdl-38309424

ABSTRACT

Chemical safety assessment begins with defining the lowest level of chemical that alters one or more measured endpoints. This critical effect level, along with factors to account for uncertainty, is used to derive limits for human exposure. In the absence of data regarding the specific mechanisms or biological pathways affected, non-specific endpoints such as body weight and non-target organ weight changes are used to set critical effect levels. Specific apical endpoints such as impaired reproductive function or altered neurodevelopment have also been used to set chemical safety limits; however, in test guidelines designed for specific apical effect(s), concurrently measured non-specific endpoints may be equally or more sensitive than specific endpoints. This means that rather than predicting a specific toxicological response, animal data are often used to develop protective critical effect levels, without assuming the same change would be observed in humans. This manuscript is intended to encourage a rethinking of how adverse chemical effects are interpreted: non-specific endpoints from in vivo toxicological studies data are often used to derive points of departure for use with safety assessment factors to create recommended exposure levels that are broadly protective but not necessarily target-specific.


Subject(s)
Toxicity Tests , Animals , Humans , Risk Assessment
5.
Risk Anal ; 43(3): 498-515, 2023 03.
Article in English | MEDLINE | ID: mdl-35460101

ABSTRACT

A number of investigators have explored the use of value of information (VOI) analysis to evaluate alternative information collection procedures in diverse decision-making contexts. This paper presents an analytic framework for determining the value of toxicity information used in risk-based decision making. The framework is specifically designed to explore the trade-offs between cost, timeliness, and uncertainty reduction associated with different toxicity-testing methodologies. The use of the proposed framework is demonstrated by two illustrative applications which, although based on simplified assumptions, show the insights that can be obtained through the use of VOI analysis. Specifically, these results suggest that timeliness of information collection has a significant impact on estimates of the VOI of chemical toxicity tests, even in the presence of smaller reductions in uncertainty. The framework introduces the concept of the expected value of delayed sample information, as an extension to the usual expected value of sample information, to accommodate the reductions in value resulting from delayed decision making. Our analysis also suggests that lower cost and higher throughput testing also may be beneficial in terms of public health benefits by increasing the number of substances that can be evaluated within a given budget. When the relative value is expressed in terms of return-on-investment per testing strategy, the differences can be substantial.


Subject(s)
Decision Support Techniques , Uncertainty , Cost-Benefit Analysis
6.
J Environ Manage ; 325(Pt A): 116393, 2023 Jan 01.
Article in English | MEDLINE | ID: mdl-36270126

ABSTRACT

Brownfield redevelopment is a complex process often involving a wide range of stakeholders holding differing priorities and opinions. The use of digital systems and products for decision making, modelling, and supporting discussion has been recognised throughout literature and industry. The inclusion of stakeholder preferences is an important consideration in the design and development of impactful digital tools and decision support systems. In this study, we present findings from stakeholder consultation with professionals from the UK brownfield sector with the aim of informing the design of future digital tools and systems. Our research investigates two broad themes; digitalisation and the use of digital tools across the sector; and perceptions of key brownfield challenge areas where digital tools could help better inform decision-makers. The methodology employed for this study comprises the collection of data and information using a combination of interviews and an online questionnaire. The results from these methods were evaluated both qualitatively and quantitatively. Findings reveal a disparity in levels of digital capability between stakeholder groups including between technical stakeholder types, and that cross-discipline communication of important issues may be aided by the development of carefully designed digital tools. To this end, we present seven core principles to guide the design and implementation of future digital tools for the brownfield sector. These principles are that future digital tools should be: (1) Stakeholder driven, (2) Problem centred, (3) Visual, (4) Intuitive, (5) Interactive, (6) Interoperable, and (7) Geospatial data driven.


Subject(s)
Communication , Industry
7.
J Environ Manage ; 347: 119145, 2023 Dec 01.
Article in English | MEDLINE | ID: mdl-37806270

ABSTRACT

This research evaluates a novel decision support system (DSS) for planning brownfield redevelopment. The DSS is implemented within a web-based geographical information system that contains the spatial data informing three modules comprising land use suitability, economic viability, and ground risk. Using multi-criteria decision analysis, an evaluation was conducted on 31,942 ha of post-industrial land and around Liverpool, UK. The representativeness and credibility of the DSS outputs were evaluated through user trials with fifteen land-use planning and development stakeholders from the Liverpool City Region Comined Authority. The DSS was used to explore land use planning scenarios and it could be used to support decision making. Our research reveals that the DSS has the potential to positively inform the identification of brownfield redevelopment opportunities by offering a reliable, carefully curated, and user-driven digital evidence base. This expedites the traditionally manual process of conducting assessments of land suitability and viability. This research has important implications for assessing the impact of current and future planning policy and the potential for the use of digital tools for land use planning and sustainability in the UK and globally.


Subject(s)
Geographic Information Systems , Industry , Power, Psychological
8.
Risk Anal ; 42(4): 707-729, 2022 04.
Article in English | MEDLINE | ID: mdl-34490933

ABSTRACT

Regulatory agencies are required to evaluate the impacts of thousands of chemicals. Toxicological tests currently used in such evaluations are time-consuming and resource intensive; however, advances in toxicology and related fields are providing new testing methodologies that reduce the cost and time required for testing. The selection of a preferred methodology is challenging because the new methodologies vary in duration and cost, and the data they generate vary in the level of uncertainty. This article presents a framework for performing cost-effectiveness analyses (CEAs) of toxicity tests that account for cost, duration, and uncertainty. This is achieved by using an output metric-the cost per correct regulatory decision-that reflects the three elements. The framework is demonstrated in two example CEAs, one for a simple decision of risk acceptability and a second, more complex decision, involving the selection of regulatory actions. Each example CEA evaluates five hypothetical toxicity-testing methodologies which differ with respect to cost, time, and uncertainty. The results of the examples indicate that either a fivefold reduction in cost or duration can be a larger driver of the selection of an optimal toxicity-testing methodology than a fivefold reduction in uncertainty. Uncertainty becomes of similar importance to cost and duration when decisionmakers are required to make more complex decisions that require the determination of small differences in risk predictions. The framework presented in this article may provide a useful basis for the identification of cost-effective methods for toxicity testing of large numbers of chemicals.


Subject(s)
Toxicity Tests , Cost-Benefit Analysis , Uncertainty
9.
Chem Res Toxicol ; 34(2): 189-216, 2021 02 15.
Article in English | MEDLINE | ID: mdl-33140634

ABSTRACT

Since 2009, the Tox21 project has screened ∼8500 chemicals in more than 70 high-throughput assays, generating upward of 100 million data points, with all data publicly available through partner websites at the United States Environmental Protection Agency (EPA), National Center for Advancing Translational Sciences (NCATS), and National Toxicology Program (NTP). Underpinning this public effort is the largest compound library ever constructed specifically for improving understanding of the chemical basis of toxicity across research and regulatory domains. Each Tox21 federal partner brought specialized resources and capabilities to the partnership, including three approximately equal-sized compound libraries. All Tox21 data generated to date have resulted from a confluence of ideas, technologies, and expertise used to design, screen, and analyze the Tox21 10K library. The different programmatic objectives of the partners led to three distinct, overlapping compound libraries that, when combined, not only covered a diversity of chemical structures, use-categories, and properties but also incorporated many types of compound replicates. The history of development of the Tox21 "10K" chemical library and data workflows implemented to ensure quality chemical annotations and allow for various reproducibility assessments are described. Cheminformatics profiling demonstrates how the three partner libraries complement one another to expand the reach of each individual library, as reflected in coverage of regulatory lists, predicted toxicity end points, and physicochemical properties. ToxPrint chemotypes (CTs) and enrichment approaches further demonstrate how the combined partner libraries amplify structure-activity patterns that would otherwise not be detected. Finally, CT enrichments are used to probe global patterns of activity in combined ToxCast and Tox21 activity data sets relative to test-set size and chemical versus biological end point diversity, illustrating the power of CT approaches to discern patterns in chemical-activity data sets. These results support a central premise of the Tox21 program: A collaborative merging of programmatically distinct compound libraries would yield greater rewards than could be achieved separately.


Subject(s)
Small Molecule Libraries/toxicity , Toxicity Tests , High-Throughput Screening Assays , Humans , United States , United States Environmental Protection Agency
10.
Regul Toxicol Pharmacol ; 125: 105020, 2021 Oct.
Article in English | MEDLINE | ID: mdl-34333066

ABSTRACT

Omics methodologies are widely used in toxicological research to understand modes and mechanisms of toxicity. Increasingly, these methodologies are being applied to questions of regulatory interest such as molecular point-of-departure derivation and chemical grouping/read-across. Despite its value, widespread regulatory acceptance of omics data has not yet occurred. Barriers to the routine application of omics data in regulatory decision making have been: 1) lack of transparency for data processing methods used to convert raw data into an interpretable list of observations; and 2) lack of standardization in reporting to ensure that omics data, associated metadata and the methodologies used to generate results are available for review by stakeholders, including regulators. Thus, in 2017, the Organisation for Economic Co-operation and Development (OECD) Extended Advisory Group on Molecular Screening and Toxicogenomics (EAGMST) launched a project to develop guidance for the reporting of omics data aimed at fostering further regulatory use. Here, we report on the ongoing development of the first formal reporting framework describing the processing and analysis of both transcriptomic and metabolomic data for regulatory toxicology. We introduce the modular structure, content, harmonization and strategy for trialling this reporting framework prior to its publication by the OECD.


Subject(s)
Metabolomics/standards , Organisation for Economic Co-Operation and Development/standards , Toxicogenetics/standards , Toxicology/standards , Transcriptome/physiology , Documentation/standards , Humans
11.
Bioinformatics ; 35(10): 1780-1782, 2019 05 15.
Article in English | MEDLINE | ID: mdl-30329029

ABSTRACT

SUMMARY: A new version (version 2) of the genomic dose-response analysis software, BMDExpress, has been created. The software addresses the increasing use of transcriptomic dose-response data in toxicology, drug design, risk assessment and translational research. In this new version, we have implemented additional statistical filtering options (e.g. Williams' trend test), curve fitting models, Linux and Macintosh compatibility and support for additional transcriptomic platforms with up-to-date gene annotations. Furthermore, we have implemented extensive data visualizations, on-the-fly data filtering, and a batch-wise analysis workflow. We have also significantly re-engineered the code base to reflect contemporary software engineering practices and streamline future development. The first version of BMDExpress was developed in 2007 to meet an unmet demand for easy-to-use transcriptomic dose-response analysis software. Since its original release, however, transcriptomic platforms, technologies, pathway annotations and quantitative methods for data analysis have undergone a large change necessitating a significant re-development of BMDExpress. To that end, as of 2016, the National Toxicology Program assumed stewardship of BMDExpress. The result is a modernized and updated BMDExpress 2 that addresses the needs of the growing toxicogenomics user community. AVAILABILITY AND IMPLEMENTATION: BMDExpress 2 is available at https://github.com/auerbachs/BMDExpress-2/releases. SUPPLEMENTARY INFORMATION: Supplementary data are available at Bioinformatics online.


Subject(s)
Transcriptome , Workflow , Genome , Molecular Sequence Annotation , Software
12.
Chem Res Toxicol ; 31(5): 287-290, 2018 05 21.
Article in English | MEDLINE | ID: mdl-29600706

ABSTRACT

Changes in chemical regulations worldwide have increased the demand for new data on chemical safety. New approach methodologies (NAMs) are defined broadly here as including in silico approaches and in chemico and in vitro assays, as well as the inclusion of information from the exposure of chemicals in the context of hazard [European Chemicals Agency, " New Approach Methodologies in Regulatory Science ", 2016]. NAMs for toxicity testing, including alternatives to animal testing approaches, have shown promise to provide a large amount of data to fill information gaps in both hazard and exposure. In order to increase experience with the new data and to advance the applications of NAM data to evaluate the safety of data-poor chemicals, demonstration case studies have to be developed to build confidence in their usability. Case studies can be used to explore the domains of applicability of the NAM data and identify areas that would benefit from further research, development, and application. To ensure that this science evolves with direct input from and engagement by risk managers and regulatory decision makers, a workshop was convened among senior leaders from international regulatory agencies to identify common barriers for using NAMs and to propose next steps to address them. Central to the workshop were a series of collaborative case studies designed to explore areas where the benefits of NAM data could be demonstrated. These included use of in vitro bioassays data in combination with exposure estimates to derive a quantitative assessment of risk, use of NAMs for updating chemical categorizations, and use of NAMs to increase understanding of exposure and human health toxicity of various chemicals. The case study approach proved effective in building collaborations and engagement with regulatory decision makers and to promote the importance of data and knowledge sharing among international regulatory agencies. The case studies will be continued to explore new ways of describing hazard (i.e., pathway perturbations as a measure of adversity) and new ways of describing risk (i.e., using NAMs to identify protective levels without necessarily being predictive of a specific hazard). Importantly, the case studies also highlighted the need for increased training and communication across the various communities including the risk assessors, regulators, stakeholders (e.g., industry, non-governmental organizations), and the general public. The development and application of NAMs will play an increasing role in filling important data gaps on the safety of chemicals, but confidence in NAMs will only come with learning by doing and sharing in the experience.


Subject(s)
Animal Testing Alternatives , Organic Chemicals/adverse effects , Toxicity Tests , Animals , Humans , Organic Chemicals/toxicity , Risk Assessment
13.
Environ Sci Technol ; 52(5): 3125-3135, 2018 03 06.
Article in English | MEDLINE | ID: mdl-29405058

ABSTRACT

A two-dimensional gas chromatography-time-of-flight/mass spectrometry (GC×GC-TOF/MS) suspect screening analysis method was used to rapidly characterize chemicals in 100 consumer products-which included formulations (e.g., shampoos, paints), articles (e.g., upholsteries, shower curtains), and foods (cereals)-and therefore supports broader efforts to prioritize chemicals based on potential human health risks. Analyses yielded 4270 unique chemical signatures across the products, with 1602 signatures tentatively identified using the National Institute of Standards and Technology 2008 spectral database. Chemical standards confirmed the presence of 119 compounds. Of the 1602 tentatively identified chemicals, 1404 were not present in a public database of known consumer product chemicals. Reported data and model predictions of chemical functional use were applied to evaluate the tentative chemical identifications. Estimated chemical concentrations were compared to manufacturer-reported values and other measured data. Chemical presence and concentration data can now be used to improve estimates of chemical exposure, and refine estimates of risk posed to human health and the environment.


Subject(s)
Household Products , Gas Chromatography-Mass Spectrometry , Humans
14.
Chem Res Toxicol ; 30(11): 2046-2059, 2017 11 20.
Article in English | MEDLINE | ID: mdl-28768096

ABSTRACT

Animal testing alone cannot practically evaluate the health hazard posed by tens of thousands of environmental chemicals. Computational approaches making use of high-throughput experimental data may provide more efficient means to predict chemical toxicity. Here, we use a supervised machine learning strategy to systematically investigate the relative importance of study type, machine learning algorithm, and type of descriptor on predicting in vivo repeat-dose toxicity at the organ-level. A total of 985 compounds were represented using chemical structural descriptors, ToxPrint chemotype descriptors, and bioactivity descriptors from ToxCast in vitro high-throughput screening assays. Using ToxRefDB, a total of 35 target organ outcomes were identified that contained at least 100 chemicals (50 positive and 50 negative). Supervised machine learning was performed using Naïve Bayes, k-nearest neighbor, random forest, classification and regression trees, and support vector classification approaches. Model performance was assessed based on F1 scores using 5-fold cross-validation with balanced bootstrap replicates. Fixed effects modeling showed the variance in F1 scores was explained mostly by target organ outcome, followed by descriptor type, machine learning algorithm, and interactions between these three factors. A combination of bioactivity and chemical structure or chemotype descriptors were the most predictive. Model performance improved with more chemicals (up to a maximum of 24%), and these gains were correlated (ρ = 0.92) with the number of chemicals. Overall, the results demonstrate that a combination of bioactivity and chemical descriptors can accurately predict a range of target organ toxicity outcomes in repeat-dose studies, but specific experimental and methodologic improvements may increase predictivity.


Subject(s)
Environmental Pollutants/toxicity , Machine Learning , Toxicity Tests/methods , Animals , Databases, Factual , Environmental Pollutants/chemistry , Humans , Models, Biological , Quantitative Structure-Activity Relationship
15.
Chem Res Toxicol ; 30(4): 946-964, 2017 04 17.
Article in English | MEDLINE | ID: mdl-27933809

ABSTRACT

Testing thousands of chemicals to identify potential androgen receptor (AR) agonists or antagonists would cost millions of dollars and take decades to complete using current validated methods. High-throughput in vitro screening (HTS) and computational toxicology approaches can more rapidly and inexpensively identify potential androgen-active chemicals. We integrated 11 HTS ToxCast/Tox21 in vitro assays into a computational network model to distinguish true AR pathway activity from technology-specific assay interference. The in vitro HTS assays probed perturbations of the AR pathway at multiple points (receptor binding, coregulator recruitment, gene transcription, and protein production) and multiple cell types. Confirmatory in vitro antagonist assay data and cytotoxicity information were used as additional flags for potential nonspecific activity. Validating such alternative testing strategies requires high-quality reference data. We compiled 158 putative androgen-active and -inactive chemicals from a combination of international test method validation efforts and semiautomated systematic literature reviews. Detailed in vitro assay information and results were compiled into a single database using a standardized ontology. Reference chemical concentrations that activated or inhibited AR pathway activity were identified to establish a range of potencies with reproducible reference chemical results. Comparison with existing Tier 1 AR binding data from the U.S. EPA Endocrine Disruptor Screening Program revealed that the model identified binders at relevant test concentrations (<100 µM) and was more sensitive to antagonist activity. The AR pathway model based on the ToxCast/Tox21 assays had balanced accuracies of 95.2% for agonist (n = 29) and 97.5% for antagonist (n = 28) reference chemicals. Out of 1855 chemicals screened in the AR pathway model, 220 chemicals demonstrated AR agonist or antagonist activity and an additional 174 chemicals were predicted to have potential weak AR pathway activity.


Subject(s)
Androgen Receptor Antagonists/metabolism , Androgens/metabolism , Models, Theoretical , Receptors, Androgen/metabolism , Androgen Receptor Antagonists/chemistry , Androgen Receptor Antagonists/pharmacology , Androgens/chemistry , Androgens/pharmacology , Area Under Curve , High-Throughput Screening Assays , Humans , Protein Binding , ROC Curve , Receptors, Androgen/chemistry , Receptors, Androgen/genetics , Transcriptional Activation/drug effects
16.
Am J Public Health ; 107(7): 1032-1039, 2017 07.
Article in English | MEDLINE | ID: mdl-28520487

ABSTRACT

Preventing adverse health effects of environmental chemical exposure is fundamental to protecting individual and public health. When done efficiently and properly, chemical risk assessment enables risk management actions that minimize the incidence and effects of environmentally induced diseases related to chemical exposure. However, traditional chemical risk assessment is faced with multiple challenges with respect to predicting and preventing disease in human populations, and epidemiological studies increasingly report observations of adverse health effects at exposure levels predicted from animal studies to be safe for humans. This discordance reinforces concerns about the adequacy of contemporary risk assessment practices for protecting public health. It is becoming clear that to protect public health more effectively, future risk assessments will need to use the full range of available data, draw on innovative methods to integrate diverse data streams, and consider health endpoints that also reflect the range of subtle effects and morbidities observed in human populations. Considering these factors, there is a need to reframe chemical risk assessment to be more clearly aligned with the public health goal of minimizing environmental exposures associated with disease.


Subject(s)
Data Interpretation, Statistical , Environmental Exposure/adverse effects , Public Health/trends , Risk Assessment/methods , Animals , Environmental Exposure/prevention & control , Forecasting , Humans , Incidence , Models, Animal
17.
Rapid Commun Mass Spectrom ; 31(15): 1239-1249, 2017 Aug 15.
Article in English | MEDLINE | ID: mdl-28494122

ABSTRACT

RATIONALE: Coal tars are a mixture of organic and inorganic compounds that were produced as a by-product from the manufactured gas and coke making industries. The composition of the tar produced varies depending on many factors; these include the temperature of production and the type of retort used. As different production processes produce different tars, a comprehensive database of the compounds present within coal tars from different production processes is a valuable resource. Such a database would help to understand how their chemical properties differ and what hazards the compounds present within these tars might pose. This study focuses on the aliphatic and aromatic compounds present in a database of 16 different tars from five different production processes. METHODS: Samples of coal tar were extracted using accelerated solvent extraction (ASE) and derivatised post-extraction using N,O-bis(trimethylsilyl)trifluoroacetamide (BSTFA) with 1% trimethylchlorosilane (TMCS). The derivatised samples were analysed using two-dimensional gas chromatography combined with time-of-flight mass spectrometry (GCxGC/TOFMS). RESULTS: A total of 198 individual aliphatic and 951 individual aromatic compounds were detected within 16 tar samples produced by five different production processes. The polycyclic aromatic hydrocarbon (PAH) content of coal tars varies greatly depending on the production process used to obtain the tars and this is clearly demonstrated within the results. The aliphatic composition of the tars provided an important piece of analytical information that would have otherwise been missed with the detection of petrogenic compounds such as alkyl cyclohexanes. CONCLUSIONS: The aromatic compositions of the tar samples varied greatly between the different production processes investigated and useful analytical information was obtained about the individual production process groups. Alkyl cyclohexanes were detected in all samples from sites known to operate Carbureted Water Gas plants and not detected in those that did not. This suggests that petrogenic material may be expected at many UK gaswork sites.

18.
Rapid Commun Mass Spectrom ; 31(15): 1231-1238, 2017 Aug 15.
Article in English | MEDLINE | ID: mdl-28488792

ABSTRACT

RATIONALE: Coal tars are a mixture of organic and inorganic compounds that were by-products from the manufactured gas and coke making industries. Different manufacturing processes have resulted in the production of distinctly different tar compositions. This study presents a comprehensive database of compounds produced using two-dimensional gas chromatography combined with time-of-flight mass spectrometry (GCxGC/TOFMS), analysing 16 tar samples produced by five distinct production processes. METHODS: Samples of coal tar were extracted using accelerated solvent extraction (ASE) and derivatised post-extraction using N,O-bis(trimethylsilyl)trifluoroacetamide (BSTFA) with 1% trimethylchlorosilane (TMCS). The derivatised samples were analysed using two-dimensional gas chromatography combined with time-of-flight mass spectrometry (GCxGC/TOFMS). RESULTS: A total of 16 tar samples originating from five different production processes: Low Temperature Horizontal Retorts, Horizontal Retorts, Vertical Retorts, Carbureted Water Gas and Coke Ovens, were analysed. A total of 2369 unique compounds were detected with 948 aromatic compounds, 196 aliphatic compounds, 380 sulfur-containing compounds, 209 oxygen-containing compounds, 262 nitrogen-containing compounds and 15 mixed heterocycles. Derivatisation allowed the detection of 359 unique compounds, the majority in the form of hydroxylated polycyclic aromatic hydrocarbons, many of which would not have been detected without derivatisation. Of the 2369 unique compounds detected, 173 were found to be present within all samples. CONCLUSIONS: A unique comprehensive database of compounds detected within 16 tar samples from five different production processes was produced. The 173 compounds identified within every sample may be of particular importance from a regulatory standpoint. This initial study indicates that different production processes produce tars with different chemical signatures and it can be further expanded upon by in-depth analysis of the different compound types. The number of compounds presented within this database clearly demonstrates the analytical power of GCxGC/TOFMS.

19.
Rapid Commun Mass Spectrom ; 31(15): 1250-1260, 2017 Aug 15.
Article in English | MEDLINE | ID: mdl-28514513

ABSTRACT

RATIONALE: Coal tars are a mixture of organic and inorganic compounds that were by-products from the manufactured gas and coke making industries. The tar compositions varied depending on many factors such as the temperature of production and the type of retort used. For this reason a comprehensive database of the compounds found in different tar types is of value to understand both how their compositions differ and what potential chemical hazards are present. This study focuses on the heterocyclic and hydroxylated compounds present in a database produced from 16 different tars from five different production processes. METHODS: Samples of coal tar were extracted using accelerated solvent extraction (ASE) and derivatized post-extraction using N,O-bis(trimethylsilyl)trifluoroacetamide (BSTFA) with 1% trimethylchlorosilane (TMCS). The derivatized samples were analysed using two-dimensional gas chromatography combined with time-of-flight mass spectrometry (GCxGC/TOFMS). RESULTS: A total of 865 heterocyclic compounds and 359 hydroxylated polycyclic aromatic hydrocarbons (PAHs) were detected in 16 tar samples produced by five different processes. The contents of both heterocyclic and hydroxylated PAHs varied greatly with the production process used, with the heterocyclic compounds giving information about the feedstock used. Of the 359 hydroxylated PAHs detected the majority would not have been be detected without the use of derivatization. CONCLUSIONS: Coal tars produced using different production processes and feedstocks yielded tars with significantly different heterocyclic and hydroxylated contents. The concentrations of the individual heterocyclic compounds varied greatly even within the different production processes and provided information about the feedstock used to produce the tars. The hydroxylated PAH content of the samples provided important analytical information that would otherwise not have been obtained without the use of derivatization and GCxGC/TOFMS.

20.
Arch Toxicol ; 91(5): 2045-2065, 2017 May.
Article in English | MEDLINE | ID: mdl-27928627

ABSTRACT

There is increasing interest in the use of quantitative transcriptomic data to determine benchmark dose (BMD) and estimate a point of departure (POD) for human health risk assessment. Although studies have shown that transcriptional PODs correlate with those derived from apical endpoint changes, there is no consensus on the process used to derive a transcriptional POD. Specifically, the subsets of informative genes that produce BMDs that best approximate the doses at which adverse apical effects occur have not been defined. To determine the best way to select predictive groups of genes, we used published microarray data from dose-response studies on six chemicals in rats exposed orally for 5, 14, 28, and 90 days. We evaluated eight approaches for selecting genes for POD derivation and three previously proposed approaches (the lowest pathway BMD, and the mean and median BMD of all genes). The relationship between transcriptional BMDs derived using these 11 approaches and PODs derived from apical data that might be used in chemical risk assessment was examined. Transcriptional BMD values for all 11 approaches were remarkably aligned with corresponding apical PODs, with the vast majority of toxicogenomics PODs being within tenfold of those derived from apical endpoints. We identified at least four approaches that produce BMDs that are effective estimates of apical PODs across multiple sampling time points. Our results support that a variety of approaches can be used to derive reproducible transcriptional PODs that are consistent with PODs produced from traditional methods for chemical risk assessment.


Subject(s)
Dose-Response Relationship, Drug , Gene Expression Regulation/drug effects , Risk Assessment/methods , Toxicogenetics/methods , Animals , Bromobenzenes/administration & dosage , Bromobenzenes/toxicity , Chlorophenols/administration & dosage , Chlorophenols/toxicity , Female , Humans , Male , Nitrosamines/administration & dosage , Nitrosamines/toxicity , Rats, Inbred F344 , Rats, Sprague-Dawley , Transcriptome
SELECTION OF CITATIONS
SEARCH DETAIL