Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 74
Filter
1.
Chem Res Toxicol ; 34(2): 189-216, 2021 02 15.
Article in English | MEDLINE | ID: mdl-33140634

ABSTRACT

Since 2009, the Tox21 project has screened ∼8500 chemicals in more than 70 high-throughput assays, generating upward of 100 million data points, with all data publicly available through partner websites at the United States Environmental Protection Agency (EPA), National Center for Advancing Translational Sciences (NCATS), and National Toxicology Program (NTP). Underpinning this public effort is the largest compound library ever constructed specifically for improving understanding of the chemical basis of toxicity across research and regulatory domains. Each Tox21 federal partner brought specialized resources and capabilities to the partnership, including three approximately equal-sized compound libraries. All Tox21 data generated to date have resulted from a confluence of ideas, technologies, and expertise used to design, screen, and analyze the Tox21 10K library. The different programmatic objectives of the partners led to three distinct, overlapping compound libraries that, when combined, not only covered a diversity of chemical structures, use-categories, and properties but also incorporated many types of compound replicates. The history of development of the Tox21 "10K" chemical library and data workflows implemented to ensure quality chemical annotations and allow for various reproducibility assessments are described. Cheminformatics profiling demonstrates how the three partner libraries complement one another to expand the reach of each individual library, as reflected in coverage of regulatory lists, predicted toxicity end points, and physicochemical properties. ToxPrint chemotypes (CTs) and enrichment approaches further demonstrate how the combined partner libraries amplify structure-activity patterns that would otherwise not be detected. Finally, CT enrichments are used to probe global patterns of activity in combined ToxCast and Tox21 activity data sets relative to test-set size and chemical versus biological end point diversity, illustrating the power of CT approaches to discern patterns in chemical-activity data sets. These results support a central premise of the Tox21 program: A collaborative merging of programmatically distinct compound libraries would yield greater rewards than could be achieved separately.


Subject(s)
Small Molecule Libraries/toxicity , Toxicity Tests , High-Throughput Screening Assays , Humans , United States , United States Environmental Protection Agency
2.
Chem Res Toxicol ; 31(5): 287-290, 2018 05 21.
Article in English | MEDLINE | ID: mdl-29600706

ABSTRACT

Changes in chemical regulations worldwide have increased the demand for new data on chemical safety. New approach methodologies (NAMs) are defined broadly here as including in silico approaches and in chemico and in vitro assays, as well as the inclusion of information from the exposure of chemicals in the context of hazard [European Chemicals Agency, " New Approach Methodologies in Regulatory Science ", 2016]. NAMs for toxicity testing, including alternatives to animal testing approaches, have shown promise to provide a large amount of data to fill information gaps in both hazard and exposure. In order to increase experience with the new data and to advance the applications of NAM data to evaluate the safety of data-poor chemicals, demonstration case studies have to be developed to build confidence in their usability. Case studies can be used to explore the domains of applicability of the NAM data and identify areas that would benefit from further research, development, and application. To ensure that this science evolves with direct input from and engagement by risk managers and regulatory decision makers, a workshop was convened among senior leaders from international regulatory agencies to identify common barriers for using NAMs and to propose next steps to address them. Central to the workshop were a series of collaborative case studies designed to explore areas where the benefits of NAM data could be demonstrated. These included use of in vitro bioassays data in combination with exposure estimates to derive a quantitative assessment of risk, use of NAMs for updating chemical categorizations, and use of NAMs to increase understanding of exposure and human health toxicity of various chemicals. The case study approach proved effective in building collaborations and engagement with regulatory decision makers and to promote the importance of data and knowledge sharing among international regulatory agencies. The case studies will be continued to explore new ways of describing hazard (i.e., pathway perturbations as a measure of adversity) and new ways of describing risk (i.e., using NAMs to identify protective levels without necessarily being predictive of a specific hazard). Importantly, the case studies also highlighted the need for increased training and communication across the various communities including the risk assessors, regulators, stakeholders (e.g., industry, non-governmental organizations), and the general public. The development and application of NAMs will play an increasing role in filling important data gaps on the safety of chemicals, but confidence in NAMs will only come with learning by doing and sharing in the experience.


Subject(s)
Animal Testing Alternatives , Organic Chemicals/adverse effects , Toxicity Tests , Animals , Humans , Organic Chemicals/toxicity , Risk Assessment
3.
Am J Public Health ; 107(7): 1032-1039, 2017 07.
Article in English | MEDLINE | ID: mdl-28520487

ABSTRACT

Preventing adverse health effects of environmental chemical exposure is fundamental to protecting individual and public health. When done efficiently and properly, chemical risk assessment enables risk management actions that minimize the incidence and effects of environmentally induced diseases related to chemical exposure. However, traditional chemical risk assessment is faced with multiple challenges with respect to predicting and preventing disease in human populations, and epidemiological studies increasingly report observations of adverse health effects at exposure levels predicted from animal studies to be safe for humans. This discordance reinforces concerns about the adequacy of contemporary risk assessment practices for protecting public health. It is becoming clear that to protect public health more effectively, future risk assessments will need to use the full range of available data, draw on innovative methods to integrate diverse data streams, and consider health endpoints that also reflect the range of subtle effects and morbidities observed in human populations. Considering these factors, there is a need to reframe chemical risk assessment to be more clearly aligned with the public health goal of minimizing environmental exposures associated with disease.


Subject(s)
Data Interpretation, Statistical , Environmental Exposure/adverse effects , Public Health/trends , Risk Assessment/methods , Animals , Environmental Exposure/prevention & control , Forecasting , Humans , Incidence , Models, Animal
5.
Environ Health Perspect ; 124(6): 713-21, 2016 06.
Article in English | MEDLINE | ID: mdl-26600562

ABSTRACT

BACKGROUND: A recent review by the International Agency for Research on Cancer (IARC) updated the assessments of the > 100 agents classified as Group 1, carcinogenic to humans (IARC Monographs Volume 100, parts A-F). This exercise was complicated by the absence of a broadly accepted, systematic method for evaluating mechanistic data to support conclusions regarding human hazard from exposure to carcinogens. OBJECTIVES AND METHODS: IARC therefore convened two workshops in which an international Working Group of experts identified 10 key characteristics, one or more of which are commonly exhibited by established human carcinogens. DISCUSSION: These characteristics provide the basis for an objective approach to identifying and organizing results from pertinent mechanistic studies. The 10 characteristics are the abilities of an agent to 1) act as an electrophile either directly or after metabolic activation; 2) be genotoxic; 3) alter DNA repair or cause genomic instability; 4) induce epigenetic alterations; 5) induce oxidative stress; 6) induce chronic inflammation; 7) be immunosuppressive; 8) modulate receptor-mediated effects; 9) cause immortalization; and 10) alter cell proliferation, cell death, or nutrient supply. CONCLUSION: We describe the use of the 10 key characteristics to conduct a systematic literature search focused on relevant end points and construct a graphical representation of the identified mechanistic information. Next, we use benzene and polychlorinated biphenyls as examples to illustrate how this approach may work in practice. The approach described is similar in many respects to those currently being implemented by the U.S. EPA's Integrated Risk Information System Program and the U.S. National Toxicology Program. CITATION: Smith MT, Guyton KZ, Gibbons CF, Fritz JM, Portier CJ, Rusyn I, DeMarini DM, Caldwell JC, Kavlock RJ, Lambert P, Hecht SS, Bucher JR, Stewart BW, Baan R, Cogliano VJ, Straif K. 2016. Key characteristics of carcinogens as a basis for organizing data on mechanisms of carcinogenesis. Environ Health Perspect 124:713-721; http://dx.doi.org/10.1289/ehp.1509912.


Subject(s)
Carcinogenicity Tests/methods , Carcinogens/toxicity , Animals , Benzene/toxicity , Carcinogenesis , Carcinogenicity Tests/standards , Carcinogens/standards , Humans , Polychlorinated Biphenyls/toxicity , Risk Assessment/methods , Risk Assessment/standards
6.
Environ Health Perspect ; 124(7): 910-9, 2016 07.
Article in English | MEDLINE | ID: mdl-26473631

ABSTRACT

BACKGROUND: High-content imaging (HCI) allows simultaneous measurement of multiple cellular phenotypic changes and is an important tool for evaluating the biological activity of chemicals. OBJECTIVES: Our goal was to analyze dynamic cellular changes using HCI to identify the "tipping point" at which the cells did not show recovery towards a normal phenotypic state. METHODS: HCI was used to evaluate the effects of 967 chemicals (in concentrations ranging from 0.4 to 200 µM) on HepG2 cells over a 72-hr exposure period. The HCI end points included p53, c-Jun, histone H2A.x, α-tubulin, histone H3, alpha tubulin, mitochondrial membrane potential, mitochondrial mass, cell cycle arrest, nuclear size, and cell number. A computational model was developed to interpret HCI responses as cell-state trajectories. RESULTS: Analysis of cell-state trajectories showed that 336 chemicals produced tipping points and that HepG2 cells were resilient to the effects of 334 chemicals up to the highest concentration (200 µM) and duration (72 hr) tested. Tipping points were identified as concentration-dependent transitions in system recovery, and the corresponding critical concentrations were generally between 5 and 15 times (25th and 75th percentiles, respectively) lower than the concentration that produced any significant effect on HepG2 cells. The remaining 297 chemicals require more data before they can be placed in either of these categories. CONCLUSIONS: These findings show the utility of HCI data for reconstructing cell state trajectories and provide insight into the adaptation and resilience of in vitro cellular systems based on tipping points. Cellular tipping points could be used to define a point of departure for risk-based prioritization of environmental chemicals. CITATION: Shah I, Setzer RW, Jack J, Houck KA, Judson RS, Knudsen TB, Liu J, Martin MT, Reif DM, Richard AM, Thomas RS, Crofton KM, Dix DJ, Kavlock RJ. 2016. Using ToxCast™ data to reconstruct dynamic cell state trajectories and estimate toxicological points of departure. Environ Health Perspect 124:910-919; http://dx.doi.org/10.1289/ehp.1409029.


Subject(s)
Environmental Pollutants/toxicity , Toxicity Tests/methods , High-Throughput Screening Assays , Membrane Potential, Mitochondrial , Risk Assessment
7.
Environ Health Perspect ; 123(11): A268-72, 2015 Nov.
Article in English | MEDLINE | ID: mdl-26523530

ABSTRACT

Biomedical developments in the 21st century provide an unprecedented opportunity to gain a dynamic systems-level and human-specific understanding of the causes and pathophysiologies of disease. This understanding is a vital need, in view of continuing failures in health research, drug discovery, and clinical translation. The full potential of advanced approaches may not be achieved within a 20th-century conceptual framework dominated by animal models. Novel technologies are being integrated into environmental health research and are also applicable to disease research, but these advances need a new medical research and drug discovery paradigm to gain maximal benefits. We suggest a new conceptual framework that repurposes the 21st-century transition underway in toxicology. Human disease should be conceived as resulting from integrated extrinsic and intrinsic causes, with research focused on modern human-specific models to understand disease pathways at multiple biological levels that are analogous to adverse outcome pathways in toxicology. Systems biology tools should be used to integrate and interpret data about disease causation and pathophysiology. Such an approach promises progress in overcoming the current roadblocks to understanding human disease and successful drug discovery and translation. A discourse should begin now to identify and consider the many challenges and questions that need to be solved.


Subject(s)
Biomedical Research/methods , Systems Biology/methods , Toxicology/methods , Animal Testing Alternatives , Computer Simulation , Drug Discovery , Genomics , Humans
9.
Sci Rep ; 4: 5664, 2014 Jul 11.
Article in English | MEDLINE | ID: mdl-25012808

ABSTRACT

The U.S. Tox21 program has screened a library of approximately 10,000 (10K) environmental chemicals and drugs in three independent runs for estrogen receptor alpha (ERα) agonist and antagonist activity using two types of ER reporter gene cell lines, one with an endogenous full length ERα (ER-luc; BG1 cell line) and the other with a transfected partial receptor consisting of the ligand binding domain (ER-bla; ERα ß-lactamase cell line), in a quantitative high-throughput screening (qHTS) format. The ability of the two assays to correctly identify ERα agonists and antagonists was evaluated using a set of 39 reference compounds with known ERα activity. Although both assays demonstrated adequate (i.e. >80%) predictivity, the ER-luc assay was more sensitive and the ER-bla assay more specific. The qHTS assay results were compared with results from previously published ERα binding assay data and showed >80% consistency. Actives identified from both the ER-bla and ER-luc assays were analyzed for structure-activity relationships (SARs) revealing known and potentially novel ERα active structure classes. The results demonstrate the feasibility of qHTS to identify environmental chemicals with the potential to interact with the ERα signaling pathway and the two different assay formats improve the confidence in correctly identifying these chemicals.


Subject(s)
Estrogen Receptor alpha/agonists , Estrogen Receptor alpha/antagonists & inhibitors , Signal Transduction/drug effects , Small Molecule Libraries/pharmacology , Cell Line , Genes, Reporter/drug effects , HEK293 Cells , High-Throughput Screening Assays/methods , Humans , Ligands , Protein Binding/drug effects , Structure-Activity Relationship
10.
Nat Biotechnol ; 32(6): 583-91, 2014 Jun.
Article in English | MEDLINE | ID: mdl-24837663

ABSTRACT

Addressing the safety aspects of drugs and environmental chemicals has historically been undertaken through animal testing. However, the quantity of chemicals in need of assessment and the challenges of species extrapolation require the development of alternative approaches. Our approach, the US Environmental Protection Agency's ToxCast program, utilizes a large suite of in vitro and model organism assays to interrogate important chemical libraries and computationally analyze bioactivity profiles. Here we evaluated one component of the ToxCast program, the use of primary human cell systems, by screening for chemicals that disrupt physiologically important pathways. Chemical-response signatures for 87 endpoints covering molecular functions relevant to toxic and therapeutic pathways were generated in eight cell systems for 641 environmental chemicals and 135 reference pharmaceuticals and failed drugs. Computational clustering of the profiling data provided insights into the polypharmacology and potential off-target effects for many chemicals that have limited or no toxicity information. The endpoints measured can be closely linked to in vivo outcomes, such as the upregulation of tissue factor in endothelial cell systems by compounds linked to the risk of thrombosis in vivo. Our results demonstrate that assaying complex biological pathways in primary human cells can identify potential chemical targets, toxicological liabilities and mechanisms useful for elucidating adverse outcome pathways.


Subject(s)
Animal Testing Alternatives/methods , High-Throughput Screening Assays/methods , Models, Biological , Small Molecule Libraries , Toxicity Tests/methods , Animals , Computer Simulation , Humans , Mice , Phenotype , Rats , United States , United States Environmental Protection Agency
11.
Chem Res Toxicol ; 27(3): 314-29, 2014 Mar 17.
Article in English | MEDLINE | ID: mdl-24446777

ABSTRACT

Systems Toxicology is the integration of classical toxicology with quantitative analysis of large networks of molecular and functional changes occurring across multiple levels of biological organization. Society demands increasingly close scrutiny of the potential health risks associated with exposure to chemicals present in our everyday life, leading to an increasing need for more predictive and accurate risk-assessment approaches. Developing such approaches requires a detailed mechanistic understanding of the ways in which xenobiotic substances perturb biological systems and lead to adverse outcomes. Thus, Systems Toxicology approaches offer modern strategies for gaining such mechanistic knowledge by combining advanced analytical and computational tools. Furthermore, Systems Toxicology is a means for the identification and application of biomarkers for improved safety assessments. In Systems Toxicology, quantitative systems-wide molecular changes in the context of an exposure are measured, and a causal chain of molecular events linking exposures with adverse outcomes (i.e., functional and apical end points) is deciphered. Mathematical models are then built to describe these processes in a quantitative manner. The integrated data analysis leads to the identification of how biological networks are perturbed by the exposure and enables the development of predictive mathematical models of toxicological processes. This perspective integrates current knowledge regarding bioanalytical approaches, computational analysis, and the potential for improved risk assessment.


Subject(s)
Models, Theoretical , Risk Assessment , Animals , Biomarkers/metabolism , Chemical and Drug Induced Liver Injury/etiology , Chemical and Drug Induced Liver Injury/metabolism , Disease Models, Animal , Environmental Exposure , Humans , Metabolomics , Proteomics , Xenobiotics/chemistry , Xenobiotics/toxicity
12.
Drug Discov Today ; 18(15-16): 716-23, 2013 Aug.
Article in English | MEDLINE | ID: mdl-23732176

ABSTRACT

Since its establishment in 2008, the US Tox21 inter-agency collaboration has made great progress in developing and evaluating cellular models for the evaluation of environmental chemicals as a proof of principle. Currently, the program has entered its production phase (Tox21 Phase II) focusing initially on the areas of modulation of nuclear receptors and stress response pathways. During Tox21 Phase II, the set of chemicals to be tested has been expanded to nearly 10,000 (10K) compounds and a fully automated screening platform has been implemented. The Tox21 robotic system combined with informatics efforts is capable of screening and profiling the collection of 10K environmental chemicals in triplicate in a week. In this article, we describe the Tox21 screening process, compound library preparation, data processing, and robotic system validation.


Subject(s)
Environmental Monitoring/methods , Environmental Pollutants/chemistry , Robotics/methods , United States Environmental Protection Agency , Animals , Environmental Monitoring/instrumentation , Humans , Robotics/instrumentation , United States , United States Environmental Protection Agency/trends
13.
Chem Res Toxicol ; 26(7): 1097-107, 2013 Jul 15.
Article in English | MEDLINE | ID: mdl-23682706

ABSTRACT

High-throughput screening (HTS) assays capable of profiling thousands of environmentally relevant chemicals for in vitro biological activity provide useful information on the potential for disrupting endocrine pathways. Disruption of the estrogen signaling pathway has been implicated in a variety of adverse health effects including impaired development, reproduction, and carcinogenesis. The estrogen-responsive human mammary ductal carcinoma cell line T-47D was exposed to 1815 ToxCast chemicals comprising pesticides, industrial chemicals, pharmaceuticals, personal care products, cosmetics, food ingredients, and other chemicals with known or suspected human exposure potential. Cell growth kinetics were evaluated using real-time cell electronic sensing. T-47D cells were exposed to eight concentrations (0.006-100 µM), and measurements of cellular impedance were repeatedly recorded for 105 h. Chemical effects were evaluated based on potency (concentration at which response occurs) and efficacy (extent of response). A linear growth response was observed in response to prototypical estrogen receptor agonists (17ß-estradiol, genistein, bisphenol A, nonylphenol, and 4-tert-octylphenol). Several compounds, including bisphenol A and genistein, induced cell growth comparable in efficacy to that of 17ß-estradiol, but with decreased potency. Progestins, androgens, and corticosteroids invoked a biphasic growth response indicative of changes in cell number or cell morphology. Results from this cell growth assay were compared with results from additional estrogen receptor (ER) binding and transactivation assays. Chemicals detected as active in both the cell growth and ER receptor binding assays demonstrated potencies highly correlated with two ER transactivation assays (r = 0.72; r = 0.70). While ER binding assays detected chemicals that were highly potent or efficacious in the T-47D cell growth and transactivation assays, the binding assays lacked sensitivity in detecting weakly active compounds. In conclusion, this cell-based assay rapidly detects chemical effects on T-47D growth and shows potential, in combination with other HTS assays, to detect environmentally relevant chemicals with potential estrogenic activity.


Subject(s)
Breast Neoplasms/metabolism , Carcinoma, Ductal, Breast/metabolism , Environmental Pollutants/toxicity , Hormones/metabolism , Molecular Mimicry , Toxicity Tests , Breast Neoplasms/pathology , Carcinoma, Ductal, Breast/pathology , Cell Line, Tumor , Cell Proliferation/drug effects , Female , High-Throughput Screening Assays , Humans , Kinetics , Receptors, Estrogen/metabolism , Time Factors
14.
Chem Res Toxicol ; 26(6): 878-95, 2013 Jun 17.
Article in English | MEDLINE | ID: mdl-23611293

ABSTRACT

Understanding potential health risks is a significant challenge due to the large numbers of diverse chemicals with poorly characterized exposures and mechanisms of toxicities. The present study analyzes 976 chemicals (including failed pharmaceuticals, alternative plasticizers, food additives, and pesticides) in Phases I and II of the U.S. EPA's ToxCast project across 331 cell-free enzymatic and ligand-binding high-throughput screening (HTS) assays. Half-maximal activity concentrations (AC50) were identified for 729 chemicals in 256 assays (7,135 chemical-assay pairs). Some of the most commonly affected assays were CYPs (CYP2C9 and CYP2C19), transporters (mitochondrial TSPO, norepinephrine, and dopaminergic), and GPCRs (aminergic). Heavy metals, surfactants, and dithiocarbamate fungicides showed promiscuous but distinctly different patterns of activity, whereas many of the pharmaceutical compounds showed promiscuous activity across GPCRs. Literature analysis confirmed >50% of the activities for the most potent chemical-assay pairs (54) but also revealed 10 missed interactions. Twenty-two chemicals with known estrogenic activity were correctly identified for the majority (77%), missing only the weaker interactions. In many cases, novel findings for previously unreported chemical-target combinations clustered with known chemical-target interactions. Results from this large inventory of chemical-biological interactions can inform read-across methods as well as link potential targets to molecular initiating events in adverse outcome pathways for diverse toxicities.


Subject(s)
Enzymes/metabolism , High-Throughput Screening Assays , Organic Chemicals/toxicity , Signal Transduction/drug effects , Animals , Guinea Pigs , Humans , Membrane Transport Proteins/metabolism , Rats , Receptors, G-Protein-Coupled/antagonists & inhibitors , Receptors, G-Protein-Coupled/metabolism
15.
Environ Health Perspect ; 121(7): 756-65, 2013 Jul.
Article in English | MEDLINE | ID: mdl-23603828

ABSTRACT

BACKGROUND: In 2008, the National Institute of Environmental Health Sciences/National Toxicology Program, the U.S. Environmental Protection Agency's National Center for Computational Toxicology, and the National Human Genome Research Institute/National Institutes of Health Chemical Genomics Center entered into an agreement on "high throughput screening, toxicity pathway profiling, and biological interpretation of findings." In 2010, the U.S. Food and Drug Administration (FDA) joined the collaboration, known informally as Tox21. OBJECTIVES: The Tox21 partners agreed to develop a vision and devise an implementation strategy to shift the assessment of chemical hazards away from traditional experimental animal toxicology studies to one based on target-specific, mechanism-based, biological observations largely obtained using in vitro assays. DISCUSSION: Here we outline the efforts of the Tox21 partners up to the time the FDA joined the collaboration, describe the approaches taken to develop the science and technologies that are currently being used, assess the current status, and identify problems that could impede further progress as well as suggest approaches to address those problems. CONCLUSION: Tox21 faces some very difficult issues. However, we are making progress in integrating data from diverse technologies and end points into what is effectively a systems-biology approach to toxicology. This can be accomplished only when comprehensive knowledge is obtained with broad coverage of chemical and biological/toxicological space. The efforts thus far reflect the initial stage of an exceedingly complicated program, one that will likely take decades to fully achieve its goals. However, even at this stage, the information obtained has attracted the attention of the international scientific community, and we believe these efforts foretell the future of toxicology.


Subject(s)
Environmental Pollutants/toxicity , Hazardous Substances/toxicity , Toxicology/methods , Animals , Computational Biology/methods , Genomics/methods , Humans , National Institute of Environmental Health Sciences (U.S.) , National Institutes of Health (U.S.) , Toxicity Tests/methods , United States , United States Environmental Protection Agency , United States Food and Drug Administration
16.
Toxicol In Vitro ; 27(4): 1320-46, 2013 Jun.
Article in English | MEDLINE | ID: mdl-23453986

ABSTRACT

The thyroid hormone (TH) system is involved in several important physiological processes, including regulation of energy metabolism, growth and differentiation, development and maintenance of brain function, thermo-regulation, osmo-regulation, and axis of regulation of other endocrine systems, sexual behaviour and fertility and cardiovascular function. Therefore, concern about TH disruption (THD) has resulted in strategies being developed to identify THD chemicals (THDCs). Information on potential of chemicals causing THD is typically derived from animal studies. For the majority of chemicals, however, this information is either limited or unavailable. It is also unlikely that animal experiments will be performed for all THD relevant chemicals in the near future for ethical, financial and practical reasons. In addition, typical animal experiments often do not provide information on the mechanism of action of THDC, making it harder to extrapolate results across species. Relevant effects may not be identified in animal studies when the effects are delayed, life stage specific, not assessed by the experimental paradigm (e.g., behaviour) or only occur when an organism has to adapt to environmental factors by modulating TH levels. Therefore, in vitro and in silico alternatives to identify THDC and quantify their potency are needed. THDC have many potential mechanisms of action, including altered hormone production, transport, metabolism, receptor activation and disruption of several feed-back mechanisms. In vitro assays are available for many of these endpoints, and the application of modern '-omics' technologies, applicable for in vivo studies can help to reveal relevant and possibly new endpoints for inclusion in a targeted THDC in vitro test battery. Within the framework of the ASAT initiative (Assuring Safety without Animal Testing), an international group consisting of experts in the areas of thyroid endocrinology, toxicology of endocrine disruption, neurotoxicology, high-throughput screening, computational biology, and regulatory affairs has reviewed the state of science for (1) known mechanisms for THD plus examples of THDC; (2) in vitro THD tests currently available or under development related to these mechanisms; and (3) in silico methods for estimating the blood levels of THDC. Based on this scientific review, the panel has recommended a battery of test methods to be able to classify chemicals as of less or high concern for further hazard and risk assessment for THD. In addition, research gaps and needs are identified to be able to optimize and validate the targeted THD in vitro test battery for a mechanism-based strategy for a decision to opt out or to proceed with further testing for THD.


Subject(s)
Endocrine Disruptors/toxicity , Thyroid Hormones/metabolism , Animals , Biological Assay , Humans , Models, Biological
17.
ALTEX ; 30(1): 51-6, 2013.
Article in English | MEDLINE | ID: mdl-23338806

ABSTRACT

In vitro high-throughput screening (HTS) assays are seeing increasing use in toxicity testing. HTS assays can simultaneously test many chemicals but have seen limited use in the regulatory arena, in part because of the need to undergo rigorous, time-consuming formal validation. Here we discuss streamlining the validation process, specifically for prioritization applications. By prioritization, we mean a process in which less complex, less expensive, and faster assays are used to prioritize which chemicals are subjected first to more complex, expensive, and slower guideline assays. Data from the HTS prioritization assays is intended to provide a priori evidence that certain chemicals have the potential to lead to the types of adverse effects that the guideline tests are assessing. The need for such prioritization approaches is driven by the fact that there are tens of thousands of chemicals to which people are exposed, but the yearly throughput of most guideline assays is small in comparison. The streamlined validation process would continue to ensure the reliability and relevance of assays for this application. We discuss the following practical guidelines: (1) follow current validation practice to the extent possible and practical; (2) make increased use of reference compounds to better demonstrate assay reliability and relevance; (3) de-emphasize the need for cross-laboratory testing; and (4) implement a web-based, transparent, and expedited peer review process.


Subject(s)
Animal Testing Alternatives/methods , Animal Testing Alternatives/standards , High-Throughput Screening Assays/methods , Toxicity Tests/methods , Toxicity Tests/standards , Animal Testing Alternatives/trends , Animals , High-Throughput Screening Assays/standards , High-Throughput Screening Assays/trends , Humans , Reproducibility of Results , Toxicity Tests/trends
18.
Toxicol Sci ; 131(1): 40-55, 2013 Jan.
Article in English | MEDLINE | ID: mdl-23024176

ABSTRACT

Thousands of untested chemicals in the environment require efficient characterization of carcinogenic potential in humans. A proposed solution is rapid testing of chemicals using in vitro high-throughput screening (HTS) assays for targets in pathways linked to disease processes to build models for priority setting and further testing. We describe a model for predicting rodent carcinogenicity based on HTS data from 292 chemicals tested in 672 assays mapping to 455 genes. All data come from the EPA ToxCast project. The model was trained on a subset of 232 chemicals with in vivo rodent carcinogenicity data in the Toxicity Reference Database (ToxRefDB). Individual HTS assays strongly associated with rodent cancers in ToxRefDB were linked to genes, pathways, and hallmark processes documented to be involved in tumor biology and cancer progression. Rodent liver cancer endpoints were linked to well-documented pathways such as peroxisome proliferator-activated receptor signaling and TP53 and novel targets such as PDE5A and PLAUR. Cancer hallmark genes associated with rodent thyroid tumors were found to be linked to human thyroid tumors and autoimmune thyroid disease. A model was developed in which these genes/pathways function as hypothetical enhancers or promoters of rat thyroid tumors, acting secondary to the key initiating event of thyroid hormone disruption. A simple scoring function was generated to identify chemicals with significant in vitro evidence that was predictive of in vivo carcinogenicity in different rat tissues and organs. This scoring function was applied to an external test set of 33 compounds with carcinogenicity classifications from the EPA's Office of Pesticide Programs and successfully (p = 0.024) differentiated between chemicals classified as "possible"/"probable"/"likely" carcinogens and those designated as "not likely" or with "evidence of noncarcinogenicity." This model represents a chemical carcinogenicity prioritization tool supporting targeted testing and functional validation of cancer pathways.


Subject(s)
Biomarkers, Tumor/analysis , Carcinogens/toxicity , High-Throughput Screening Assays , Neoplasms/chemically induced , Animals , Biomarkers, Tumor/genetics , Carcinogenicity Tests , Carcinogens/chemistry , Databases, Factual , Endpoint Determination , In Vitro Techniques , Mice , Neoplasms/genetics , Predictive Value of Tests , Rats , Species Specificity , Thyroid Neoplasms/chemically induced , Thyroid Neoplasms/genetics
20.
Chem Res Toxicol ; 25(7): 1287-302, 2012 Jul 16.
Article in English | MEDLINE | ID: mdl-22519603

ABSTRACT

The field of toxicology is on the cusp of a major transformation in how the safety and hazard of chemicals are evaluated for potential effects on human health and the environment. Brought on by the recognition of the limitations of the current paradigm in terms of cost, time, and throughput, combined with the ever increasing power of modern biological tools to probe mechanisms of chemical-biological interactions at finer and finer resolutions, 21st century toxicology is rapidly taking shape. A key element of the new approach is a focus on the molecular and cellular pathways that are the targets of chemical interactions. By understanding toxicity in this manner, we begin to learn how chemicals cause toxicity, as opposed to merely what diseases or health effects they might cause. This deeper understanding leads to increasing confidence in identifying which populations might be at risk, significant susceptibility factors, and key influences on the shape of the dose-response curve. The U. S. Environmental Protection Agency (EPA) initiated the ToxCast, or "toxicity forecaster", program 5 years ago to gain understanding of the strengths and limitations of the new approach by starting to test relatively large numbers (hundreds) of chemicals against an equally large number of biological assays. Using computational approaches, the EPA is building decision support tools based on ToxCast in vitro screening results to help prioritize chemicals for further investigation, as well as developing predictive models for a number of health outcomes. This perspective provides a summary of the initial, proof of concept, Phase I of ToxCast that has laid the groundwork for the next phases and future directions of the program.


Subject(s)
Environmental Pollutants/toxicity , Risk Management , Biological Assay , Decision Support Techniques , Environmental Pollutants/chemistry , Humans , Program Development , United States , United States Environmental Protection Agency
SELECTION OF CITATIONS
SEARCH DETAIL
...