RESUMO
Exposure levels without appreciable human health risk may be determined by dividing a point of departure on a dose-response curve (e.g., benchmark dose) by a composite adjustment factor (AF). An "effect severity" AF (ESAF) is employed in some regulatory contexts. An ESAF of 10 may be incorporated in the derivation of a health-based guidance value (HBGV) when a "severe" toxicological endpoint, such as teratogenicity, irreversible reproductive effects, neurotoxicity, or cancer was observed in the reference study. Although mutation data have been used historically for hazard identification, this endpoint is suitable for quantitative dose-response modeling and risk assessment. As part of the 8th International Workshops on Genotoxicity Testing, a sub-group of the Quantitative Analysis Work Group (WG) explored how the concept of effect severity could be applied to mutation. To approach this question, the WG reviewed the prevailing regulatory guidance on how an ESAF is incorporated into risk assessments, evaluated current knowledge of associations between germline or somatic mutation and severe disease risk, and mined available data on the fraction of human germline mutations expected to cause severe disease. Based on this review and given that mutations are irreversible and some cause severe human disease, in regulatory settings where an ESAF is used, a majority of the WG recommends applying an ESAF value between 2 and 10 when deriving a HBGV from mutation data. This recommendation may need to be revisited in the future if direct measurement of disease-causing mutations by error-corrected next generation sequencing clarifies selection of ESAF values.
RESUMO
Lidocaine has not been associated with cancer in humans despite 8 decades of therapeutic use. Its metabolite, 2,6-xylidine, is a rat carcinogen, believed to induce genotoxicity via N-hydroxylation and DNA adduct formation, a non-threshold mechanism of action. To better understand this dichotomy, we review literature pertaining to metabolic activation and genotoxicity of 2,6-xylidine, identifying that it appears resistant to N-hydroxylation and instead metabolises almost exclusively to DMAP (an aminophenol). At high exposures (sufficient to saturate phase 2 metabolism), this may undergo metabolic threshold-dependent activation to a quinone-imine with potential to redox cycle producing ROS, inducing cytotoxicity and genotoxicity. A new rat study found no evidence of genotoxicity in vivo based on micronuclei in bone marrow, comets in nasal tissue or female liver, despite high level exposure to 2,6-xylidine (including metabolites). In male liver, weak dose-related comet increases, within the historical control range, were associated with metabolic overload and acute systemic toxicity. Benchmark dose analysis confirmed a non-linear dose response. The weight of evidence indicates 2,6-xylidine is a non-direct acting (metabolic threshold-dependent) genotoxin, and is not genotoxic in vivo in rats in the absence of acute systemic toxic effects, which occur at levels 35 × beyond lidocaine-related exposure in humans.
Assuntos
Compostos de Anilina/toxicidade , Mutagênicos/toxicidade , Ativação Metabólica , Anestésicos Locais/farmacocinética , Anestésicos Locais/toxicidade , Compostos de Anilina/farmacocinética , Animais , Humanos , Lidocaína/farmacocinética , Lidocaína/toxicidade , Testes de Mutagenicidade , Mutagênicos/farmacocinéticaRESUMO
The purpose of the present investigation is to analyze the in vivo genotoxicity dose-response data of ethylene oxide (EO) and the applicability of the derived point-of-departure (PoD) values when estimating permitted daily exposure (PDE) values. A total of 40 data sets were identified from the literature, and benchmark dose analyses were conducted using PROAST software to identify a PoD value. Studies employing the inhalation route of exposure and assessing gene or chromosomal mutations and chromosomal damage in various tissues were considered the most relevant for assessing risk from EO, since these effects are likely to contribute to adverse health consequences in exposed individuals. The PoD estimates were screened for precision and the values were divided by data-derived adjustment factors. For gene mutations, the lowest PDE was 285 parts per trillion (ppt) based on the induction of lacI mutations in the testes of mice following 48 weeks of exposure to EO. The corresponding lowest PDE value for chromosomal mutations was 1,175 ppt for heritable translocations in mice following 8.5 weeks of EO exposure. The lowest PDE for chromosomal aberrations was 238 ppt in the mouse peripheral blood lymphocytes following 48 weeks of inhalation exposure. The diverse dose-response data for EO-induced genotoxicity enabled the derivation of PoDs for various endpoints, tissues, and species and identified 238 ppt as the lowest PDE in this retrospective analysis.
Assuntos
Óxido de Etileno/toxicidade , Mutagênicos/toxicidade , Animais , Aberrações Cromossômicas/efeitos dos fármacos , Relação Dose-Resposta a Droga , Óxido de Etileno/administração & dosagem , Camundongos , Testes de Mutagenicidade , Mutagênicos/administração & dosagem , Mutação/efeitos dos fármacos , Ratos , Medição de Risco , Translocação Genética/efeitos dos fármacosRESUMO
We recently published a next generation framework for assessing the risk of genomic damage via exposure to chemical substances. The framework entails a systematic approach with the aim to quantify risk levels for substances that induce genomic damage contributing to human adverse health outcomes. Here, we evaluated the utility of the framework for assessing the risk for industrial chemicals, using the case of benzene. Benzene is a well-studied substance that is generally considered a genotoxic carcinogen and is known to cause leukemia. The case study limits its focus on occupational and general population health as it relates to benzene exposure. Using the framework as guidance, available data on benzene considered relevant for assessment of genetic damage were collected. Based on these data, we were able to conduct quantitative analyses for relevant data sets to estimate acceptable exposure levels and to characterize the risk of genetic damage. Key observations include the need for robust exposure assessments, the importance of information on toxicokinetic properties, and the benefits of cheminformatics. The framework points to the need for further improvement on understanding of the mechanism(s) of action involved, which would also provide support for the use of targeted tests rather than a prescribed set of assays. Overall, this case study demonstrates the utility of the next generation framework to quantitatively model human risk on the basis of genetic damage, thereby enabling a new, innovative risk assessment concept. Environ. Mol. Mutagen. 61:94-113, 2020. © 2019 The Authors. Environmental and Molecular Mutagenesis published by Wiley Periodicals, Inc. on behalf of Environmental Mutagen Society.
Assuntos
Benzeno/toxicidade , Carcinógenos/toxicidade , Mutagênese/efeitos dos fármacos , Mutagênicos/toxicidade , Animais , Benzeno/metabolismo , Carcinógenos/metabolismo , Dano ao DNA/efeitos dos fármacos , Exposição Ambiental/efeitos adversos , Humanos , Leucemia/induzido quimicamente , Leucemia/genética , Testes de Mutagenicidade/métodos , Mutagênicos/metabolismo , Exposição Ocupacional/efeitos adversos , Medição de Risco/métodosRESUMO
The screen-and-bin approach for interpretation of genotoxicity data is predicated on three false assumptions: that genotoxicants are rare, that genotoxicity dose-response functions do not contain a low-dose region mechanistically characterized by zero-order kinetics, and that genotoxicity is not a bona fide toxicological endpoint. Consequently, there is a need to develop and implement quantitative methods to interpret genotoxicity dose-response data for risk assessment and regulatory decision-making. Standardized methods to analyze dose-response data, and determine point-of-departure (PoD) metrics, have been established; the most robust PoD is the benchmark dose (BMD). However, there are no standards for regulatory interpretation of mutagenicity BMDs. Although 5-10% is often used as a critical effect size (CES) for BMD determination, values for genotoxicity endpoints have not been established. The use of BMDs to determine health-based guidance values (HBGVs) requires assessment factors (AFs) to account for interspecies differences and variability in human sensitivity. Default AFs used for other endpoints may not be appropriate for interpretation of in vivo mutagenicity BMDs. Analyses of published dose-response data showing the effects of compensatory pathway deficiency indicate that AFs for sensitivity differences should be in the range of 2-20. Additional analyses indicate that the AF to compensate for short treatment durations should be in the range of 5-15. Future work should use available data to empirically determine endpoint-specific CES values; similarly, to determine AF values for BMD adjustment. Future work should also evaluate the ability to use in vitro dose-response data for risk assessment, and the utility of probabilistic methods for determination of mutagenicity HBGVs. Environ. Mol. Mutagen. 61:66-83, 2020. © 2019 Her Majesty the Queen in Right of Canada.
Assuntos
Testes de Mutagenicidade/métodos , Mutagênicos/toxicidade , Animais , Relação Dose-Resposta a Droga , Humanos , Mutação/efeitos dos fármacos , Nível de Efeito Adverso não Observado , Medição de Risco/métodosRESUMO
For several decades, regulatory testing schemes for genetic damage have been standardized where the tests being utilized examined mutations and structural and numerical chromosomal damage. This has served the genetic toxicity community well when most of the substances being tested were amenable to such assays. The outcome from this testing is usually a dichotomous (yes/no) evaluation of test results, and in many instances, the information is only used to determine whether a substance has carcinogenic potential or not. Over the same time period, mechanisms and modes of action (MOAs) that elucidate a wider range of genomic damage involved in many adverse health outcomes have been recognized. In addition, a paradigm shift in applied genetic toxicology is moving the field toward a more quantitative dose-response analysis and point-of-departure (PoD) determination with a focus on risks to exposed humans. This is directing emphasis on genomic damage that is likely to induce changes associated with a variety of adverse health outcomes. This paradigm shift is moving the testing emphasis for genetic damage from a hazard identification only evaluation to a more comprehensive risk assessment approach that provides more insightful information for decision makers regarding the potential risk of genetic damage to exposed humans. To enable this broader context for examining genetic damage, a next generation testing strategy needs to take into account a broader, more flexible approach to testing, and ultimately modeling, of genomic damage as it relates to human exposure. This is consistent with the larger risk assessment context being used in regulatory decision making. As presented here, this flexible approach for examining genomic damage focuses on testing for relevant genomic effects that can be, as best as possible, associated with an adverse health effect. The most desired linkage for risk to humans would be changes in loci associated with human diseases, whether in somatic or germ cells. The outline of a flexible approach and associated considerations are presented in a series of nine steps, some of which can occur in parallel, which was developed through a collaborative effort by leading genetic toxicologists from academia, government, and industry through the International Life Sciences Institute (ILSI) Health and Environmental Sciences Institute (HESI) Genetic Toxicology Technical Committee (GTTC). The ultimate goal is to provide quantitative data to model the potential risk levels of substances, which induce genomic damage contributing to human adverse health outcomes. Any good risk assessment begins with asking the appropriate risk management questions in a planning and scoping effort. This step sets up the problem to be addressed (e.g., broadly, does genomic damage need to be addressed, and if so, how to proceed). The next two steps assemble what is known about the problem by building a knowledge base about the substance of concern and developing a rational biological argument for why testing for genomic damage is needed or not. By focusing on the risk management problem and potential genomic damage of concern, the next step of assay(s) selection takes place. The work-up of the problem during the earlier steps provides the insight to which assays would most likely produce the most meaningful data. This discussion does not detail the wide range of genomic damage tests available, but points to types of testing systems that can be very useful. Once the assays are performed and analyzed, the relevant data sets are selected for modeling potential risk. From this point on, the data are evaluated and modeled as they are for any other toxicology endpoint. Any observed genomic damage/effects (or genetic event(s)) can be modeled via a dose-response analysis and determination of an estimated PoD. When a quantitative risk analysis is needed for decision making, a parallel exposure assessment effort is performed (exposure assessment is not detailed here as this is not the focus of this discussion; guidelines for this assessment exist elsewhere). Then the PoD for genomic damage is used with the exposure information to develop risk estimations (e.g., using reference dose (RfD), margin of exposure (MOE) approaches) in a risk characterization and presented to risk managers for informing decision making. This approach is applicable now for incorporating genomic damage results into the decision-making process for assessing potential adverse outcomes in chemically exposed humans and is consistent with the ILSI HESI Risk Assessment in the 21st Century (RISK21) roadmap. This applies to any substance to which humans are exposed, including pharmaceuticals, agricultural products, food additives, and other chemicals. It is time for regulatory bodies to incorporate the broader knowledge and insights provided by genomic damage results into the assessments of risk to more fully understand the potential of adverse outcomes in chemically exposed humans, thus improving the assessment of risk due to genomic damage. The historical use of genomic damage data as a yes/no gateway for possible cancer risk has been too narrowly focused in risk assessment. The recent advances in assaying for and understanding genomic damage, including eventually epigenetic alterations, obviously add a greater wealth of information for determining potential risk to humans. Regulatory bodies need to embrace this paradigm shift from hazard identification to quantitative analysis and to incorporate the wider range of genomic damage in their assessments of risk to humans. The quantitative analyses and methodologies discussed here can be readily applied to genomic damage testing results now. Indeed, with the passage of the recent update to the Toxic Substances Control Act (TSCA) in the US, the new generation testing strategy for genomic damage described here provides a regulatory agency (here the US Environmental Protection Agency (EPA), but suitable for others) a golden opportunity to reexamine the way it addresses risk-based genomic damage testing (including hazard identification and exposure). Environ. Mol. Mutagen. 58:264-283, 2017. © 2016 The Authors. Environmental and Molecular Mutagenesis Published by Wiley Periodicals, Inc.
Assuntos
Genômica/métodos , Testes de Mutagenicidade/tendências , Animais , Saúde Ambiental , Humanos , Modelos Teóricos , Testes de Mutagenicidade/normas , Mutagênicos/toxicidade , Medição de RiscoRESUMO
Genotoxicity tests have traditionally been used only for hazard identification, with qualitative dichotomous groupings being used to identify compounds that have the capacity to induce mutations and/or cytogenetic alterations. However, there is an increasing interest in employing quantitative analysis of in vivo dose-response data to derive point of departure (PoD) metrics that can be used to establish human exposure limits or margins of exposure (MOEs), thereby supporting human health risk assessments and regulatory decisions. This work is an extension of our companion article on in vitro dose-response analyses and outlines how the combined benchmark dose (BMD) approach across included covariates can be used to improve the analyses and interpretation of in vivo genetic toxicity dose-response data. Using the BMD-covariate approach, we show that empirical comparisons of micronucleus frequency dose-response data across multiple studies justifies dataset merging, with subsequent analyses improving the precision of BMD estimates and permitting attendant potency ranking of seven clastogens. Similarly, empirical comparisons of Pig-a mutant phenotype frequency data collected in males and females justified dataset merging across sex. This permitted more effective scrutiny regarding the effect of post-exposure sampling time on the mutagenicity of N-ethyl-N-nitrosourea observed in reticulocytes and erythrocytes in the Pig-a assay. The BMD-covariate approach revealed tissue-specific differences in the induction of lacZ transgene mutations in Muta™Mouse specimens exposed to benzo[a]pyrene (BaP), with the results permitting the formulation of mechanistic hypotheses regarding the observed potency ranking. Lastly, we illustrate how historical dose-response data for assessments that examined numerous doses (i.e. induced lacZ mutant frequency (MF) across 10 doses of BaP) can be used to improve the precision of BMDs derived from datasets with far fewer doses (i.e. lacZ MF for 3 doses of dibenz[a,h]anthracene). Collectively, the presented examples illustrate how innovative use of the BMD approach can permit refinement of the use of in vivo data; improving the efficacy of experimental animal use in genetic toxicology without sacrificing PoD precision.
Assuntos
Dano ao DNA , Modelos Animais , Testes de Mutagenicidade/métodos , Mutagênicos/toxicidade , Animais , DNA/efeitos dos fármacos , Feminino , Genética , Humanos , Masculino , Modelos Biológicos , Mutagênicos/farmacologia , Mutação , Reticulócitos/efeitos dos fármacos , ToxicologiaRESUMO
Applied genetic toxicology is undergoing a transition from qualitative hazard identification to quantitative dose-response analysis and risk assessment. To facilitate this change, the Health and Environmental Sciences Institute (HESI) Genetic Toxicology Technical Committee (GTTC) sponsored a workshop held in Lancaster, UK on July 10-11, 2014. The event included invited speakers from several institutions and the contents was divided into three themes-1: Point-of-departure Metrics for Quantitative Dose-Response Analysis in Genetic Toxicology; 2: Measurement and Estimation of Exposures for Better Extrapolation to Humans and 3: The Use of Quantitative Approaches in Genetic Toxicology for human health risk assessment (HHRA). A host of pertinent issues were discussed relating to the use of in vitro and in vivo dose-response data, the development of methods for in vitro to in vivo extrapolation and approaches to use in vivo dose-response data to determine human exposure limits for regulatory evaluations and decision-making. This Special Issue, which was inspired by the workshop, contains a series of papers that collectively address topics related to the aforementioned themes. The Issue includes contributions that collectively evaluate, describe and discuss in silico, in vitro, in vivo and statistical approaches that are facilitating the shift from qualitative hazard evaluation to quantitative risk assessment. The use and application of the benchmark dose approach was a central theme in many of the workshop presentations and discussions, and the Special Issue includes several contributions that outline novel applications for the analysis and interpretation of genetic toxicity data. Although the contents of the Special Issue constitutes an important step towards the adoption of quantitative methods for regulatory assessment of genetic toxicity, formal acceptance of quantitative methods for HHRA and regulatory decision-making will require consensus regarding the relationships between genetic damage and disease, and the concomitant ability to use genetic toxicity results per se.
Assuntos
Testes de Mutagenicidade , Medição de Risco , Animais , Genética , Humanos , ToxicologiaRESUMO
This report summarizes the discussion, conclusions, and points of consensus of the IWGT Working Group on Quantitative Approaches to Genetic Toxicology Risk Assessment (QWG) based on a meeting in Foz do Iguaçu, Brazil October 31-November 2, 2013. Topics addressed included (1) the need for quantitative dose-response analysis, (2) methods to analyze exposure-response relationships & derive point of departure (PoD) metrics, (3) points of departure (PoD) and mechanistic threshold considerations, (4) approaches to define exposure-related risks, (5) empirical relationships between genetic damage (mutation) and cancer, and (6) extrapolations across test systems and species. This report discusses the first three of these topics and a companion report discusses the latter three. The working group critically examined methods for determining point of departure metrics (PoDs) that could be used to estimate low-dose risk of genetic damage and from which extrapolation to acceptable exposure levels could be made using appropriate mode of action information and uncertainty factors. These included benchmark doses (BMDs) derived from fitting families of exponential models, the No Observed Genotoxic Effect Level (NOGEL), and "threshold" or breakpoint dose (BPD) levels derived from bilinear models when mechanistic data supported this approach. The QWG recognizes that scientific evidence suggests that thresholds below which genotoxic effects do not occur likely exist for both DNA-reactive and DNA-nonreactive substances, but notes that small increments of the spontaneous level cannot be unequivocally excluded either by experimental measurement or by mathematical modeling. Therefore, rather than debating the theoretical possibility of such low-dose effects, emphasis should be placed on determination of PoDs from which acceptable exposure levels can be determined by extrapolation using available mechanistic information and appropriate uncertainty factors. This approach places the focus on minimization of the genotoxic risk, which protects against the risk of the development of diseases resulting from the genetic damage. Based on analysis of the strengths and weaknesses of each method, the QWG concluded that the order of preference of PoD metrics is the statistical lower bound on the BMD > the NOGEL > a statistical lower bound on the BPD. A companion report discusses the use of these metrics in genotoxicity risk assessment, including scaling and uncertainty factors to be considered when extrapolating below the PoD and/or across test systems and to the human.
Assuntos
DNA , Modelos Genéticos , Mutagênicos/análise , Mutagênicos/toxicidade , Mutação , Neoplasias , DNA/genética , DNA/metabolismo , Humanos , Testes de Mutagenicidade/métodos , Testes de Mutagenicidade/normas , Neoplasias/induzido quimicamente , Neoplasias/genética , Neoplasias/metabolismo , Neoplasias/patologia , Medição de RiscoRESUMO
This is the second of two reports from the International Workshops on Genotoxicity Testing (IWGT) Working Group on Quantitative Approaches to Genetic Toxicology Risk Assessment (the QWG). The first report summarized the discussions and recommendations of the QWG related to the need for quantitative dose-response analysis of genetic toxicology data, the existence and appropriate evaluation of threshold responses, and methods to analyze exposure-response relationships and derive points of departure (PoDs) from which acceptable exposure levels could be determined. This report summarizes the QWG discussions and recommendations regarding appropriate approaches to evaluate exposure-related risks of genotoxic damage, including extrapolation below identified PoDs and across test systems and species. Recommendations include the selection of appropriate genetic endpoints and target tissues, uncertainty factors and extrapolation methods to be considered, the importance and use of information on mode of action, toxicokinetics, metabolism, and exposure biomarkers when using quantitative exposure-response data to determine acceptable exposure levels in human populations or to assess the risk associated with known or anticipated exposures. The empirical relationship between genetic damage (mutation and chromosomal aberration) and cancer in animal models was also examined. It was concluded that there is a general correlation between cancer induction and mutagenic and/or clastogenic damage for agents thought to act via a genotoxic mechanism, but that the correlation is limited due to an inadequate number of cases in which mutation and cancer can be compared at a sufficient number of doses in the same target tissues of the same species and strain exposed under directly comparable routes and experimental protocols.