Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 13 de 13
Filtrar
1.
Regul Toxicol Pharmacol ; 149: 105615, 2024 May.
Artigo em Inglês | MEDLINE | ID: mdl-38555098

RESUMO

RIVM convened a workshop on the use of New Approach Methodologies (NAMs) for the ad hoc human health risk assessment of food and non-food products. Central to the workshop were two case studies of marketed products with a potential health concern: the botanical Tabernanthe iboga which is used to facilitate mental or spiritual insight or to (illegally) treat drug addiction and is associated with cardiotoxicity, and dermal creams containing female sex hormones, intended for use by perimenopausal women to reduce menopause symptoms without medical supervision. The workshop participants recognized that data from NAM approaches added valuable information for the ad hoc risk assessment of these products, although the available approaches were inadequate to derive health-based guidance values. Recommendations were provided on how to further enhance and implement NAM approaches in regulatory risk assessment, specifying both scientific and technical aspects as well as stakeholder engagement aspects.


Assuntos
Qualidade de Produtos para o Consumidor , Humanos , Medição de Risco
2.
Front Toxicol ; 4: 933197, 2022.
Artigo em Inglês | MEDLINE | ID: mdl-36199824

RESUMO

Next generation risk assessment is defined as a knowledge-driven system that allows for cost-efficient assessment of human health risk related to chemical exposure, without animal experimentation. One of the key features of next generation risk assessment is to facilitate prioritization of chemical substances that need a more extensive toxicological evaluation, in order to address the need to assess an increasing number of substances. In this case study focusing on chemicals in food, we explored how exposure data combined with the Threshold of Toxicological Concern (TTC) concept could be used to prioritize chemicals, both for existing substances and new substances entering the market. Using a database of existing chemicals relevant for dietary exposure we calculated exposure estimates, followed by application of the TTC concept to identify substances of higher concern. Subsequently, a selected set of these priority substances was screened for toxicological potential using high-throughput screening (HTS) approaches. Remarkably, this approach resulted in alerts for a selection of substances that are already on the market and represent relevant exposure in consumers. Taken together, the case study provides proof-of-principle for the approach taken to identify substances of concern, and this approach can therefore be considered a supportive element to a next generation risk assessment strategy.

3.
Chem Res Toxicol ; 34(2): 452-459, 2021 02 15.
Artigo em Inglês | MEDLINE | ID: mdl-33378166

RESUMO

Recently, we reported an in vitro toxicogenomics comparison approach to categorize chemical substances according to similarities in their proposed toxicological modes of action. Use of such an approach for regulatory purposes requires, among others, insight into the extent of biological concordance between in vitro and in vivo findings. To that end, we applied the comparison approach to transcriptomics data from the Open TG-GATEs database for 137 substances with diverging modes of action and evaluated the outcomes obtained for rat primary hepatocytes and for rat liver. The results showed that a relatively small number of matches observed in vitro were also observed in vivo, whereas quite a large number of matches between substances were found to be relevant solely in vivo or in vitro. The latter could not be explained by physicochemical properties, leading to insufficient bioavailability or poor water solubility. Nevertheless, pathway analyses indicated that for relevant matches the mechanisms perturbed in vitro are consistent with those perturbed in vivo. These findings support the utility of the comparison approach as tool in mechanism-based risk assessment.


Assuntos
Doença Hepática Induzida por Substâncias e Drogas/genética , Hepatócitos/metabolismo , Fígado/metabolismo , Compostos Orgânicos/toxicidade , Animais , Doença Hepática Induzida por Substâncias e Drogas/metabolismo , Bases de Dados Factuais , Bases de Dados Genéticas , Relação Dose-Resposta a Droga , Hepatócitos/efeitos dos fármacos , Fígado/efeitos dos fármacos , Compostos Orgânicos/administração & dosagem , Ratos , Medição de Risco , Transcriptoma
4.
Food Chem Toxicol ; 142: 111440, 2020 Aug.
Artigo em Inglês | MEDLINE | ID: mdl-32473292

RESUMO

Physiologically-based toxicokinetic (PBTK) models are important tools for in vitro to in vivo or inter-species extrapolations in health risk assessment of foodborne and non-foodborne chemicals. Here we present a generic PBTK model implemented in the EuroMix toolbox, MCRA 9 and predict internal kinetics of nine chemicals (three endocrine disrupters, three liver steatosis inducers, and three developmental toxicants), in data-rich and data-poor conditions, when increasingly complex levels of parametrization are applied. At the first stage, only QSAR models were used to determine substance-specific parameters, then some parameter values were refined by estimates from substance-specific or high-throughput in vitro experiments. At the last stage, elimination or absorption parameters were calibrated based on available in vivo kinetic data. The results illustrate that parametrization plays a capital role in the output of the PBTK model, as it can change how chemicals are prioritized based on internal concentration factors. In data-poor situations, estimates can be far from observed values. In many cases of chronic exposure, the PBTK model can be summarized by an external to internal dose factor, and interspecies concentration factors can be used to perform interspecies extrapolation. We finally discuss the implementation and use of the model in the MCRA risk assessment platform.


Assuntos
Substâncias Perigosas/toxicidade , Modelos Biológicos , Toxicocinética , Animais , Humanos , Probabilidade , Medição de Risco
6.
Food Chem Toxicol ; 138: 111223, 2020 Apr.
Artigo em Inglês | MEDLINE | ID: mdl-32088251

RESUMO

Mixtures of substances to which humans are exposed may lead to cumulative exposure and health effects. To study their effects, it is first necessary to identify a cumulative assessment group (CAG) of substances for risk assessment or hazard testing. Excluding substances from consideration before there is sufficient evidence may underestimate the risk. Conversely, including everything and treating the inevitable uncertainties using conservative assumptions is inefficient and may overestimate the risk, with an unknown level of protection. An efficient, transparent strategy is described to retain a large group, quantifying the uncertainty of group membership and other uncertainties. Iterative refinement of the CAG then focuses on adding information for the substances with high probability of contributing significantly to the risk. Probabilities can be estimated using expert opinion or derived from data on substance properties. An example is presented with 100 pesticides, in which the retain step identified a single substance to target refinement. Using an updated hazard characterisation for this substance reduced the mean exposure estimate from 0.43 to 0.28 µg kg-bw-1 day-1 and reduced the 99.99th percentile exposure from 24.9 to 5.1 µg kg-bw-1 day-1. Other retained substances contributed little to the risk estimates, even after accounting for uncertainty.


Assuntos
Contaminação de Alimentos/análise , Praguicidas/análise , Exposição Ambiental , Monitoramento Ambiental , Humanos , Medição de Risco , Incerteza
7.
Food Chem Toxicol ; 138: 111185, 2020 Apr.
Artigo em Inglês | MEDLINE | ID: mdl-32058012

RESUMO

A model and data toolbox is presented to assess risks from combined exposure to multiple chemicals using probabilistic methods. The Monte Carlo Risk Assessment (MCRA) toolbox, also known as the EuroMix toolbox, has more than 40 modules addressing all areas of risk assessment, and includes a data repository with data collected in the EuroMix project. This paper gives an introduction to the toolbox and illustrates its use with examples from the EuroMix project. The toolbox can be used for hazard identification, hazard characterisation, exposure assessment and risk characterisation. Examples for hazard identification are selection of substances relevant for a specific adverse outcome based on adverse outcome pathways and QSAR models. Examples for hazard characterisation are calculation of benchmark doses and relative potency factors with uncertainty from dose response data, and use of kinetic models to perform in vitro to in vivo extrapolation. Examples for exposure assessment are assessing cumulative exposure at external or internal level, where the latter option is needed when dietary and non-dietary routes have to be aggregated. Finally, risk characterisation is illustrated by calculation and display of the margin of exposure for single substances and for the cumulation, including uncertainties derived from exposure and hazard characterisation estimates.


Assuntos
Método de Monte Carlo , Medição de Risco , Rotas de Resultados Adversos , Animais , Benchmarking , Análise de Dados , Bases de Dados Factuais , Exposição Ambiental , Substâncias Perigosas , Humanos , Modelos Estatísticos , Nível de Efeito Adverso não Observado , Relação Quantitativa Estrutura-Atividade , Incerteza
8.
Chem Res Toxicol ; 33(3): 834-848, 2020 03 16.
Artigo em Inglês | MEDLINE | ID: mdl-32041405

RESUMO

The ongoing developments in chemical risk assessment have led to new concepts building on integration of sophisticated nonanimal models for hazard characterization. Here we explore a pragmatic approach for implementing such concepts, using a case study of three triazole fungicides, namely, flusilazole, propiconazole, and cyproconazole. The strategy applied starts with evaluating the overall level of concern by comparing exposure estimates to toxicological potential, followed by a combination of in silico tools and literature-derived high-throughput screening assays and computational elaborations to obtain insight into potential toxicological mechanisms and targets in the organism. Additionally, some targeted in vitro tests were evaluated for their utility to confirm suspected mechanisms of toxicity and to generate points of departure. Toxicological mechanisms instead of the current "end point-by-end point" approach should guide the selection of methods and assays that constitute a toolbox for next-generation risk assessment. Comparison of the obtained in silico and in vitro results with data from traditional in vivo testing revealed that, overall, nonanimal methods for hazard identification can produce adequate qualitative hazard information for risk assessment. Follow-up studies are needed to further refine the proposed approach, including the composition of the toolbox, toxicokinetics models, and models for exposure assessment.


Assuntos
Fungicidas Industriais/toxicidade , Ensaios de Triagem em Larga Escala , Silanos/toxicidade , Testes de Toxicidade , Triazóis/toxicidade , Humanos , Estrutura Molecular , Medição de Risco
9.
Crit Rev Toxicol ; 48(6): 500-511, 2018 07.
Artigo em Inglês | MEDLINE | ID: mdl-29745287

RESUMO

Non-genotoxic carcinogens (NGTXCs) do not cause direct DNA damage but induce cancer via other mechanisms. In risk assessment of chemicals and pharmaceuticals, carcinogenic risks are determined using carcinogenicity studies in rodents. With the aim to reduce animal testing, REACH legislation states that carcinogenicity studies are only allowed when specific concerns are present; risk assessment of compounds that are potentially carcinogenic by a non-genotoxic mode of action is usually based on subchronic toxicity studies. Health-based guidance values (HBGVs) of NGTXCs may therefore be based on data from carcinogenicity or subchronic toxicity studies depending on the legal framework that applies. HBGVs are usually derived from No-Observed-Adverse-Effect-Levels (NOAELs). Here, we investigate whether current risk assessment of NGTXCs based on NOAELs is protective against cancer. To answer this question, we estimated Benchmark doses (BMDs) for carcinogenicity data of 44 known NGTXCs. These BMDs were compared to the NOAELs derived from the same carcinogenicity studies, as well as to the NOAELs derived from the associated subchronic studies. The results lead to two main conclusions. First, a NOAEL derived from a subchronic study is similar to a NOAEL based on cancer effects from a carcinogenicity study, supporting the current practice in REACH. Second, both the subchronic and cancer NOAELs are, on average, associated with a cancer risk of around 1% in rodents. This implies that for those chemicals that are potentially carcinogenic in humans, current risk assessment of NGTXCs may not be completely protective against cancer. Our results call for a broader discussion within the scientific community, followed by discussions among risk assessors, policy makers, and other stakeholders as to whether or not the potential cancer risk levels that appear to be associated with currently derived HBGVs of NGXTCs are acceptable.


Assuntos
Testes de Carcinogenicidade/métodos , Carcinógenos/toxicidade , Neoplasias/induzido quimicamente , Animais , Testes de Carcinogenicidade/normas , Dano ao DNA , Feminino , Humanos , Masculino , Nível de Efeito Adverso não Observado , Medição de Risco/métodos , Medição de Risco/normas
10.
Environ Mol Mutagen ; 58(5): 264-283, 2017 06.
Artigo em Inglês | MEDLINE | ID: mdl-27650663

RESUMO

For several decades, regulatory testing schemes for genetic damage have been standardized where the tests being utilized examined mutations and structural and numerical chromosomal damage. This has served the genetic toxicity community well when most of the substances being tested were amenable to such assays. The outcome from this testing is usually a dichotomous (yes/no) evaluation of test results, and in many instances, the information is only used to determine whether a substance has carcinogenic potential or not. Over the same time period, mechanisms and modes of action (MOAs) that elucidate a wider range of genomic damage involved in many adverse health outcomes have been recognized. In addition, a paradigm shift in applied genetic toxicology is moving the field toward a more quantitative dose-response analysis and point-of-departure (PoD) determination with a focus on risks to exposed humans. This is directing emphasis on genomic damage that is likely to induce changes associated with a variety of adverse health outcomes. This paradigm shift is moving the testing emphasis for genetic damage from a hazard identification only evaluation to a more comprehensive risk assessment approach that provides more insightful information for decision makers regarding the potential risk of genetic damage to exposed humans. To enable this broader context for examining genetic damage, a next generation testing strategy needs to take into account a broader, more flexible approach to testing, and ultimately modeling, of genomic damage as it relates to human exposure. This is consistent with the larger risk assessment context being used in regulatory decision making. As presented here, this flexible approach for examining genomic damage focuses on testing for relevant genomic effects that can be, as best as possible, associated with an adverse health effect. The most desired linkage for risk to humans would be changes in loci associated with human diseases, whether in somatic or germ cells. The outline of a flexible approach and associated considerations are presented in a series of nine steps, some of which can occur in parallel, which was developed through a collaborative effort by leading genetic toxicologists from academia, government, and industry through the International Life Sciences Institute (ILSI) Health and Environmental Sciences Institute (HESI) Genetic Toxicology Technical Committee (GTTC). The ultimate goal is to provide quantitative data to model the potential risk levels of substances, which induce genomic damage contributing to human adverse health outcomes. Any good risk assessment begins with asking the appropriate risk management questions in a planning and scoping effort. This step sets up the problem to be addressed (e.g., broadly, does genomic damage need to be addressed, and if so, how to proceed). The next two steps assemble what is known about the problem by building a knowledge base about the substance of concern and developing a rational biological argument for why testing for genomic damage is needed or not. By focusing on the risk management problem and potential genomic damage of concern, the next step of assay(s) selection takes place. The work-up of the problem during the earlier steps provides the insight to which assays would most likely produce the most meaningful data. This discussion does not detail the wide range of genomic damage tests available, but points to types of testing systems that can be very useful. Once the assays are performed and analyzed, the relevant data sets are selected for modeling potential risk. From this point on, the data are evaluated and modeled as they are for any other toxicology endpoint. Any observed genomic damage/effects (or genetic event(s)) can be modeled via a dose-response analysis and determination of an estimated PoD. When a quantitative risk analysis is needed for decision making, a parallel exposure assessment effort is performed (exposure assessment is not detailed here as this is not the focus of this discussion; guidelines for this assessment exist elsewhere). Then the PoD for genomic damage is used with the exposure information to develop risk estimations (e.g., using reference dose (RfD), margin of exposure (MOE) approaches) in a risk characterization and presented to risk managers for informing decision making. This approach is applicable now for incorporating genomic damage results into the decision-making process for assessing potential adverse outcomes in chemically exposed humans and is consistent with the ILSI HESI Risk Assessment in the 21st Century (RISK21) roadmap. This applies to any substance to which humans are exposed, including pharmaceuticals, agricultural products, food additives, and other chemicals. It is time for regulatory bodies to incorporate the broader knowledge and insights provided by genomic damage results into the assessments of risk to more fully understand the potential of adverse outcomes in chemically exposed humans, thus improving the assessment of risk due to genomic damage. The historical use of genomic damage data as a yes/no gateway for possible cancer risk has been too narrowly focused in risk assessment. The recent advances in assaying for and understanding genomic damage, including eventually epigenetic alterations, obviously add a greater wealth of information for determining potential risk to humans. Regulatory bodies need to embrace this paradigm shift from hazard identification to quantitative analysis and to incorporate the wider range of genomic damage in their assessments of risk to humans. The quantitative analyses and methodologies discussed here can be readily applied to genomic damage testing results now. Indeed, with the passage of the recent update to the Toxic Substances Control Act (TSCA) in the US, the new generation testing strategy for genomic damage described here provides a regulatory agency (here the US Environmental Protection Agency (EPA), but suitable for others) a golden opportunity to reexamine the way it addresses risk-based genomic damage testing (including hazard identification and exposure). Environ. Mol. Mutagen. 58:264-283, 2017. © 2016 The Authors. Environmental and Molecular Mutagenesis Published by Wiley Periodicals, Inc.


Assuntos
Genômica/métodos , Testes de Mutagenicidade/tendências , Animais , Saúde Ambiental , Humanos , Modelos Teóricos , Testes de Mutagenicidade/normas , Mutagênicos/toxicidade , Medição de Risco
11.
Crit Rev Toxicol ; 44(10): 876-94, 2014 Nov.
Artigo em Inglês | MEDLINE | ID: mdl-25058877

RESUMO

Regulatory toxicology urgently needs applicable alternative test systems that reduce animal use, testing time, and cost. European regulation on cosmetic ingredients has already banned animal experimentation for hazard identification, and public awareness drives toward additional restrictions in other regulatory frameworks as well. In addition, scientific progress stimulates a more mechanistic approach of hazard identification. Nevertheless, the implementation of alternative methods is lagging far behind their development. In search for general bottlenecks for the implementation of alternative methods, this manuscript reviews the state of the art as to the development and implementation of 10 diverse test systems in various areas of toxicological hazard assessment. They vary widely in complexity and regulatory acceptance status. The assays are reviewed as to parameters assessed, biological system involved, standardization, interpretation of results, extrapolation to human hazard, position in testing strategies, and current regulatory acceptance status. Given the diversity of alternative methods in many aspects, no common bottlenecks could be identified that hamper implementation of individual alternative assays in general. However, specific issues for the regulatory acceptance and application were identified for each assay. Acceptance of one-in-one replacement of complex in vivo tests by relatively simple in vitro assays is not feasible. Rather, innovative approaches using test batteries are required together with metabolic information and in vitro to in vivo dose extrapolation to convincingly provide the same level of information of current in vivo tests. A mechanistically based alternative approach using the Adverse Outcome Pathway concept could stimulate further (regulatory) acceptance of non-animal tests.


Assuntos
Alternativas aos Testes com Animais/métodos , Substâncias Perigosas/toxicidade , Testes de Toxicidade/métodos , Animais , Modelos Animais de Doenças , Humanos , Medição de Risco
12.
Environ Toxicol Chem ; 33(2): 293-301, 2014 Feb.
Artigo em Inglês | MEDLINE | ID: mdl-24122976

RESUMO

Comparative toxicity potentials (CTPs) quantify the potential ecotoxicological impacts of chemicals per unit of emission. They are the product of a substance's environmental fate, exposure, and hazardous concentration. When empirical data are lacking, substance properties can be predicted. The goal of the present study was to assess the influence of predictive uncertainty in substance property predictions on the CTPs of triazoles. Physicochemical and toxic properties were predicted with quantitative structure-activity relationships (QSARs), and uncertainty in the predictions was quantified with use of the data underlying the QSARs. Degradation half-lives were based on a probability distribution representing experimental half-lives of triazoles. Uncertainty related to the species' sample size that was present in the prediction of the hazardous aquatic concentration was also included. All parameter uncertainties were treated as probability distributions, and propagated by Monte Carlo simulations. The 90% confidence interval of the CTPs typically spanned nearly 4 orders of magnitude. The CTP uncertainty was mainly determined by uncertainty in soil sorption and soil degradation rates, together with the small number of species sampled. In contrast, uncertainty in species-specific toxicity predictions contributed relatively little. The findings imply that the reliability of CTP predictions for the chemicals studied can be improved particularly by including experimental data for soil sorption and soil degradation, and by developing toxicity QSARs for more species.


Assuntos
Modelos Teóricos , Relação Quantitativa Estrutura-Atividade , Triazóis/toxicidade , Poluentes Químicos da Água/toxicidade , Adsorção , Animais , Clorófitas , Daphnia , Meia-Vida , Método de Monte Carlo , Oncorhynchus mykiss , Reprodutibilidade dos Testes , Medição de Risco/métodos , Tamanho da Amostra , Solo/química , Triazóis/química , Incerteza , Poluentes Químicos da Água/química
13.
Toxicol Lett ; 198(2): 255-62, 2010 Oct 05.
Artigo em Inglês | MEDLINE | ID: mdl-20633615

RESUMO

REACH requires all available (eco)toxicological information, whether protocol studies, other experiments, or non-testing approaches such as read-across or (Q)SAR, to be collected and evaluated. However, guidance documents only limitedly address how adequacy of (eco)toxicological information can be assessed consistently and transparently. We propose an Integrated Assessment Scheme (IAS) for the evaluation of (eco)toxicological data. The IAS consists of three modules: (i) the reliability of the data; (ii) the validity of the methods the data are generated from and; (iii) the regulatory need of the data. Each module is assessed and documented using adjusted OECD principles for the validation of (Q)SARs. These adjusted principles provide a harmonised set of criteria for the evaluation of all types of (eco)toxicological data. Assessment codes, similar to Klimisch codes, are assigned to the evaluated information in each module. The coherent combination of the assessment codes of all three modules determines the overall adequacy of information for fulfilling the information requirement in REACH, and can serve as a weight in a Weight of Evidence procedure as mentioned in REACH Annex XI.


Assuntos
Bases de Dados Factuais , Ecotoxicologia , Substâncias Perigosas , Animais , Ecotoxicologia/legislação & jurisprudência , Ecotoxicologia/métodos , Ecotoxicologia/normas , Determinação de Ponto Final , União Europeia , Regulamentação Governamental , Guias como Assunto , Substâncias Perigosas/classificação , Substâncias Perigosas/toxicidade , Relação Quantitativa Estrutura-Atividade , Reprodutibilidade dos Testes , Medição de Risco/legislação & jurisprudência , Medição de Risco/métodos , Medição de Risco/normas
SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA