Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 24
Filtrar
Mais filtros












Base de dados
Intervalo de ano de publicação
1.
Front Toxicol ; 6: 1370045, 2024.
Artigo em Inglês | MEDLINE | ID: mdl-38646442

RESUMO

The ICH S1B carcinogenicity global testing guideline has been recently revised with a novel addendum that describes a comprehensive integrated Weight of Evidence (WoE) approach to determine the need for a 2-year rat carcinogenicity study. In the present work, experts from different organizations have joined efforts to standardize as much as possible a procedural framework for the integration of evidence associated with the different ICH S1B(R1) WoE criteria. The framework uses a pragmatic consensus procedure for carcinogenicity hazard assessment to facilitate transparent, consistent, and documented decision-making and it discusses best-practices both for the organization of studies and presentation of data in a format suitable for regulatory review. First, it is acknowledged that the six WoE factors described in the addendum form an integrated network of evidence within a holistic assessment framework that is used synergistically to analyze and explain safety signals. Second, the proposed standardized procedure builds upon different considerations related to the primary sources of evidence, mechanistic analysis, alternative methodologies and novel investigative approaches, metabolites, and reliability of the data and other acquired information. Each of the six WoE factors is described highlighting how they can contribute evidence for the overall WoE assessment. A suggested reporting format to summarize the cross-integration of evidence from the different WoE factors is also presented. This work also notes that even if a 2-year rat study is ultimately required, creating a WoE assessment is valuable in understanding the specific factors and levels of human carcinogenic risk better than have been identified previously with the 2-year rat bioassay alone.

2.
Toxicol Rep ; 12: 215-223, 2024 Jun.
Artigo em Inglês | MEDLINE | ID: mdl-38322170

RESUMO

N-nitrosamines, a very heterogeneous class of chemicals, may enter humans in small amounts through various sources and are produced endogenously, too. Some are known to be mutagenic carcinogens and have recently been detected as impurities in several marketed pharmaceuticals. Despite their known mutagenic properties, the suitability of the bacterial reverse mutation (Ames) assay and in particular the use of induced rat liver S9 to detect their mutagenic potential, is often discussed. Recently, it could be demonstrated that induced rat liver S9 is capable of metabolizing small alkyl nitrosamines to exert their mutagenic potential (Bringezu & Simon, 2022). In this project, the mutagenic potential of nitrosamines in vitro under different S9 conditions applying the preincubation protocol and OECD 471-compliant standard Ames test recommendations was investigated. These conditions included various amounts of S9 fraction from hamster and rat, uninduced or induced with Aroclor 1254 or Phenobarbital/beta-Naphthoflavone (PB/NF). The findings indicated that in addition to induced S9, uninduced hamster S9 also demonstrated effectiveness. Moreover, both rat and hamster S9 fractions exhibited suitable responses in terms of mutation frequencies. Increasing the S9 content did not increase the sensitivity of the Ames test. However, above 20% S9, reduced mutation frequency was observed in the higher concentration range suggesting cytotoxicity to the bacteria. Thus, limiting the S9 content to 10% provides reliable results and relates to a lower number of animals required for S9 production which is in concordance with the 3R principles (reduce, refine, replace) for animal testing. In addition, results obtained show that uninduced and induced hamster S9 are similarly effective, doubting the requirement of pretreating animals with enzyme inducers. Further investigations to compare mutagenicity data and rat and hamster S9 proteome analyses are ongoing.

3.
Regul Toxicol Pharmacol ; 148: 105583, 2024 Mar.
Artigo em Inglês | MEDLINE | ID: mdl-38401761

RESUMO

The alkaline comet assay is frequently used as in vivo follow-up test within different regulatory environments to characterize the DNA-damaging potential of different test items. The corresponding OECD Test guideline 489 highlights the importance of statistical analyses and historical control data (HCD) but does not provide detailed procedures. Therefore, the working group "Statistics" of the German-speaking Society for Environmental Mutation Research (GUM) collected HCD from five laboratories and >200 comet assay studies and performed several statistical analyses. Key results included that (I) observed large inter-laboratory effects argue against the use of absolute quality thresholds, (II) > 50% zero values on a slide are considered problematic, due to their influence on slide or animal summary statistics, (III) the type of summarizing measure for single-cell data (e.g., median, arithmetic and geometric mean) may lead to extreme differences in resulting animal tail intensities and study outcome in the HCD. These summarizing values increase the reliability of analysis results by better meeting statistical model assumptions, but at the cost of information loss. Furthermore, the relation between negative and positive control groups in the data set was always satisfactorily (or sufficiently) based on ratio, difference and quantile analyses.


Assuntos
Dano ao DNA , Projetos de Pesquisa , Animais , Ensaio Cometa/métodos , Reprodutibilidade dos Testes , Mutação
4.
ALTEX ; 41(2): 282-301, 2024.
Artigo em Inglês | MEDLINE | ID: mdl-38043132

RESUMO

Historical data from control groups in animal toxicity studies is currently mainly used for comparative purposes to assess validity and robustness of study results. Due to the highly controlled environment in which the studies are performed and the homogeneity of the animal collectives it has been proposed to use the historical data for building so-called virtual control groups, which could replace partly or entirely the concurrent control. This would constitute a substantial contribution to the reduction of animal use in safety studies. Before the concept can be implemented, the prerequisites regarding data collection, curation and statistical evaluation together with a validation strategy need to be identified to avoid any impairment of the study outcome and subsequent consequences for human risk assessment. To further assess and develop the concept of virtual control groups the transatlantic think tank for toxicology (t4) sponsored a workshop with stakeholders from the pharmaceutical and chemical industry, academia, FDA, pharmaceutical, contract research organizations (CROs), and non-governmental organizations in Washington, which took place in March 2023. This report summarizes the current efforts of a European initiative to share, collect and curate animal control data in a centralized database and the first approaches to identify optimal matching criteria between virtual controls and the treatment arms of a study as well as first reflections about strategies for a qualification procedure and potential pitfalls of the concept.


Animal safety studies are usually performed with three groups of animals where increasing amounts of the test chemical are given to the animals and one control group where the animals do not receive the test chemical. The design of such studies, the characteristics of the animals, and the measured parameters are often very similar from study to study. Therefore, it has been suggested that measurement data from the control groups could be reused from study to study to lower the total number of animals per study. This could reduce animal use by up to 25% for such standardized studies. A workshop was held to discuss the pros and cons of such a concept and what would have to be done to implement it without threatening the reliability of the study outcome or the resulting human risk assessment.


Assuntos
Pesquisa , Animais , Grupos Controle , Preparações Farmacêuticas
6.
Comput Toxicol ; 212022 Feb.
Artigo em Inglês | MEDLINE | ID: mdl-35368849

RESUMO

Understanding the reliability and relevance of a toxicological assessment is important for gauging the overall confidence and communicating the degree of uncertainty related to it. The process involved in assessing reliability and relevance is well defined for experimental data. Similar criteria need to be established for in silico predictions, as they become increasingly more important to fill data gaps and need to be reasonably integrated as additional lines of evidence. Thus, in silico assessments could be communicated with greater confidence and in a more harmonized manner. The current work expands on previous definitions of reliability, relevance, and confidence and establishes a conceptional framework to apply those to in silico data. The approach is used in two case studies: 1) phthalic anhydride, where experimental data are readily available and 2) 4-hydroxy-3-propoxybenzaldehyde, a data poor case which relies predominantly on in silico methods, showing that reliability, relevance, and confidence of in silico assessments can be effectively communicated within Integrated approaches to testing and assessment (IATA).

7.
Toxicol Rep ; 9: 250-255, 2022.
Artigo em Inglês | MEDLINE | ID: mdl-35198408

RESUMO

Humans are exposed to low levels of N-nitrosamines via different sources. N-Nitrosamines have recently been detected as impurities in various marketed drugs and they are known mutagenic carcinogens belonging to the cohort of concern as referred to in the ICH M7 guideline. Despite their well-known mutagenic properties, there is ongoing discussion on the suitability of the bacterial reverse mutation assay and using induced rat liver S9 as the external source of metabolism to detect their mutagenic potential. Therefore, we have investigated the mutagenic potential of N-nitrosodimethylamine, N-nitrosodiethylamine, N-nitrosodipropylamine, and N-nitrosodibutylamine in vitro under various conditions. Our work showed that the bacterial reverse mutation assay applying plate incorporation or preincubation protocols and using Salmonella typhimurium strains TA100 and TA1535 and E. coli WP2 uvrA is suitable to predict the mutagenicity of n-nitrosamines in the presence of phenobarbital/ß-naphthoflavone induced rat liver S9.

8.
Methods Mol Biol ; 2425: 119-131, 2022.
Artigo em Inglês | MEDLINE | ID: mdl-35188630

RESUMO

The pharmaceutical industry would benefit from the collaboration with academic groups in the development of predictive safety models using the newest computational technologies. However, this collaboration is sometimes hampered by the handling of confidential proprietary information and different working practices in both environments. In this manuscript, we propose a strategy for facilitating this collaboration, based on the use of modeling frameworks developed for facilitating the use of sensitive data, as well as the development, interchange, hosting, and use of predictive models in production. The strategy is illustrated with a real example in which we used Flame, an open-source modeling framework developed in our group, for the development of an in silico eye irritation model. The model was based on bibliographic data, refined during the company-academic group collaboration, and enriched with the incorporation of confidential data, yielding a useful model that was validated experimentally.


Assuntos
Indústria Farmacêutica , Simulação por Computador
9.
Pharmaceuticals (Basel) ; 14(3)2021 Mar 08.
Artigo em Inglês | MEDLINE | ID: mdl-33800393

RESUMO

eTRANSAFE is a research project funded within the Innovative Medicines Initiative (IMI), which aims at developing integrated databases and computational tools (the eTRANSAFE ToxHub) that support the translational safety assessment of new drugs by using legacy data provided by the pharmaceutical companies that participate in the project. The project objectives include the development of databases containing preclinical and clinical data, computational systems for translational analysis including tools for data query, analysis and visualization, as well as computational models to explain and predict drug safety events.

10.
Comput Toxicol ; 202021 Nov.
Artigo em Inglês | MEDLINE | ID: mdl-35368437

RESUMO

Historically, identifying carcinogens has relied primarily on tumor studies in rodents, which require enormous resources in both money and time. In silico models have been developed for predicting rodent carcinogens but have not yet found general regulatory acceptance, in part due to the lack of a generally accepted protocol for performing such an assessment as well as limitations in predictive performance and scope. There remains a need for additional, improved in silico carcinogenicity models, especially ones that are more human-relevant, for use in research and regulatory decision-making. As part of an international effort to develop in silico toxicological protocols, a consortium of toxicologists, computational scientists, and regulatory scientists across several industries and governmental agencies evaluated the extent to which in silico models exist for each of the recently defined 10 key characteristics (KCs) of carcinogens. This position paper summarizes the current status of in silico tools for the assessment of each KC and identifies the data gaps that need to be addressed before a comprehensive in silico carcinogenicity protocol can be developed for regulatory use.

SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA
...