Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 20
Filtrar
Mais filtros










Base de dados
Intervalo de ano de publicação
1.
Regul Toxicol Pharmacol ; : 105645, 2024 May 16.
Artigo em Inglês | MEDLINE | ID: mdl-38761967

RESUMO

ICH Q3A/B guidelines provide qualification thresholds for impurities or degradation products in new drug substances and products. However, the guidelines note that certain impurities / degradation products may warrant further safety evaluation for being unusually potent or toxic. The purpose of this study was to confirm that especially toxic non-mutagenic compounds are rare and to identify classes of compounds that could warrant lower qualification thresholds. A total of 2,815 compounds were evaluated, of which 2,213 were assessed as non-mutagenic. For the purpose of this analysis, compounds were considered potent when the point of departure was ≤ 0.2 mg/kg/day based on the qualification threshold (1 mg/day or 0.02 mg/kg/day for a 50 kg human) in a new drug substance, with an additional 10-fold margin. Only 54 of the entire set (2.4%) would be considered potent based on this conservative potency analysis, confirming that the existing ICH Q3A/B qualification thresholds are appropriate for the majority of impurities. If the Q3A/B threshold, without the additional 10-fold margin is used, 14 compounds (0.6%) are considered "highly potent". Very few non-mutagenic structural classes were identified, including organothiophosphates and derivatives, polychlorinated benzenes and polychlorinated polycyclic aliphatics, that correlate with potential high potency, consistent with prior publications.

2.
Regul Toxicol Pharmacol ; 150: 105632, 2024 Apr 27.
Artigo em Inglês | MEDLINE | ID: mdl-38679316

RESUMO

The replacement of a proportion of concurrent controls by virtual controls in nonclinical safety studies has gained traction over the last few years. This is supported by foundational work, encouraged by regulators, and aligned with societal expectations regarding the use of animals in research. This paper provides an overview of the points to consider for any institution on the verge of implementing this concept, with emphasis given on database creation, risks, and discipline-specific perspectives.

3.
ALTEX ; 41(2): 282-301, 2024.
Artigo em Inglês | MEDLINE | ID: mdl-38043132

RESUMO

Historical data from control groups in animal toxicity studies is currently mainly used for comparative purposes to assess validity and robustness of study results. Due to the highly controlled environment in which the studies are performed and the homogeneity of the animal collectives it has been proposed to use the historical data for building so-called virtual control groups, which could replace partly or entirely the concurrent control. This would constitute a substantial contribution to the reduction of animal use in safety studies. Before the concept can be implemented, the prerequisites regarding data collection, curation and statistical evaluation together with a validation strategy need to be identified to avoid any impairment of the study outcome and subsequent consequences for human risk assessment. To further assess and develop the concept of virtual control groups the transatlantic think tank for toxicology (t4) sponsored a workshop with stakeholders from the pharmaceutical and chemical industry, academia, FDA, pharmaceutical, contract research organizations (CROs), and non-governmental organizations in Washington, which took place in March 2023. This report summarizes the current efforts of a European initiative to share, collect and curate animal control data in a centralized database and the first approaches to identify optimal matching criteria between virtual controls and the treatment arms of a study as well as first reflections about strategies for a qualification procedure and potential pitfalls of the concept.


Animal safety studies are usually performed with three groups of animals where increasing amounts of the test chemical are given to the animals and one control group where the animals do not receive the test chemical. The design of such studies, the characteristics of the animals, and the measured parameters are often very similar from study to study. Therefore, it has been suggested that measurement data from the control groups could be reused from study to study to lower the total number of animals per study. This could reduce animal use by up to 25% for such standardized studies. A workshop was held to discuss the pros and cons of such a concept and what would have to be done to implement it without threatening the reliability of the study outcome or the resulting human risk assessment.


Assuntos
Pesquisa , Animais , Grupos Controle , Preparações Farmacêuticas
4.
Toxicol Ind Health ; 39(12): 687-699, 2023 Dec.
Artigo em Inglês | MEDLINE | ID: mdl-37860984

RESUMO

Acute oral toxicity (AOT) data inform the acute toxicity potential of a compound and guides occupational safety and transportation practices. AOT data enable the categorization of a chemical into the appropriate AOT Globally Harmonized System (GHS) category based on the severity of the hazard. AOT data are also utilized to identify compounds that are Dangerous Goods (DGs) and subsequent transportation guidance for shipping of these hazardous materials. Proper identification of DGs is challenging for novel compounds that lack data. It is not feasible to err on the side of caution for all compounds lacking AOT data and to designate them as DGs, as shipping a compound as a DG has cost, resource, and time implications. With the wealth of available historical AOT data, AOT testing approaches are evolving, and in silico AOT models are emerging as tools that can be utilized with confidence to assess the acute toxicity potential of de novo molecules. Such approaches align with the 3R principles, offering a reduction or even replacement of traditional in vivo testing methods and can also be leveraged for product stewardship purposes. Utilizing proprietary historical in vivo AOT data for 210 pharmaceutical compounds (PCs), we evaluated the performance of two established in silico AOT programs: the Leadscope AOT Model Suite and the Collaborative Acute Toxicity Modeling Suite. These models accurately identified 94% and 97% compounds that were not DGs (GHS categories 4, 5, and not classified (NC)) suggesting that the models are fit-for-purpose in identifying PCs with low acute oral toxicity potential (LD50 >300 mg/kg). Utilization of these models to identify compounds that are not DGs can enable them to be de-prioritized for in vivo testing. This manuscript provides a detailed evaluation and assessment of the two models and recommends the most suitable applications of such models.


Assuntos
Substâncias Perigosas , Testes de Toxicidade Aguda/métodos , Substâncias Perigosas/toxicidade , Simulação por Computador
5.
Comput Toxicol ; 212022 Feb.
Artigo em Inglês | MEDLINE | ID: mdl-35368849

RESUMO

Understanding the reliability and relevance of a toxicological assessment is important for gauging the overall confidence and communicating the degree of uncertainty related to it. The process involved in assessing reliability and relevance is well defined for experimental data. Similar criteria need to be established for in silico predictions, as they become increasingly more important to fill data gaps and need to be reasonably integrated as additional lines of evidence. Thus, in silico assessments could be communicated with greater confidence and in a more harmonized manner. The current work expands on previous definitions of reliability, relevance, and confidence and establishes a conceptional framework to apply those to in silico data. The approach is used in two case studies: 1) phthalic anhydride, where experimental data are readily available and 2) 4-hydroxy-3-propoxybenzaldehyde, a data poor case which relies predominantly on in silico methods, showing that reliability, relevance, and confidence of in silico assessments can be effectively communicated within Integrated approaches to testing and assessment (IATA).

6.
Int J Mol Sci ; 24(1)2022 Dec 30.
Artigo em Inglês | MEDLINE | ID: mdl-36614078

RESUMO

Due to challenges with historical data and the diversity of assay formats, in silico models for safety-related endpoints are often based on discretized data instead of the data on a natural continuous scale. Models for discretized endpoints have limitations in usage and interpretation that can impact compound design. Here, we present a consistent data inference approach, exemplified on two data sets of Ether-à-go-go-Related Gene (hERG) K+ inhibition data, for dose-response and screening experiments that are generally applicable for in vitro assays. hERG inhibition has been associated with severe cardiac effects and is one of the more prominent safety targets assessed in drug development, using a wide array of in vitro and in silico screening methods. In this study, the IC50 for hERG inhibition is estimated from diverse historical proprietary data. The IC50 derived from a two-point proprietary screening data set demonstrated high correlation (R = 0.98, MAE = 0.08) with IC50s derived from six-point dose-response curves. Similar IC50 estimation accuracy was obtained on a public thallium flux assay data set (R = 0.90, MAE = 0.2). The IC50 data were used to develop a robust quantitative model. The model's MAE (0.47) and R2 (0.46) were on par with literature statistics and approached assay reproducibility. Using a continuous model has high value for pharmaceutical projects, as it enables rank ordering of compounds and evaluation of compounds against project-specific inhibition thresholds. This data inference approach can be widely applicable to assays with quantitative readouts and has the potential to impact experimental design and improve model performance, interpretation, and acceptance across many standard safety endpoints.


Assuntos
Canais de Potássio Éter-A-Go-Go , Bloqueadores dos Canais de Potássio , Canais de Potássio Éter-A-Go-Go/genética , Reprodutibilidade dos Testes , Simulação por Computador , Bloqueadores dos Canais de Potássio/farmacologia
7.
Comput Toxicol ; 242022 Nov.
Artigo em Inglês | MEDLINE | ID: mdl-36818760

RESUMO

Acute toxicity in silico models are being used to support an increasing number of application areas including (1) product research and development, (2) product approval and registration as well as (3) the transport, storage and handling of chemicals. The adoption of such models is being hindered, in part, because of a lack of guidance describing how to perform and document an in silico analysis. To address this issue, a framework for an acute toxicity hazard assessment is proposed. This framework combines results from different sources including in silico methods and in vitro or in vivo experiments. In silico methods that can assist the prediction of in vivo outcomes (i.e., LD50) are analyzed concluding that predictions obtained using in silico approaches are now well-suited for reliably supporting assessment of LD50-based acute toxicity for the purpose of GHS classification. A general overview is provided of the endpoints from in vitro studies commonly evaluated for predicting acute toxicity (e.g., cytotoxicity/cytolethality as well as assays targeting specific mechanisms). The increased understanding of pathways and key triggering mechanisms underlying toxicity and the increased availability of in vitro data allow for a shift away from assessments solely based on endpoints such as LD50, to mechanism-based endpoints that can be accurately assessed in vitro or by using in silico prediction models. This paper also highlights the importance of an expert review of all available information using weight-of-evidence considerations and illustrates, using a series of diverse practical use cases, how in silico approaches support the assessment of acute toxicity.

8.
Comput Toxicol ; 202021 Nov.
Artigo em Inglês | MEDLINE | ID: mdl-35340402

RESUMO

Hepatotoxicity is one of the most frequently observed adverse effects resulting from exposure to a xenobiotic. For example, in pharmaceutical research and development it is one of the major reasons for drug withdrawals, clinical failures, and discontinuation of drug candidates. The development of faster and cheaper methods to assess hepatotoxicity that are both more sustainable and more informative is critically needed. The biological mechanisms and processes underpinning hepatotoxicity are summarized and experimental approaches to support the prediction of hepatotoxicity are described, including toxicokinetic considerations. The paper describes the increasingly important role of in silico approaches and highlights challenges to the adoption of these methods including the lack of a commonly agreed upon protocol for performing such an assessment and the need for in silico solutions that take dose into consideration. A proposed framework for the integration of in silico and experimental information is provided along with a case study describing how computational methods have been used to successfully respond to a regulatory question concerning non-genotoxic impurities in chemically synthesized pharmaceuticals.

9.
Comput Toxicol ; 202021 Nov.
Artigo em Inglês | MEDLINE | ID: mdl-35721273

RESUMO

The kidneys, heart and lungs are vital organ systems evaluated as part of acute or chronic toxicity assessments. New methodologies are being developed to predict these adverse effects based on in vitro and in silico approaches. This paper reviews the current state of the art in predicting these organ toxicities. It outlines the biological basis, processes and endpoints for kidney toxicity, pulmonary toxicity, respiratory irritation and sensitization as well as functional and structural cardiac toxicities. The review also covers current experimental approaches, including off-target panels from secondary pharmacology batteries. Current in silico approaches for prediction of these effects and mechanisms are described as well as obstacles to the use of in silico methods. Ultimately, a commonly accepted protocol for performing such assessment would be a valuable resource to expand the use of such approaches across different regulatory and industrial applications. However, a number of factors impede their widespread deployment including a lack of a comprehensive mechanistic understanding, limited in vitro testing approaches and limited in vivo databases suitable for modeling, a limited understanding of how to incorporate absorption, distribution, metabolism, and excretion (ADME) considerations into the overall process, a lack of in silico models designed to predict a safe dose and an accepted framework for organizing the key characteristics of these organ toxicants.

10.
Comput Toxicol ; 202021 Nov.
Artigo em Inglês | MEDLINE | ID: mdl-35368437

RESUMO

Historically, identifying carcinogens has relied primarily on tumor studies in rodents, which require enormous resources in both money and time. In silico models have been developed for predicting rodent carcinogens but have not yet found general regulatory acceptance, in part due to the lack of a generally accepted protocol for performing such an assessment as well as limitations in predictive performance and scope. There remains a need for additional, improved in silico carcinogenicity models, especially ones that are more human-relevant, for use in research and regulatory decision-making. As part of an international effort to develop in silico toxicological protocols, a consortium of toxicologists, computational scientists, and regulatory scientists across several industries and governmental agencies evaluated the extent to which in silico models exist for each of the recently defined 10 key characteristics (KCs) of carcinogens. This position paper summarizes the current status of in silico tools for the assessment of each KC and identifies the data gaps that need to be addressed before a comprehensive in silico carcinogenicity protocol can be developed for regulatory use.

11.
Regul Toxicol Pharmacol ; 116: 104688, 2020 Oct.
Artigo em Inglês | MEDLINE | ID: mdl-32621976

RESUMO

The assessment of skin sensitization has evolved over the past few years to include in vitro assessments of key events along the adverse outcome pathway and opportunistically capitalize on the strengths of in silico methods to support a weight of evidence assessment without conducting a test in animals. While in silico methods vary greatly in their purpose and format; there is a need to standardize the underlying principles on which such models are developed and to make transparent the implications for the uncertainty in the overall assessment. In this contribution, the relationship between skin sensitization relevant effects, mechanisms, and endpoints are built into a hazard assessment framework. Based on the relevance of the mechanisms and effects as well as the strengths and limitations of the experimental systems used to identify them, rules and principles are defined for deriving skin sensitization in silico assessments. Further, the assignments of reliability and confidence scores that reflect the overall strength of the assessment are discussed. This skin sensitization protocol supports the implementation and acceptance of in silico approaches for the prediction of skin sensitization.


Assuntos
Alérgenos/toxicidade , Haptenos/toxicidade , Medição de Risco/métodos , Alternativas aos Testes com Animais , Animais , Simulação por Computador , Células Dendríticas/efeitos dos fármacos , Dermatite de Contato/etiologia , Humanos , Queratinócitos/efeitos dos fármacos , Linfócitos/efeitos dos fármacos
12.
Regul Toxicol Pharmacol ; 107: 104403, 2019 Oct.
Artigo em Inglês | MEDLINE | ID: mdl-31195068

RESUMO

In silico toxicology (IST) approaches to rapidly assess chemical hazard, and usage of such methods is increasing in all applications but especially for regulatory submissions, such as for assessing chemicals under REACH as well as the ICH M7 guideline for drug impurities. There are a number of obstacles to performing an IST assessment, including uncertainty in how such an assessment and associated expert review should be performed or what is fit for purpose, as well as a lack of confidence that the results will be accepted by colleagues, collaborators and regulatory authorities. To address this, a project to develop a series of IST protocols for different hazard endpoints has been initiated and this paper describes the genetic toxicity in silico (GIST) protocol. The protocol outlines a hazard assessment framework including key effects/mechanisms and their relationships to endpoints such as gene mutation and clastogenicity. IST models and data are reviewed that support the assessment of these effects/mechanisms along with defined approaches for combining the information and evaluating the confidence in the assessment. This protocol has been developed through a consortium of toxicologists, computational scientists, and regulatory scientists across several industries to support the implementation and acceptance of in silico approaches.


Assuntos
Modelos Teóricos , Mutagênicos/toxicidade , Projetos de Pesquisa , Toxicologia/métodos , Animais , Simulação por Computador , Humanos , Testes de Mutagenicidade , Medição de Risco
13.
Mutagenesis ; 34(1): 67-82, 2019 03 06.
Artigo em Inglês | MEDLINE | ID: mdl-30189015

RESUMO

(Quantitative) structure-activity relationship or (Q)SAR predictions of DNA-reactive mutagenicity are important to support both the design of new chemicals and the assessment of impurities, degradants, metabolites, extractables and leachables, as well as existing chemicals. Aromatic N-oxides represent a class of compounds that are often considered alerting for mutagenicity yet the scientific rationale of this structural alert is not clear and has been questioned. Because aromatic N-oxide-containing compounds may be encountered as impurities, degradants and metabolites, it is important to accurately predict mutagenicity of this chemical class. This article analysed a series of publicly available aromatic N-oxide data in search of supporting information. The article also used a previously developed structure-activity relationship (SAR) fingerprint methodology where a series of aromatic N-oxide substructures was generated and matched against public and proprietary databases, including pharmaceutical data. An assessment of the number of mutagenic and non-mutagenic compounds matching each substructure across all sources was used to understand whether the general class or any specific subclasses appear to lead to mutagenicity. This analysis resulted in a downgrade of the general aromatic N-oxide alert. However, it was determined there were enough public and proprietary data to assign the quindioxin and related chemicals as well as benzo[c][1,2,5]oxadiazole 1-oxide subclasses as alerts. The overall results of this analysis were incorporated into Leadscope's expert-rule-based model to enhance its predictive accuracy.


Assuntos
Óxidos N-Cíclicos/química , Dano ao DNA/efeitos dos fármacos , Mutagênicos/química , Relação Quantitativa Estrutura-Atividade , Óxidos N-Cíclicos/toxicidade , Mutagênese/efeitos dos fármacos , Testes de Mutagenicidade , Mutagênicos/toxicidade
14.
Regul Toxicol Pharmacol ; 102: 53-64, 2019 Mar.
Artigo em Inglês | MEDLINE | ID: mdl-30562600

RESUMO

The International Council for Harmonization (ICH) M7 guideline describes a hazard assessment process for impurities that have the potential to be present in a drug substance or drug product. In the absence of adequate experimental bacterial mutagenicity data, (Q)SAR analysis may be used as a test to predict impurities' DNA reactive (mutagenic) potential. However, in certain situations, (Q)SAR software is unable to generate a positive or negative prediction either because of conflicting information or because the impurity is outside the applicability domain of the model. Such results present challenges in generating an overall mutagenicity prediction and highlight the importance of performing a thorough expert review. The following paper reviews pharmaceutical and regulatory experiences handling such situations. The paper also presents an analysis of proprietary data to help understand the likelihood of misclassifying a mutagenic impurity as non-mutagenic based on different combinations of (Q)SAR results. This information may be taken into consideration when supporting the (Q)SAR results with an expert review, especially when out-of-domain results are generated during a (Q)SAR evaluation.


Assuntos
Contaminação de Medicamentos , Guias como Assunto , Mutagênicos/classificação , Relação Quantitativa Estrutura-Atividade , Indústria Farmacêutica , Órgãos Governamentais , Mutagênicos/toxicidade , Medição de Risco
15.
Regul Toxicol Pharmacol ; 96: 1-17, 2018 Jul.
Artigo em Inglês | MEDLINE | ID: mdl-29678766

RESUMO

The present publication surveys several applications of in silico (i.e., computational) toxicology approaches across different industries and institutions. It highlights the need to develop standardized protocols when conducting toxicity-related predictions. This contribution articulates the information needed for protocols to support in silico predictions for major toxicological endpoints of concern (e.g., genetic toxicity, carcinogenicity, acute toxicity, reproductive toxicity, developmental toxicity) across several industries and regulatory bodies. Such novel in silico toxicology (IST) protocols, when fully developed and implemented, will ensure in silico toxicological assessments are performed and evaluated in a consistent, reproducible, and well-documented manner across industries and regulatory bodies to support wider uptake and acceptance of the approaches. The development of IST protocols is an initiative developed through a collaboration among an international consortium to reflect the state-of-the-art in in silico toxicology for hazard identification and characterization. A general outline for describing the development of such protocols is included and it is based on in silico predictions and/or available experimental data for a defined series of relevant toxicological effects or mechanisms. The publication presents a novel approach for determining the reliability of in silico predictions alongside experimental data. In addition, we discuss how to determine the level of confidence in the assessment based on the relevance and reliability of the information.


Assuntos
Simulação por Computador , Testes de Toxicidade/métodos , Toxicologia/métodos , Animais , Humanos
16.
Toxicol Sci ; 162(1): 287-300, 2018 03 01.
Artigo em Inglês | MEDLINE | ID: mdl-29155963

RESUMO

Over the past decades, pharmaceutical companies have conducted a large number of high-quality in vivo repeat-dose toxicity (RDT) studies for regulatory purposes. As part of the eTOX project, a high number of these studies have been compiled and integrated into a database. This valuable resource can be queried directly, but it can be further exploited to build predictive models. As the studies were originally conducted to investigate the properties of individual compounds, the experimental conditions across the studies are highly heterogeneous. Consequently, the original data required normalization/standardization, filtering, categorization and integration to make possible any data analysis (such as building predictive models). Additionally, the primary objectives of the RDT studies were to identify toxicological findings, most of which do not directly translate to in vivo endpoints. This article describes a method to extract datasets containing comparable toxicological properties for a series of compounds amenable for building predictive models. The proposed strategy starts with the normalization of the terms used within the original reports. Then, comparable datasets are extracted from the database by applying filters based on the experimental conditions. Finally, carefully selected profiles of toxicological findings are mapped to endpoints of interest, generating QSAR-like tables. In this work, we describe in detail the strategy and tools used for carrying out these transformations and illustrate its application in a data sample extracted from the eTOX database. The suitability of the resulting tables for developing hazard-predicting models was investigated by building proof-of-concept models for in vivo liver endpoints.


Assuntos
Bases de Dados Factuais , Avaliação Pré-Clínica de Medicamentos/métodos , Efeitos Colaterais e Reações Adversas Relacionados a Medicamentos , Determinação de Ponto Final , Modelos Teóricos , Testes de Toxicidade/métodos , Mineração de Dados , Avaliação Pré-Clínica de Medicamentos/normas , Avaliação Pré-Clínica de Medicamentos/estatística & dados numéricos , Previsões , Disseminação de Informação , Medição de Risco , Testes de Toxicidade/normas , Testes de Toxicidade/estatística & dados numéricos
17.
J Chem Inf Model ; 54(9): 2411-22, 2014 Sep 22.
Artigo em Inglês | MEDLINE | ID: mdl-25137615

RESUMO

Chemical structure data and corresponding measured bioactivities of compounds are nowadays easily available from public and commercial databases. However, these databases contain heterogeneous data from different laboratories determined under different protocols and, in addition, sometimes even erroneous entries. In this study, we evaluated the use of data from bioactivity databases for the generation of high quality in silico models for off-target mediated toxicity as a decision support in early drug discovery and crop-protection research. We chose human acetylcholinesterase (hAChE) inhibition as an exemplary end point for our case study. A standardized and thorough quality management routine for input data consisting of more than 2,200 chemical entities from bioactivity databases was established. This procedure finally enables the development of predictive QSAR models based on heterogeneous in vitro data from multiple laboratories. An extended applicability domain approach was used, and regression results were refined by an error estimation routine. Subsequent classification augmented by special consideration of borderline candidates leads to high accuracies in external validation achieving correct predictive classification of 96%. The standardized process described herein is implemented as a (semi)automated workflow and thus easily transferable to other off-targets and assay readouts.


Assuntos
Modelos Teóricos , Algoritmos , Inteligência Artificial , Simulação por Computador
18.
Bioorg Med Chem ; 20(18): 5352-65, 2012 Sep 15.
Artigo em Inglês | MEDLINE | ID: mdl-22560839

RESUMO

The pregnane X receptor (PXR), a member of the nuclear hormone superfamily, regulates the expression of several enzymes and transporters involved in metabolically relevant processes. The significant induction of CYP450 enzymes by PXR, in particular CYP3A4, might significantly alter the metabolism of prescribed drugs. In order to early identify molecules in drug discovery with a potential to activate PXR as antitarget, we developed fast and reliable in silico filters by ligand-based QSAR techniques. Two classification models were established on a diverse dataset of 434 drug-like molecules. A second augmented set allowed focusing on interesting regions in chemical space. These classifiers are based on decision trees combined with a genetic algorithm based variable selection to arrive at predictive models. The classifier for the first dataset on 29 descriptors showed good performance on a test set with a correct classification of both 100% for PXR activators and non-activators plus 87% for activators and 83% for non-activators in an external dataset. The second classifier then correctly predicts 97% activators and 91% non-activators in a test set and 94% for activators and 64% non-activators in an external set of 50 molecules, which still qualifies for application as a filter focusing on PXR activators. Finally a quantitative model for PXR activation for a subset of these molecules was derived using a regression-tree approach combined with GA variable selection. This final model shows a predictive r(2) of 0.774 for the test set and 0.452 for an external set of 33 molecules. Thus, the combination of these filters consistently provide guidelines for lowering PXR activation in novel candidate molecules.


Assuntos
Biologia Computacional , Descoberta de Drogas , Receptores de Esteroides/metabolismo , Bases de Dados de Produtos Farmacêuticos , Ligantes , Estrutura Molecular , Receptor de Pregnano X , Relação Quantitativa Estrutura-Atividade , Receptores de Esteroides/antagonistas & inibidores , Receptores de Esteroides/química
19.
Arch Toxicol ; 85(6): 555-63, 2011 Jun.
Artigo em Inglês | MEDLINE | ID: mdl-21046363

RESUMO

Our study was performed in the context of an in vitro primary hepatic cell culture as an alternative for the in vivo cancerogenic bioassay. The 29 substances which are to be used in the in vitro primary hepatic cell culture have been tested in 2-year bioassays and a 14-day short term study. The aim of this modelling study was to simulate the concentration--time profile of the compounds when given by the oral route at the doses tested in the previous studies taking into account the percentage of the dose absorbed. The model contained seven tissue compartments with uptake from the gastrointestinal tract into the portal vein. Because the primary hepatic cell culture is metabolically competent and the primary interest was to model the concentration in the portal vein, the hepatic vein and the systemic circulation (blood) in the beginning we did not include elimination. Partitioning between blood and tissues was calculated according to a published biologically based algorithm. The substances' kinetic profile differed according to their blood: tissue partitioning. Maximal concentrations in portal vein, hepatic vein and the blood depended mainly on the dose and the fraction absorbed which were the most critical parameters in this respect. Our study demonstrates an application of BPTK modelling for the purpose to simulate concentrations for planning the doses for an in vitro study. BPTK modelling seems to be a better approach than using data from in vitro studies on cytotoxicity.


Assuntos
Modelos Biológicos , Farmacocinética , Testes de Toxicidade , Alternativas aos Testes com Animais , Animais , Biomarcadores/sangue , Biotransformação , Carcinógenos/química , Carcinógenos/metabolismo , Carcinógenos/farmacocinética , Células Cultivadas , Fenômenos Químicos , Simulação por Computador , Hepatócitos/efeitos dos fármacos , Hepatócitos/metabolismo , Humanos , Mutagênicos/química , Mutagênicos/metabolismo , Mutagênicos/farmacocinética , Concentração Osmolar , Ratos , Distribuição Tecidual , Testes de Toxicidade/métodos , Testes de Toxicidade/normas
20.
J Am Chem Soc ; 131(17): 6096-8, 2009 May 06.
Artigo em Inglês | MEDLINE | ID: mdl-19364103

RESUMO

Isolated Co-phthalocyanine (CoPc) molecules were moved on a monolayer of CoPc on Cu(111) using an STM tip. If placed almost on top of another, the CoPc molecule in the second layer locks in place and the STM image at negative bias changes substantially. Density functional theory calculations explain the nature of the bonding mode and the change in STM.


Assuntos
Indóis/química , Compostos Organometálicos/síntese química , Cobalto/química , Simulação por Computador , Cobre , Dimerização , Isoindóis , Microscopia de Tunelamento , Modelos Químicos , Compostos Organometálicos/química , Teoria Quântica , Propriedades de Superfície
SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA
...