Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 23
Filtrar
Mais filtros

Bases de dados
Tipo de documento
Intervalo de ano de publicação
1.
Comput Toxicol ; 24: 1-11, 2022 Nov.
Artigo em Inglês | MEDLINE | ID: mdl-36405647

RESUMO

The Threshold of Toxicological Concern (TTC) is a pragmatic approach used to establish safe thresholds below which there can be no appreciable risk to human health. Here, a large inventory of ~45,000 substances (referred to as the LRI dataset) was profiled through the Kroes TTC decision module within Toxtree v3.1 to assign substances into their respective TTC categories. Four thousand and two substances were found to be not applicable for the TTC approach. However, closer examination of these substances uncovered several implementation issues: substances represented in their salt forms were automatically assigned as not appropriate for TTC when many of these contained essential metals as counter ions which would render them TTC applicable. High Potency Carcinogens and dioxin-like substances were not fully captured based on the rules currently implemented in the software. Phosphorus containing substances were considered exclusions when many of them would be appropriate for TTC. Refinements were proposed to address the limitations in the current software implementation. A second component of the study explored a set of substances representative of those released from medical devices and compared them to the LRI dataset as well as other toxicity datasets to investigate their structural similarity. A third component of the study sought to extend the exclusion rules to address application to substances released from medical devices that lack toxicity data. The refined rules were then applied to this dataset and the TTC assignments were compared. This case study demonstrated the importance of evaluating the software implementation of an established TTC workflow, identified certain limitations and explored potential refinements when applying these concepts to medical devices.

2.
Comput Toxicol ; 242022 Nov.
Artigo em Inglês | MEDLINE | ID: mdl-36818760

RESUMO

Acute toxicity in silico models are being used to support an increasing number of application areas including (1) product research and development, (2) product approval and registration as well as (3) the transport, storage and handling of chemicals. The adoption of such models is being hindered, in part, because of a lack of guidance describing how to perform and document an in silico analysis. To address this issue, a framework for an acute toxicity hazard assessment is proposed. This framework combines results from different sources including in silico methods and in vitro or in vivo experiments. In silico methods that can assist the prediction of in vivo outcomes (i.e., LD50) are analyzed concluding that predictions obtained using in silico approaches are now well-suited for reliably supporting assessment of LD50-based acute toxicity for the purpose of GHS classification. A general overview is provided of the endpoints from in vitro studies commonly evaluated for predicting acute toxicity (e.g., cytotoxicity/cytolethality as well as assays targeting specific mechanisms). The increased understanding of pathways and key triggering mechanisms underlying toxicity and the increased availability of in vitro data allow for a shift away from assessments solely based on endpoints such as LD50, to mechanism-based endpoints that can be accurately assessed in vitro or by using in silico prediction models. This paper also highlights the importance of an expert review of all available information using weight-of-evidence considerations and illustrates, using a series of diverse practical use cases, how in silico approaches support the assessment of acute toxicity.

4.
ALTEX ; 38(1): 140-150, 2021.
Artigo em Inglês | MEDLINE | ID: mdl-33452529

RESUMO

The use of new approach methodologies (NAMs) in support of read-across (RAx) approaches for regulatory purposes is a main goal of the EU-ToxRisk project. To bring this forward, EU-ToxRisk partners convened a workshop in close collaboration with regulatory representatives from key organizations including European regulatory agencies, such as the European Chemicals Agency (ECHA) and the European Food Safety Authority (EFSA), as well as the Scientific Committee on Consumer Safety (SCCS), national agencies from several European countries, Japan, Canada and the USA, as well as the Organisation for Economic Cooperation and Development (OECD). More than a hundred people actively participated in the discussions, bringing together diverse viewpoints across academia, regulators and industry. The discussion was organized starting from five practical cases of RAx applied to specific problems that offered the oppor-tunity to consider real examples. There was general consensus that NAMs can improve confidence in RAx, in particular in defining category boundaries as well as characterizing the similarities/dissimilarities between source and target substances. In addition to describing dynamics, NAMs can be helpful in terms of kinetics and metabolism that may play an important role in the demonstration of similarity or dissimilarity among the members of a category. NAMs were also noted as effective in providing quanti-tative data correlated with traditional no observed adverse effect levels (NOAELs) used in risk assessment, while reducing the uncertainty on the final conclusion. An interesting point of view was the advice on calibrating the number of new tests that should be carefully selected, avoiding the allure of "the more, the better". Unfortunately, yet unsurprisingly, there was no single approach befitting every case, requiring careful analysis delineating the optimal approach. Expert analysis and assessment of each specific case is still an important step in the process.


Assuntos
Alternativas aos Testes com Animais/métodos , Análise de Dados , Relação Estrutura-Atividade , Testes de Toxicidade/métodos , Animais , Simulação por Computador , União Europeia , Humanos , Legislação de Medicamentos , Nível de Efeito Adverso não Observado , Organização para a Cooperação e Desenvolvimento Econômico , Medição de Risco/métodos
5.
J Clin Epidemiol ; 129: 138-150, 2021 01.
Artigo em Inglês | MEDLINE | ID: mdl-32980429

RESUMO

OBJECTIVES: The objective of the study is to present the Grading of Recommendations Assessment, Development, and Evaluation (GRADE) conceptual approach to the assessment of certainty of evidence from modeling studies (i.e., certainty associated with model outputs). STUDY DESIGN AND SETTING: Expert consultations and an international multidisciplinary workshop informed development of a conceptual approach to assessing the certainty of evidence from models within the context of systematic reviews, health technology assessments, and health care decisions. The discussions also clarified selected concepts and terminology used in the GRADE approach and by the modeling community. Feedback from experts in a broad range of modeling and health care disciplines addressed the content validity of the approach. RESULTS: Workshop participants agreed that the domains determining the certainty of evidence previously identified in the GRADE approach (risk of bias, indirectness, inconsistency, imprecision, reporting bias, magnitude of an effect, dose-response relation, and the direction of residual confounding) also apply when assessing the certainty of evidence from models. The assessment depends on the nature of model inputs and the model itself and on whether one is evaluating evidence from a single model or multiple models. We propose a framework for selecting the best available evidence from models: 1) developing de novo, a model specific to the situation of interest, 2) identifying an existing model, the outputs of which provide the highest certainty evidence for the situation of interest, either "off-the-shelf" or after adaptation, and 3) using outputs from multiple models. We also present a summary of preferred terminology to facilitate communication among modeling and health care disciplines. CONCLUSION: This conceptual GRADE approach provides a framework for using evidence from models in health decision-making and the assessment of certainty of evidence from a model or models. The GRADE Working Group and the modeling community are currently developing the detailed methods and related guidance for assessing specific domains determining the certainty of evidence from models across health care-related disciplines (e.g., therapeutic decision-making, toxicology, environmental health, and health economics).


Assuntos
Abordagem GRADE , Revisões Sistemáticas como Assunto/normas , Tomada de Decisão Clínica/métodos , Medicina Baseada em Evidências/métodos , Medicina Baseada em Evidências/normas , Humanos , Comunicação Interdisciplinar , Competência Profissional/normas , Viés de Publicação , Avaliação da Tecnologia Biomédica/métodos , Avaliação da Tecnologia Biomédica/organização & administração
6.
Comput Toxicol ; 162020 Nov 01.
Artigo em Inglês | MEDLINE | ID: mdl-34124416

RESUMO

The toxicokinetic (TK) parameters fraction of the chemical unbound to plasma proteins and metabolic clearance are critical for relating exposure and internal dose when building in vitro-based risk assessment models. However, experimental toxicokinetic studies have only been carried out on limited chemicals of environmental interest (~1000 chemicals with TK data relative to tens of thousands of chemicals of interest). This work evaluated the utility of chemical structure information to predict TK parameters in silico; development of cluster-based read-across and quantitative structure-activity relationship models of fraction unbound or fub (regression) and intrinsic clearance or Clint (classification and regression) using a dataset of 1487 chemicals; utilization of predicted TK parameters to estimate uncertainty in steady-state plasma concentration (Css); and subsequent in vitro-in vivo extrapolation analyses to derive bioactivity-exposure ratio (BER) plot to compare human oral equivalent doses and exposure predictions using androgen and estrogen receptor activity data for 233 chemicals as an example dataset. The results demonstrate that fub is structurally more predictable than Clint. The model with the highest observed performance for fub had an external test set RMSE/σ=0.62 and R2=0.61, for Clint classification had an external test set accuracy = 65.9%, and for intrinsic clearance regression had an external test set RMSE/σ=0.90 and R2=0.20. This relatively low performance is in part due to the large uncertainty in the underlying Clint data. We show that Css is relatively insensitive to uncertainty in Clint. The models were benchmarked against the ADMET Predictor software. Finally, the BER analysis allowed identification of 14 out of 136 chemicals for further risk assessment demonstrating the utility of these models in aiding risk-based chemical prioritization.

7.
Regul Toxicol Pharmacol ; 106: 278-291, 2019 Aug.
Artigo em Inglês | MEDLINE | ID: mdl-31121201

RESUMO

Traditional approaches for chemical risk assessment cannot keep pace with the number of substances requiring assessment. Thus, in a global effort to expedite and modernize chemical risk assessment, New Approach Methodologies (NAMs) are being explored and developed. Included in this effort is the OECD Integrated Approaches for Testing and Assessment (IATA) program, which provides a forum for OECD member countries to develop and present case studies illustrating the application of NAM in various risk assessment contexts. Here, we present an IATA case study for the prediction of estrogenic potential of three target phenols: 4-tert-butylphenol, 2,4-di-tert-butylphenol and octabenzone. Key features of this IATA include the use of two computational approaches for analogue selection for read-across, data collected from traditional and NAM sources, and a workflow to generate predictions regarding the targets' ability to bind the estrogen receptor (ER). Endocrine disruption can occur when a chemical substance mimics the activity of natural estrogen by binding to the ER and, if potency and exposure are sufficient, alters the function of the endocrine system to cause adverse effects. The data indicated that of the three target substances that were considered herein, 4-tert-butylphenol is a potential endocrine disruptor. Further, this IATA illustrates that the NAM approach explored is health protective when compared to in vivo endpoints traditionally used for human health risk assessment.


Assuntos
Benzofenonas/farmacologia , Fenóis/farmacologia , Receptores de Estrogênio/metabolismo , Benzofenonas/química , Humanos , Estrutura Molecular , Fenóis/química , Medição de Risco
8.
Regul Toxicol Pharmacol ; 101: 12-23, 2019 Feb.
Artigo em Inglês | MEDLINE | ID: mdl-30359698

RESUMO

The application of toxic equivalency factors (TEFs) or toxic units to estimate toxic potencies for mixtures of chemicals which contribute to a biological effect through a common mechanism is one approach for filling data gaps. Toxic Equivalents (TEQ) have been used to express the toxicity of dioxin-like compounds (i.e., dioxins, furans, and dioxin-like polychlorinated biphenyls (PCBs)) in terms of the most toxic form of dioxin: 2,3,7,8-tetrachlorodibenzo-p-dioxin (2,3,7,8-TCDD). This study sought to integrate two data gap filling techniques, quantitative structure-activity relationships (QSARs) and TEFs, to predict neurotoxicity TEQs for PCBs. Simon et al. (2007) previously derived neurotoxic equivalent (NEQ) values for a dataset of 87 PCB congeners, of which 83 congeners had experimental data. These data were taken from a set of four different studies measuring different effects related to neurotoxicity, each of which tested overlapping subsets of the 83 PCB congeners. The goals of the current study were to: (i) evaluate an alternative neurotoxic equivalent factor (NEF) derivations from an expanded dataset, relative to those derived by Simon et al. and (ii) develop QSAR models to provide NEF estimates for the large number of untested PCB congeners. The models used multiple linear regression, support vector regression, k-nearest neighbor and random forest algorithms within a 5-fold cross validation scheme and position-specific chlorine substitution patterns on the biphenyl scaffold as descriptors. Alternative NEF values were derived but the resulting QSAR models had relatively low predictivity (RMSE ∼0.24). This was mostly driven by the large uncertainties in the underlying data and NEF values. The derived NEFs and the QSAR predicted NEFs to fill data gaps should be applied with caution.


Assuntos
Poluentes Ambientais/toxicidade , Síndromes Neurotóxicas , Bifenilos Policlorados/toxicidade , Animais , Encéfalo/metabolismo , Cálcio/metabolismo , Dopamina/metabolismo , Poluentes Ambientais/química , Células PC12 , Bifenilos Policlorados/química , Proteína Quinase C/metabolismo , Relação Quantitativa Estrutura-Atividade , Ratos , Medição de Risco , Canal de Liberação de Cálcio do Receptor de Rianodina/metabolismo
9.
Toxicol In Vitro ; 52: 131-145, 2018 Oct.
Artigo em Inglês | MEDLINE | ID: mdl-29908304

RESUMO

New approaches are needed to assess the effects of inhaled substances on human health. These approaches will be based on mechanisms of toxicity, an understanding of dosimetry, and the use of in silico modeling and in vitro test methods. In order to accelerate wider implementation of such approaches, development of adverse outcome pathways (AOPs) can help identify and address gaps in our understanding of relevant parameters for model input and mechanisms, and optimize non-animal approaches that can be used to investigate key events of toxicity. This paper describes the AOPs and the toolbox of in vitro and in silico models that can be used to assess the key events leading to toxicity following inhalation exposure. Because the optimal testing strategy will vary depending on the substance of interest, here we present a decision tree approach to identify an appropriate non-animal integrated testing strategy that incorporates consideration of a substance's physicochemical properties, relevant mechanisms of toxicity, and available in silico models and in vitro test methods. This decision tree can facilitate standardization of the testing approaches. Case study examples are presented to provide a basis for proof-of-concept testing to illustrate the utility of non-animal approaches to inform hazard identification and risk assessment of humans exposed to inhaled substances.


Assuntos
Alternativas aos Testes com Animais , Testes de Toxicidade Aguda , Administração por Inalação , Árvores de Decisões , Humanos
10.
J Appl Toxicol ; 38(1): 41-50, 2018 Jan.
Artigo em Inglês | MEDLINE | ID: mdl-28543848

RESUMO

There is an expectation that to meet regulatory requirements, and avoid or minimize animal testing, integrated approaches to testing and assessment will be needed that rely on assays representing key events (KEs) in the skin sensitization adverse outcome pathway. Three non-animal assays have been formally validated and regulatory adopted: the direct peptide reactivity assay (DPRA), the KeratinoSens™ assay and the human cell line activation test (h-CLAT). There have been many efforts to develop integrated approaches to testing and assessment with the "two out of three" approach attracting much attention. Here a set of 271 chemicals with mouse, human and non-animal sensitization test data was evaluated to compare the predictive performances of the three individual non-animal assays, their binary combinations and the "two out of three" approach in predicting skin sensitization potential. The most predictive approach was to use both the DPRA and h-CLAT as follows: (1) perform DPRA - if positive, classify as sensitizing, and (2) if negative, perform h-CLAT - a positive outcome denotes a sensitizer, a negative, a non-sensitizer. With this approach, 85% (local lymph node assay) and 93% (human) of non-sensitizer predictions were correct, whereas the "two out of three" approach had 69% (local lymph node assay) and 79% (human) of non-sensitizer predictions correct. The findings are consistent with the argument, supported by published quantitative mechanistic models that only the first KE needs to be modeled. All three assays model this KE to an extent. The value of using more than one assay depends on how the different assays compensate for each other's technical limitations. Copyright © 2017 John Wiley & Sons, Ltd.


Assuntos
Alternativas aos Testes com Animais , Dermatite Alérgica de Contato/etiologia , Substâncias Perigosas/toxicidade , Pele/efeitos dos fármacos , Testes de Toxicidade/métodos , Animais , Linhagem Celular , Dermatite Alérgica de Contato/imunologia , Humanos , Ensaio Local de Linfonodo , Camundongos , Valor Preditivo dos Testes , Pele/imunologia
11.
Adv Exp Med Biol ; 856: 165-187, 2016.
Artigo em Inglês | MEDLINE | ID: mdl-27671722

RESUMO

In this chapter, we provide an overview of how (Quantitative) Structure Activity Relationships, (Q)SARs, are validated and applied for regulatory purposes. We outline how chemical categories are derived to facilitate endpoint specific read-across using tools such as the OECD QSAR Toolbox and discuss some of the current difficulties in addressing the residual uncertainties of read-across. Finally we put forward a perspective of how non-testing approaches may evolve in light of the advances in new and emerging technologies and how these fit within the Adverse Outcome Pathway (AOP) framework.


Assuntos
Relação Quantitativa Estrutura-Atividade , Estudos de Validação como Assunto , Organização para a Cooperação e Desenvolvimento Econômico
12.
Adv Exp Med Biol ; 856: 317-342, 2016.
Artigo em Inglês | MEDLINE | ID: mdl-27671729

RESUMO

In this chapter, we explain how Integrated Approaches to Testing and Assessment (IATA) offer a means of integrating and translating the data generated by toxicity testing methods, thereby serving as flexible and suitable tools for toxicological decision making in the twenty-first century. In addition to traditional in vitro and in vivo testing methods, IATA are increasingly incorporating newly developed in vitro systems and measurement technologies such as high throughput screening and high content imaging. Computational approaches are also being used in IATA development, both as a means of generating data (e.g. QSARs), interpreting data (bioinformatics and chemoinformatics), and as a means of integrating multiple sources of data (e.g. expert systems, bayesian models). Decision analytic methods derived from socioeconomic theory can also play a role in developing flexible and optimal IATA solutions. Some of the challenges involved in the development, validation and implementation of IATA are also discussed.


Assuntos
Medição de Risco/métodos , Testes de Toxicidade/métodos , Animais , Biologia Computacional , Humanos , Pele/efeitos dos fármacos , Estudos de Validação como Assunto
13.
Regul Toxicol Pharmacol ; 70(3): 629-40, 2014 Dec.
Artigo em Inglês | MEDLINE | ID: mdl-25261300

RESUMO

Chemical regulation is challenged by the large number of chemicals requiring assessment for potential human health and environmental impacts. Current approaches are too resource intensive in terms of time, money and animal use to evaluate all chemicals under development or already on the market. The need for timely and robust decision making demands that regulatory toxicity testing becomes more cost-effective and efficient. One way to realize this goal is by being more strategic in directing testing resources; focusing on chemicals of highest concern, limiting testing to the most probable hazards, or targeting the most vulnerable species. Hypothesis driven Integrated Approaches to Testing and Assessment (IATA) have been proposed as practical solutions to such strategic testing. In parallel, the development of the Adverse Outcome Pathway (AOP) framework, which provides information on the causal links between a molecular initiating event (MIE), intermediate key events (KEs) and an adverse outcome (AO) of regulatory concern, offers the biological context to facilitate development of IATA for regulatory decision making. This manuscript summarizes discussions at the Workshop entitled "Advancing AOPs for Integrated Toxicology and Regulatory Applications" with particular focus on the role AOPs play in informing the development of IATA for different regulatory purposes.


Assuntos
Medição de Risco/métodos , Alternativas aos Testes com Animais , Animais , Simulação por Computador , Tomada de Decisões , Regulamentação Governamental , Ensaios de Triagem em Larga Escala , Humanos , Testes de Toxicidade
14.
Regul Toxicol Pharmacol ; 69(3): 529-45, 2014 Aug.
Artigo em Inglês | MEDLINE | ID: mdl-24928565

RESUMO

Since the OECD published the Adverse Outcome Pathway (AOP) for skin sensitization, many efforts have focused on how to integrate and interpret nonstandard information generated for key events in a manner that can be practically useful for decision making. These types of frameworks are known as Integrated Approaches to Testing and Assessment (IATA). Here we have outlined an IATA for skin sensitization which focuses on existing information including non testing approaches such as QSAR and read-across. The IATA was implemented into a pipeline tool using OASIS technology to provide a means of systematically collating and compiling relevant information which could be used in an assessment of skin sensitization potential. A test set of 100 substances with available skin sensitization information was profiled using the pipeline IATA. In silico and in chemico profiling information alone was able to correctly predict skin sensitization potential, with a preliminary accuracy of 73.85%. Information from other relevant endpoints (e.g., Ames mutagenicity) was found to improve the accuracy (to 87.6%) when coupled with a reaction chemistry mechanistic understanding. This pipeline platform could be useful in the assessment of skin sensitization potential and marks a step change in how non testing approaches can be practically applied.


Assuntos
Alérgenos/química , Alérgenos/imunologia , Pele/imunologia , Linhagem Celular Tumoral , Dermatite Alérgica de Contato/etiologia , Dermatite Alérgica de Contato/imunologia , Humanos , Organização para a Cooperação e Desenvolvimento Econômico , Ligação Proteica/imunologia , Relação Quantitativa Estrutura-Atividade , Medição de Risco , Células Th1 , Células U937
15.
J Appl Toxicol ; 34(4): 436-40, 2014 Apr.
Artigo em Inglês | MEDLINE | ID: mdl-24122899

RESUMO

A Bayesian integrated testing strategy (ITS) approach, aiming to assess skin sensitization potency, has been presented, in which data from various types of in vitro assays are integrated and assessed in combination for their ability to predict in vivo skin sensitization data. Here we discuss this approach and compare it to our quantitative mechanistic modeling (QMM) approach based on physical organic chemistry. The main findings of the Bayesian study are consistent with our chemistry-based approach and our previously published assessment of the key determinants of sensitization potency, in particular the relatively high predictive value found for chemical reactivity data and the relatively low predictive value for bioavailability parameters. As it stands at present the Bayesian approach does not utilize the full range of predictive capability that is already available, and aims only to assign potency categories rather than numerical potency values per se. In contrast, for many chemicals the QMM approach can already provide numerical potency predictions. However, the Bayesian approach may have potential for those chemicals where a chemistry modeling approach cannot provide a complete answer (e.g. pro-electrophiles whose in cutaneo activation cannot currently be modeled confidently). Nonetheless, our main message is of the importance of leveraging chemistry insights and read-across approaches to the fullest extent possible.


Assuntos
Dermatite Alérgica de Contato/etiologia , Dermatite Alérgica de Contato/imunologia , Modelos Químicos , Testes Cutâneos , Teorema de Bayes , Humanos , Interações Hidrofóbicas e Hidrofílicas , Cinética , Anidridos Maleicos/química , Anidridos Maleicos/imunologia , Anidridos Maleicos/toxicidade , Anidridos Ftálicos/química , Anidridos Ftálicos/imunologia , Anidridos Ftálicos/toxicidade , Medição de Risco
16.
Front Biosci (Elite Ed) ; 5(2): 418-34, 2013 01 01.
Artigo em Inglês | MEDLINE | ID: mdl-23276999

RESUMO

Chemical Regulation and the means by which data is generated for the purposes of risk assessment is undergoing a tremendous shift. There is a strong impetus in Europe, in particular, to move towards non-animal approaches to address data gaps for specific endpoints either in lieu of testing or as part of weight of evidence approaches within integrated testing strategies (ITS). An Exposure assessment considering workers and/or consumers is a critical component of a robust risk assessment. The EU chemicals legislation REACH, for example, provides considerable flexibility in the application of non-testing approaches such as (Q)SARs, chemical categories and read-across for data gap filling. There have been a number of efforts aimed at developing technical guidance, tools, and techniques for non-testing and tiered exposure approaches. Despite these efforts, there remains limited practical insight about how these approaches can be applied in the assessment of substances. Here, we first provide a background of the available approaches and how they can and should be practically utilised to address REACH requirements.


Assuntos
Indústria Química/legislação & jurisprudência , Ecotoxicologia/métodos , Exposição Ambiental , Metacrilatos/toxicidade , Relação Quantitativa Estrutura-Atividade , Medição de Risco/métodos , Medição de Risco/tendências , Biodegradação Ambiental , União Europeia , Metacrilatos/química , Medição de Risco/legislação & jurisprudência
17.
Regul Toxicol Pharmacol ; 65(2): 226-8, 2013 Mar.
Artigo em Inglês | MEDLINE | ID: mdl-23266660

RESUMO

Read-across has generated much attention, since it may be used as an alternative approach for addressing the information requirements under REACH. Experience in the application of "read-across" has undoubtedly been gained within the context of the 2010 registrations (>1000 tonnes/annum). Industry, European Chemicals Agency (ECHA) and EU Member States all conceptually accept read-across approaches but difficulties still remain in applying them consistently in practice. A workshop on the 'Use of Read-Across for Chemical Safety Assessment under REACH', organised by ECHA with the active support of Cefic LRI was held on the 3rd October 2012 to gain insight on how ECHA evaluates read-across justifications, to share Industry experiences with read-across approaches and to discuss practical strategies to help develop scientifically valid read-across for future submissions.


Assuntos
Segurança Química/métodos , Substâncias Perigosas/toxicidade , Medição de Risco/métodos , Gestão da Segurança/métodos , Testes de Toxicidade/métodos , Animais , Segurança Química/normas , União Europeia , Humanos , Gestão da Segurança/organização & administração , Testes de Toxicidade/normas
18.
Curr Pharm Des ; 16(24): 2737-64, 2010.
Artigo em Inglês | MEDLINE | ID: mdl-20642428

RESUMO

Quantitative Structure-Activity Relationship (QSAR) models have been used in Pharmaceutical design and Medicinal Chemistry for the discovery of anti-parasite drugs. QSAR models predict biological activity using as input different types of structural parameters of molecules. Topological Indices (TIs) are a very interesting class of these parameters. We can derive TIs from graph representations based on only nodes (atoms) and edges (chemical bonds). TIs are not time-consuming in terms of computational resources because they depend only on atom-atom connectivity information. This information expressed in the molecular graphs can be tabulated in the form of adjacency matrices easy to manipulate with computers. Consequently, TIs allow the rapid collection, annotation, retrieval, comparison and mining of molecular structures within large databases. The interest in TIs has exploded because we can use them to describe also macromolecular and macroscopic systems represented by complex networks of interactions (links) between the different parts of a system (nodes) such as: drug-target, protein-protein, metabolic, host-parasite, brain cortex, parasite disease spreading, Internet, or social networks. In this work, we review and comment on the following topics related to the use of TIs in anti-parasite drugs and target discovery. The first topic reviewed was: Topological Indices and QSAR for antiparasitic drugs. This topic included: Theoretical Background, QSAR for anti-malaria drugs, QSAR for anti-Toxoplasma drugs. The second topic was: TOMO-COMD approach to QSAR of antiparasitic drugs. We included in this topic: TOMO-COMD theoretical background and TOMO-COMD models for antihelmintic activity, Trichomonas, anti-malarials, anti-trypanosome compounds. The third section was inserted to discuss Topological Indices in the context of Complex Networks. The last section is devoted to the MARCH-INSIDE approach to QSAR of antiparasitic drugs and targets. This begins with a theoretical background for drugs and parameters for proteins. Next, we reviewed MARCH-INSIDE models for Pharmaceutical Design of antiparasitic drugs including: flukicidal drugs and anti-coccidial drugs. We close MARCH-NSIDE topic with a review of multi-target QSAR of antiparasitic drugs, MARCH-INSIDE assembly of complex networks of antiparasitic drugs. We closed the MARCH-INSIDE section discussing the prediction of proteins in parasites and MARCH-INSIDE web-servers for Protein-Protein interactions in parasites: Plasmod-PPI and Trypano-PPI web-servers. We closed this revision with an important section devoted to review some legal issues related to QSAR models.


Assuntos
Antiparasitários , Desenho de Fármacos , Terapia de Alvo Molecular , Doenças Parasitárias/tratamento farmacológico , Relação Quantitativa Estrutura-Atividade , Animais , Antiparasitários/química , Antiparasitários/farmacologia , Simulação por Computador , Bases de Dados Factuais/legislação & jurisprudência , Bases de Dados de Proteínas , Humanos , Cadeias de Markov , Modelos Moleculares , Estrutura Molecular , Doenças Parasitárias/classificação , Mapeamento de Interação de Proteínas , Relação Estrutura-Atividade
20.
Curr Top Med Chem ; 8(18): 1666-75, 2008.
Artigo em Inglês | MEDLINE | ID: mdl-19075773

RESUMO

In recent times, there has been an increased use of software and computational models in Medicinal Chemistry, both for the prediction of effects such as drug-target interactions, as well as for the development of (Quantitative) Structure-Activity Relationships ((Q)SAR). Whilst the ultimate goal of Medicinal Chemistry research is for the discovery of new drug candidates, a secondary yet important outcome that results is in the creation of new computational tools. The adoption of computational tools by medicinal chemists is sadly, and all too often accompanied, by a lack of understanding of the legal aspects related to software and model use, that is, the copyright protection of new medicinal chemistry software and software-mediated discovered products. This article aims to provide a reference to the various legal avenues that are available for the protection of software, and the acceptance and legal treatment of scientific results and techniques derived from such software. An overview of relevant international tax issues is also presented. We have considered cases of patents protecting software, models, and/or new compounds discovered using methods such as molecular modeling or QSAR. This paper has been written and compiled by the authors as a review of current topics and trends on the legal issues in certain fields of Medicinal Chemistry and as such is not intended to be exhaustive.


Assuntos
Química Farmacêutica/legislação & jurisprudência , Propriedade Intelectual , Software/legislação & jurisprudência , Química Farmacêutica/economia , Biologia Computacional , Direitos Autorais , Desenho de Fármacos , Relação Quantitativa Estrutura-Atividade , Software/economia , Impostos
SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA