RESUMO
With current progress in science, there is growing interest in developing and applying Physiologically Based Kinetic (PBK) models in chemical risk assessment, as knowledge of internal exposure to chemicals is critical to understanding potential effects in vivo. In particular, a new generation of PBK models is being developed in which the model parameters are derived from in silico and in vitro methods. To increase the acceptance and use of these "Next Generation PBK models", there is a need to demonstrate their validity. However, this is challenging in the case of data-poor chemicals that are lacking in kinetic data and for which predictive capacity cannot, therefore, be assessed. The aim of this work is to lay down the fundamental steps in using a read across framework to inform modellers and risk assessors on how to develop, or evaluate, PBK models for chemicals without in vivo kinetic data. The application of a PBK model that takes into account the absorption, distribution, metabolism and excretion characteristics of the chemical reduces the uncertainties in the biokinetics and biotransformation of the chemical of interest. A strategic flow-charting application, proposed herein, allows users to identify the minimum information to perform a read-across from a data-rich chemical to its data-poor analogue(s). The workflow analysis is illustrated by means of a real case study using the alkenylbenzene class of chemicals, showing the reliability and potential of this approach. It was demonstrated that a consistent quantitative relationship between model simulations could be achieved using models for estragole and safrole (source chemicals) when applied to methyleugenol (target chemical). When the PBK model code for the source chemicals was adapted to utilise input values relevant to the target chemical, simulation was consistent between the models. The resulting PBK model for methyleugenol was further evaluated by comparing the results to an existing, published model for methyleugenol, providing further evidence that the approach was successful. This can be considered as a "read-across" approach, enabling a valid PBK model to be derived to aid the assessment of a data poor chemical.
RESUMO
The EU Directive 2010/63/EU on the protection of animals used for scientific purposes and other EU regulations, such as REACH and the Cosmetic Products Regulation advocate for a change in the way toxicity testing is conducted. Whilst the Cosmetic Products Regulation bans animal testing altogether, REACH aims for a progressive shift from in vivo testing towards quantitative in vitro and computational approaches. Several endpoints can already be addressed using non-animal approaches including skin corrosion and irritation, serious eye damage and irritation, skin sensitisation, and mutagenicity and genotoxicity. However, for systemic effects such as acute toxicity, repeated dose toxicity and reproductive and developmental toxicity, evaluation of chemicals under REACH still heavily relies on animal tests. Here we summarise current EU regulatory requirements for the human health assessment of chemicals under REACH and the Cosmetic Products Regulation, considering the more critical endpoints and identifying the main challenges in introducing alternative methods into regulatory testing practice. This supports a recent initiative taken by the International Cooperation on Alternative Test Methods (ICATM) to summarise current regulatory requirements specific for the assessment of chemicals and cosmetic products for several human health-related endpoints, with the aim of comparing different jurisdictions and coordinating the promotion and ultimately the implementation of non-animal approaches worldwide. Recent initiatives undertaken at European level to promote the 3Rs and the use of alternative methods in current regulatory practice are also discussed.
Assuntos
Alternativas aos Testes com Animais/legislação & jurisprudência , Cosméticos/legislação & jurisprudência , Testes de Toxicidade/métodos , Alternativas aos Testes com Animais/métodos , Animais , Cosméticos/toxicidade , União Europeia , Humanos , Cooperação Internacional , Medição de Risco/legislação & jurisprudência , Medição de Risco/métodosRESUMO
In view of the need to enhance the assessment of consumer products called for in the EU Chemicals Strategy for Sustainability, we developed a methodology for evaluating hazard by combining information across different systemic toxicity endpoints and integrating the information with new approach methodologies. This integrates mechanistic information with a view to avoiding redundant in vivo studies, minimising reliance on apical endpoint tests and ultimately devising efficient testing strategies. Here, we present the application of our methodology to carcinogenicity assessment, mapping the available information from toxicity test methods across endpoints to the key characteristics of carcinogens. Test methods are deconstructed to allow the information they provide to be organised in a systematic way, enabling the description of the toxicity mechanisms leading to the adverse outcome. This integrated approach provides a flexible and resource-efficient means of fully exploiting test methods for which test guidelines are available to fulfil regulatory requirements for systemic toxicity assessment as well as identifying where new methods can be integrated.
Assuntos
Testes de Carcinogenicidade/métodos , Carcinógenos/toxicidade , Medição de Risco/métodos , Testes de Toxicidade/métodos , Animais , Determinação de Ponto Final , União Europeia , HumanosRESUMO
This paper outlines a new concept to optimise testing strategies for improving the efficiency of chemical testing for hazard-based risk management. While chemical classification based on standard checklists of information triggers risk management measures, the link is not one-to-one. Toxicity testing may be performed with no impact on the safe use of chemicals . Each hazard class and category is not assigned a unique pictogram and for the purpose of this proof-of-concept study, the level of concern for a chemical for the population and the environment is simplistically considered to be reflected by the hazard pictograms. Using active substances in biocides and plant protection products as a dataset, three testing strategies were built with the boundary condition that an optimal approach must indicate a given level of concern while requiring less testing (strategy B), prioritising new approach methodologies (strategy C) or combining the two considerations (strategy D). The implementation of the strategies B and D reduced the number of tests performed by 6.0% and 8.8%, respectively, while strategy C relied the least on in vivo methods. The intentionally simplistic approach to optimised testing strategies presented here could be used beyond the assessment of biocides and plant protection products to gain efficiencies in the safety assessment of other chemical groups, saving animals and making regulatory testing more time- and cost-efficient.
Assuntos
Segurança Química/métodos , Poluentes Ambientais/toxicidade , Substâncias Perigosas/toxicidade , Testes de Toxicidade/métodos , Segurança Química/legislação & jurisprudência , Poluentes Ambientais/classificação , União Europeia , Regulamentação Governamental , Substâncias Perigosas/classificação , Humanos , Medição de Risco , Gestão de RiscosRESUMO
Human biomonitoring (HBM) data can provide insight into co-exposure patterns resulting from exposure to multiple chemicals from various sources and over time. Therefore, such data are particularly valuable for assessing potential risks from combined exposure to multiple chemicals. One way to interpret HBM data is establishing safe levels in blood or urine, called Biomonitoring Equivalents (BE) or HBM health based guidance values (HBM-HBGV). These can be derived by converting established external reference values, such as tolerable daily intake (TDI) values. HBM-HBGV or BE values are so far agreed only for a very limited number of chemicals. These values can be established using physiologically based kinetic (PBK) modelling, usually requiring substance specific models and the collection of many input parameters which are often not available or difficult to find in the literature. The aim of this study was to investigate the suitability and limitations of generic PBK models in deriving BE values for several compounds with a view to facilitating the use of HBM data in the assessment of chemical mixtures at a screening level. The focus was on testing the methodology with two generic models, the IndusChemFate tool and High-Throughput Toxicokinetics package, for two different classes of compounds, phenols and phthalates. HBM data on Danish children and on Norwegian mothers and children were used to evaluate the quality of the predictions and to illustrate, by means of a case study, the overall approach of applying PBK models to chemical classes with HBM data in the context of chemical mixture risk assessment. Application of PBK models provides a better understanding and interpretation of HBM data. However, the study shows that establishing safety threshold levels in urine is a difficult and complex task. The approach might be more straightforward for more persistent chemicals that are analysed as parent compounds in blood but high uncertainties have to be considered around simulated metabolite concentrations in urine. Refining the models may reduce these uncertainties and improve predictions. Based on the experience gained with this study, the performance of the models for other chemicals could be investigated, to improve the accuracy of the simulations.
Assuntos
Monitoramento Biológico , Monitoramento Ambiental , Criança , Humanos , Nível de Efeito Adverso não Observado , Valores de Referência , Medição de RiscoRESUMO
As the basis for managing the risks of chemical exposure, the Chemical Risk Assessment (CRA) process can impact a substantial part of the economy, the health of hundreds of millions of people, and the condition of the environment. However, the number of properly assessed chemicals falls short of societal needs due to a lack of experts for evaluation, interference of third party interests, and the sheer volume of potentially relevant information on the chemicals from disparate sources. In order to explore ways in which computational methods may help overcome this discrepancy between the number of chemical risk assessments required on the one hand and the number and adequateness of assessments actually being conducted on the other, the European Commission's Joint Research Centre organised a workshop on Artificial Intelligence for Chemical Risk Assessment (AI4CRA). The workshop identified a number of areas where Artificial Intelligence could potentially increase the number and quality of regulatory risk management decisions based on CRA, involving process simulation, supporting evaluation, identifying problems, facilitating collaboration, finding experts, evidence gathering, systematic review, knowledge discovery, and building cognitive models. Although these are interconnected, they are organised and discussed under two main themes: scientific-technical process and social aspects and the decision making process.
RESUMO
BACKGROUND: In light of the vulnerability of the developing brain, mixture risk assessment (MRA) for the evaluation of developmental neurotoxicity (DNT) should be implemented, since infants and children are co-exposed to more than one chemical at a time. One possible approach to tackle MRA could be to cluster DNT chemicals in a mixture on the basis of their mode of action (MoA) into 'similar' and 'dissimilar', but still contributing to the same adverse outcome, and anchor DNT assays to common key events (CKEs) identified in DNT-specific adverse outcome pathways (AOPs). Moreover, the use of human in vitro models, such as induced pluripotent stem cell (hiPSC)-derived neuronal and glial cultures would enable mechanistic understanding of chemically-induced adverse effects, avoiding species extrapolation. METHODS: HiPSC-derived neural progenitors differentiated into mixed cultures of neurons and astrocytes were used to assess the effects of acute (3 days) and repeated dose (14 days) treatments with single chemicals and in mixtures belonging to different classes (i.e., lead(II) chloride and methylmercury chloride (heavy metals), chlorpyrifos (pesticide), bisphenol A (organic compound and endocrine disrupter), valproic acid (drug), and PCB138 (persistent organic pollutant and endocrine disrupter), which are associated with cognitive deficits, including learning and memory impairment in children. Selected chemicals were grouped based on their mode of action (MoA) into 'similar' and 'dissimilar' MoA compounds and their effects on synaptogenesis, neurite outgrowth, and brain derived neurotrophic factor (BDNF) protein levels, identified as CKEs in currently available AOPs relevant to DNT, were evaluated by immunocytochemistry and high content imaging analysis. RESULTS: Chemicals working through similar MoA (i.e., alterations of BDNF levels), at non-cytotoxic (IC20/100), very low toxic (IC5), or moderately toxic (IC20) concentrations, induce DNT effects in mixtures, as shown by increased number of neurons, impairment of neurite outgrowth and synaptogenesis (the most sensitive endpoint as confirmed by mathematical modelling) and increase of BDNF levels, to a certain extent reproducing autism-like cellular changes observed in the brain of autistic children. CONCLUSIONS: Our findings suggest that the use of human iPSC-derived mixed neuronal/glial cultures applied to a battery of assays anchored to key events of an AOP network represents a valuable approach to identify mixtures of chemicals with potential to cause learning and memory impairment in children.
Assuntos
Rotas de Resultados Adversos , Poluentes Ambientais/toxicidade , Síndromes Neurotóxicas/etiologia , Neurotoxinas/toxicidade , Disruptores Endócrinos/toxicidade , Humanos , Células-Tronco Pluripotentes Induzidas/efeitos dos fármacos , Metais Pesados/toxicidade , Células-Tronco Neurais/efeitos dos fármacos , Praguicidas/toxicidade , Bifenilos Policlorados/toxicidade , Medição de Risco , Testes de ToxicidadeRESUMO
Canine degenerative lumbosacral stenosis (DLSS) is a syndrome of low back pain with or without neurologic dysfunction associated with compression of the cauda equina. Most commonly occurring in medium- to large-breed dogs of middle to older age, German shepherd and working dogs are predisposed. Diagnosis is based on a combination of clinical signs, advanced imaging and ruling out other differential diagnoses. The volume of the intervertebral foramina at the lumbosacral junction is naturally reduced on extension but degenerative changes lead to a more marked reduction that can impinge the L7 nerve roots. Evidence is lacking on which to base decision-making for treatment of dogs with DLSS. However, surgical intervention may be indicated in dogs that do not respond to conservative management, or for dogs in which there is a requirement to work that prevents lifestyle adjustments. Improvements in electrodiagnosis and novel intra-discal treatments may improve the management of DLSS in the future.
RESUMO
OBJECTIVES: There is growing evidence that single substances present below their individual thresholds of effect may still contribute to combined effects. In component-based mixture risk assessment (MRA), the risks can be addressed using information on the mixture components. This is, however, often hampered by limited availability of ecotoxicity data. Here, the possible use of ecotoxicological threshold concentrations of no concern (i.e. 5th percentile of statistical distribution of ecotoxicological values) is investigated to fill data gaps in MRA. METHODS: For chemicals without available aquatic toxicity data, ecotoxicological threshold concentrations of no concern have been derived from Predicted No Effect Concentration (PNEC) distributions and from chemical toxicity distributions, using the EnviroTox tool, with and without considering the chemical mode of action. For exposure, chemical monitoring data from European rivers have been used to illustrate four realistic co-exposure scenarios. Based on those monitoring data and available ecotoxicity data or threshold concentrations when no data were available, Risk Quotients for individual chemicals were calculated, to then derive a mixture Risk Quotient (RQmix). RESULTS: A risk was identified in two of the four scenarios. Threshold concentrations contribute from 24 to 95% of the whole RQmix; thus they have a large impact on the predicted mixture risk. Therefore they could only be used for data gap filling for a limited number of chemicals in the mixture. The use of mode of action information to derive more specific threshold values could be a helpful refinement in some cases.
RESUMO
In silico chemical safety assessment can support the evaluation of hazard and risk following potential exposure to a substance. A symposium identified a number of opportunities and challenges to implement in silico methods, such as quantitative structure-activity relationships (QSARs) and read-across, to assess the potential harm of a substance in a variety of exposure scenarios, e.g. pharmaceuticals, personal care products, and industrial chemicals. To initiate the process of in silico safety assessment, clear and unambiguous problem formulation is required to provide the context for these methods. These approaches must be built on data of defined quality, while acknowledging the possibility of novel data resources tapping into on-going progress with data sharing. Models need to be developed that cover appropriate toxicity and kinetic endpoints, and that are documented appropriately with defined uncertainties. The application and implementation of in silico models in chemical safety requires a flexible technological framework that enables the integration of multiple strands of data and evidence. The findings of the symposium allowed for the identification of priorities to progress in silico chemical safety assessment towards the animal-free assessment of chemicals.
RESUMO
Cancer is a key public health concern, being the second leading cause of worldwide morbidity and mortality after cardiovascular diseases. At the global level, cancer prevalence, incidence and mortality rates are increasing. These trends are not fully explained by a growing and ageing population: with marked regional and socioeconomic disparities, lifestyle factors, the resources dedicated to preventive medicine, and the occupational and environmental control of hazardous chemicals all playing a role. While it is difficult to establish the contribution of chemical exposure to the societal burden of cancer, a number of measures can be taken to better assess the carcinogenic properties of chemicals and manage their risks. This paper discusses how these measures can be informed not only by the traditional data streams of regulatory toxicology, but also by using new toxicological assessment methods, along with indicators of public health status based on biomonitoring. These diverse evidence streams have the potential to form the basis of an integrated and more effective approach to cancer prevention.
Assuntos
Testes de Carcinogenicidade/métodos , Carcinógenos/toxicidade , Exposição Ambiental/efeitos adversos , Monitoramento Ambiental/métodos , Substâncias Perigosas/efeitos adversos , Saúde Pública/métodos , Animais , Carcinogênese/induzido quimicamente , Humanos , Camundongos , RatosRESUMO
This paper summarizes current challenges, the potential use of novel scientific methodologies, and ways forward in the risk assessment and risk management of mixtures. Generally, methodologies to address mixtures have been agreed; however, there are still several data and methodological gaps to be addressed. New approach methodologies can support the filling of knowledge gaps on the toxicity and mode(s) of action of individual chemicals. (Bio)Monitoring, modeling, and better data sharing will support the derivation of more realistic co-exposure scenarios. As knowledge and data gaps often hamper an in-depth assessment of specific chemical mixtures, the option of taking account of possible mixture effects in single substance risk assessments is briefly discussed. To allow risk managers to take informed decisions, transparent documentation of assumptions and related uncertainties is recommended indicating the potential impact on the assessment. Considering the large number of possible combinations of chemicals in mixtures, prioritization is needed, so that actions first address mixtures of highest concern and chemicals that drive the mixture risk. As chemicals with different applications and regulated separately might lead to similar toxicological effects, it is important to consider chemical mixtures across legislative sectors.
Assuntos
Exposição Ambiental , Política Ambiental , Substâncias Perigosas , Humanos , Medição de RiscoRESUMO
Humans are continuously exposed to low levels of thousands of industrial chemicals, most of which are poorly characterised in terms of their potential toxicity. The new paradigm in chemical risk assessment (CRA) aims to rely on animal-free testing, with kinetics being a key determinant of toxicity when moving from traditional animal studies to integrated in vitro-in silico approaches. In a kinetically informed CRA, membrane transporters, which have been intensively studied during drug development, are an essential piece of information. However, how existing knowledge on transporters gained in the drug field can be applied to CRA is not yet fully understood. This review outlines the opportunities, challenges and existing tools for investigating chemical-transporter interactions in kinetically informed CRA without animal studies. Various environmental chemicals acting as substrates, inhibitors or modulators of transporter activity or expression have been shown to impact TK, just as drugs do. However, because pollutant concentrations are often lower in humans than drugs and because exposure levels and internal chemical doses are not usually known in contrast to drugs, new approaches are required to translate transporter data and reasoning from the drug sector to CRA. Here, the generation of in vitro chemical-transporter interaction data and the development of transporter databases and classification systems trained on chemical datasets (and not only drugs) are proposed. Furtheremore, improving the use of human biomonitoring data to evaluate the in vitro-in silico transporter-related predicted values and developing means to assess uncertainties could also lead to increase confidence of scientists and regulators in animal-free CRA. Finally, a systematic characterisation of the transportome (quantitative monitoring of transporter abundance, activity and maintenance over time) would reinforce confidence in the use of experimental transporter/barrier systems as well as in established cell-based toxicological assays currently used for CRA.
Assuntos
Alternativas aos Testes com Animais/métodos , Poluentes Ambientais/toxicidade , Proteínas de Membrana Transportadoras/metabolismo , Medição de Risco/métodos , Monitoramento Ambiental , Humanos , CinéticaRESUMO
The Threshold of Toxicological Concern (TTC) is an important risk assessment tool which establishes acceptable low-level exposure values to be applied to chemicals with limited toxicological data. One of the logical next steps in the continued evolution of TTC is to develop this concept further so that it is representative of internal exposures (TTC based on plasma concentration). An internal TTC (iTTC) would provide threshold values that could be utilized in exposure-based safety assessments. As part of a Cosmetics Europe (CosEu) research program, CosEu has initiated a project that is working towards the development of iTTCs that can be used for the human safety assessment. Knowing that the development of an iTTC is an ambitious and broad-spanning topic, CosEu organized a Working Group comprised a balance of multiple stakeholders (cosmetics and chemical industries, the EPA and JRC and academia) with relevant experience and expertise and workshop to critically evaluate the requirements to establish an iTTC. Outcomes from the workshop included an evaluation on the current state of the science for iTTC, the overall iTTC strategy, selection of chemical databases, capture and curation of chemical information, ADME and repeat dose data, expected challenges, as well as next steps and ongoing work.
Assuntos
Cosméticos/toxicidade , Animais , Cosméticos/efeitos adversos , Cosméticos/metabolismo , Europa (Continente) , Humanos , Medição de RiscoRESUMO
Costs, scientific and ethical concerns related to animal tests for regulatory decision-making have stimulated the development of alternative methods. When applying alternative approaches, kinetics have been identified as a key element to consider. Membrane transporters affect the kinetic processes of absorption, distribution, metabolism and excretion (ADME) of various compounds, such as drugs or environmental chemicals. Therefore, pharmaceutical scientists have intensively studied transporters impacting drug efficacy and safety. Besides pharmacokinetics, transporters are considered as major determinant of toxicokinetics, potentially representing an essential piece of information in chemical risk assessment. To capture the applicability of transporter data for kinetic-based risk assessment in non-pharmaceutical sectors, the EU Reference Laboratory for Alternatives to Animal Testing (EURL ECVAM) created a survey with a view of identifying the improvements needed when using in vitro and in silico methods. Seventy-three participants, from different sectors and with various kinds of expertise, completed the survey. The results revealed that transporters are investigated mainly during drug development, but also for risk assessment purposes of food and feed contaminants, industrial chemicals, cosmetics, nanomaterials and in the context of environmental toxicology, by applying both in vitro and in silico tools. However, to rely only on alternative methods for chemical risk assessment, it is critical that the data generated by in vitro and in silico methods are scientific integer, reproducible and of high quality so that they are trusted by decision makers and used by industry. In line, the respondents identified various challenges related to the interpretation and use of transporter data from non-animal methods. Overall, it was determined that a combined mechanistically-anchored in vitro-in silico approach, validated against available human data, would gain confidence in using transporter data within an animal-free risk assessment paradigm. Finally, respondents involved primarily in fundamental research expressed lower confidence in non-animal studies to unravel complex transporter mechanisms.
Assuntos
Alternativas aos Testes com Animais , Pesquisa Biomédica , Medição de Risco , Animais , Bovinos , Simulação por Computador , Feminino , Humanos , Lactação , Proteínas de Membrana Transportadoras , Camundongos , RatosRESUMO
Currently, the identification of chemicals that have the potential to induce developmental neurotoxicity (DNT) is based on animal testing. Since at the regulatory level, systematic testing of DNT is not a standard requirement within the EU or USA chemical legislation safety assessment, DNT testing is only performed in higher tiered testing triggered based on chemical structure activity relationships or evidence of neurotoxicity in systemic acute or repeated dose toxicity studies. However, these triggers are rarely used and, in addition, do not always serve as reliable indicators of DNT, as they are generally based on observations in adult rodents. Therefore, there is a pressing need for developing alternative methodologies that can reliably support identification of DNT triggers, and more rapidly and cost-effectively support the identification and characterization of chemicals with DNT potential. We propose to incorporate mechanistic knowledge and data derived from in vitro studies to support various regulatory applications including: (a) the identification of potential DNT triggers, (b) initial chemical screening and prioritization, (c) hazard identification and characterization, (d) chemical biological grouping, and (e) assessment of exposure to chemical mixtures. Ideally, currently available cellular neuronal/glial models derived from human induced pluripotent stem cells (hiPSCs) should be used as they allow evaluation of chemical impacts on key neurodevelopmental processes, by reproducing different windows of exposure during human brain development. A battery of DNT in vitro test methods derived from hiPSCs could generate valuable mechanistic data, speeding up the evaluation of thousands of compounds present in industrial, agricultural and consumer products that lack safety data on DNT potential.
Assuntos
Sistema Nervoso/efeitos dos fármacos , Neurogênese/efeitos dos fármacos , Neurônios/efeitos dos fármacos , Síndromes Neurotóxicas/etiologia , Testes de Toxicidade , Toxicologia/métodos , Alternativas aos Testes com Animais , Animais , Células Cultivadas , Relação Dose-Resposta a Droga , Humanos , Células-Tronco Pluripotentes Induzidas/efeitos dos fármacos , Células-Tronco Pluripotentes Induzidas/metabolismo , Células-Tronco Pluripotentes Induzidas/patologia , Sistema Nervoso/embriologia , Sistema Nervoso/metabolismo , Neurônios/metabolismo , Neurônios/patologia , Síndromes Neurotóxicas/embriologia , Síndromes Neurotóxicas/metabolismo , Formulação de Políticas , Relação Quantitativa Estrutura-Atividade , Medição de Risco , Toxicologia/legislação & jurisprudênciaRESUMO
OBJECTIVE: This article aims to report the medium-term clinical outcome and assess persistence of enlargement of the lumbosacral lateral intervertebral neurovascular foramen using computed tomography (CT) volumetric analysis in dogs following lateral foraminotomy. MATERIALS: Six dogs that underwent lumbosacral lateral foraminotomy on one or both sides were evaluated with CT prior to, immediately postoperatively (n = 2) and at 12 to 44 months of follow-up. Five out of six dogs had successful clinical outcomes with alleviation of pain and increased levels of activity, according to subjective assessment. Immediate postoperative CT volumetric analysis of the lateral intervertebral neurovascular foramina in two dogs indicated a 650 to 800% increase in volume in extension achieved by foraminotomy (four foramens). At subsequent follow-up, bone regrowth had occurred with reduction in foraminal volume, though in both dogs foraminal volume remained higher than preoperative values. Follow-up CT at a median of 24 months postoperatively indicated a mean 335% increase in volume of the lumbosacral lateral intervertebral neurovascular foramina in extension compared with the preoperative foraminal volume. The follow-up volume was substantially greater than the presurgical volume in four out of six dogs. CLINICAL SIGNIFICANCE: In this limited case series, lateral foraminotomy achieved persistent enlargement of the lumbosacral lateral intervertebral neurovascular foramen in the medium term, but osseous regrowth at the site was demonstrated which may limit the effectiveness of lateral foraminotomy in the longer term. One of two working dogs had recurrent clinical signs that necessitated further surgery.
Assuntos
Doenças do Cão/cirurgia , Foraminotomia/veterinária , Região Lombossacral/patologia , Estenose Espinal/veterinária , Animais , Doenças do Cão/diagnóstico por imagem , Doenças do Cão/patologia , Cães , Feminino , Foraminotomia/métodos , Região Lombossacral/diagnóstico por imagem , Região Lombossacral/cirurgia , Masculino , Estenose Espinal/diagnóstico por imagem , Estenose Espinal/patologia , Estenose Espinal/cirurgia , Tomografia Computadorizada por Raios X/veterinária , Resultado do TratamentoRESUMO
We describe and illustrate a workflow for chemical safety assessment that completely avoids animal testing. The workflow, which was developed within the SEURAT-1 initiative, is designed to be applicable to cosmetic ingredients as well as to other types of chemicals, e.g. active ingredients in plant protection products, biocides or pharmaceuticals. The aim of this work was to develop a workflow to assess chemical safety without relying on any animal testing, but instead constructing a hypothesis based on existing data, in silico modelling, biokinetic considerations and then by targeted non-animal testing. For illustrative purposes, we consider a hypothetical new ingredient x as a new component in a body lotion formulation. The workflow is divided into tiers in which points of departure are established through in vitro testing and in silico prediction, as the basis for estimating a safe external dose in a repeated use scenario. The workflow includes a series of possible exit (decision) points, with increasing levels of confidence, based on the sequential application of the Threshold of Toxicological (TTC) approach, read-across, followed by an "ab initio" assessment, in which chemical safety is determined entirely by new in vitro testing and in vitro to in vivo extrapolation by means of mathematical modelling. We believe that this workflow could be applied as a tool to inform targeted and toxicologically relevant in vitro testing, where necessary, and to gain confidence in safety decision making without the need for animal testing.
RESUMO
Physiologically based kinetic (PBK) models are used widely throughout a number of working sectors, including academia and industry, to provide insight into the dosimetry related to observed adverse health effects in humans and other species. Use of these models has increased over the last several decades, especially in conjunction with emerging alternative methods to animal testing, such as in vitro studies and data-driven in silico quantitative-structure-activity-relationship (QSAR) predictions. Experimental information derived from these new approach methods can be used as input for model parameters and allows for increased confidence in models for chemicals that did not have in vivo data for model calibration. Despite significant advancements in good modelling practice (GMP) for model development and evaluation, there remains some reluctance among regulatory agencies to use such models during the risk assessment process. Here, the results of a survey disseminated to the modelling community are presented in order to inform the frequency of use and applications of PBK models in science and regulatory submission. Additionally, the survey was designed to identify a network of investigators involved in PBK modelling and knowledgeable of GMP so that they might be contacted in the future for peer review of PBK models, especially in regards to vetting the models to such a degree as to gain a greater acceptance for regulatory purposes.
Assuntos
Indústria Farmacêutica/métodos , Modelos Biológicos , Farmacologia/métodos , Medição de Risco/métodos , Animais , Relação Dose-Resposta a Droga , Indústria Farmacêutica/legislação & jurisprudência , Indústria Farmacêutica/normas , Guias como Assunto , Humanos , Técnicas In Vitro/métodos , Técnicas In Vitro/normas , Farmacologia/legislação & jurisprudência , Farmacologia/normas , Relação Quantitativa Estrutura-Atividade , Medição de Risco/normas , Inquéritos e QuestionáriosRESUMO
Prevailing knowledge gaps in linking specific molecular changes to apical outcomes and methodological uncertainties in the generation, storage, processing, and interpretation of 'omics data limit the application of 'omics technologies in regulatory toxicology. Against this background, the European Centre for Ecotoxicology and Toxicology of Chemicals (ECETOC) convened a workshop Applying 'omics technologies in chemicals risk assessment that is reported herein. Ahead of the workshop, multi-expert teams drafted frameworks on best practices for (i) a Good-Laboratory Practice-like context for collecting, storing and curating 'omics data; (ii) the processing of 'omics data; and (iii) weight-of-evidence approaches for integrating 'omics data. The workshop participants confirmed the relevance of these Frameworks to facilitate the regulatory applicability and use of 'omics data, and the workshop discussions provided input for their further elaboration. Additionally, the key objective (iv) to establish approaches to connect 'omics perturbations to phenotypic alterations was addressed. Generally, it was considered promising to strive to link gene expression changes and pathway perturbations to the phenotype by mapping them to specific adverse outcome pathways. While further work is necessary before gene expression changes can be used to establish safe levels of substance exposure, the ECETOC workshop provided important incentives towards achieving this goal.