Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 15 de 15
Filtrar
Más filtros

Banco de datos
País/Región como asunto
Tipo del documento
Intervalo de año de publicación
1.
Transgenic Res ; 19(3): 425-36, 2010 Jun.
Artículo en Inglés | MEDLINE | ID: mdl-19757133

RESUMEN

Problem formulation is the first step in environmental risk assessment (ERA) where policy goals, scope, assessment endpoints, and methodology are distilled to an explicitly stated problem and approach for analysis. The consistency and utility of ERAs for genetically modified (GM) plants can be improved through rigorous problem formulation (PF), producing an analysis plan that describes relevant exposure scenarios and the potential consequences of these scenarios. A properly executed PF assures the relevance of ERA outcomes for decision-making. Adopting a harmonized approach to problem formulation should bring about greater uniformity in the ERA process for GM plants among regulatory regimes globally. This paper is the product of an international expert group convened by the International Life Sciences Institute (ILSI) Research Foundation.


Asunto(s)
Ambiente , Plantas Modificadas Genéticamente/efectos adversos , Proyectos de Investigación , Medición de Riesgo/métodos , Testimonio de Experto , Regulación Gubernamental , Política Pública
2.
Crit Rev Food Sci Nutr ; 49(8): 682-9, 2009 Sep.
Artículo en Inglés | MEDLINE | ID: mdl-19690994

RESUMEN

The ILSI Research Foundation convened a cross-disciplinary working group to examine current approaches for assessing dose-response and identifying safe levels of intake or exposure for four categories of bioactive agents-food allergens, nutrients, pathogenic microorganisms, and environmental chemicals. This effort generated a common analytical framework-the Key Events Dose-Response Framework (KEDRF)-for systematically examining key events that occur between the initial dose of a bioactive agent and the effect of concern. Individual key events are considered with regard to factors that influence the dose-response relationship and factors that underlie variability in that relationship. This approach illuminates the connection between the processes occurring at the level of fundamental biology and the outcomes observed at the individual and population levels. Thus, it promotes an evidence-based approach for using mechanistic data to reduce reliance on default assumptions, to quantify variability, and to better characterize biological thresholds. This paper provides an overview of the KEDRF and introduces a series of four companion papers that illustrate initial application of the approach to a range of bioactive agents.


Asunto(s)
Alérgenos/efectos adversos , Contaminantes Ambientales/efectos adversos , Microbiología de Alimentos , Alimentos , Algoritmos , Relación Dosis-Respuesta a Droga , Hipersensibilidad a los Alimentos , Humanos , Incertidumbre
3.
Crit Rev Food Sci Nutr ; 49(8): 690-707, 2009 Sep.
Artículo en Inglés | MEDLINE | ID: mdl-19690995

RESUMEN

The existence of thresholds for toxicants is a matter of debate in chemical risk assessment and regulation. Current risk assessment methods are based on the assumption that, in the absence of sufficient data, carcinogenesis does not have a threshold, while noncarcinogenic endpoints are assumed to be thresholded. Advances in our fundamental understanding of the events that underlie toxicity are providing opportunities to address these assumptions about thresholds. A key events dose-response analytic framework was used to evaluate three aspects of toxicity. The first section illustrates how a fundamental understanding of the mode of action for the hepatic toxicity and the hepatocarcinogenicity of chloroform in rodents can replace the assumption of low-dose linearity. The second section describes how advances in our understanding of the molecular aspects of carcinogenesis allow us to consider the critical steps in genotoxic carcinogenesis in a key events framework. The third section deals with the case of endocrine disrupters, where the most significant question regarding thresholds is the possible additivity to an endogenous background of hormonal activity. Each of the examples suggests that current assumptions about thresholds can be refined. Understanding inter-individual variability in the events involved in toxicological effects may enable a true population threshold(s) to be identified.


Asunto(s)
Carcinógenos/farmacocinética , Carcinógenos/toxicidad , Contaminación de Alimentos , Algoritmos , Daño del ADN , Replicación del ADN/efectos de los fármacos , Relación Dosis-Respuesta a Droga , Disruptores Endocrinos/farmacocinética , Disruptores Endocrinos/toxicidad , Salud Pública , Medición de Riesgo , Factores Socioeconómicos
4.
Food Chem Toxicol ; 45(5): 759-96, 2007 May.
Artículo en Inglés | MEDLINE | ID: mdl-17215066

RESUMEN

One of the principal applications of toxicology data is to inform risk assessments and support risk management decisions that are protective of human health. Ideally, a risk assessor would have available all of the relevant information on (a) the toxicity profile of the agent of interest; (b) its interactions with living systems; and (c) the known or projected exposure scenarios: to whom, how much, by which route(s), and how often. In practice, however, complete information is seldom available. Nonetheless, decisions still must be made. Screening-level assays and tools can provide support for many aspects of the risk assessment process, as long as the limitations of the tools are understood and to the extent that the added uncertainty the tools introduce into the process can be characterized and managed. Use of these tools for decision-making may be an end in itself for risk assessment and decision-making or a preliminary step to more extensive data collection and evaluation before assessments are undertaken or completed and risk management decisions made. This paper describes a framework for the application of screening tools for human health decision-making, although with some modest modification, it could be made applicable to environmental settings as well. The framework consists of problem formulation, development of a screening strategy based on an assessment of critical data needs, and a data analysis phase that employs weight-of-evidence criteria and uncertainty analyses, and leads to context-based decisions. Criteria for determining the appropriate screening tool(s) have been identified. The choice and use of the tool(s) will depend on the question and the level of uncertainty that may be appropriate for the context in which the decision is being made. The framework is iterative, in that users may refine the question(s) as they proceed. Several case studies illustrate how the framework may be used effectively to address specific questions for any endpoint of toxicity.


Asunto(s)
Toma de Decisiones , Exposición a Riesgos Ambientales/prevención & control , Salud Ambiental , Medición de Riesgo , Animales , Humanos , Gestión de Riesgos , Estados Unidos
5.
Part Fibre Toxicol ; 2: 8, 2005 Oct 06.
Artículo en Inglés | MEDLINE | ID: mdl-16209704

RESUMEN

The rapid proliferation of many different engineered nanomaterials (defined as materials designed and produced to have structural features with at least one dimension of 100 nanometers or less) presents a dilemma to regulators regarding hazard identification. The International Life Sciences Institute Research Foundation/Risk Science Institute convened an expert working group to develop a screening strategy for the hazard identification of engineered nanomaterials. The working group report presents the elements of a screening strategy rather than a detailed testing protocol. Based on an evaluation of the limited data currently available, the report presents a broad data gathering strategy applicable to this early stage in the development of a risk assessment process for nanomaterials. Oral, dermal, inhalation, and injection routes of exposure are included recognizing that, depending on use patterns, exposure to nanomaterials may occur by any of these routes. The three key elements of the toxicity screening strategy are: Physicochemical Characteristics, In Vitro Assays (cellular and non-cellular), and In Vivo Assays. There is a strong likelihood that biological activity of nanoparticles will depend on physicochemical parameters not routinely considered in toxicity screening studies. Physicochemical properties that may be important in understanding the toxic effects of test materials include particle size and size distribution, agglomeration state, shape, crystal structure, chemical composition, surface area, surface chemistry, surface charge, and porosity. In vitro techniques allow specific biological and mechanistic pathways to be isolated and tested under controlled conditions, in ways that are not feasible in in vivo tests. Tests are suggested for portal-of-entry toxicity for lungs, skin, and the mucosal membranes, and target organ toxicity for endothelium, blood, spleen, liver, nervous system, heart, and kidney. Non-cellular assessment of nanoparticle durability, protein interactions, complement activation, and pro-oxidant activity is also considered. Tier 1 in vivo assays are proposed for pulmonary, oral, skin and injection exposures, and Tier 2 evaluations for pulmonary exposures are also proposed. Tier 1 evaluations include markers of inflammation, oxidant stress, and cell proliferation in portal-of-entry and selected remote organs and tissues. Tier 2 evaluations for pulmonary exposures could include deposition, translocation, and toxicokinetics and biopersistence studies; effects of multiple exposures; potential effects on the reproductive system, placenta, and fetus; alternative animal models; and mechanistic studies.

6.
Adv Exp Med Biol ; 561: 117-25, 2005.
Artículo en Inglés | MEDLINE | ID: mdl-16438294

RESUMEN

A pharmacokinetic (PBPK) model has been developed for acrylamide (AMD) and its oxidative metabolite, glycidamide (GLY), in the rat based on available information. Despite gaps and limitations to the database, model parameters have been estimated to provide a relatively consistent description of the kinetics of acrylamide and glycidamide using a single set of values (with minor adjustments in some cases). Future kinetic and mechanistic studies will need to focus on the collection of key data for refining certain model parameters and for model validation, as well as for conducting studies that elucidate the mechanism of action. Development of a validated human AMD/GLY PBPK model capable of predicting target tissue doses at relevant dietary AMD exposures, in combination with expanding data on modes of action, should allow for a substantive improvement in the risk assessment of acrylamide in food.


Asunto(s)
Acrilamida/farmacocinética , Acrilamida/toxicidad , Compuestos Epoxi/farmacocinética , Medición de Riesgo/métodos , Animales , Área Bajo la Curva , Aductos de ADN , Compuestos Epoxi/química , Humanos , Cinética , Modelos Químicos , Modelos Estadísticos , Oxígeno/química , Ratas , Proyectos de Investigación , Distribución Tisular
7.
Environ Health Perspect ; 111(12): 1524-6, 2003 Sep.
Artículo en Inglés | MEDLINE | ID: mdl-12948894

RESUMEN

Characterization of children's health risks from environmental exposures requires special consideration of life-stage-specific periods of unique susceptibility in relation to childhood activities, behaviors, and intakes. At a workshop in Stowe, Vermont, in mid-summer 2001, 54 experts developed a systematic conceptual framework for assessing the impact of these factors on children's risks. This meeting report provides a brief overview of the workshop.


Asunto(s)
Protección a la Infancia , Exposición a Riesgos Ambientales , Contaminantes Ambientales/toxicidad , Niño , Conducta Infantil , Preescolar , Educación , Monitoreo del Ambiente , Humanos , Lactante , Recién Nacido , Medición de Riesgo
8.
Environ Health Perspect ; 112(2): 238-56, 2004 Feb.
Artículo en Inglés | MEDLINE | ID: mdl-14754580

RESUMEN

In recent years there has been an increasing focus in environmental risk assessment on children as a potentially susceptible population. There also has been growing recognition of the need for a systematic approach for organizing, evaluating, and incorporating the available data on children's susceptibilities in risk assessments. In this article we present a conceptual framework for assessing risks to children from environmental exposures. The proposed framework builds on the problem formulation-->analysis-->risk characterization paradigm, identifying at each phase the questions and issues of particular importance for characterizing risks to the developing organism (from conception through organ maturation). The framework is presented and discussed from the complementary perspectives of toxicokinetics and toxicodynamics.


Asunto(s)
Protección a la Infancia , Contaminantes Ambientales/farmacocinética , Contaminantes Ambientales/envenenamiento , Modelos Teóricos , Niño , Desarrollo Infantil , Preescolar , Humanos , Lactante , Recién Nacido , Medición de Riesgo/métodos
10.
Am J Health Syst Pharm ; 68(9): 835-42, 2011 May 01.
Artículo en Inglés | MEDLINE | ID: mdl-21515868

RESUMEN

PURPOSE: The development, implementation, and evaluation of an i.v. interoperability program to advance medication safety at the bedside are described. SUMMARY: I.V. interoperability integrates intelligent infusion devices (IIDs), the bar-code-assisted medication administration system, and the electronic medication administration record system into a bar-code-driven workflow that populates provider-ordered, pharmacist-validated infusion parameters on IIDs. The purpose of this project was to improve medication safety through the integration of these technologies and decrease the potential for error during i.v. medication administration. Four key phases were essential to developing and implementing i.v. interoperability: (a) preparation, (b) i.v. interoperability pilot, (c) preliminary validation, and (d) expansion. The establishment of pharmacy involvement in i.v. interoperability resulted in two additional safety checks: pharmacist infusion rate oversight and nurse independent validation of the autoprogrammed rate. After instituting i.v. interoperability, monthly compliance to the telemetry drug library increased to a mean ± S.D. of 72.1% ± 2.1% from 56.5% ± 1.5%, and the medical-surgical nursing unit's drug library monthly compliance rate increased to 58.6% ± 2.9% from 34.1% ± 2.6% (p < 0.001 for both comparisons). The number of manual pump edits decreased with both telemetry and medical-surgical drug libraries, demonstrating a reduction from 56.9 ± 12.8 to 14.2 ± 3.9 and from 61.2 ± 15.4 to 14.7 ± 3.8, respectively (p < 0.001 for both comparisons). Through the integration and incorporation of pharmacist oversight for rate changes, the telemetry and medical-surgical patient care areas demonstrated a 32% reduction in reported monthly errors involving i.v. administration of heparin. CONCLUSION: By integrating two stand-alone technologies, i.v. interoperability was implemented to improve medication administration. Medication errors were reduced, nursing workflow was simplified, and pharmacists became involved in checking infusion rates of i.v. medications.


Asunto(s)
Procesamiento Automatizado de Datos , Errores de Medicación/prevención & control , Sistemas de Medicación en Hospital , Preparaciones Farmacéuticas/administración & dosificación , Anticoagulantes/administración & dosificación , Anticoagulantes/efectos adversos , Heparina/administración & dosificación , Heparina/efectos adversos , Humanos , Infusiones Intravenosas , Rol de la Enfermera , Farmacéuticos/organización & administración , Proyectos Piloto , Rol Profesional , Telemetría/métodos , Flujo de Trabajo
11.
Food Chem Toxicol ; 47(9): 2236-45, 2009 Sep.
Artículo en Inglés | MEDLINE | ID: mdl-19531369

RESUMEN

Due to ever-improving analytical capabilities, very low levels of unexpected chemicals can now be detected in foods. Although these may be toxicologically insignificant, such incidents often garner significant attention. The threshold of toxicological concern (TTC) methodology provides a scientifically defensible, transparent approach for putting low-level exposures in the context of potential risk, as a tool to facilitate prioritization of responses, including potential mitigation. The TTC method supports the establishment of tiered, health-protective exposure limits for chemicals lacking a full toxicity database, based on evaluation of the known toxicity of chemicals which share similar structural characteristics. The approach supports the view that prudent actions towards public health protection are based on evaluation of safety as opposed to detection chemistry. This paper builds on the existing TTC literature and recommends refinements that address two key areas. The first describes the inclusion of genotoxicity data as a way to refine the TTC limit for chemicals that have structural alerts for genotoxicity. The second area addresses duration of exposure. Whereas the existing TTC exposure limits assume a lifetime of exposure, human exposure to unintended chemicals in food is often only for a limited time. Recommendations are made to refine the approach for less-than-lifetime exposures.


Asunto(s)
Análisis de los Alimentos/métodos , Contaminación de Alimentos/prevención & control , Abastecimiento de Alimentos/legislación & jurisprudencia , Legislación Alimentaria , Medición de Riesgo/métodos , Xenobióticos/análisis , Relación Dosis-Respuesta a Droga , Humanos , Mutágenos/química , Mutágenos/toxicidad , Nivel sin Efectos Adversos Observados , Relación Estructura-Actividad , Xenobióticos/toxicidad
12.
Crit Rev Toxicol ; 37(9): 729-837, 2007.
Artículo en Inglés | MEDLINE | ID: mdl-17957539

RESUMEN

For more than three decades chronic studies in rodents have been the benchmark for assessing the potential long-term toxicity, and particularly the carcinogenicity, of chemicals. With doses typically administered for about 2 years (18 months to lifetime), the rodent bioassay has been an integral component of testing protocols for food additives, pesticides, pharmaceuticals, industrial chemicals, and all manner of byproducts and environmental contaminants. Over time, the data from these studies have been used to address an increasing diversity of questions related to the assessment of human health risks, adding complexity to study design and interpretation. An earlier ILSI RSI working group developed a set of principles for the selection of doses for chronic rodent studies (ILSI, 1997). The present report builds on that work, examining some of the issues that arise and offering new perspectives and approaches for putting the principles into practice. Dose selection is considered both from the prospective viewpoint of the choosing of dose levels for a study and from the retrospective interpretation of study results in light of the doses used. A main theme of this report is that the purposes and objectives of chronic rodent studies vary and should be clearly defined in advance. Dose placement, then, should be optimized to achieve study objectives. For practical reasons, most chronic studies today must be designed to address multiple objectives, often requiring trade-offs and innovative approaches in study design. A systematic approach to dose selection should begin with recognition that the design of chronic studies occurs in the context of a careful assessment of the accumulated scientific information on the test substance, the relevant risk management questions, priorities and mandates, and the practical limitations and constraints on available resources. A stepwise process is described. The aim is to increase insofar as possible the utility of an expensive and time-consuming experiment. The kinds of data that are most commonly needed for dose selection and for understanding the dose-related results of chronic rodent studies, particularly carcinogenicity studies, are discussed as "design/interpretation factors." They comprise both the inherent characteristics of the test substance and indicators of biological damage, perturbation or stress among the experimental animals. They may be primary toxicity endpoints, predictors or indicators of appropriate dose selection, or indicators of conditions to be avoided in dose selection. The application and interpretation of design/interpretation factors is conditioned by the study objectives-what is considered desirable will depend on the strategy for choice of doses that is being followed. The challenge is to select doses that accommodate all of the issues raised by the relevant design/interpretation factors. Three case studies are presented here that illustrate the interplay between study objectives and the design and selection of doses for chronic rodent studies. These examples also highlight issues associated with multiple plausible modes of action, multiple pathways for biotransformation of the chemical, extraneous high-dose effects, the use of modeling in dose selection, and the implications of human exposure levels. Finally, looking to the future, the report explores seven potential paradigm shifts for risk assessment that will significantly impact the design and interpretation of toxicity and carcinogenicity studies.


Asunto(s)
Pruebas de Carcinogenicidad/métodos , Relación Dosis-Respuesta a Droga , Animales , Carcinógenos/toxicidad , Humanos , Proyectos de Investigación , Roedores
13.
Risk Anal ; 26(1): 5-16, 2006 Feb.
Artículo en Inglés | MEDLINE | ID: mdl-16492173

RESUMEN

The article offers insights on the peer-review process as it relates to scientific and technical reports used to inform regulatory decisions. Used effectively, peer review is a powerful tool for advising organizational leaders whether the scientific foundations of their decisions can be expected to withstand scrutiny as rule-making products move through interagency reviews, public comment and stakeholder processes, congressional oversight, and judicial review. The emphasis is "heads up" rather than "how to." That is, without delving into myriad technical and administrative details, the discussion highlights nine fundamental "leadership responsibilities" that determine the nature and course of peer review.


Asunto(s)
Revisión de la Investigación por Pares/legislación & jurisprudencia , Conflicto de Intereses , Toma de Decisiones en la Organización , Gobierno Federal , Liderazgo , Legislación Médica , Edición/legislación & jurisprudencia , Estados Unidos
14.
Regul Toxicol Pharmacol ; 37(1): 105-32, 2003 Feb.
Artículo en Inglés | MEDLINE | ID: mdl-12662914

RESUMEN

The estimation and characterization of a cancer risk is grounded in the observation of tumors in humans and/or experimental animals. Increasingly, however, other kinds of data (non-tumor data) are finding application in cancer risk assessment. Metabolism and kinetics, adduct formation, genetic damage, mode of action, and biomarkers of exposure, susceptibility, and effects are examples. While these and other parameters have been studied for many important chemicals over the past 30-40 years, their use in risk assessments is more recent, and new insights and opportunities are continuing to unfold. To provide some perspective on this field, the ILSI Risk Science Institute asked a select working group to characterize the pertinent non-tumor data available for 1,3-butadiene, benzene, and vinyl chloride and to comment on the utility of these data in characterizing cancer risks. This paper presents the findings of that working group and concludes with 15 simple principles for the use of non-tumor data in cancer risk assessment.


Asunto(s)
Benceno/toxicidad , Butadienos/toxicidad , Carcinógenos/toxicidad , Cloruro de Vinilo/toxicidad , Animales , Benceno/metabolismo , Benceno/farmacocinética , Biomarcadores/análisis , Butadienos/metabolismo , Butadienos/farmacocinética , Pruebas de Carcinogenicidad , Carcinógenos/metabolismo , Carcinógenos/farmacocinética , Aductos de ADN/metabolismo , Humanos , Pruebas de Mutagenicidad , Neoplasias/inducido químicamente , Medición de Riesgo/métodos , Cloruro de Vinilo/metabolismo , Cloruro de Vinilo/farmacocinética
SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA