RESUMEN
Making research data findable, accessible, interoperable and reusable (FAIR) is typically hampered by a lack of skills in technical aspects of data management by data generators and a lack of resources. We developed a Template Wizard for researchers to easily create templates suitable for consistently capturing data and metadata from their experiments. The templates are easy to use and enable the compilation of machine-readable metadata to accompany data generation and align them to existing community standards and databases, such as eNanoMapper, streamlining the adoption of the FAIR principles. These templates are citable objects and are available as online tools. The Template Wizard is designed to be user friendly and facilitates using and reusing existing templates for new projects or project extensions. The wizard is accompanied by an online template validator, which allows self-evaluation of the template (to ensure mapping to the data schema and machine readability of the captured data) and transformation by an open-source parser into machine-readable formats, compliant with the FAIR principles. The templates are based on extensive collective experience in nanosafety data collection and include over 60 harmonized data entry templates for physicochemical characterization and hazard assessment (cell viability, genotoxicity, environmental organism dose-response tests, omics), as well as exposure and release studies. The templates are generalizable across fields and have already been extended and adapted for microplastics and advanced materials research. The harmonized templates improve the reliability of interlaboratory comparisons, data reuse and meta-analyses and can facilitate the safety evaluation and regulation process for (nano) materials.
Asunto(s)
Nanoestructuras , Nanoestructuras/química , Programas Informáticos , MetadatosRESUMEN
The past few decades of managing the uncertain risks associated with nanomaterials have provided valuable insights (knowledge gaps, tools, methods, etc.) that are equally important to promote safe and sustainable development and use of advanced materials. Based on these insights, the current paper proposes several actions to optimize the risk and sustainability governance of advanced materials. We emphasise the importance of establishing a European approach for risk and sustainability governance of advanced materials as soon as possible to keep up with the pace of innovation and to manage uncertainty among regulators, industry, SMEs and the public, regarding potential risks and impacts of advanced materials. Coordination of safe and sustainable advanced material research efforts, and data management according to the Findable, Accessible, Interoperable and Reusable (FAIR) principles will enhance the generation of regulatory-relevant knowledge. This knowledge is crucial to identify whether current regulatory standardised and harmonised test methods are adequate to assess advanced materials. At the same time, there is urgent need for responsible innovation beyond regulatory compliance which can be promoted through the Safe and Sustainable Innovation Approach. that combines the Safe and Sustainable by Design concept with Regulatory Preparedness, supported by a trusted environment. We further recommend consolidating all efforts and networks related to the risk and sustainability governance of advanced materials in a single, easy-to-use digital portal. Given the anticipated complexity and tremendous efforts required, we identified the need of establishing an organisational structure dedicated to aligning the fast technological developments in advanced materials with proper risk and sustainability governance. Involvement of multiple stakeholders in a trusted environment ensures a coordinated effort towards the safe and sustainable development, production, and use of advanced materials. The existing infrastructures and network of experts involved in the governance of nanomaterials would form a solid foundation for such an organisational structure.
Asunto(s)
Nanoestructuras , Desarrollo Sostenible , Humanos , Nanotecnología/legislación & jurisprudencia , Europa (Continente)RESUMEN
This work aimed to verify whether it is possible to extend the applicability domain (AD) of existing QSPR (Quantitative Structure-Property Relationship) models by employing a strategy involving additional quantum-chemical calculations. We selected two published QSPR models: for water solubility, logSW, and vapor pressure, logVP of PFAS as case studies. We aimed to enlarge set of compounds used to build the model by applying factorial planning to plan the augmentation of the set of these compounds based on their structural features (descriptors). Next, we used the COSMO-RS model to calculate the logSW and logVP for selected chemicals. This allowed filling gaps in the experimental data for further training QSPR models. We improved the published models by significantly extending number of compounds for which theoretical predictions are reliable (i.e., extending the AD). Additionally, we performed external validation that had not been carried out in original models. To test effectiveness of the AD extension, we screened 4519 PFAS from NORMAN Database. The number of compounds outside the domain was reduced comparing the original model for both properties. Our work shows that combining physics-based methods with data-driven models can significantly improve the performance of predictions of phys-chem properties relevant for the chemical risk assessment.
Asunto(s)
Asteraceae , Fluorocarburos , Presión de Vapor , Solubilidad , AguaRESUMEN
The environmental impact on health is an inevitable by-product of human activity. Environmental health sciences is a multidisciplinary field addressing complex issues on how people are exposed to hazardous chemicals that can potentially affect adversely the health of present and future generations. Exposure sciences and environmental epidemiology are becoming increasingly data-driven and their efficiency and effectiveness can significantly improve by implementing the FAIR (findable, accessible, interoperable, reusable) principles for scientific data management and stewardship. This will enable data integration, interoperability and (re)use while also facilitating the use of new and powerful analytical tools such as artificial intelligence and machine learning in the benefit of public health policy, and research, development and innovation (RDI). Early research planning is critical to ensuring data is FAIR at the outset. This entails a well-informed and planned strategy concerning the identification of appropriate data and metadata to be gathered, along with established procedures for their collection, documentation, and management. Furthermore, suitable approaches must be implemented to evaluate and ensure the quality of the data. Therefore, the 'Europe Regional Chapter of the International Society of Exposure Science' (ISES Europe) human biomonitoring working group (ISES Europe HBM WG) proposes the development of a FAIR Environment and health registry (FAIREHR) (hereafter FAIREHR). FAIR Environment and health registry offers preregistration of studies on exposure sciences and environmental epidemiology using HBM (as a starting point) across all areas of environmental and occupational health globally. The registry is proposed to receive a dedicated web-based interface, to be electronically searchable and to be available to all relevant data providers, users and stakeholders. Planned Human biomonitoring studies would ideally be registered before formal recruitment of study participants. The resulting FAIREHR would contain public records of metadata such as study design, data management, an audit trail of major changes to planned methods, details of when the study will be completed, and links to resulting publications and data repositories when provided by the authors. The FAIREHR would function as an integrated platform designed to cater to the needs of scientists, companies, publishers, and policymakers by providing user-friendly features. The implementation of FAIREHR is expected to yield significant benefits in terms of enabling more effective utilization of human biomonitoring (HBM) data.
RESUMEN
BACKGROUND: To ascertain the safe use of chemicals that are used in multiple consumer products, the aggregate human exposure, arising from combined use of multiple consumer products needs to be assessed. OBJECTIVE: In this work the Probabilistic Aggregate Consumer Exposure Model (PACEM) is presented and discussed. PACEM is implemented in the publicly available web tool, PACEMweb, for aggregate consumer exposure assessment. METHODS: PACEM uses a person-oriented simulation method that is based on realistic product usage information obtained in surveys from several European countries. PACEM evaluates aggregate exposure in a population considering individual use and co-use patterns as well as variation in product composition. Product usage data is included on personal care products (PCPs) and household cleaning products (HCPs). RESULTS: PACEM has been implemented in a web tool that supports broad use in research as well as regulatory risk assessment. PACEM has been evaluated in a number of applications, testing and illustrating the advantage of the person-oriented modeling method. Also, PACEM assessments have been evaluated by comparing its results with biomonitoring information. SIGNIFICANCE: PACEM enables the assessment of realistic aggregate exposure to chemicals in consumer products. It provides detailed insight into the distribution of exposure in a population as well as products that contribute the most to exposure. This allows for better informed decision making in the risk management of chemicals. IMPACT: Realistic assessment of the total, aggregate exposure of consumers to chemicals in consumer products is necessary to guarantee the safe use of chemicals in these products. PACEMweb provides, for the first time, a publicly available tool to assist in realistic aggregate exposure assessment of consumers to chemicals in consumer products.
RESUMEN
(Quantitative) structure-activity relationship ([Q]SAR) methodologies are widely applied to predict the (eco)toxicological effects of chemicals, and their use is envisaged in different regulatory frameworks for filling data gaps of untested substances. However, their application to the risk assessment of nanomaterials is still limited, also due to the scarcity of large and curated experimental datasets. Despite a great amount of nanosafety data having been produced over the last decade in international collaborative initiatives, their interpretation, integration and reuse has been hampered by several obstacles, such as poorly described (meta)data, non-standard terminology, lack of harmonized reporting formats and criteria. Recently, the FAIR (Findable, Accessible, Interoperable, and Reusable) principles have been established to guide the scientific community in good data management and stewardship. The EU H2020 Gov4Nano project, together with other international projects and initiatives, is addressing the challenge of improving nanosafety data FAIRness, for maximizing their availability, understanding, exchange and ultimately their reuse. These efforts are largely supported by the creation of a common Nanosafety Data Interface, which connects a row of project-specific databases applying the eNanoMapper data model. A wide variety of experimental data relating to characterization and effects of nanomaterials are stored in the database; however, the methods, protocols and parameters driving their generation are not fully mature. This article reports the progress of an ongoing case study in the Gov4nano project on the reuse of in vitro Comet genotoxicity data, focusing on the issues and challenges encountered in their FAIRification through the eNanoMapper data model. The case study is part of an iterative process in which the FAIRification of data supports the understanding of the phenomena underlying their generation and, ultimately, improves their reusability.
RESUMEN
Data sharing and reuse are crucial to enhance scientific progress and maximize return of investments in science. Although attitudes are increasingly favorable, data reuse remains difficult due to lack of infrastructures, standards, and policies. The FAIR (findable, accessible, interoperable, reusable) principles aim to provide recommendations to increase data reuse. Because of the broad interpretation of the FAIR principles, maturity indicators are necessary to determine the FAIRness of a dataset. In this work, we propose a reproducible computational workflow to assess data FAIRness in the life sciences. Our implementation follows principles and guidelines recommended by the maturity indicator authoring group and integrates concepts from the literature. In addition, we propose a FAIR balloon plot to summarize and compare dataset FAIRness. We evaluated the feasibility of our method on three real use cases where researchers looked for six datasets to answer their scientific questions. We retrieved information from repositories (ArrayExpress, Gene Expression Omnibus, eNanoMapper, caNanoLab, NanoCommons and ChEMBL), a registry of repositories, and a searchable resource (Google Dataset Search) via application program interfaces (API) wherever possible. With our analysis, we found that the six datasets met the majority of the criteria defined by the maturity indicators, and we showed areas where improvements can easily be reached. We suggest that use of standard schema for metadata and the presence of specific attributes in registries of repositories could increase FAIRness of datasets.
RESUMEN
Advanced material development, including at the nanoscale, comprises costly and complex challenges coupled to ensuring human and environmental safety. Governmental agencies regulating safety have announced interest toward acceptance of safety data generated under the collective term New Approach Methodologies (NAMs), as such technologies/approaches offer marked potential to progress the integration of safety testing measures during innovation from idea to product launch of nanomaterials. Divided in overall eight main categories, searchable databases for grouping and read across purposes, exposure assessment and modeling, in silico modeling of physicochemical structure and hazard data, in vitro high-throughput and high-content screening assays, dose-response assessments and modeling, analyses of biological processes and toxicity pathways, kinetics and dose extrapolation, consideration of relevant exposure levels and biomarker endpoints typify such useful NAMs. Their application generally agrees with articulated stakeholder needs for improvement of safety testing procedures. They further fit for inclusion and add value in nanomaterials risk assessment tools. Overall 37 of 50 evaluated NAMs and tiered workflows applying NAMs are recommended for considering safer-by-design innovation, including guidance to the selection of specific NAMs in the eight categories. An innovation funnel enriched with safety methods is ultimately proposed under the central aim of promoting rigorous nanomaterials innovation.
Asunto(s)
Ciencia de los Materiales , Nanoestructuras , Seguridad , Pruebas de Toxicidad , Simulación por Computador , Humanos , Ciencia de los Materiales/métodos , Ciencia de los Materiales/tendencias , Nanoestructuras/normas , Medición de RiesgoRESUMEN
To facilitate the application of probabilistic risk assessment, the WHO released the APROBA tool. This tool applies lognormal uncertainty distributions to the different aspects of the hazard characterization, resulting in a probabilistic health-based guidance value. The current paper describes an extension, APROBA-Plus, which combines the output from the probabilistic hazard characterization with the probabilistic exposure to rapidly characterize risk and its uncertainty. The uncertainty in exposure is graphically compared with the uncertainty in the target human dose, i.e. the dose that complies with the specified protection goals. APROBA-Plus is applied to several case studies, resulting in distinct outcomes and illustrating that APROBA-Plus could serve as a standard extension of routine risk assessments. By visualizing the uncertainties, APROBA-Plus provides a more transparent and informative outcome than the more usual deterministic approaches, so that risk managers can make better informed decisions. For example, APROBA-Plus can help in deciding whether risk-reducing measures are warranted or that a refined risk assessment would first be needed. If the latter, the tool can be used to prioritize possible refinements. APROBA-Plus may also be used to rank substances into different risk categories, based on potential health risks without being compromised by different levels of conservatism that may be associated with point estimates of risk.
Asunto(s)
Contaminación de Alimentos/análisis , Sustancias Peligrosas/toxicidad , Medición de Riesgo/métodos , Animales , Sustancias Peligrosas/análisis , Humanos , Modelos EstadísticosRESUMEN
Two small-scale field studies were conducted to investigate the transfer of substances from products into dust due to direct and air-mediated transfer. The project focused on semivolatile organic compounds (SVOCs), which are frequently found in and re-emitted from dust. For the field studies, four artificial products containing deuterium-labeled SVOCs (eight phthalates and adipates) were installed in residential indoor environments. Two plastic products were installed vertically to investigate substance transfer due to evaporation into air. One plastic product and a carpet were installed horizontally to investigate the direct transfer from source to dust. A pyrethroid was intentionally released by spraying a commercial spray. Dust samples were collected from the floor, elevated surfaces in the room and the surfaces of the horizontally installed products. We observed that the dust concentrations of substances exclusively transferred via air were similar at different collection sites, but the concentrations of chemicals present in horizontal products were up to 3 orders of magnitude higher in dust deposited on the source. We conclude that direct transfer from source into dust substantially increases the final SVOC concentration in dust in contact with the source, regardless of the vapor pressure of investigated SVOCs, and may lead to larger human exposure.
Asunto(s)
Contaminación del Aire Interior , Polvo , Humanos , Compuestos Orgánicos VolátilesRESUMEN
Semivolatile organic compounds (SVOCs) can be released from products and distributed in the indoor environment, including air and dust. However, the mechanisms and the extent of substance transfer into air and dust are not well understood. Therefore, in a small-scale field study the transfer of nine SVOCs was investigated: Four artificial consumer products were doped with eight deuterium-labeled plasticizers (phthalates and adipates) and installed in five homes to investigate the emission processes of evaporation, abrasion, and direct transfer. Intentional release was studied with a commercial spray containing a pyrethroid. During the 12 week study, indoor air and settled dust samples were collected and analyzed. On the basis of our measurement results, we conclude that the octanol-air partitioning coefficient Koa is a major determinant for the substance transfer into either air or dust: A high Koa implies that the substance is more likely to be found in dust than in air. The emission process also plays a role: For spraying, we found higher dust and air concentrations than for evaporation. In contrast, apartment parameters like air exchange rate or temperature had just a minor influence. Another important mechanistic finding was that although transfer from product to dust currently is postulated to be mostly mediated by air, direct transport from product to dust on the product surface was also observed.
Asunto(s)
Contaminación del Aire Interior/análisis , Polvo/análisis , Compuestos Orgánicos Volátiles/análisis , Deuterio/análisis , Deuterio/química , Ácidos Ftálicos/química , Plastificantes/química , Compuestos Orgánicos Volátiles/químicaRESUMEN
The Dutch cities Utrecht and Wijk bij Duurstede were founded by the Romans around 50 B.C. and the village Fijnaart and Graft-De Rijp around 1600 A.D. The soils of these villages are polluted with Pb (up to ~5000 mg/kg). Lead isotope ratios were used to trace the sources of Pb pollution in the urban soils. In ~75% of the urban soils the source of the Pb pollution was a mixture of glazed potsherd, sherds of glazed roof tiles, building remnants (Pb sheets), metal slag, Pb-based paint flakes and coal ashes. These anthropogenic Pb sources most likely entered the urban soils due to historical smelting activities, renovation and demolition of houses, disposal of coal ashes and raising and fertilization of land with city waste. Since many houses still contain Pb-based building materials, careless renovation or demolition can cause new or more extensive Pb pollution in urban soils. In ~25% of the studied urban topsoils, Pb isotope compositions suggest Pb pollution was caused by incinerator ash and/or gasoline Pb suggesting atmospheric deposition as the major source. The bioaccessible Pb fraction of 14 selected urban soils was determined with an in vitro test and varied from 16% to 82% of total Pb. The bioaccessibility appears related to the chemical composition and grain size of the primary Pb phases and pollution age. Risk assessment based on the in vitro test results imply that risk to children may be underestimated in ~90% of the studied sample sites (13 out of 14).
Asunto(s)
Ciudades , Exposición a Riesgos Ambientales/efectos adversos , Contaminación Ambiental/efectos adversos , Residuos Industriales/efectos adversos , Plomo/química , Metalurgia/historia , Contaminantes del Suelo/análisis , Suelo/química , Ciudades/historia , Polvo , Exposición a Riesgos Ambientales/prevención & control , Monitoreo del Ambiente , Contaminación Ambiental/historia , Historia Antigua , Humanos , Incineración/historia , Países Bajos , Material Particulado , Medición de RiesgoRESUMEN
Current practice of chemical risk assessment for consumer product ingredients still rarely exercises the aggregation of multi-source exposure. However, focusing on a single dominant source/pathway combination may lead to a significant underestimation of the risk for substances present in numerous consumer products, which often are used simultaneously. Moreover, in most cases complex multi-route exposure scenarios also need to be accounted for. This paper introduces and evaluates the performance of the Probabilistic Aggregate Consumer Exposure Model (PACEM) applied in the context of a tiered approach to exposure assessment for ingredients in cosmetics and personal care products (C&PCPs) using decamethylcyclopentasiloxane (D5) as a worked example. It is demonstrated that PACEM predicts a more realistic, but still conservative aggregate exposure within the Dutch adult population when compared to a deterministic point estimate obtained in a lower tier screening assessment. An overall validation of PACEM is performed by quantitatively relating and comparing its estimates to currently available human biomonitoring and environmental sampling data. Moderate (by maximum one order of magnitude) overestimation of exposure is observed due to a justified conservatism built into the model structure, resulting in the tool being suitable for risk assessment.
Asunto(s)
Exposición a Riesgos Ambientales/análisis , Monitoreo del Ambiente/métodos , Siloxanos/análisis , Adulto , Anciano , Cosméticos/química , Exposición a Riesgos Ambientales/efectos adversos , Femenino , Productos Domésticos , Humanos , Masculino , Persona de Mediana Edad , Modelos Estadísticos , Medición de Riesgo/métodos , Adulto JovenRESUMEN
Calixarene 0118 is a potent anti-angiogenic agent that effectively inhibited tumor growth in preclinical studies, and is currently being evaluated in a phase I clinical trial. We have designed two close mimetics of calixarene 0118 containing a terminal alkynyl-functional group, and developed an optimized semi-automated procedure for radiolabeling with 2-[(18)F]fluoroethylazide using click chemistry. Following semi-preparative HPLC purification and formulation, the lower-rim modified analog [(18)F]6 and the equatorially labeled [(18)F]13 were obtained in >97% radiochemical purity and overall decay-corrected isolated radiochemical yields of 18.7 ± 2.7% (n = 4) and 10.2 ± 5.0% (n = 4), respectively, in a total synthesis time of about 2 h. Preliminary in vivo studies in nude mice bearing human tumor xenografts revealed highest accumulation of both tracers in the liver, followed by spleen, kidney, lung and bone, with no substantial uptake in the tumor. Still, these first-in-class radiotracers are a valuable tool for pharmacokinetic profiling and improvement of calixarene-based anti-angiogenic therapeutics in the future, as similar radiolabeling strategies may be applied to other compounds in the calixarene series. The cold reference compounds of the radiotracers were characterized in terms of cytotoxicity and anti-proliferative effects on HUVEC cells and on MA148 human ovarian carcinoma cells, along with the respective precursors, a small series of 0118 analogs modified with short-chain linear alkyl substituents, and a PEG3-spaced calixarene dimer. While all of the new analogs proved at least equipotent to parent 0118, some of them inhibited HUVEC and MA148 cell growth almost 4- and 10-fold more effectively, rendering these analogs promising candidates for further evaluation in anti-angiogenic cancer therapy.
Asunto(s)
Inhibidores de la Angiogénesis/síntesis química , Inhibidores de la Angiogénesis/uso terapéutico , Azidas/química , Calixarenos/síntesis química , Calixarenos/uso terapéutico , Radioisótopos de Flúor/química , Inhibidores de la Angiogénesis/química , Inhibidores de la Angiogénesis/farmacología , Animales , Calixarenos/química , Calixarenos/farmacología , Línea Celular Tumoral , Supervivencia Celular/efectos de los fármacos , Cromatografía Líquida de Alta Presión , Química Clic , Reacción de Cicloadición , Diseño de Fármacos , Femenino , Células Endoteliales de la Vena Umbilical Humana , Humanos , Ratones Endogámicos BALB C , Ratones Desnudos , Estructura Molecular , Radioquímica , Resultado del Tratamiento , Ensayos Antitumor por Modelo de XenoinjertoRESUMEN
Current methods for cancer risk assessment result in single values, without any quantitative information on the uncertainties in these values. Therefore, single risk values could easily be overinterpreted. In this study, we discuss a full probabilistic cancer risk assessment approach in which all the generally recognized uncertainties in both exposure and hazard assessment are quantitatively characterized and probabilistically evaluated, resulting in a confidence interval for the final risk estimate. The methodology is applied to three example chemicals (aflatoxin, N-nitrosodimethylamine, and methyleugenol). These examples illustrate that the uncertainty in a cancer risk estimate may be huge, making single value estimates of cancer risk meaningless. Further, a risk based on linear extrapolation tends to be lower than the upper 95% confidence limit of a probabilistic risk estimate, and in that sense it is not conservative. Our conceptual analysis showed that there are two possible basic approaches for cancer risk assessment, depending on the interpretation of the dose-incidence data measured in animals. However, it remains unclear which of the two interpretations is the more adequate one, adding an additional uncertainty to the already huge confidence intervals for cancer risk estimates.
Asunto(s)
Neoplasias/inducido químicamente , Medición de Riesgo/métodos , Aflatoxinas/administración & dosificación , Aflatoxinas/toxicidad , Animales , Carcinógenos/administración & dosificación , Carcinógenos/toxicidad , Dimetilnitrosamina/administración & dosificación , Dimetilnitrosamina/toxicidad , Relación Dosis-Respuesta a Droga , Eugenol/administración & dosificación , Eugenol/análogos & derivados , Eugenol/toxicidad , Femenino , Contaminación de Alimentos/análisis , Humanos , Incidencia , Masculino , Modelos Estadísticos , Neoplasias/epidemiología , Medición de Riesgo/estadística & datos numéricos , IncertidumbreRESUMEN
Bioaccessibility is a measurement of a substance's solubility in the human gastro-intestinal system, and is often used in the risk assessment of soils. The present study was designed to determine the variability among laboratories using different methods to measure the bioaccessibility of 24 inorganic contaminants in one standardized soil sample, the standard reference material NIST 2710. Fourteen laboratories used a total of 17 bioaccessibility extraction methods. The variability between methods was assessed by calculating the reproducibility relative standard deviations (RSDs), where reproducibility is the sum of within-laboratory and between-laboratory variability. Whereas within-laboratory repeatability was usually better than (<) 15% for most elements, reproducibility RSDs were much higher, indicating more variability, although for many elements they were comparable to typical uncertainties (e.g., 30% in commercial laboratories). For five trace elements of interest, reproducibility RSDs were: arsenic (As), 22-44%; cadmium (Cd), 11-41%; Cu, 15-30%; lead (Pb), 45-83%; and Zn, 18-56%. Only one method variable, pH, was found to correlate significantly with bioaccessibility for aluminum (Al), Cd, copper (Cu), manganese (Mn), Pb and zinc (Zn) but other method variables could not be examined systematically because of the study design. When bioaccessibility results were directly compared with bioavailability results for As (swine and mouse) and Pb (swine), four methods returned results within uncertainty ranges for both elements: two that were defined as simpler (gastric phase only, limited chemicals) and two were more complex (gastric + intestinal phases, with a mixture of chemicals).
Asunto(s)
Monitoreo del Ambiente/métodos , Monitoreo del Ambiente/normas , Laboratorios , Modelos Biológicos , Contaminantes del Suelo , Tracto Gastrointestinal/metabolismo , Humanos , Laboratorios/normas , Estándares de Referencia , Reproducibilidad de los Resultados , Contaminantes del Suelo/análisis , Contaminantes del Suelo/farmacocinética , Estados Unidos , United States Government AgenciesRESUMEN
Complete information regarding the use of personal care products (PCPs) by consumers is limited, but such information is crucial for realistic consumer exposure assessment. To fill this gap, a database was created with person-oriented information regarding usage patterns and circumstances of use for 32 different PCPs. Out of 2700 potential participants from the Netherlands, 516 men and women completed a digital questionnaire. The prevalence of use varied by gender, age, level of education and skin type. A high frequency of use was observed for some products (e.g. lip care products), while toothpaste, deodorant and day cream were generally used once or twice a day. The frequency of use for other PCPs varied over a wide range. The amounts of use varied largely between and within different product groups. Body lotion, sunscreen and after sun lotion were often applied on adjacent body parts. The majority of PCPs were applied in the morning, but some products, such as night cream and after sun, were predominantly applied in the evening or night. As expected, the participants used several PCPs simultaneously. The database yields important personalized exposure factors which can be used in aggregate consumer exposure assessment for substances that are components of PCPs.
Asunto(s)
Cosméticos , Exposición a Riesgos Ambientales , Femenino , Humanos , Masculino , Países BajosRESUMEN
The consumption of fish and nitrate-rich vegetables may lead to the formation of the genotoxic carcinogen N-nitrosodimethylamine (NDMA) in the stomach. To assess human cancer risk associated with this formation, a dynamic in vitro gastrointestinal model was used to simulate NDMA formation in the stomach after a fish + vegetable meal. The experimental results were combined with statistical modeling of Dutch food consumption data resulting in predicted exposures to endogenously formed NDMA in the population. The 95th percentile of the long-term exposure distribution was around 4 ng/kg-bw in young children and 0.4 ng/kg-bw in adults. By comparing this exposure with the Benchmark Dose Lower bound (BMDL) 10 for liver cancer in a chronic carcinogenicity study, a chronic margin of exposure (MOE) was calculated of 7000 and 73,000 for young children and adults. Furthermore, the long-term exposure distribution was combined with a dose-response analysis of the liver cancer incidence data to obtain a cancer risk distribution for the human population. The 95th percentile of that distribution was 6 x 10(-6) extra risk for 5-year-old children and 8 x 10(-7) for adults. The liver cancer data allowed for the analysis of the relationship between tumor incidence and time to tumor. For an extra risk of 10(-6), the decrease in time to tumor was conservatively estimated at 3.8 min in the rat, equivalent to 0.1 days in humans. We also combined acute exposure estimates with the BMDL10 from an acute carcinogenicity study for NDMA, resulting in an acute MOE of 110,000. We conclude that the combined consumption of fish and nitrate-rich vegetables appears to lead to marginal increases of additional cancer risk.
Asunto(s)
Dimetilnitrosamina/toxicidad , Alimentos , Alimentos Marinos , Verduras , Pruebas de Carcinogenicidad , Relación Dosis-Respuesta a Droga , Exposición a Riesgos Ambientales , Humanos , Medición de RiesgoRESUMEN
Various models exist for estimating the usual intake distribution from dietary intake data. In this paper, we compare two of these models, the Iowa State University Foods (ISUF) model and the betabinomial-normal (BBN) model and apply them to three different datasets. Intake data are obtained by aggregating over multiple food products and are often non-normal. The ISUF and BBN model both address non-normality. While the two models have similar structures, they show some differences. The ISUF model includes an additional spline transformation for improving the normality of the intake amount distribution, while the BBN model includes the possibility of addressing covariates, such as age or sex. Our analyses showed that for two of the example datasets both models produced similar estimates of the higher percentiles of the usual intake distribution. However, for the third dataset, where the intake amount distribution appear to be multimodal, both models produced different percentile estimates.