Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 3 de 3
Filtrar
Más filtros










Base de datos
Intervalo de año de publicación
1.
J Environ Radioact ; 278: 107472, 2024 Jun 20.
Artículo en Inglés | MEDLINE | ID: mdl-38905881

RESUMEN

Methods for determining the radiation dose received by exposed biota require major improvements to reduce uncertainties and increase precision. We share our experiences in attempting to quantify external dose rates to free-ranging wildlife using GPS-coupled dosimetry methods. The manuscript is a primer on fundamental concepts in wildlife dosimetry in which the complexities of quantifying dose rates are highlighted, and lessons learned are presented based on research with wild boar and snakes at Fukushima, wolves at Chornobyl, and reindeer in Norway. GPS-coupled dosimeters produced empirical data to which numerical simulations of external dose using computer software were compared. Our data did not support a standing paradigm in risk analyses: Using averaged soil contaminant levels to model external dose rates conservatively overestimate the dose to individuals within a population. Following this paradigm will likely lead to misguided recommendations for risk management. The GPS-dosimetry data also demonstrated the critical importance of how modeled external dose rates are impacted by the scale at which contaminants are mapped. When contaminant mapping scales are coarse even detailed knowledge about each animal's home range was inadequate to accurately predict external dose rates. Importantly, modeled external dose rates based on a single measurement at a trap site did not correlate to actual dose rates measured on free ranging animals. These findings provide empirical data to support published concerns about inadequate dosimetry in much of the published Chernobyl and Fukushima dose-effects research. Our data indicate that a huge portion of that literature should be challenged, and that improper dosimetry remains a significant source of controversy in radiation dose-effect research.

2.
Front Plant Sci ; 9: 307, 2018.
Artículo en Inglés | MEDLINE | ID: mdl-29593765

RESUMEN

Nearly half of the world cereal production comes from soils low or marginal in plant available zinc, leading to unsustainable and poor quality grain production. Therefore, the effects of nitrogen (N) rate and application time on zinc (Zn) and iron (Fe) concentration in wheat grain were investigated. Wheat (Triticum aestivum var. Krabat) was grown in a growth chamber with 8 and 16 h of day and night periods, respectively. The N rates were 29, 43, and 57 mg N kg-1 soil, equivalent to 80, 120, and 160 kg N ha-1. Zinc and Fe were applied at 10 mg kg-1 growth media. In one of the N treatments, additional Zn and Fe through foliar spray (6 mg of Zn or Fe in 10 ml water/pot) was applied. Micro-analytical localization of Zn and Fe within grain was performed using scanning macro-X-ray fluorescence (MA-XRF) and laser ablation-inductively coupled plasma-mass spectrometry (LA-ICP-MS). The following data were obtained: grain and straw yield pot-1, 1000 grains weight, number of grains pot-1, whole grain protein content, concentration of Zn and Fe in the grains. Grain yield increased from 80 to 120 kg N ha-1 rates only and decreased at 160 kg N ha-1 g. Relatively higher protein content and Zn and Fe concentration in the grain were recorded with the split N application of 160 kg N ha-1. Soil and foliar supply of Zn and Fe (Zn + Fes+f), with a single application of 120 kg N ha-1N at sowing, increased the concentration of Zn by 46% and of Fe by 35%, as compared to their growth media application only. Line scans of freshly cut areas of sliced grains showed co-localization of Zn and Fe within germ, crease and aleurone. We thus conclude that split application of N at 160 kg ha-1 at sowing and stem elongation, in combination with soil and foliar application of Zn and Fe, can be a good agricultural practice to enhance protein content and the Zn and Fe concentration in grain.

3.
Sci Rep ; 6: 32977, 2016 09 06.
Artículo en Inglés | MEDLINE | ID: mdl-27596356

RESUMEN

Even today, 70 years after Hiroshima and accidents like in Chernobyl and Fukushima, we still have limited knowledge about the health effects of low dose rate (LDR) radiation. Despite their human relevance after occupational and accidental exposure, only few animal studies on the genotoxic effects of chronic LDR radiation have been performed. Selenium (Se) is involved in oxidative stress defence, protecting DNA and other biomolecules from reactive oxygen species (ROS). It is hypothesised that Se deficiency, as it occurs in several parts of the world, may aggravate harmful effects of ROS-inducing stressors such as ionising radiation. We performed a study in the newly established LDR-facility Figaro on the combined effects of Se deprivation and LDR γ exposure in DNA repair knockout mice (Ogg1(-/-)) and control animals (Ogg1(+/-)). Genotoxic effects were seen after continuous radiation (1.4 mGy/h) for 45 days. Chromosomal damage (micronucleus), phenotypic mutations (Pig-a gene mutation of RBC(CD24-)) and DNA lesions (single strand breaks/alkali labile sites) were significantly increased in blood cells of irradiated animals, covering three types of genotoxic activity. This study demonstrates that chronic LDR γ radiation is genotoxic in an exposure scenario realistic for humans, supporting the hypothesis that even LDR γ radiation may induce cancer.


Asunto(s)
Células Sanguíneas/efectos de la radiación , Daño del ADN/efectos de la radiación , ADN Glicosilasas/fisiología , Reparación del ADN/efectos de la radiación , Rayos gamma/efectos adversos , Animales , ADN Glicosilasas/efectos de la radiación , Relación Dosis-Respuesta en la Radiación , Humanos , Masculino , Ratones , Ratones Endogámicos C57BL , Ratones Noqueados , Mutación , Estrés Oxidativo/efectos de la radiación , Especies Reactivas de Oxígeno/metabolismo , Selenio/deficiencia
SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA
...