Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 83
Filter
1.
Proc Nutr Soc ; 82(1): 32-38, 2023 02.
Article in English | MEDLINE | ID: mdl-35983607

ABSTRACT

Selenium is found at the active centre of twenty-five selenoproteins which have a variety of roles, including the well-characterised function of antioxidant defense, but it also is claimed to be involved in the immune system. However, due to limited and conflicting data for different parameters of immune function, intakes of selenium that have an influence on immune function are uncertain. This review covers the relationship between selenium and immune function in man, focusing on the highest level of evidence, namely that generated by randomised controlled trials (RCT), in which the effect of selective administration of selenium, in foods or a supplement, on immune function was assessed. A total of nine RCT were identified from a systematic search of the literature, and some of these trials reported effects on T and natural killer cells, which were dependent on the dose and form of selenium administered, but little effect of selenium on humoral immunity. There is clearly a need to undertake dose-response analysis of cellular immunity data in order to derive quantitative relationships between selenium intake and measures of immune function. Overall, limited effects on immunity emerged from experimental studies in human subjects, though additional investigation on the potential influence of selenium status on cellular immunity appears to be warranted.


Subject(s)
Selenium , Humans , Antioxidants , Dietary Supplements , Nutritional Status , Selenium/pharmacology , Selenoproteins
2.
Nutrients ; 14(9)2022 Apr 28.
Article in English | MEDLINE | ID: mdl-35565817

ABSTRACT

Research in both animals and humans shows that some nutrients are important in pregnancy and during the first years of life to support brain and cognitive development. Our aim was to evaluate the role of selenium (Se) in supporting brain and behavioral plasticity and maturation. Pregnant and lactating female rats and their offspring up to postnatal day 40 were fed isocaloric diets differing in Se content-i.e., optimal, sub-optimal, and deficient-and neurodevelopmental, neuroinflammatory, and anti-oxidant markers were analyzed. We observed early adverse behavioral changes in juvenile rats only in sub-optimal offspring. In addition, sub-optimal, more than deficient supply, reduced basal glial reactivity in sex dimorphic and brain-area specific fashion. In female offspring, deficient and sub-optimal diets reduced the antioxidant Glutathione peroxidase (GPx) activity in the cortex and in the liver, the latter being the key organ regulating Se metabolism and homeostasis. The finding that the Se sub-optimal was more detrimental than Se deficient diet may suggest that maternal Se deficient diet, leading to a lower Se supply at earlier stages of fetal development, stimulated homeostatic mechanisms in the offspring that were not initiated by sub-optimal Se. Our observations demonstrate that even moderate Se deficiency during early life negatively may affect, in a sex-specific manner, optimal brain development.


Subject(s)
Selenium , Animals , Antioxidants/pharmacology , Diet , Female , Glutathione Peroxidase/metabolism , Humans , Lactation , Liver/metabolism , Male , Maternal Nutritional Physiological Phenomena , Pregnancy , Rats
3.
J Trace Elem Med Biol ; 71: 126956, 2022 May.
Article in English | MEDLINE | ID: mdl-35217499

ABSTRACT

BACKGROUND AND AIM: The COVID-19 pandemic has severely affected the world's population in the last two years. Along with non-pharmacological public health interventions, major efforts have also been made to identify effective drugs or active substances for COVID-19 prevention and treatment. These include, among many others, the trace elements zinc and selenium, based on laboratory studies and some observational human studies. However, both of these study designs are not adequate to identify and approve treatments in human medicine, and experimental studies in the form of randomized controlled trials are needed to demonstrate the effectiveness and the safety of any interventions. METHODS: We undertook a systematic review in which we searched for published and unpublished clinical trials using zinc or selenium supplementation to treat or prevent COVID-19 in the Pubmed, Scopus and ClinicalTrials databases up to 10 January 2022. RESULTS: Amongst the published studies, we did not find any trial with selenium, whereas we retrieved four eligible randomized clinical trials using zinc supplementation, only one of which was double-blind. One of these trials looked at the effect of the intervention on the rate of new SARS-CoV-2 infections, and three at the COVID-19 clinical outcome in already infected individuals. The study populations of the four trials were very heterogeneous, ranging from uninfected individuals to those hospitalized for COVID-19. Only two studies investigated zinc alone in the intervention arm with no differences in the endpoints. The other two studies examined zinc in association with one or more drugs and supplements in the intervention arm, therefore making it impossible to disentangle any specific effects of the element. In addition, we identified 22 unpublished ongoing clinical trials, 19 on zinc, one on selenium and two on both elements. CONCLUSION: No trials investigated the effect of selenium supplementation on COVID-19, while the very few studies on the effects of zinc supplementation did not confirm efficacy. Therefore, preventive or therapeutic interventions against COVID-19 based on zinc or selenium supplementation are currently unjustified, although when the results of the on-going studies are published, this may change our conclusion.


Subject(s)
COVID-19 , Selenium , Humans , Selenium/therapeutic use , Zinc/therapeutic use , COVID-19/prevention & control , Pandemics/prevention & control , SARS-CoV-2 , Dietary Supplements , Randomized Controlled Trials as Topic
4.
Nutrients ; 13(8)2021 Aug 01.
Article in English | MEDLINE | ID: mdl-34444841

ABSTRACT

High sodium and low potassium intakes are associated with increased levels of blood pressure and risk of cardiovascular diseases. Assessment of habitual dietary habits are helpful to evaluate their intake and adherence to healthy dietary recommendations. In this study, we determined sodium and potassium food-specific content and intake in a Northern Italy community, focusing on the role and contribution of adherence to Mediterranean diet patterns. We collected a total of 908 food samples and measured sodium and potassium content using inductively coupled plasma mass spectrometry. Using a validated semi-quantitative food frequency questionnaire, we assessed habitual dietary intake of 719 adult individuals of the Emilia-Romagna region. We then estimated sodium and potassium daily intake for each food based on their relative contribution to the overall diet, and their link to Mediterranean diet patterns. The estimated mean sodium intake was 2.15 g/day, while potassium mean intake was 3.37 g/day. The foods contributing most to sodium intake were cereals (33.2%), meat products (24.5%, especially processed meat), and dairy products (13.6%), and for potassium they were meat (17.1%, especially red and white meat), fresh fruits (15.7%), and vegetables (15.1%). Adherence to a Mediterranean diet had little influence on sodium intake, whereas potassium intake was greatly increased in subjects with higher scores, resulting in a lower sodium/potassium ratio. Although we may have underestimated dietary sodium intake by not including discretionary salt use and there may be some degree of exposure misclassification as a result of changes in food sodium content and dietary habits over time, our study provides an overview of the contribution of a wide range of foods to the sodium and potassium intake in a Northern Italy community and of the impact of a Mediterranean diet on intake. The mean sodium intake was above the dietary recommendations for adults of 1.5-2 g/day, whilst potassium intake was only slightly lower than the recommended 3.5 g/day. Our findings suggest that higher adherence to Mediterranean diet patterns has limited effect on restricting sodium intake, but may facilitate a higher potassium intake, thereby aiding the achievement of healthy dietary recommendations.


Subject(s)
Diet, Healthy/statistics & numerical data , Diet, Mediterranean , Guideline Adherence/statistics & numerical data , Potassium, Dietary/analysis , Sodium, Dietary/analysis , Adult , Aged , Diet Surveys , Diet, Healthy/standards , Eating/physiology , Feeding Behavior/physiology , Female , Heart Disease Risk Factors , Humans , Italy , Male , Mass Spectrometry , Middle Aged , Nutrition Policy , Nutritional Status/physiology , Potassium, Dietary/blood , Sodium, Dietary/blood
5.
Food Res Int ; 137: 109370, 2020 11.
Article in English | MEDLINE | ID: mdl-33233072

ABSTRACT

BACKGROUND AND AIM: Lead is a highly toxic heavy metal released into the environment after natural and anthropogenic activities. Excluding populations in occupations where there is possible lead contamination, food is the major source of human exposure. In this study, we determined lead contamination in food and beverages consumed in a Northern Italy community and performed a health risk assessment. METHODS: We collected a total of 908 food samples and measured lead levels using inductively coupled plasma mass spectrometry. Using a validated food frequency questionnaire, we assessed the dietary habits and estimated daily lead dietary intakes in a sample of 719 adult individuals. We performed risk assessment using a benchmark dose and margin of exposure approach, based on exposure levels for both adverse effect of systolic blood pressure and chronic kidney disease. RESULTS: Foods with the highest lead levels include non-chocolate confectionery (48.7 µg/kg), leafy (39.0 µg/kg) and other vegetables (42.2 µg/kg), and crustaceans and molluscs (39.0 µg/kg). The estimated mean lead intake was 0.155 µg/kg bw-day in all subjects, with little lower intakes in men (0.151 µg/kg bw-day) compared to women (0.157 µg/kg bw-day). Top food contributors were vegetables, cereals, and beverages, particularly wine. In relation to risk assessment, the estimated dietary intake was lower than levels associated with cardiovascular risk and nephrotoxicity. CONCLUSIONS: Our study provides an updated assessment of lead food contamination and dietary exposure in a Northern Italian community. The margin of exposure risk assessment approach suggests that risk of detrimental effects due to dietary lead intake is low in the investigated population. Nonetheless, these exposure levels for adverse effects are not reference health standards, and no safety threshold value can be established for lead. As a consequence, other and more subtle adverse effects may still occur in vulnerable and occupationally exposed individuals, particularly in relation to the nervous system.


Subject(s)
Lead , Vegetables , Adult , Eating , Female , Humans , Italy , Male , Risk Assessment
6.
Cochrane Database Syst Rev ; 3: CD005004, 2020 03 02.
Article in English | MEDLINE | ID: mdl-32118296

ABSTRACT

BACKGROUND: This review is an update of a previously published review in the Cochrane Database of Systematic Reviews (2009, Issue 3).Tea is one of the most commonly consumed beverages worldwide. Teas from the plant Camellia sinensis can be grouped into green, black and oolong tea, and drinking habits vary cross-culturally. C sinensis contains polyphenols, one subgroup being catechins. Catechins are powerful antioxidants, and laboratory studies have suggested that these compounds may inhibit cancer cell proliferation. Some experimental and nonexperimental epidemiological studies have suggested that green tea may have cancer-preventative effects. OBJECTIVES: To assess possible associations between green tea consumption and the risk of cancer incidence and mortality as primary outcomes, and safety data and quality of life as secondary outcomes. SEARCH METHODS: We searched eligible studies up to January 2019 in CENTRAL, MEDLINE, Embase, ClinicalTrials.gov, and reference lists of previous reviews and included studies. SELECTION CRITERIA: We included all epidemiological studies, experimental (i.e. randomised controlled trials (RCTs)) and nonexperimental (non-randomised studies, i.e. observational studies with both cohort and case-control design) that investigated the association of green tea consumption with cancer risk or quality of life, or both. DATA COLLECTION AND ANALYSIS: Two or more review authors independently applied the study criteria, extracted data and assessed methodological quality of studies. We summarised the results according to diagnosis of cancer type. MAIN RESULTS: In this review update, we included in total 142 completed studies (11 experimental and 131 nonexperimental) and two ongoing studies. This is an additional 10 experimental and 85 nonexperimental studies from those included in the previous version of the review. Eleven experimental studies allocated a total of 1795 participants to either green tea extract or placebo, all demonstrating an overall high methodological quality based on 'Risk of bias' assessment. For incident prostate cancer, the summary risk ratio (RR) in the green tea-supplemented participants was 0.50 (95% confidence interval (CI) 0.18 to 1.36), based on three studies and involving 201 participants (low-certainty evidence). The summary RR for gynaecological cancer was 1.50 (95% CI 0.41 to 5.48; 2 studies, 1157 participants; low-certainty evidence). No evidence of effect of non-melanoma skin cancer emerged (summary RR 1.00, 95% CI 0.06 to 15.92; 1 study, 1075 participants; low-certainty evidence). In addition, adverse effects of green tea extract intake were reported, including gastrointestinal disorders, elevation of liver enzymes, and, more rarely, insomnia, raised blood pressure and skin/subcutaneous reactions. Consumption of green tea extracts induced a slight improvement in quality of life, compared with placebo, based on three experimental studies. In nonexperimental studies, we included over 1,100,000 participants from 46 cohort studies and 85 case-control studies, which were on average of intermediate to high methodological quality based on Newcastle-Ottawa Scale 'Risk of bias' assessment. When comparing the highest intake of green tea with the lowest, we found a lower overall cancer incidence (summary RR 0.83, 95% CI 0.65 to 1.07), based on three studies, involving 52,479 participants (low-certainty evidence). Conversely, we found no association between green tea consumption and cancer-related mortality (summary RR 0.99, 95% CI 0.91 to 1.07), based on eight studies and 504,366 participants (low-certainty evidence). For most of the site-specific cancers we observed a decreased RR in the highest category of green tea consumption compared with the lowest one. After stratifying the analysis according to study design, we found strongly conflicting results for some cancer sites: oesophageal, prostate and urinary tract cancer, and leukaemia showed an increased RR in cohort studies and a decreased RR or no difference in case-control studies. AUTHORS' CONCLUSIONS: Overall, findings from experimental and nonexperimental epidemiological studies yielded inconsistent results, thus providing limited evidence for the beneficial effect of green tea consumption on the overall risk of cancer or on specific cancer sites. Some evidence of a beneficial effect of green tea at some cancer sites emerged from the RCTs and from case-control studies, but their methodological limitations, such as the low number and size of the studies, and the inconsistencies with the results of cohort studies, limit the interpretability of the RR estimates. The studies also indicated the occurrence of several side effects associated with high intakes of green tea. In addition, the majority of included studies were carried out in Asian populations characterised by a high intake of green tea, thus limiting the generalisability of the findings to other populations. Well conducted and adequately powered RCTs would be needed to draw conclusions on the possible beneficial effects of green tea consumption on cancer risk.


Subject(s)
Camellia sinensis , Neoplasms/prevention & control , Phytotherapy/methods , Plant Extracts/therapeutic use , Tea , Breast Neoplasms/prevention & control , Camellia sinensis/chemistry , Case-Control Studies , Female , Flavonoids/pharmacology , Gastrointestinal Neoplasms/epidemiology , Gastrointestinal Neoplasms/prevention & control , Humans , Incidence , Liver Neoplasms/epidemiology , Liver Neoplasms/prevention & control , Lung Neoplasms/epidemiology , Lung Neoplasms/prevention & control , Male , Mouth Neoplasms/epidemiology , Mouth Neoplasms/prevention & control , Neoplasms/epidemiology , Neoplasms/mortality , Phenols/pharmacology , Plant Extracts/adverse effects , Polyphenols , Randomized Controlled Trials as Topic , Skin Neoplasms/epidemiology , Skin Neoplasms/prevention & control , Tea/adverse effects , Urogenital Neoplasms/epidemiology , Urogenital Neoplasms/prevention & control
7.
Eur J Clin Nutr ; 74(4): 537-542, 2020 04.
Article in English | MEDLINE | ID: mdl-31996796

ABSTRACT

In this invited article for the Crystal Ball series, I have tried to briefly cover my undergraduate and post-graduate training and subsequent career in nutrition, and end with some thoughts about the future. It has not been possible to give a comprehensive account of my many years of nutrition research, so I have selected a few events that might amuse readers. Also, due to the lack of space, I have been unable to mention all the wonderful colleagues and friends with whom I have interacted, but, if they read this article, they know who they are. Unfortunately, a growing number are no longer with us and I would like to pay tribute to them and their important contribution to human nutrition.


Subject(s)
Nutritionists , Humans
8.
Am J Clin Nutr ; 111(1): 98-109, 2020 01 01.
Article in English | MEDLINE | ID: mdl-31559434

ABSTRACT

BACKGROUND: Mediterranean diets limit red meat consumption and increase intakes of high-phytate foods, a combination that could reduce iron status. Conversely, higher intakes of fish, a good source of selenium, could increase selenium status. OBJECTIVES: A 1-y randomized controlled trial [New Dietary Strategies Addressing the Specific Needs of the Elderly Population for Healthy Aging in Europe (NU-AGE)] was carried out in older Europeans to investigate the effects of consuming a Mediterranean-style diet on indices of inflammation and changes in nutritional status. METHODS: Selenium and iron intakes and status biomarkers were measured at baseline and after 1 y in 1294 people aged 65-79 y from 5 European countries (France, Italy, the Netherlands, Poland, and the United Kingdom) who had been randomly allocated either to a Mediterranean-style diet or to remain on their habitual, Western diet. RESULTS: Estimated selenium intakes increased significantly with the intervention group (P < 0.01), but were not accompanied by changes in serum selenium concentrations. Iron intakes also increased (P < 0.001), but there was no change in iron status. However, when stratified by study center, there were positive effects of the intervention on iron status for serum ferritin for participants in Italy (P = 0.04) and France (P = 0.04) and on soluble transferrin receptor (sTfR) for participants in Poland (P < 0.01). Meat intake decreased and fish intake increased to a greater degree in the intervention group, relative to the controls (P < 0.01 for both), but the overall effects of the intervention on meat and fish intakes were mainly driven by data from Poland and France. Changes in serum selenium in the intervention group were associated with greater changes in serum ferritin (P = 0.01) and body iron (P = 0.01), but not sTfR (P = 0.73); there were no study center × selenium status interactions for the iron biomarkers. CONCLUSIONS: Consuming a Mediterranean-style diet for 1 y had no overall effect on iron or selenium status, although there were positive effects on biomarkers of iron status in some countries. The NU-AGE trial was registered at clinicaltrials.gov as NCT01754012.


Subject(s)
Diet, Mediterranean , Healthy Aging/metabolism , Iron/blood , Selenium/blood , Aged , Europe , Female , Healthy Aging/blood , Humans , Iron/metabolism , Male , Nutritional Status , Selenium/metabolism
9.
Am J Clin Nutr ; 108(3): 633-640, 2018 09 01.
Article in English | MEDLINE | ID: mdl-30007343

ABSTRACT

Background: The Mediterranean diet (MD) is widely recommended for the prevention of chronic disease, but evidence for a beneficial effect on bone health is lacking. Objective: The aim of this study was to examine the effect of a Mediterranean-like dietary pattern [NU-AGE (New Dietary Strategies Addressing the Specific Needs of the Elderly Population for Healthy Aging in Europe)] on indexes of inflammation with a number of secondary endpoints, including bone mineral density (BMD) and biomarkers of bone and collagen degradation in a 1-y multicenter randomized controlled trial (RCT; NU-AGE) in elderly Europeans. Design: An RCT was undertaken across 5 European centers. Subjects in the intervention group consumed the NU-AGE diet for 1 y by receiving individually tailored dietary advice, coupled with supplies of foods including whole-grain pasta, olive oil, and a vitamin D3 supplement (10 µg/d). Participants in the control group were provided with leaflets on healthy eating available in their country. Results: A total of 1294 participants (mean ± SD age: 70.9 ±4.0 y; 44% male) were recruited to the study and 1142 completed the 1-y trial. The Mediterranean-like dietary pattern had no effect on BMD (site-specific or whole-body); the inclusion of compliance to the intervention in the statistical model did not change the findings. There was also no effect of the intervention on the urinary biomarkers free pyridinoline or free deoxypyridinoline. Serum 25-hydroxyvitamin D significantly increased and parathyroid hormone decreased (P < 0.001) in the MD compared with the control group. Subgroup analysis of individuals with osteoporosis at baseline (site-specific BMD T-score ≤ -2.5 SDs) showed that the MD attenuated the expected decline in femoral neck BMD (n = 24 and 30 in MD and control groups, respectively; P = 0.04) but had no effect on lumbar spine or whole-body BMD. Conclusions: A 1-y intervention of the Mediterranean-like diet together with vitamin D3 supplements (10 µg/d) had no effect on BMD in the normal age-related range, but it significantly reduced the rate of loss of bone at the femoral neck in individuals with osteoporosis. The NU-AGE trial is registered at clinicaltrials.gov as NCT01754012.


Subject(s)
Cholecalciferol/administration & dosage , Diet, Mediterranean , Osteoporosis/physiopathology , Aged , Amino Acids/urine , Biomarkers/blood , Biomarkers/urine , Bone Density , Bone and Bones/metabolism , Collagen/metabolism , Dietary Supplements , Europe , Female , Femur Neck , Humans , Male , Olive Oil , Osteoporosis/diet therapy , Osteoporosis/drug therapy , Parathyroid Hormone/blood , Vitamin D/analogs & derivatives , Vitamin D/blood , Whole Grains
10.
Proc Nutr Soc ; : 1-7, 2018 Jul 27.
Article in English | MEDLINE | ID: mdl-30049292

ABSTRACT

This review aims to describe approaches used to estimate bioavailability when deriving dietary reference values (DRV) for iron and zinc using the factorial approach. Various values have been applied by different expert bodies to convert absorbed iron or zinc into dietary intakes, and these are summarised in this review. The European Food Safety Authority (EFSA) derived zinc requirements from a trivariate saturation response model describing the relationship between zinc absorption and dietary zinc and phytate. The average requirement for men and women was determined as the intercept of the total absorbed zinc needed to meet physiological requirements, calculated according to body weight, with phytate intake levels of 300, 600, 900 and 1200 mg/d, which are representative of mean/median intakes observed in European populations. For iron, the method employed by EFSA was to use whole body iron losses, determined from radioisotope dilution studies, to calculate the quantity of absorbed iron required to maintain null balance. Absorption from the diet was estimated from a probability model based on measures of iron intake and status and physiological requirements for absorbed iron. Average dietary requirements were derived for men and pre- and post-menopausal women. Taking into consideration the complexity of deriving DRV for iron and zinc, mainly due to the limited knowledge on dietary bioavailability, it appears that EFSA has made maximum use of the most relevant up-to-date data to develop novel and transparent DRV for these nutrients.

11.
Mech Ageing Dev ; 175: 55-73, 2018 10.
Article in English | MEDLINE | ID: mdl-30040993

ABSTRACT

A comprehensive literature review of iron status in the elderly was undertaken in order to update a previous review (Fairweather-Tait et al, 2014); 138 summarised papers describe research on the magnitude of the problem, aetiology and age-related physiological changes that may affect iron status, novel strategies for assessing iron status with concurrent health conditions, hepcidin, lifestyle factors, iron supplements, iron status and health outcomes (bone mineral density, frailty, inflammatory bowel disease, kidney failure, cancer, cardiovascular, and neurodegenerative diseases). Each section of this review concludes with key points from the relevant papers. The overall findings were that disturbed iron metabolism plays a major role in a large number of conditions associated with old age. Correction of iron deficiency/overload may improve disease prognosis, but diagnosis of iron deficiency requires appropriate cut-offs for biomarkers of iron status in elderly men and women to be agreed. Iron deficiency (with or without anemia), anemia of inflammation, and anemia of chronic disease are all widespread in the elderly and, once identified, should be investigated further as they are often indicative of underlying disease. Management options should be reviewed and updated, and novel therapies, which show potential for treating anemia of inflammation or chronic disease, should be considered.


Subject(s)
Aging/blood , Anemia, Iron-Deficiency/blood , Iron Overload/blood , Iron/blood , Adult , Age Factors , Aged , Aged, 80 and over , Anemia, Iron-Deficiency/diagnosis , Anemia, Iron-Deficiency/epidemiology , Anemia, Iron-Deficiency/therapy , Biomarkers/blood , Comorbidity , Female , Geriatric Assessment , Humans , Iron Overload/diagnosis , Iron Overload/epidemiology , Iron Overload/therapy , Life Style , Male , Middle Aged , Prognosis , Risk Factors
12.
Genes Nutr ; 12: 35, 2017.
Article in English | MEDLINE | ID: mdl-29270237

ABSTRACT

Nutrigenetic research examines the effects of inter-individual differences in genotype on responses to nutrients and other food components, in the context of health and of nutrient requirements. A practical application of nutrigenetics is the use of personal genetic information to guide recommendations for dietary choices that are more efficacious at the individual or genetic subgroup level relative to generic dietary advice. Nutrigenetics is unregulated, with no defined standards, beyond some commercially adopted codes of practice. Only a few official nutrition-related professional bodies have embraced the subject, and, consequently, there is a lack of educational resources or guidance for implementation of the outcomes of nutrigenetic research. To avoid misuse and to protect the public, personalised nutrigenetic advice and information should be based on clear evidence of validity grounded in a careful and defensible interpretation of outcomes from nutrigenetic research studies. Evidence requirements are clearly stated and assessed within the context of state-of-the-art 'evidence-based nutrition'. We have developed and present here a draft framework that can be used to assess the strength of the evidence for scientific validity of nutrigenetic knowledge and whether 'actionable'. In addition, we propose that this framework be used as the basis for developing transparent and scientifically sound advice to the public based on nutrigenetic tests. We feel that although this area is still in its infancy, minimal guidelines are required. Though these guidelines are based on semi-quantitative data, they should stimulate debate on their utility. This framework will be revised biennially, as knowledge on the subject increases.

13.
Nutrients ; 9(9)2017 Sep 12.
Article in English | MEDLINE | ID: mdl-28895913

ABSTRACT

Iron deficiency is a major public health concern and nutritional approaches are required to reduce its prevalence. The aim of this study was to examine the iron bioavailability of a novel home fortificant, the "Lucky Iron Fish™" (LIF) (www.luckyironfish.com/shop, Guelph, Canada) and the impact of dietary factors and a food matrix on iron uptake from LIF in Caco-2 cells. LIF released a substantial quantity of iron (about 1.2 mM) at pH 2 but this iron was only slightly soluble at pH 7 and not taken up by cells. The addition of ascorbic acid (AA) maintained the solubility of iron released from LIF (LIF-iron) at pH 7 and facilitated iron uptake by the cells in a concentration-dependent manner. In vitro digestion of LIF-iron in the presence of peas increased iron uptake 10-fold. However, the addition of tannic acid to the digestion reduced the cellular iron uptake 7.5-fold. Additionally, LIF-iron induced an overproduction of reactive oxygen species (ROS), similar to ferrous sulfate, but this effect was counteracted by the addition of AA. Overall, our data illustrate the major influence of dietary factors on iron solubility and bioavailability from LIF, and demonstrate that the addition of AA enhances iron uptake and reduces ROS in the intestinal lumen.


Subject(s)
Anemia, Iron-Deficiency/prevention & control , Iron/pharmacokinetics , Ascorbic Acid/pharmacology , Biological Availability , Biological Transport , Caco-2 Cells , Canada , Cell Survival/drug effects , Ferritins/metabolism , Ferrous Compounds/metabolism , Humans , Hydrogen-Ion Concentration , Reactive Oxygen Species/metabolism , Solubility , Tannins/pharmacology
14.
Am J Clin Nutr ; 105(6): 1408-1414, 2017 06.
Article in English | MEDLINE | ID: mdl-28381473

ABSTRACT

Background: Values for dietary iron bioavailability are required for setting dietary reference values. These are estimated from predictive algorithms, nonheme iron absorption from meals, and models of iron intake, serum ferritin concentration, and iron requirements.Objective: We developed a new interactive tool to predict dietary iron bioavailability.Design: Iron intake and serum ferritin, a quantitative marker of body iron stores, from 2 nationally representative studies of adults in the United Kingdom and Ireland and a trial in elderly people in Norfolk, United Kingdom, were used to develop a model to predict dietary iron absorption at different serum ferritin concentrations. Individuals who had raised inflammatory markers or were taking iron-containing supplements were excluded.Results: Mean iron intakes were 13.6, 10.3, and 10.9 mg/d and mean serum ferritin concentrations were 140.7, 49.4, and 96.7 mg/L in men, premenopausal women, and postmenopausal women, respectively. The model predicted that at serum ferritin concentrations of 15, 30, and 60 mg/L, mean dietary iron absorption would be 22.3%, 16.3%, and 11.6%, respectively, in men; 27.2%, 17.2%, and 10.6%, respectively, in premenopausal women; and 18.4%, 12.7%, and 10.5%, respectively, in postmenopausal women.Conclusions: An interactive program for calculating dietary iron absorption at any concentration of serum ferritin is presented. Differences in iron status are partly explained by age but also by diet, with meat being a key determinant. The effect of the diet is more marked at lower serum ferritin concentrations. The model can be applied to any adult population in whom representative, good-quality data on iron intake and iron status have been collected. Values for dietary iron bioavailability can be derived for any target concentration of serum ferritin, thereby giving risk managers and public health professionals a flexible and transparent basis on which to base their dietary recommendations. This trial was registered at clinicaltrials.gov as NCT01754012.


Subject(s)
Diet , Ferritins/blood , Intestinal Absorption , Iron, Dietary/blood , Iron/blood , Adult , Aged , Biological Availability , Biomarkers/blood , Female , Humans , Ireland , Iron/pharmacokinetics , Iron, Dietary/pharmacokinetics , Male , Meat , Middle Aged , United Kingdom
15.
Am J Clin Nutr ; 104(1): 121-31, 2016 Jul.
Article in English | MEDLINE | ID: mdl-27225436

ABSTRACT

BACKGROUND: Water-loss dehydration (hypertonic, hyperosmotic, or intracellular dehydration) is due to insufficient fluid intake and is distinct from hypovolemia due to excess fluid losses. Water-loss dehydration is associated with poor health outcomes such as disability and mortality in older people. Urine specific gravity (USG), urine color, and urine osmolality have been widely advocated for screening for dehydration in older adults. OBJECTIVE: We assessed the diagnostic accuracy of urinary measures to screen for water-loss dehydration in older people. DESIGN: This was a diagnostic accuracy study of people aged ≥65 y taking part in the DRIE (Dehydration Recognition In our Elders; living in long-term care) or NU-AGE (Dietary Strategies for Healthy Ageing in Europe; living in the community) studies. The reference standard was serum osmolality, and index tests included USG, urine color, urine osmolality, urine cloudiness, additional dipstick measures, ability to provide a urine sample, and the volume of a random urine sample. Minimum useful diagnostic accuracy was set at sensitivity and specificity ≥70% or a receiver operating characteristic plot area under the curve ≥0.70. RESULTS: DRIE participants (women: 67%; mean age: 86 y; n = 162) had more limited cognitive and functional abilities than did NU-AGE participants (women: 64%; mean age: 70 y; n = 151). Nineteen percent of DRIE participants and 22% of NU-AGE participants were dehydrated (serum osmolality >300 mOsm/kg). Neither USG nor any other potential urinary tests were usefully diagnostic for water-loss dehydration. CONCLUSIONS: Although USG, urine color, and urinary osmolality have been widely advocated for screening for dehydration in older adults, we show, in the largest study to date to our knowledge, that their diagnostic accuracy is too low to be useful, and these measures should not be used to indicate hydration status in older people (either alone or as part of a wider tranche of tests). There is a need to develop simple, inexpensive, and noninvasive tools for the assessment of dehydration in older people. The DRIE study was registered at www.researchregister.org.uk as 122273. The NU-AGE trial was registered at clinicialtrials.gov as NCT01754012.


Subject(s)
Dehydration/diagnosis , Urinalysis/methods , Water-Electrolyte Balance , Water , Aged , Aged, 80 and over , Area Under Curve , Biomarkers/urine , Color , Dehydration/urine , Female , Humans , Male , Osmolar Concentration , ROC Curve , Sensitivity and Specificity , Specific Gravity
16.
BMJ Open ; 5(10): e008846, 2015 Oct 21.
Article in English | MEDLINE | ID: mdl-26490100

ABSTRACT

OBJECTIVES: To assess which osmolarity equation best predicts directly measured serum/plasma osmolality and whether its use could add value to routine blood test results through screening for dehydration in older people. DESIGN: Diagnostic accuracy study. PARTICIPANTS: Older people (≥65 years) in 5 cohorts: Dietary Strategies for Healthy Ageing in Europe (NU-AGE, living in the community), Dehydration Recognition In our Elders (DRIE, living in residential care), Fortes (admitted to acute medical care), Sjöstrand (emergency room) or Pfortmueller cohorts (hospitalised with liver cirrhosis). REFERENCE STANDARD FOR HYDRATION STATUS: Directly measured serum/plasma osmolality: current dehydration (serum osmolality>300 mOsm/kg), impending/current dehydration (≥295 mOsm/kg). INDEX TESTS: 39 osmolarity equations calculated using serum indices from the same blood draw as directly measured osmolality. RESULTS: Across 5 cohorts 595 older people were included, of whom 19% were dehydrated (directly measured osmolality>300 mOsm/kg). Of 39 osmolarity equations, 5 showed reasonable agreement with directly measured osmolality and 3 had good predictive accuracy in subgroups with diabetes and poor renal function. Two equations were characterised by narrower limits of agreement, low levels of differential bias and good diagnostic accuracy in receiver operating characteristic plots (areas under the curve>0.8). The best equation was osmolarity=1.86×(Na++K+)+1.15×glucose+urea+14 (all measured in mmol/L). It appeared useful in people aged ≥65 years with and without diabetes, poor renal function, dehydration, in men and women, with a range of ages, health, cognitive and functional status. CONCLUSIONS: Some commonly used osmolarity equations work poorly, and should not be used. Given costs and prevalence of dehydration in older people we suggest use of the best formula by pathology laboratories using a cutpoint of 295 mOsm/L (sensitivity 85%, specificity 59%), to report dehydration risk opportunistically when serum glucose, urea and electrolytes are measured for other reasons in older adults. TRIAL REGISTRATION NUMBERS: DRIE: Research Register for Social Care, 122273; NU-AGE: ClinicalTrials.gov NCT01754012.


Subject(s)
Dehydration/blood , Dehydration/diagnosis , Osmolar Concentration , Adult , Aged , Aged, 80 and over , Cohort Studies , Europe , Female , Humans , Male , Middle Aged , Multicenter Studies as Topic , Prognosis , ROC Curve , Randomized Controlled Trials as Topic , Sensitivity and Specificity
18.
PLoS One ; 9(11): e112144, 2014.
Article in English | MEDLINE | ID: mdl-25391138

ABSTRACT

UNLABELLED: Previous in vitro results indicated that alginate beads might be a useful vehicle for food iron fortification. A human study was undertaken to test the hypothesis that alginate enhances iron absorption. A randomised, single blinded, cross-over trial was carried out in which iron absorption was measured from serum iron appearance after a test meal. Overnight-fasted volunteers (n = 15) were given a test meal of 200 g cola-flavoured jelly plus 21 mg iron as ferrous gluconate, either in alginate beads mixed into the jelly or in a capsule. Iron absorption was lower from the alginate beads than from ferrous gluconate (8.5% and 12.6% respectively, p = 0.003). Sub-group B (n = 9) consumed the test meals together with 600 mg calcium to determine whether alginate modified the inhibitory effect of calcium. Calcium reduced iron absorption from ferrous gluconate by 51%, from 11.5% to 5.6% (p = 0.014), and from alginate beads by 37%, from 8.3% to 5.2% (p = 0.009). In vitro studies using Caco-2 cells were designed to explore the reasons for the difference between the previous in vitro findings and the human study; confirmed the inhibitory effect of alginate. Beads similar to those used in the human study were subjected to simulated gastrointestinal digestion, with and without cola jelly, and the digestate applied to Caco-2 cells. Both alginate and cola jelly significantly reduced iron uptake into the cells, by 34% (p = 0.009) and 35% (p = 0.003) respectively. The combination of cola jelly and calcium produced a very low ferritin response, 16.5% (p < 0.001) of that observed with ferrous gluconate alone. The results of these studies demonstrate that alginate beads are not a useful delivery system for soluble salts of iron for the purpose of food fortification. TRIAL REGISTRATION: ClinicalTrials.gov NCT01528644.


Subject(s)
Alginates/chemistry , Alginates/pharmacology , Ferrous Compounds/chemistry , Ion Transport/drug effects , Iron/blood , Iron/metabolism , Adolescent , Adult , Aged , Caco-2 Cells , Calcium/blood , Calcium/chemistry , Cross-Over Studies , Enzyme-Linked Immunosorbent Assay , Ferritins/analysis , Ferrous Compounds/administration & dosage , Ferrous Compounds/pharmacology , Glucuronic Acid/chemistry , Glucuronic Acid/pharmacology , Hexuronic Acids/chemistry , Hexuronic Acids/pharmacology , Humans , Male , Middle Aged , Young Adult
19.
PLoS One ; 9(10): e111824, 2014.
Article in English | MEDLINE | ID: mdl-25356629

ABSTRACT

Currently there are no satisfactory methods for estimating dietary iron absorption (bioavailability) at a population level, but this is essential for deriving dietary reference values using the factorial approach. The aim of this work was to develop a novel approach for estimating dietary iron absorption using a population sample from a sub-section of the UK National Diet and Nutrition Survey (NDNS). Data were analyzed in 873 subjects from the 2000-2001 adult cohort of the NDNS, for whom both dietary intake data and hematological measures (hemoglobin and serum ferritin (SF) concentrations) were available. There were 495 men aged 19-64 y (mean age 42.7±12.1 y) and 378 pre-menopausal women (mean age 35.7±8.2 y). Individual dietary iron requirements were estimated using the Institute of Medicine calculations. A full probability approach was then applied to estimate the prevalence of dietary intakes that were insufficient to meet the needs of the men and women separately, based on their estimated daily iron intake and a series of absorption values ranging from 1-40%. The prevalence of SF concentrations below selected cut-off values (indicating that absorption was not high enough to maintain iron stores) was derived from individual SF concentrations. An estimate of dietary iron absorption required to maintain specified SF values was then calculated by matching the observed prevalence of insufficiency with the prevalence predicted for the series of absorption estimates. Mean daily dietary iron intakes were 13.5 mg for men and 9.8 mg for women. Mean calculated dietary absorption was 8% in men (50th percentile for SF 85 µg/L) and 17% in women (50th percentile for SF 38 µg/L). At a ferritin level of 45 µg/L estimated absorption was similar in men (14%) and women (13%). This new method can be used to calculate dietary iron absorption at a population level using data describing total iron intake and SF concentration.


Subject(s)
Food , Iron, Dietary/metabolism , Iron/metabolism , Adult , Biological Availability , Female , Ferritins/blood , Humans , Male , Middle Aged , Nutrition Surveys , Probability , United Kingdom , Young Adult
20.
J Agric Food Chem ; 62(42): 10320-5, 2014 Oct 22.
Article in English | MEDLINE | ID: mdl-25275535

ABSTRACT

Iron bioavailability in unleavened white and wholegrain bread made from two commercial wheat varieties was assessed by measuring ferritin production in Caco-2 cells. The breads were subjected to simulated gastrointestinal digestion and the digests applied to the Caco-2 cells. Although Riband grain contained a lower iron concentration than Rialto, iron bioavailability was higher. No iron was taken up by the cells from white bread made from Rialto flour or from wholegrain bread from either variety, but Riband white bread produced a small ferritin response. The results probably relate to differences in phytate content of the breads, although iron in soluble monoferric phytate was demonstrated to be bioavailable in the cell model. Nicotianamine, an iron chelator in plants involved in iron transport, was a more potent enhancer of iron uptake into Caco-2 cells than ascorbic acid or 2'-deoxymugineic acid, another metal chelator present in plants.


Subject(s)
Azetidinecarboxylic Acid/analogs & derivatives , Digestion , Iron/metabolism , Seeds/metabolism , Triticum/metabolism , Azetidinecarboxylic Acid/metabolism , Biological Availability , Bread/analysis , Caco-2 Cells , Flour/analysis , Humans , Models, Biological , Triticum/economics
SELECTION OF CITATIONS
SEARCH DETAIL
...