Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 127
Filter
Add more filters

Publication year range
1.
Nutr Res Rev ; : 1-17, 2024 Feb 23.
Article in English | MEDLINE | ID: mdl-38389450

ABSTRACT

Energy-restricted (ER) diets promote weight loss and improve body composition and glycaemic control. Nut consumption also improves these parameters. However, less is known about the combined benefit of these two strategies. This scoping review implemented a systematic search of Medline, Embase and Scopus to identify randomised controlled trials evaluating the effect of ER diets with or without nuts on body mass, body composition and glycaemic control in adults. After reviewing titles and abstracts, twenty-nine full-text articles were screened, resulting in seven studies reported in eight papers that met the inclusion criteria. Energy restriction was achieved by prescribing a set energy target or reducing intake by 1000-4200 kJ from daily energy requirements. Interventions ranged from 4 to 52 weeks in duration and contained 42-84 g/d of almonds, peanuts, pistachios or walnuts. While all studies reported that energy restriction resulted in significant weight loss, the addition of nuts to ER diets demonstrated significantly greater weight loss in only approximately half of the included studies (4/7 studies). There was limited evidence to support additional benefits from nuts for body composition measures or glycaemic control. Although improvements in weight loss and glycaemia were not consistent when nuts were included in ER diets, no study revealed an adverse effect of nut consumption on health outcomes. Future studies could explore the effect of consuming different types and amounts of nuts, combined with various levels of energy restriction on weight, body composition and glycaemic control.

2.
BMC Med Inform Decis Mak ; 24(1): 273, 2024 Sep 27.
Article in English | MEDLINE | ID: mdl-39334341

ABSTRACT

BACKGROUND: Decision thresholds play important role in medical decision-making. Individual decision-making differences may be attributable to differences in subjective judgments or cognitive processes that are captured through the decision thresholds. This systematic scoping review sought to characterize the literature on non-expected utility decision thresholds in medical decision-making by identifying commonly used theoretical paradigms and contextual and subjective factors that inform decision thresholds. METHODS: A structured search designed around three concepts-individual decision-maker, decision threshold, and medical decision-was conducted in MEDLINE (Ovid) and Scopus databases from inception to July 2023. ProQuest (Dissertations and Theses) database was searched to August 2023. The protocol, developed a priori, was registered on Open Science Framework and PRISMA-ScR guidelines were followed for reporting on this study. Titles and abstracts of 1,618 articles and the full texts for the 228 included articles were reviewed by two independent reviewers. 95 articles were included in the analysis. A single reviewer used a pilot-tested data collection tool to extract study and author characteristics, article type, objectives, theoretical paradigm, contextual or subjective factors, decision-maker, and type of medical decision. RESULTS: Of the 95 included articles, 68 identified a theoretical paradigm in their approach to decision thresholds. The most common paradigms included regret theory, hybrid theory, and dual processing theory. Contextual and subjective factors that influence decision thresholds were identified in 44 articles. CONCLUSIONS: Our scoping review is the first to systematically characterizes the available literature on decision thresholds within medical decision-making. This study offers an important characterization of the literature through the identification of the theoretical paradigms for non-expected utility decision thresholds. Moreover, this study provides insight into the various contextual and subjective factors that have been documented within the literature to influence decision thresholds, as well as these factors juxtapose theoretical paradigms.


Subject(s)
Clinical Decision-Making , Humans , Decision Support Techniques
3.
Int J Health Plann Manage ; 39(3): 906-916, 2024 May.
Article in English | MEDLINE | ID: mdl-38369691

ABSTRACT

The global health workforce crisis, simmering for decades, was brought to a rolling boil by the COVID-19 pandemic in 2020. With scarce literature, evidence, or best practices to draw from, countries around the world moved to flex their workforces to meet acute challenges of the pandemic, facing demands related to patient volume, patient acuity, and worker vulnerability and absenteeism. One early hypothesis suggested that the acute, short-term pandemic phase would be followed by several waves of resource demands extending over the longer term. However, as the acute phase of the pandemic abated, temporary workforce policies expired and others were repealed with a view of returning to 'normal'. The workforce needs of subsequent phases of pandemic effects were largely ignored despite our new equilibrium resting nowhere near our pre-COVID baseline. In this paper, we describe Canada's early pandemic workforce response. We report the results of an environmental scan of the early workforce strategies adopted in Canada during the first wave of the COVID-19 pandemic. Within an expanded three-part conceptual framework for supporting a sustainable health workforce, we describe 470 strategies and policies that aimed to increase the numbers and flexibility of health workers in Canada, and to maximise their continued availability to work. These strategies targeted all types of health workers and roles, enabling changes to the places health work is done, the way in which care is delivered, and the mechanisms by which it is regulated. Telehealth strategies and virtual care were the most prevalent, followed by role expansion, licensure flexibility, mental health supports for workers, and return to practice of retirees. We explore the degree to which these short-term, acute response strategies might be adapted or extended to support the evolving workforce's long-term needs.


Subject(s)
COVID-19 , Health Workforce , Pandemics , COVID-19/epidemiology , Humans , Canada , Health Workforce/organization & administration , SARS-CoV-2 , Health Personnel
4.
Curr Atheroscler Rep ; 25(7): 373-380, 2023 07.
Article in English | MEDLINE | ID: mdl-37219706

ABSTRACT

PURPOSE OF REVIEW: This review summarizes recent evidence published since a previous review in 2018 on the association between egg consumption and risk of cardiovascular disease (CVD) mortality, CVD incidence, and CVD risk factors. RECENT FINDINGS: No recent randomized controlled trials were identified. Evidence from observational studies is mixed, with studies reporting either an increased risk or no association of highest egg consumption with CVD mortality, and a similar spread of increased risk, decreased risk, or no association between egg intake and total CVD incidence. Most studies reported a reduced risk or no association between egg consumption and CVD risk factors. Included studies reported low and high egg intake as between 0 and 1.9 eggs/week and 2 and ≥14 eggs/week, respectively. Ethnicity may influence the risk of CVD with egg consumption, likely due to differences in how eggs are consumed in the diet rather than eggs themselves. Recent findings are inconsistent regarding the possible relationship between egg consumption and CVD mortality and morbidity. Dietary guidance should focus on improving the overall quality of the diet to promote cardiovascular health.


Subject(s)
Cardiovascular Diseases , Humans , Risk Factors , Cardiovascular Diseases/etiology , Diet
5.
Eur J Nutr ; 62(2): 857-866, 2023 Mar.
Article in English | MEDLINE | ID: mdl-36305961

ABSTRACT

PURPOSE: Early satiety has been identified as one of the mechanisms that may explain the beneficial effects of nuts for reducing obesity. This study compared postprandial changes in appetite-regulating hormones and self-reported appetite ratings after consuming almonds (AL, 15% of energy requirement) or an isocaloric carbohydrate-rich snack bar (SB). METHODS: This is a sub-analysis of baseline assessments of a larger parallel-arm randomised controlled trial in overweight and obese (Body Mass Index 27.5-34.9 kg/m2) adults (25-65 years). After an overnight fast, 140 participants consumed a randomly allocated snack (AL [n = 68] or SB [n = 72]). Appetite-regulating hormones and self-reported appetite sensations, measured using visual analogue scales, were assessed immediately before snack food consumption, and at 30, 60, 90 and 120 min following snack consumption. A sub-set of participants (AL, n = 49; SB, n = 48) then consumed a meal challenge buffet ad libitum to assess subsequent energy intake. An additional appetite rating assessment was administered post buffet at 150 min. RESULTS: Postprandial C-peptide area under the curve (AUC) response was 47% smaller with AL compared to SB (p < 0.001). Glucose-dependent insulinotropic polypeptide, glucagon and pancreatic polypeptide AUC responses were larger with AL compared to SB (18%, p = 0.005; 39% p < 0.001; 45% p < 0.001 respectively). Cholecystokinin, ghrelin, glucagon-like peptide-1, leptin and polypeptide YY AUCs were not different between groups. Self-reported appetite ratings and energy intake following the buffet did not differ between groups. CONCLUSION: More favourable appetite-regulating hormone responses to AL did not translate into better self-reported appetite or reduced short-term energy consumption. Future studies should investigate implications for longer term appetite regulation. ANZCTR REFERENCE NUMBER: ACTRN12618001861246 2018.


Subject(s)
Appetite , Prunus dulcis , Adult , Humans , Appetite/physiology , Snacks , Self Report , Insulin , Satiation/physiology , Ghrelin , Obesity , Energy Intake , Sensation , Carbohydrates , Postprandial Period
6.
Prehosp Emerg Care ; : 1-11, 2023 Aug 29.
Article in English | MEDLINE | ID: mdl-37594851

ABSTRACT

Objectives: Diet quality often changes as shift workers adjust to atypical work schedules, however, limited research exists examining the early effects of starting rotating shift work on diet and body composition. This study explored dietary behavior changes occurring in graduate paramedics during the first year of exposure to rotating shift work, and investigated dietary intake, diet quality and anthropometric changes over two years.Methods: Participants from a graduate paramedic cohort in Melbourne, Australia were approached after two years of shift work for study inclusion. Using a mixed method study approach, the qualitative component comprised individual in-depth interviews to explore perceived dietary behavior changes experienced over the first year of shift work. Interview transcripts were thematically analyzed and guided by the COM-B model (capability, opportunity, motivation, and behavior) and theoretical domains framework (TDF). Diet quality and dietary intake were quantitatively assessed by the Australian Eating SurveyTM at baseline, one year, and two years, along with body weight, waist circumference, and body mass index (BMI) to monitor changes.Results: Eighteen participants were included in the study. From the interviews, participants reported: 1. food choices are driven by wanting to fit in with coworker food habits, 2. food choices and mealtimes are unpredictable and 3. paramedics try to make healthy food choices but give in to less healthy options. While daily energy intake and diet quality scores did not differ in the first two years of shift work, daily energy from takeaway foods significantly increased (mean difference (MD): 2.96% EI; 95% CI: 0.44 - 5.48; p = 0.017) and increases in weight (MD: 2.96 kg; 95% CI: 0.89-5.04; p = 0.003), BMI (MD: 1.07 kg/m2; 95% CI: 0.26 - 1.87; p = 0.006) and waist circumference (MD: 5.07 cm; 95% CI: 1.25-8.89; p = 0.006) were also evident at two years.Conclusions: This study contributes new information on dietary changes and the current early trajectory of unintentional weight gain and takeaway reliance occurring within a graduate paramedic cohort over two years of shift work. To reduce the unintended metabolic consequences commonly observed with rotating shift schedules, workplaces could improve access to healthier food options and enable behavioral support/education to address nutrition-related health risks.

7.
Curr Diab Rep ; 22(4): 147-155, 2022 04.
Article in English | MEDLINE | ID: mdl-35403984

ABSTRACT

PURPOSE OF REVIEW: The aim of this short review is to provide an updated commentary on the current literature examining the impact of meal timing on obesity and weight gain in adults. The potential mechanisms, including novel and emerging factors, behind timing of food intake across the 24-h period in the development of obesity, and dietary strategies manipulating meal timing to ameliorate weight gain are also explored. RECENT FINDINGS: Dietary patterns that feature meal timing outside of the regular daytime hours can contribute to circadian disruption as food is metabolised in opposition to internal daily rhythms and can feedback on the timekeeping mechanisms setting these rhythms. Epidemiological evidence examining the impact of late meal timing patterns is beginning to suggest that eating at night increases the risk of weight gain over time. Mechanisms contributing to this include changes to the efficiency of metabolism across the day, and dysregulation of appetite hormone and gut microbiota by mis-timed meals. When meals are eaten, in relation to the time of day, is increasingly considered of importance when implementing dietary change in order to address the growing burden of obesity, although further research is required in order to determine optimal patterns.


Subject(s)
Meals , Weight Gain , Adult , Appetite , Circadian Rhythm , Energy Intake/physiology , Feeding Behavior , Humans , Obesity/epidemiology
8.
Br J Nutr ; 127(6): 872-884, 2022 03 28.
Article in English | MEDLINE | ID: mdl-33971995

ABSTRACT

Diet is a modifiable risk factor for chronic disease and a potential modulator of telomere length (TL). The study aim was to investigate associations between diet quality and TL in Australian adults after a 12-week dietary intervention with an almond-enriched diet (AED). Participants (overweight/obese, 50-80 years) were randomised to an AED (n 62) or isoenergetic nut-free diet (NFD, n 62) for 12 weeks. Diet quality was assessed using a Dietary Guideline Index (DGI), applied to weighed food records, that consists of ten components reflecting adequacy, variety and quality of core food components and discretionary choices within the diet. TL was measured by quantitative PCR in samples of lymphocytes, neutrophils, and whole blood. There were no significant associations between DGI scores and TL at baseline. Diet quality improved with AED and decreased with NFD after 12 weeks (change from baseline AED + 9·8 %, NFD - 14·3 %; P < 0·001). TL increased in neutrophils (+9·6 bp, P = 0·009) and decreased in whole blood, to a trivial extent (-12·1 bp, P = 0·001), and was unchanged in lymphocytes. Changes did not differ between intervention groups. There were no significant relationships between changes in diet quality scores and changes in lymphocyte, neutrophil or whole blood TL. The inclusion of almonds in the diet improved diet quality scores but had no impact on TL mid-age to older Australian adults. Future studies should investigate the impact of more substantial dietary changes over longer periods of time.


Subject(s)
Overweight , Prunus dulcis , Adult , Australia , Humans , Obesity , Telomere
9.
Nutr Res Rev ; 35(1): 112-135, 2022 06.
Article in English | MEDLINE | ID: mdl-33988113

ABSTRACT

Circadian rhythms, metabolic processes and dietary intake are inextricably linked. Timing of food intake is a modifiable temporal cue for the circadian system and may be influenced by numerous factors, including individual chronotype - an indicator of an individual's circadian rhythm in relation to the light-dark cycle. This scoping review examines temporal patterns of eating across chronotypes and assesses tools that have been used to collect data on temporal patterns of eating and chronotype. A systematic search identified thirty-six studies in which aspects of temporal patterns of eating, including meal timings; meal skipping; energy distribution across the day; meal frequency; time interval between meals, or meals and wake/sleep times; midpoint of food/energy intake; meal regularity; and duration of eating window, were presented in relation to chronotype. Findings indicate that, compared with morning chronotypes, evening chronotypes tend to skip meals more frequently, have later mealtimes, and distribute greater energy intake towards later times of the day. More studies should explore the difference in meal regularity and duration of eating window amongst chronotypes. Currently, tools used in collecting data on chronotype and temporal patterns of eating are varied, limiting the direct comparison of findings between studies. Development of a standardised assessment tool will allow future studies to confidently compare findings to inform the development and assessment of guidelines that provide recommendations on temporal patterns of eating for optimal health.


Subject(s)
Feeding Behavior , Meals , Adult , Circadian Rhythm , Energy Intake , Humans , Sleep
10.
Int J Health Plann Manage ; 37(5): 2534-2541, 2022 Sep.
Article in English | MEDLINE | ID: mdl-35691008

ABSTRACT

Over the last 15 years, there has been a trend in Canada to centralise the provision of health services that were previously administratively and fiscally decentralised. Canadian policy rhetoric on centralisation often identifies improved innovation as an anticipated outcome. This paper challenges the assumed relationship between centralisation and innovation. We incorporate evidence from the management literature into the debate on the structure of health systems to explore the effects that centralisation is likely to have on innovation in health systems. The findings of this paper will be of interest to international policymakers, who are currently grappling with the prospect of maintaining a decentralised approach or adopting a more centralised health system structure in the future.


Subject(s)
Health Services , Canada , Organizational Innovation
11.
Hum Resour Health ; 19(1): 154, 2021 12 20.
Article in English | MEDLINE | ID: mdl-34930337

ABSTRACT

BACKGROUND: The early weeks of the COVID-19 pandemic brought multiple concurrent threats-high patient volume and acuity and, simultaneously, increased risk to health workers. Healthcare managers and decision-makers needed to identify strategies to mitigate these adverse conditions. This paper reports on the health workforce strategies implemented in relation to past large-scale emergencies (including natural disasters, extreme weather events, and infectious disease outbreaks). METHODS: We conducted a rapid scoping review of health workforce responses to natural disasters, extreme weather events, and infectious disease outbreaks reported in the literature between January 2000 and April 2020. The 3582 individual results were screened to include articles which described surge responses to past emergencies for which an evaluative component was included in the report. A total of 37 articles were included in our analysis. RESULTS: The reviewed literature describes challenges related to increased demand for health services and a simultaneous decrease in the availability of the workforce. Many articles also described impacts on infrastructure that hindered emergency response. These challenges aligned well with those faced during the early days of the COVID-19 pandemic. In the published literature, the workforce strategies that were described aimed either to increase the numbers of health workers in a given area, to increase the flexibility of the health workforce to meet needs in new ways, or to support and sustain health workers in practice. Workforce responses addressed all types and cadres of health workers and were executed in a wide range of settings. We additionally report on the barriers and facilitators of workforce strategies reported in the literature reviewed. The strategies that were reported in the literature aligned closely with our COVID-specific conceptual framework of workforce capacity levers, suggesting that our framework may have heuristic value across many types of health disasters. CONCLUSIONS: This research highlights a key deficiency with the existing literature on workforce responses to emergencies: most papers lack substantive evaluation of the strategies implemented. Future research on health workforce capacity interventions should include robust evaluation of impact and effectiveness.


Subject(s)
COVID-19 , Pandemics , Health Personnel , Health Workforce , Humans , SARS-CoV-2
12.
Eur J Cancer Care (Engl) ; 29(6): e13303, 2020 Nov.
Article in English | MEDLINE | ID: mdl-32875677

ABSTRACT

OBJECTIVE: To identify cancer survivors' perceptions of the role diet plays in their cognitive function, and how their cancer-related cognitive changes influence their diet. METHODS: Cancer survivors diagnosed with cancer in the past 5 years, not on active treatment, and with self-reported cognitive changes since diagnosis were recruited from the general population. Semi-structured interviews were conducted with 15 Australian breast (n = 13) and colorectal (n = 2) survivors (mean time since diagnosed: 27.0 months ± SD=16.8). Questions related to how their diet and cognitive changes influenced each other. Interviews were recorded, and transcripts were analysed using thematic analysis. RESULTS: Four themes related to how diet impacted cognition: (a) directly (e.g. healthy diet improves cognition), (b) indirectly (e.g. diet affects tiredness which affects cognition); (c) no impact; and (d) potentially (e.g. poorer diet quality would worsen cognition). Three themes emerged for how cognitive changes were thought to impact survivors' diets: (a) planning meals is harder; (b) cooking is more difficult and complex; and, (c) choosing healthy is more challenging. CONCLUSIONS: Many cancer survivors perceived a bidirectional influence between diet and cognition that has cognitive and behavioural consequences. Diet could be investigated as a modifiable lifestyle behaviour to improve cancer-related cognitive impairment and fatigue. Survivors may benefit from dietary guidance with meal planning and preparing.


Subject(s)
Breast Neoplasms , Cancer Survivors , Cognitive Dysfunction , Australia , Cognition , Cognitive Dysfunction/etiology , Diet , Female , Humans
13.
Asia Pac J Clin Nutr ; 28(1): 166-176, 2019.
Article in English | MEDLINE | ID: mdl-30896428

ABSTRACT

BACKGROUND AND OBJECTIVES: The need for updated competencies for nutrition scientists in Australia was identified. The aim of this paper is to describe the process of revising of these competencies for undergraduate nutrition science degrees in Australia. METHODS AND STUDY DESIGN: An iterative multiple methods approach comprising three stages was undertaken: 1. Scoping study of existing competencies; 2. Exploratory survey; and, 3. Modified Delphi process (2 rounds) involving 128 nutrition experts from industry, community, government and academia. A ≥70% consensus rule was applied to Rounds 1 and 2 of the Delphi process in order to arrive at a final list of competencies. RESULTS: Stage 1: Scoping study resulted in an initial list of 71 competency statements, categorised under six core areas. Stage 2: Exploratory survey-completed by 74 Nutrition Society of Australia (NSA) members; 76% agreed there was a need to update the current competencies. Standards were refined to six core areas and 36 statements. Stage 3: Modified Delphi process-revised competencies comprise five core competency areas, underpinned by fundamental knowledge, skills, attitudes and values: Nutrition Science; Food and the Food System; Nutrition Governance, Sociocultural and Behavioural Factors; Nutrition Research and Critical Analysis; and Communication and Professional Conduct; and three specialist competency areas: Food Science; Public Health Nutrition; and Animal Nutrition. CONCLUSIONS: The revised competencies provide an updated framework of nutrition science knowledge for graduates to effectively practice in Australia. They may be used to benchmark current and future nutrition science degrees and lead to improved employability skills of nutrition science graduates.


Subject(s)
Curriculum , Nutritional Sciences/education , Nutritionists/education , Professional Competence/standards , Australia , Humans
14.
J Sleep Res ; 27(5): e12681, 2018 10.
Article in English | MEDLINE | ID: mdl-29582507

ABSTRACT

Caffeine is known for its capacity to mitigate performance decrements. The metabolic side-effects are less well understood. This study examined the impact of cumulative caffeine doses on glucose metabolism, self-reported hunger and mood state during 50 hr of wakefulness. In a double-blind laboratory study, participants were assigned to caffeine (n = 9, 6M, age 21.3 ± 2.1 years; body mass index 21.9 ± 1.6 kg/m2 ) or placebo conditions (n = 8, 4M, age 23.0 ± 2.8 years; body mass index 21.8 ± 1.6 kg/m2 ). Following a baseline sleep (22:00 hours-08:00 hours), participants commenced 50 hr of sleep deprivation. Meal timing and composition were controlled throughout the study. Caffeine (200 mg) or placebo gum was chewed for 5 min at 01:00 hours, 03:00 hours, 05:00 hours and 07:00 hours during each night of sleep deprivation. Continual glucose monitors captured interstitial glucose 2 hr post-breakfast, at 5-min intervals. Hunger and mood state were assessed at 10:00 hours, 16:30 hours, 22:30 hours and 04:30 hours. Caffeine did not affect glucose area under the curve (p = 0.680); however, glucose response to breakfast significantly increased after 2 nights of extended wakefulness compared with baseline (p = 0.001). There was a significant main effect of day, with increased tiredness (p < 0.001), mental exhaustion (p < 0.001), irritability (p = 0.002) and stress (p < 0.001) on the second day of extended wake compared with day 1. Caffeine attenuated the rise in tiredness (p < 0.001), mental exhaustion (p = 0.044) and irritability (p = 0.018) on day 1 but not day 2. Self-reported hunger was not affected by sleep deprivation or caffeine. These data confirm the effectiveness of caffeine in improving performance under conditions of sleep deprivation by reducing feelings of tiredness, mental exhaustion and irritability without exacerbating glucose metabolism and feelings of hunger.


Subject(s)
Affect/physiology , Caffeine/adverse effects , Glucose/metabolism , Hunger/physiology , Adult , Double-Blind Method , Female , Humans , Male , Self Report , Time Factors , Wakefulness/physiology , Young Adult
15.
Nutr J ; 17(1): 62, 2018 06 15.
Article in English | MEDLINE | ID: mdl-29907153

ABSTRACT

BACKGROUND: Although higher-protein diets (HP) can assist with weight loss and glycemic control, their effect on psychological wellbeing has not been established. The objective of this study was to compare the effects of a HP and a higher-carbohydrate diet (HC), combined with regular exercise, on psychological wellbeing both during weight loss (WL) and weight maintenance phases (WM). METHODS: In a parallel RCT, 61 adults with T2D (mean ± SD: BMI 34.3 ± 5.1 kg/m2, aged 55 ± 8 years) consumed a HP diet (29% protein, 34% carbohydrate, 31% fat) or an isocaloric HC diet (21%:48%:24%), with moderate intensity exercise, for 12 weeks of WL and 12 weeks of WM. Secondary data evaluating psychological wellbeing was assessed using: Problems Areas in Diabetes (PAID); Diabetes-39 Quality of Life (D-39); Short Form Health Survey (SF-36); Perceived Stress Scale-10 (PSS-10) and the Leeds Sleep Evaluation Questionnaire (LSEQ) at Weeks 0, 12 and 24 and evaluated with mixed models analysis. RESULTS: Independent of diet, improvements for PAID; D-39 diabetes control; D-39 severity of diabetes; SF-36 physical functioning and SF-36 general health were found following WL (d = 0.30 to 0.69, P ≤ 0.04 for all) which remained after 12 weeks of WM. SF-36 vitality improved more in the HP group (group x time interaction P = 0.03). Associations were seen between HbA1c and D-39 severity of diabetes rating (r = 0.30, P = 0.01) and SF-36 mental health (r = - 0.32, P = 0.003) and between weight loss and PAID (r = 0.30, P = 0.01). CONCLUSION: Several improvements in diabetes-related and general psychological wellbeing were seen similarly for both diets following weight loss and a reduction in HbA1c with most of these improvements remaining when weight loss was sustained for 12 weeks. A HP diet may provide additional increases in vitality. TRIAL REGISTRATION: The trial was prospectively registered with the Australian New Zealand Clinical Trials Registry (ACTRN 12613000008729 ) on 4 January 2013.


Subject(s)
Diabetes Mellitus, Type 2/diet therapy , Diet, Fat-Restricted/psychology , Dietary Carbohydrates/administration & dosage , Dietary Proteins/administration & dosage , Obesity/diet therapy , Quality of Life/psychology , Aged , Exercise , Humans , Middle Aged , Surveys and Questionnaires , Weight Loss
17.
Int J Food Sci Nutr ; 69(4): 503-512, 2018 Jun.
Article in English | MEDLINE | ID: mdl-29041827

ABSTRACT

This study aimed to compare sugar intake in Australian children with current guidelines and determine if total sugar consumption as a percentage of energy (sugar %E) exacerbates the relationship between sleep and behaviour. A sample of 287 children aged 8-12 years (boys 48.8%, age: 10.7 ± 1.3 years), and their parents/guardians completed a battery of questionnaires. Children completed a food frequency questionnaire, and parents completed demographic, sleep, and behaviour questionnaires. Average sugar intake was 134.9 ± 71.7 g per day (sugar %E 26.0 ± 7.0%), and only 55 (19%) participants did not exceed the recommended sugar intake limit. Correlations and logistical regressions indicated that sugar %E was not associated with sleep or behavioural domains (r range = -0.07-0.08; p range = .173-.979) nor contributed to the prediction of sleep behaviour problems (p range = .16-.80). Whilst a high proportion of children consumed above the recommended amount of daily total sugar, total sugar consumption was not related to behavioural or sleep problems, nor affected the relationship between these variables.


Subject(s)
Child Behavior/drug effects , Dietary Carbohydrates/administration & dosage , Dietary Carbohydrates/pharmacology , Sleep/drug effects , Sugars/administration & dosage , Sugars/pharmacology , Australia , Child , Diet Records , Female , Humans , Male
18.
Nutr Neurosci ; 20(10): 555-562, 2017 Dec.
Article in English | MEDLINE | ID: mdl-27386745

ABSTRACT

OBJECTIVE: Peanuts contain bioactive nutrients beneficial for vascular function. This study investigated whether consumption of unsalted peanuts (with skins) would enhance cerebrovascular perfusion and cognitive performance. METHOD: In a randomized crossover trial, 61 volunteers (29 males/32 females, 65 ± 7 years, BMI 31 ± 4 kg/m2) consumed their habitual diet ± high-oleic peanuts (56-84 g/day), each for 12 weeks. Nutrient intakes, vascular and cognitive function were assessed at baseline and at the end of each 12-week phase. Differences between the ends of each phase were compared by general linear repeated measures ANOVA controlling for baseline. Pearson's correlation analyses determined relationships between differences in cerebrovascular reactivity (CVR) and cognitive function. RESULTS: Intakes of bioactive nutrients increased during the peanut phase. CVR was 5% greater in the left middle cerebral artery (MCA) and 7% greater in the right MCA. Small artery elasticity was 10% greater after peanut consumption; large artery elasticity and blood pressure did not differ between phases. Measures of short-term memory, verbal fluency, and processing speed were also higher following the peanut phase; other cognitive measures did not change. Differences in CVR in the left MCA correlated with differences in delayed memory and recognition. DISCUSSION: Regular peanut consumption improved cerebrovascular and cognitive function; increased intakes of bioactive nutrients may have mediated these improvements. This clinical trial was registered with the Australian Clinical Trials Registry (ACTRN 12612000192886).


Subject(s)
Arachis/chemistry , Brain/physiology , Cardiovascular Physiological Phenomena , Cognition , Overweight , Aged , Blood Pressure , Body Mass Index , Cross-Over Studies , Diet , Dietary Fiber/administration & dosage , Dietary Proteins/administration & dosage , Fatty Acids, Unsaturated/administration & dosage , Fatty Acids, Unsaturated/blood , Female , Humans , Male , Memory, Short-Term , Micronutrients/administration & dosage , Micronutrients/blood , Middle Aged
19.
Qual Life Res ; 26(11): 3119-3129, 2017 11.
Article in English | MEDLINE | ID: mdl-28674767

ABSTRACT

PURPOSE: To investigate associations between aspects of time use and health-related quality of life (HRQoL) in youth. METHODS: 239 obese and healthy-weight 10- to 13-year-old Australian children completed the Pediatric Quality of Life Inventory (PedsQL™) quantifying their health-related quality of life. Time use was evaluated over four days using the Multimedia Activity Recall for Children and Adolescents (MARCA), a validated 24 h recall tool. The average number of minutes/day spent in physical activity (divided into sport, active transport and play), screen time (divided into television, videogames and computer use), and sleep were calculated. Percent fat was measured using dual-energy X-ray absorptiometry, Tanner stage by self-report, and household income by parental report. Sex-stratified analysis was conducted using Partial Least Squares regression, with percent fat, Tanner stage, household income, and use-of-time as the independent variables, and PedsQL™ total, physical and psychosocial subscale scores as the dependent variables. RESULTS: For boys, the most important predictors of HRQoL were percent fat (negative), videogames (negative), sport (positive), and Tanner stage (negative). For girls, the significant predictors were percent fat (negative), television (negative), sport (positive), active transport (negative), and household income (positive). CONCLUSION: While body fat was the most significant correlate of HRQoL, sport was independently associated with better HRQoL, and television and videogames with poorer HRQoL. Thus, parents and clinicians should be mindful that not all physical activity and screen-based behaviours have equivocal relationships with children's HRQoL. Prospective research is needed to confirm causation and to inform current activity guidelines.


Subject(s)
Exercise/psychology , Sickness Impact Profile , Child , Female , Humans , Male , Prospective Studies , Time Factors
20.
J Assist Reprod Genet ; 34(1): 71-78, 2017 Jan.
Article in English | MEDLINE | ID: mdl-27853913

ABSTRACT

PURPOSE: This study aims to test the hypothesis, in a single-center retrospective analysis, that live birth rates are significantly different when utilizing preimplantation genetic screening (PGS) compared to not utilizing PGS in frozen-thawed embryo transfers in our patients that use eggs from young, anonymous donors. The question therefore arises of whether PGS is an appropriate intervention for donor egg cycles. METHODS: Live birth rates per cycle and live birth rates per embryo transferred after 398 frozen embryo transfer (FET) cycles were examined from patients who elected to have PGS compared to those who did not. Blastocysts derived from donor eggs underwent trophectoderm biopsy and were tested for aneuploidy using array comparative genomic hybridization (aCGH) or next-generation sequencing (NGS), then vitrified for future use (test) or were vitrified untested (control). Embryos were subsequently warmed and transferred into a recipient or gestational carrier uterus. Data was analyzed separately for single embryo transfer (SET), double embryo transfer (DET), and for own recipient uterus and gestational carrier (GC) uterus recipients. RESULTS: Rates of implantation of embryos leading to a live birth were significantly higher in the PGS groups transferring two embryos (DET) compared to the no PGS group (GC, 72 vs. 56 %; own uterus, 60 vs. 36 %). The live birth implantation rate in the own uterus group for SET was higher in the PGS group compared to the control (58 vs. 36 %), and this almost reached significance but the live birth implantation rate for the SET GC group remained the same for both tested and untested embryos. Live births per cycle were nominally higher in the PGS GC DET and own uterus SET and DET groups compared to the non-PGS embryo transfers. These differences almost reached significance. The live birth rate per cycle in the SET GC group was almost identical. CONCLUSIONS: Significant differences were noted only for DET; however, benefits need to be balanced against risks associated with multiple pregnancies. Results observed for SET need to be confirmed on larger series and with randomized cohorts.


Subject(s)
Blastocyst/cytology , Fertilization in Vitro , Preimplantation Diagnosis , Single Embryo Transfer/methods , Adult , Comparative Genomic Hybridization , Cryopreservation , Embryo Implantation , Female , High-Throughput Nucleotide Sequencing , Humans , Live Birth , Pregnancy , Pregnancy Outcome , Randomized Controlled Trials as Topic , Retrospective Studies , Vitrification
SELECTION OF CITATIONS
SEARCH DETAIL