Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 1.665
Filter
Add more filters

Publication year range
1.
Proc Natl Acad Sci U S A ; 121(33): e2408731121, 2024 Aug 13.
Article in English | MEDLINE | ID: mdl-39106305

ABSTRACT

AI is now an integral part of everyday decision-making, assisting us in both routine and high-stakes choices. These AI models often learn from human behavior, assuming this training data is unbiased. However, we report five studies that show that people change their behavior to instill desired routines into AI, indicating this assumption is invalid. To show this behavioral shift, we recruited participants to play the ultimatum game, where they were asked to decide whether to accept proposals of monetary splits made by either other human participants or AI. Some participants were informed their choices would be used to train an AI proposer, while others did not receive this information. Across five experiments, we found that people modified their behavior to train AI to make fair proposals, regardless of whether they could directly benefit from the AI training. After completing this task once, participants were invited to complete this task again but were told their responses would not be used for AI training. People who had previously trained AI persisted with this behavioral shift, indicating that the new behavioral routine had become habitual. This work demonstrates that using human behavior as training data has more consequences than previously thought since it can engender AI to perpetuate human biases and cause people to form habits that deviate from how they would normally act. Therefore, this work underscores a problem for AI algorithms that aim to learn unbiased representations of human preferences.


Subject(s)
Artificial Intelligence , Decision Making , Humans , Decision Making/physiology , Male , Female , Adult , Choice Behavior/physiology , Young Adult
2.
Proc Natl Acad Sci U S A ; 121(32): e2320603121, 2024 Aug 06.
Article in English | MEDLINE | ID: mdl-39074277

ABSTRACT

Distracted driving is responsible for nearly 1 million crashes each year in the United States alone, and a major source of driver distraction is handheld phone use. We conducted a randomized, controlled trial to compare the effectiveness of interventions designed to create sustained reductions in handheld use while driving (NCT04587609). Participants were 1,653 consenting Progressive® Snapshot® usage-based auto insurance customers ages 18 to 77 who averaged at least 2 min/h of handheld use while driving in the month prior to study invitation. They were randomly assigned to one of five arms for a 10-wk intervention period. Arm 1 (control) got education about the risks of handheld phone use, as did the other arms. Arm 2 got a free phone mount to facilitate hands-free use. Arm 3 got the mount plus a commitment exercise and tips for hands-free use. Arm 4 got the mount, commitment, and tips plus weekly goal gamification and social competition. Arm 5 was the same as Arm 4, plus offered behaviorally designed financial incentives. Postintervention, participants were monitored until the end of their insurance rating period, 25 to 65 d more. Outcome differences were measured using fractional logistic regression. Arm 4 participants, who received gamification and competition, reduced their handheld use by 20.5% relative to control (P < 0.001); Arm 5 participants, who additionally received financial incentives, reduced their use by 27.6% (P < 0.001). Both groups sustained these reductions through the end of their insurance rating period.


Subject(s)
Distracted Driving , Humans , Female , Male , Adult , Middle Aged , Distracted Driving/prevention & control , Aged , Adolescent , Automobile Driving , Young Adult
3.
Proc Natl Acad Sci U S A ; 120(17): e2216115120, 2023 04 25.
Article in English | MEDLINE | ID: mdl-37068252

ABSTRACT

We apply a machine learning technique to characterize habit formation in two large panel data sets with objective measures of 1) gym attendance (over 12 million observations) and 2) hospital handwashing (over 40 million observations). Our Predicting Context Sensitivity (PCS) approach identifies context variables that best predict behavior for each individual. This approach also creates a time series of overall predictability for each individual. These time series predictability values are used to trace a habit formation curve for each individual, operationalizing the time of habit formation as the asymptotic limit of when behavior becomes highly predictable. Contrary to the popular belief in a "magic number" of days to develop a habit, we find that it typically takes months to form the habit of going to the gym but weeks to develop the habit of handwashing in the hospital. Furthermore, we find that gymgoers who are more predictable are less responsive to an intervention designed to promote more gym attendance, consistent with past experiments showing that habit formation generates insensitivity to reward devaluation.


Subject(s)
Exercise , Reward , Hygiene , Habits , Time Factors
4.
Eur J Neurosci ; 60(1): 3447-3465, 2024 Jul.
Article in English | MEDLINE | ID: mdl-38798086

ABSTRACT

As opposed to those requiring a single action for reward acquisition, tasks necessitating action sequences demand that animals learn action elements and their sequential order and sustain the behaviour until the sequence is completed. With repeated learning, animals not only exhibit precise execution of these sequences but also demonstrate enhanced smoothness and efficiency. Previous research has demonstrated that midbrain dopamine and its major projection target, the striatum, play crucial roles in these processes. Recent studies have shown that dopamine from the substantia nigra pars compacta (SNc) and the ventral tegmental area (VTA) serve distinct functions in action sequence learning. The distinct contributions of dopamine also depend on the striatal subregions, namely the ventral, dorsomedial and dorsolateral striatum. Here, we have reviewed recent findings on the role of striatal dopamine in action sequence learning, with a focus on recent rodent studies.


Subject(s)
Dopamine , Learning , Animals , Dopamine/metabolism , Learning/physiology , Ventral Tegmental Area/physiology , Corpus Striatum/physiology , Corpus Striatum/metabolism , Corpus Striatum/drug effects , Humans , Reward
5.
Eur J Neurosci ; 60(4): 4518-4535, 2024 Aug.
Article in English | MEDLINE | ID: mdl-38973167

ABSTRACT

The balance between goal-directed and habitual control has been proposed to determine the flexibility of instrumental behaviour, in both humans and animals. This view is supported by neuroscientific studies that have implicated dissociable neural pathways in the ability to flexibly adjust behaviour when outcome values change. A previous Diffusion Tensor Imaging study provided preliminary evidence that flexible instrumental performance depends on the strength of parallel cortico-striatal white-matter pathways previously implicated in goal-directed and habitual control. Specifically, estimated white-matter strength between caudate and ventromedial prefrontal cortex correlated positively with behavioural flexibility, and posterior putamen-premotor cortex connectivity correlated negatively, in line with the notion that these pathways compete for control. However, the sample size of the original study was limited, and so far, there have been no attempts to replicate these findings. In the present study, we aimed to conceptually replicate these findings by testing a large sample of 205 young adults to relate cortico-striatal connectivity to performance on the slips-of-action task. In short, we found only positive neural correlates of goal-directed performance, including striatal connectivity (caudate and anterior putamen) with the dorsolateral prefrontal cortex. However, we failed to provide converging evidence for the existence of a neural habit system that puts limits on the capacity for flexible, goal-directed action. We discuss the implications of our findings for dual-process theories of instrumental action.


Subject(s)
Corpus Striatum , Goals , Neural Pathways , White Matter , Humans , White Matter/physiology , White Matter/diagnostic imaging , White Matter/anatomy & histology , Male , Female , Adult , Corpus Striatum/physiology , Corpus Striatum/diagnostic imaging , Corpus Striatum/anatomy & histology , Young Adult , Neural Pathways/physiology , Adolescent , Cerebral Cortex/physiology , Cerebral Cortex/diagnostic imaging , Diffusion Tensor Imaging/methods
6.
BMC Plant Biol ; 24(1): 764, 2024 Aug 10.
Article in English | MEDLINE | ID: mdl-39123124

ABSTRACT

BACKGROUND: Leaf nitrogen (N) and phosphorus (P) resorption is a fundamental adaptation strategy for plant nutrient conservation. However, the relative roles that environmental factors and plant functional traits play in regulating N and P resorption remain largely unclear, and little is known about the underlying mechanism of plant functional traits affecting nutrient resorption. Here, we measured leaf N and P resorption and 13 plant functional traits of leaf, petiole, and twig for 101 representative broad-leaved tree species in our target subtropical transitional forests. We integrated these multiple functional traits into the plant economics spectrum (PES). We further explored whether and how elevation-related environmental factors and these functional traits collectively control leaf N and P resorption. RESULTS: We found that deciduous and evergreen trees exhibited highly diversified PES strategies, tending to be acquisitive and conservative, respectively. The effects of PES, rather than of environmental factors, dominated leaf N and P resorption patterns along the elevational gradient. Specifically, the photosynthesis and nutrient recourse utilization axis positively affected N and P resorption for both deciduous and evergreen trees, whereas the structural and functional investment axis positively affected leaf N and P resorption for evergreen species only. Specific leaf area and green leaf nutrient concentrations were the most influential traits driving leaf N and P resorption. CONCLUSIONS: Our study simultaneously elucidated the relative contributions of environmental factors and plant functional traits to leaf N and P resorption by including more representative tree species than previous studies, expanding our understanding beyond the relatively well-studied tropical and temperate forests. We highlight that prioritizing the fundamental role of traits related to leaf resource capture and defense contributes to the monitoring and modeling of leaf nutrient resorption. Therefore, we need to integrate PES effects on leaf nutrient resorption into the current nutrient cycling model framework to better advance our general understanding of the consequences of shifting tree species composition for nutrient cycles across diverse forests.


Subject(s)
Forests , Nitrogen , Phosphorus , Plant Leaves , Trees , Nitrogen/metabolism , Phosphorus/metabolism , Plant Leaves/metabolism , Plant Leaves/physiology , Trees/metabolism , Trees/physiology , Tropical Climate , China , Photosynthesis
7.
Neurobiol Learn Mem ; 213: 107961, 2024 Sep.
Article in English | MEDLINE | ID: mdl-39025429

ABSTRACT

In an animal model of compulsive drug use, a subset of rats continues to self-administer cocaine despite footshock consequences and is considered punishment resistant. We recently found that punishment resistance is associated with habits that persist under conditions that typically encourage a transition to goal-directed control. Given that random ratio (RR) and random interval (RI) schedules of reinforcement influence whether responding is goal-directed or habitual, we investigated the influence of these schedules on punishment resistance for cocaine or food. Male and female Sprague Dawley rats were trained to self-administer either intravenous cocaine or food pellets on a seeking-taking chained schedule of reinforcement, with the seeking lever requiring completion of either an RR20 or RI60 schedule. Rats were then given four days of punishment testing with footshock administered at the completion of seeking on a random one-third of trials. For cocaine-trained rats, the RI60 schedule led to greater punishment resistance (i.e., more trials completed) than the RR20 schedule in males and females. For food-trained rats, the RI60 schedule led to greater punishment resistance (i.e., higher reward rates) than the RR20 schedule in female rats, although male rats showed punishment resistance on both RR20 and RI60 schedules. For both cocaine and food, we found that seeking responses were suppressed to a greater degree than reward rate with the RI60 schedule, whereas response rate and reward rate were equally suppressed with the RR20 schedule. This dissociation between punishment effects on reward rate and response rate with the RI60 schedule can be explained by the nonlinear relation between these variables on RI schedules, but it does not account for the enhanced resistance to punishment. Overall, the results show greater punishment resistance with the RI60 schedule as compared to the RR20 schedule, indicating that schedules of reinforcement are an influencing factor on resistance to negative consequences.


Subject(s)
Cocaine , Punishment , Rats, Sprague-Dawley , Reinforcement Schedule , Self Administration , Animals , Male , Female , Cocaine/administration & dosage , Cocaine/pharmacology , Rats , Conditioning, Operant/drug effects , Reinforcement, Psychology , Drug-Seeking Behavior/drug effects , Drug-Seeking Behavior/physiology
8.
Osteoporos Int ; 35(3): 523-531, 2024 Mar.
Article in English | MEDLINE | ID: mdl-37947843

ABSTRACT

Most studies investigating the association between physical activity and osteoporosis prevention only focused on specific types of physical activity. This study's evidence regarding the combined effects or interaction of sleep duration and physical activity. The findings emphasize the role of sleep duration and physical activity in association with osteoporosis. PURPOSE: The associations between physical activity, sleep duration, and prevalent osteoporosis in Taiwanese adults were studied in this cross-sectional study. METHODS: The Taiwan Biobank enrolled a community-based cohort of ~ 120,000 volunteers (as of April 30, 2020) between 30 and 76 years of age with no history of cancer. Amongst, bone mineral density (BMD) measures by dual-energy X-ray absorptiometry (DXA) were available in 22,402 participants. After excluding individuals who had no complete data of BMI (n = 23), MET score (n = 207), T-score (n = 8,826), and sleep duration (n = 16), 13,330 subjects were included as the primary cohort. Univariate and multivariable regression analyses were performed to determine the associations between the presence of osteoporosis, physical activity level, sleep duration, and other variables. RESULTS: The results showed that after adjustment, subjects with physical activity < 20 METs/week and ≥ 20 METs/week (aOR = 1.017 and 0.767, respectively) were associated with risk of osteoporosis than those with zero MET. The odds of osteoporosis were not significantly lower in subjects who slept for ≥ 8 h/day (aOR = 0.934,p=0.266). In addition, compared to short sleepers with no physical activity, adults with increased physical activity ≥ 20 METs/week and sleep ≥ 8 h/day had a significantly lowest likelihood of osteoporosis (aOR = 0.702). Those with medium physical activity (< 20 METs/week) plus average sleep duration (6.5-8 h/day) did not have significant higher odds of osteoporosis (aOR = 1.129,p=0.151). CONCLUSION: The findings emphasize the joint role of sleep duration and physical activity in association with osteoporosis. Adults with high physical activity plus high sleep hours have the highest BMD and lowest risk of osteoporosis.


Subject(s)
Osteoporosis , Sleep Duration , Adult , Humans , Taiwan/epidemiology , Cross-Sectional Studies , Biological Specimen Banks , Osteoporosis/etiology , Osteoporosis/complications , Bone Density , Absorptiometry, Photon , Exercise
9.
Br J Psychiatry ; 224(5): 164-169, 2024 May.
Article in English | MEDLINE | ID: mdl-38652060

ABSTRACT

BACKGROUND: A significant proportion of people with clozapine-treated schizophrenia develop 'checking' compulsions, a phenomenon yet to be understood. AIMS: To use habit formation models developed in cognitive neuroscience to investigate the dynamic interplay between psychosis, clozapine dose and obsessive-compulsive symptoms (OCS). METHOD: Using the anonymised electronic records of a cohort of clozapine-treated patients, including longitudinal assessments of OCS and psychosis, we performed longitudinal multi-level mediation and multi-level moderation analyses to explore associations of psychosis with obsessiveness and excessive checking. Classic bivariate correlation tests were used to assess clozapine load and checking compulsions. The influence of specific genetic variants was tested in a subsample. RESULTS: A total of 196 clozapine-treated individuals and 459 face-to-face assessments were included. We found significant OCS to be common (37.9%), with checking being the most prevalent symptom. In mediation models, psychosis severity mediated checking behaviour indirectly by inducing obsessions (r = 0.07, 95% CI 0.04-0.09; P < 0.001). No direct effect of psychosis on checking was identified (r = -0.28, 95% CI -0.09 to 0.03; P = 0.340). After psychosis remission (n = 65), checking compulsions correlated with both clozapine plasma levels (r = 0.35; P = 0.004) and dose (r = 0.38; P = 0.002). None of the glutamatergic and serotonergic genetic variants were found to moderate the effect of psychosis on obsession and compulsion (SLC6A4, SLC1A1 and HTR2C) survived the multiple comparisons correction. CONCLUSIONS: We elucidated different phases of the complex interplay of psychosis and compulsions, which may inform clinicians' therapeutic decisions.


Subject(s)
Antipsychotic Agents , Clozapine , Psychotic Disorders , Schizophrenia, Treatment-Resistant , Humans , Clozapine/adverse effects , Clozapine/therapeutic use , Male , Female , Adult , Antipsychotic Agents/adverse effects , Longitudinal Studies , Psychotic Disorders/drug therapy , Schizophrenia, Treatment-Resistant/drug therapy , Schizophrenia, Treatment-Resistant/genetics , Middle Aged , Compulsive Behavior/chemically induced , Obsessive-Compulsive Disorder/drug therapy , Obsessive-Compulsive Disorder/chemically induced , Schizophrenia/drug therapy
10.
Horm Behav ; 158: 105468, 2024 Feb.
Article in English | MEDLINE | ID: mdl-38101144

ABSTRACT

Hormonal contraceptives are utilized by millions of women worldwide. However, it remains unclear if these powerful endocrine modulators may alter cognitive function. Habit formation involves the progression of instrumental learning as it goes from being a conscious goal-directed process to a cue-driven automatic habitual motor response. Dysregulated goal and/or habit is implicated in numerous psychopathologies, underscoring the relevance of examining the effect of hormonal contraceptives on goal-directed and habitual behavior. This study examined the effect of levonorgestrel (LNG), a widely used progestin-type contraceptive, on the development of habit in intact female rats. Rats were implanted with subcutaneous capsules that slowly released LNG over the course of the experiment or cholesterol-filled capsules. All female rats underwent operant training followed by reward devaluation to test for habit. One group of females was trained at a level that is sub-threshold to habit, while another group of females was trained to a level well over the habit threshold observed in intact females. The results reveal that all sub-threshold trained rats remained goal-directed irrespective of LGN treatment, suggesting LNG is not advancing habit formation in female rats at this level of reinforcement. However, in rats that were overtrained well above the threshold, cholesterol females showed habitual behavior, thus replicating a portion of our original studies. In contrast, LNG-treated habit-trained rats remained goal-directed, indicating that LNG impedes the development and/or expression of habit following this level of supra-threshold to habit training. Thus, LNG may offset habit formation by sustaining attentional or motivational processes during learning in intact female rats. These results may be clinically relevant to women using this type of hormonal contraceptive as well as in other progestin-based hormone therapies.


Subject(s)
Goals , Levonorgestrel , Humans , Rats , Female , Animals , Levonorgestrel/pharmacology , Progestins/pharmacology , Conditioning, Operant/physiology , Habits , Cholesterol/pharmacology , Contraceptive Agents/pharmacology
11.
Ann Bot ; 2024 May 08.
Article in English | MEDLINE | ID: mdl-38716780

ABSTRACT

BACKGROUND AND AIMS: There is ample theoretical and experimental evidence that angiosperms harbouring self-incompatibility (SI) systems are likely to respond to global changes in unique ways relative to taxa with other mating systems. In this paper, we present an updated database on the prevalence of SI systems across angiosperms and examine the relationship between the presence of SI and latitude, biomes, life-history traits and management conditions to evaluate the potential vulnerability of SI taxa to climate change and habitat disturbance. METHODS: We performed literature searches to identify studies that employed controlled crosses, microscopic analyses and/or genetic data to classify taxa as having SI, self-compatibility (SC), partial self-compatibility (PSC) or self-sterility (SS). Where described, the site of the SI reaction and the presence of dimorphic versus monomorphic flowers were also recorded. We then combined this database on the distribution of mating systems with information about the life span, growth habit, management conditions and geographic distribution of taxa. Information about the geographic distribution of taxa was obtained from a manually curated version of the Global Biodiversity Information Facility database, and from vegetation surveys encompassing 9 biomes. We employed multinomial logit regression to assess the relationship between mating system and life-history traits, management condition, latitude and latitude-squared using self-compatible taxa as the baseline. Additionally, we employed LOESS regression to examine the relationship between the probability of SI and latitude. Finally, by summarizing information at the family level, we plotted the distribution of SI systems across angiosperms including information about the presence of SI or dioecy, the inferred reaction site of the SI system when known, as well as the proportion of taxa in a family for which information is available. KEY RESULTS: We obtained information about the SI status of 5686 hermaphroditic taxa, of which 55% exhibited SC, and the remaining 45% harbour SI, self-sterility (SS), or PSC. Highlights of the multinomial logit regression include that taxa with PSC have a greater odds of being short- (OR=1.3) or long- (OR=1.57) lived perennials relative to SC ones, and that SS/SI taxa (pooled) are less likely to be annuals (OR=0.64) and more likely to be long-lived perennials (OR=1.32). SS/SI taxa had a greater odds of being succulent (OR=2.4) or a tree (OR=2.05), and were less likely to be weeds (OR=0.34). Further, we find a quadratic relationship between the probability of being SI with latitude: SI taxa were more common in the tropics, a finding that was further supported by the vegetation surveys which showed fewer species with SS/SI in temperate and northern latitudes compared to mediterranean and tropical biomes. CONCLUSIONS: We conclude that in the short-term habitat fragmentation, pollinator loss and temperature increases may negatively impact plants with SI systems, particularly long-lived perennial and woody species dominant in tropical forests. In the longer term, these and other global changes are likely to select for self-compatible or partially self-compatible taxa which, due to the apparent importance of SI as a driver of plant diversification across the angiosperm tree of life, may globally influence plant species richness.

12.
Malar J ; 23(1): 246, 2024 Aug 16.
Article in English | MEDLINE | ID: mdl-39152481

ABSTRACT

BACKGROUND: Early diagnosis and prompt treatment of malaria in young children are crucial for preventing the serious stages of the disease. If delayed treatment-seeking habits are observed in certain areas, targeted campaigns and interventions can be implemented to improve the situation. METHODS: This study applied multivariate binary logistic regression model diagnostics and geospatial logistic model to identify traditional authorities in Malawi where caregivers have unusual health-seeking behaviour for childhood malaria. The data from the 2021 Malawi Malaria Indicator Survey were analysed using R software version 4.3.0 for regressions and STATA version 17 for data cleaning. RESULTS: Both models showed significant variability in treatment-seeking habits of caregivers between villages. The mixed-effects logit model residual identified Vuso Jere, Kampingo Sibande, Ngabu, and Dzoole as outliers in the model. Despite characteristics that promote late reporting of malaria at clinics, most mothers in these traditional authorities sought treatment within twenty-four hours of the onset of malaria symptoms in their children. On the other hand, the geospatial logit model showed that late seeking of malaria treatment was prevalent in most areas of the country, except a few traditional authorities such as Mwakaboko, Mwenemisuku, Mwabulambya, Mmbelwa, Mwadzama, Zulu, Amidu, Kasisi, and Mabuka. CONCLUSIONS: These findings suggest that using a combination of multivariate regression model residuals and geospatial statistics can help in identifying communities with distinct treatment-seeking patterns for childhood malaria within a population. Health policymakers could benefit from consulting traditional authorities who demonstrated early reporting for care in this study. This could help in understanding the best practices followed by mothers in those areas which can be replicated in regions where seeking care is delayed.


Subject(s)
Malaria , Patient Acceptance of Health Care , Malawi , Humans , Malaria/prevention & control , Malaria/epidemiology , Patient Acceptance of Health Care/statistics & numerical data , Child, Preschool , Logistic Models , Infant , Female , Male , Adult , Child , Young Adult , Adolescent
13.
Ann Behav Med ; 2024 Sep 03.
Article in English | MEDLINE | ID: mdl-39225981

ABSTRACT

BACKGROUND: Physical activity interventions using habit development may help people increase and then maintain physical activity increases over time. Enacting behavior in consistent contexts is a central component of habit development, yet its causal role in habit development in health behaviors has not been confirmed. PURPOSE: This study tests the causal role of consistent context in habit development in health behavior, using a randomized control trial of a planning intervention to develop a walking habit in 127 insufficiently active, working, midlife adults in a real-world setting. METHODS: We compare participants who plan walking in consistent contexts with controls who plan walking in varied contexts and with controls not required to plan on a change in average daily steps (measured using an accelerometer) and inhabit automaticity during a 4-week intervention and at a 4-week follow-up. RESULTS: As expected, consistent and varied context planners increased walking during the intervention, but only consistent context planners developed (and maintained) habit automaticity. Counter to expectations, consistent context planners did not show walking maintenance. However, across conditions, participants who developed more habit automaticity during the intervention also maintained walking more (decreased less). Having a routine daily schedule moderated some effects. Notably, no-plan controls with greater routine developed more habit automaticity, mediated by walking in more consistent contexts. CONCLUSIONS: This study confirms the causal role of consistent contexts in developing a walking habit, in a real-world setting, with an important but challenging population for physical activity interventions and identifies a facilitating condition common for many: a routine schedule.


Developing an exercise habit may help people increase and then maintain physical activity. This study tests and confirms the role of exercising in consistent contexts as a cause of forming a daily walking habit. We use a randomized control trial of a 4-week planning intervention, with a follow-up 4 weeks after the intervention. Participants were 127 insufficiently active, working, midlife adults. We compared participants asked to plan their daily walks in consistent contexts from day-to-day, with participants asked to plan their walks in varied contexts and with participants not required to plan. As expected, consistent and varied context planners increased their daily walking steps (measured using an accelerometer) during the intervention compared to participants not required to plan. However, only consistent context planners developed (and then maintained) a daily walking habit, that is, where taking daily walks felt relatively automatic. Unexpectedly, consistent context planners did not show walking maintenance. However, across all participants, those who developed a stronger walking habit during the intervention maintained their walking more after the intervention ended. Lastly, having an existing routine daily schedule helped some participants. Those who were not asked to plan and had a more routine daily schedule also developed a daily walking habit.

14.
Prev Med ; 180: 107890, 2024 Mar.
Article in English | MEDLINE | ID: mdl-38336280

ABSTRACT

BACKGROUND: Long working hours are associated with cardiovascular and metabolic diseases. This study investigated the relationship between the working hours and dietary qualities and patterns in Korean workers. METHODS: Data from 24,523 workers were extracted from the Korea National Health and Nutrition Examination Survey, 2013-2021. The Korean Healthy Eating Index (KHEI), which ranges from 0 to 100, with a higher score indicating greater adherence to Korean dietary guidelines and superior dietary quality, was used for dietary assessment. We identified dietary patterns and classified workers using latent profile analysis. Logistic regressions were used to estimate odds ratios (ORs) and 95% confidence intervals (CIs). RESULTS: Five distinct dietary patterns emerged: healthy diet (24.8%), low-vegetable diet (14.0%), average diet (7.8%), low-fruit diet (31.4%), and poor diet (22.0%). The mean KHEI score was 60.8, with the highest score observed in the healthy diet pattern (71.3) and the lowest, in the poor diet pattern (50.0). Compared with working 35-40 h/week, working ≥55 h/week was negatively associated with KHEI scores (ß: -1.08; 95% CI: -1.67, -0.49). Those working ≥55 h/week were less likely to have a healthy diet pattern (OR: 0.81; 95% CI: 0.72, 0.91) and more likely to have a low-fruit diet (OR: 1.36; 95% CI: 1.20, 1.55) or poor diet pattern (OR: 1.23; 95% CI: 1.05, 1.43) compared with those working 35-40 h/week. CONCLUSION: Long working hours are associated with undesirable dietary quality and patterns. Policy interventions aimed at enhancing dietary quality are needed to alleviate the health burdens associated with long working hours.


Subject(s)
Diet, Healthy , Diet , Humans , Nutrition Surveys , Fruit , Republic of Korea
15.
Psychophysiology ; 61(7): e14571, 2024 Jul.
Article in English | MEDLINE | ID: mdl-38679809

ABSTRACT

Given experience in cluttered but stable visual environments, our eye-movements form stereotyped routines that sample task-relevant locations, while not mixing-up routines between similar task-settings. Both dopamine signaling and mindfulness have been posited as factors that influence the formation of such routines, yet quantification of their impact remains to be tested in healthy humans. Over two sessions, participants searched through grids of doors to find hidden targets, using a gaze-contingent display. Within each session, door scenes appeared in either one of two colors, with each color signaling a differing set of likely target locations. We derived measures for how well target locations were learned (target-accuracy), how routine were sets of eye-movements (stereotypy), and the extent of interference between the two scenes (setting-accuracy). Participants completed two sessions, where they were administered either levodopa (dopamine precursor) or placebo (vitamin C), under double-blind counterbalanced conditions. Dopamine and trait mindfulness (assessed by questionnaire) interacted to influence both target-accuracy and stereotypy. Increasing dopamine improved accuracy and reduced stereotypy for high mindfulness scorers, but induced the opposite pattern for low mindfulness scorers. Dopamine also disrupted setting-accuracy invariant to mindfulness. Our findings show that mindfulness modulates the impact of dopamine on the target-accuracy and stereotypy of eye-movement routines, whereas increasing dopamine promotes interference between task-settings, regardless of mindfulness. These findings provide a link between non-human and human models regarding the influence of dopamine on the formation of task-relevant eye-movement routines and provide novel insights into behavior-trait factors that modulate the use of experience when building adaptive repertoires.


Subject(s)
Dopamine , Mindfulness , Humans , Male , Female , Adult , Young Adult , Dopamine/metabolism , Levodopa/pharmacology , Levodopa/administration & dosage , Double-Blind Method , Eye Movements/physiology , Visual Perception/physiology , Dopamine Agents/pharmacology , Attention/physiology , Psychomotor Performance/physiology
16.
Br J Nutr ; 131(6): 1007-1014, 2024 03 28.
Article in English | MEDLINE | ID: mdl-37926898

ABSTRACT

This study aimed to investigate the causal effect of dietary habits on COVID-19 susceptibility, hospitalisation and severity. We used data from a large-scale diet dataset and the COVID-19 Host Genetics Initiative to estimate causal relationships using Mendelian randomisation. The inverse variance weighted (IVW) method was used as the main analysis. For COVID-19 susceptibility, IVW estimates indicated that milk (OR: 0·82; 95 % CI (0·68, 0·98); P = 0·032), unsalted peanut (OR: 0·53; 95 % CI (0·35, 0·82); P = 0·004), beef (OR: 0·59; 95 % CI (0·41, 0·84); P = 0·004), pork (OR: 0·63; 95 % CI (0·42, 0·93); P = 0·022) and processed meat (OR: 0·76; 95 % CI (0·63, 0·92); P = 0·005) were causally associated with reduced COVID-19 susceptibility, while coffee (OR: 1·23; 95 % CI (1·04, 1·45); P = 0·017) and tea (OR: 1·17; 95 % CI (1·05, 1·31); P = 0·006) were causally associated with increased risk. For COVID-19 hospitalisation, beef (OR: 0·51; 95 % CI (0·26, 0·98); P = 0·042) showed negative correlations, while tea (OR: 1·54; 95 % CI (1·16, 2·04); P = 0·003), dried fruit (OR: 2·08; 95 % CI (1·37, 3·15); P = 0·001) and red wine (OR: 2·35; 95 % CI (1·29, 4·27); P = 0·005) showed positive correlations. For COVID-19 severity, coffee (OR: 2·16; 95 % CI (1·25, 3·76); P = 0·006), dried fruit (OR: 1·98; 95 % CI (1·16, 3·37); P = 0·012) and red wine (OR: 2·84; 95 % CI (1·21, 6·68); P = 0·017) showed an increased risk. These findings were confirmed to be robust through sensitivity analyses. Our findings established a causal relationship between dietary habits and COVID-19 susceptibility, hospitalisation and severity.


Subject(s)
COVID-19 , Feeding Behavior , Humans , Coffee , COVID-19/epidemiology , COVID-19/etiology , Genome-Wide Association Study , Hospitalization , Tea , Mendelian Randomization Analysis
17.
Eur J Nutr ; 63(2): 409-423, 2024 Mar.
Article in English | MEDLINE | ID: mdl-38006443

ABSTRACT

PURPOSE: Diet-related diseases are advancing as the leading cause of death globally. As self-reporting of diet by patients can be associated with errors, stable isotopes of human tissues can be used to diagnose diseases, understand physiology, and detect change in diet. This study investigates the effect of type and amount of food on the nitrogen and carbon concentration (Nconc and Cconc) and isotopic composition (δ15N and δ13C) in human scalp hair and fingernails. METHODS: A total of 100 residents participated in the study whereas only 74 individuals provided complete diet history. Sixty-six food items majorly available to them were also collected. The Nconc, Cconc, δ15N and δ13C values of human hair, nails and food items were determined. RESULTS: The Nconc, Cconc, δ15N and δ13C values between plant-sourced and animal-sourced food items, as well as human hair and nail tissue were significantly different (p < 0.05). The δ15N value of human tissues was distinct between lacto-vegetarians and omnivores by 0.9‰. The δ15N and δ13C values of human tissues increased by 0.4-0.5‰ with every 5% increase in the consumption of animal protein. CONCLUSIONS: The study helps to demarcate lacto-vegetarians from omnivores, and estimate the percentage of animal protein in diet based on the dual isotope values of human tissues. It also acts as a reference to determine isotopic composition of hair tissue provided the isotope value of nail tissue is known and vice versa.


Subject(s)
Nails , Scalp , Animals , Humans , Scalp/chemistry , Nails/chemistry , Nitrogen Isotopes/analysis , Carbon Isotopes/analysis , Diet , Hair/chemistry , Animal Feed/analysis
18.
Kidney Blood Press Res ; 49(1): 472-479, 2024.
Article in English | MEDLINE | ID: mdl-38852587

ABSTRACT

INTRODUCTION: Breakfast-skipping habits are associated with adverse health outcomes including coronary heart disease, metabolic syndrome, and diabetes mellitus. However, it remains uncertain whether skipping breakfast affects chronic kidney disease (CKD) risk. This study aimed to examine the association between skipping breakfast and progression of CKD. METHODS: We retrospectively conducted a population-based cohort study using the data from the Iki City Epidemiological Study of Atherosclerosis and Chronic Kidney Disease (ISSA-CKD). Between 2008 and 2019, we included 922 participants aged 30 years or older who had CKD (estimated glomerular filtration rate <60 mL/min/1.73 m2 and/or proteinuria) at baseline. Breakfast skippers were defined as participants who skipped breakfast more than 3 times per week. The outcome was CKD progression defined as a decline of at least 30% in the estimated glomerular filtration rate (eGFR) from the baseline status. Cox proportional hazards models were used to calculate hazard ratios (HRs) and 95% confidence intervals (CIs) for CKD progression, adjusted for other CKD risk factors. RESULTS: During a follow-up period with a mean of 5.5 years, CKD progression occurred in 60 (6.5%) participants. The incidence rate (per 1,000 person-years) of CKD progression was 21.5 in the breakfast-skipping group and 10.7 in the breakfast-eating group (p = 0.029), respectively. The multivariable-adjusted HR (95% CI) for CKD progression was 2.60 (95% CI: 1.29-5.26) for the breakfast-skipping group (p = 0.028) compared with the group eating breakfast. There were no clear differences in the association of skipping breakfast with CKD progression in subgroup analyses by sex, age, obesity, hypertension, diabetes mellitus, baseline eGFR, and baseline proteinuria. CONCLUSION: Skipping breakfast was significantly associated with higher risk of CKD progression in the general Japanese population.


Subject(s)
Breakfast , Disease Progression , Renal Insufficiency, Chronic , Humans , Renal Insufficiency, Chronic/epidemiology , Renal Insufficiency, Chronic/physiopathology , Male , Female , Middle Aged , Retrospective Studies , Japan/epidemiology , Aged , Glomerular Filtration Rate , Atherosclerosis/epidemiology , Atherosclerosis/etiology , Adult , Risk Factors , Feeding Behavior , Cohort Studies , East Asian People
19.
Alcohol Alcohol ; 59(3)2024 Mar 16.
Article in English | MEDLINE | ID: mdl-38725398

ABSTRACT

AIMS: This study aimed to compare reward, relief, and habit treatment-seeking individuals on recent drinking, alcohol use disorder (AUD) phenomenology, and mood. The second aim of the study was to evaluate the predictive validity of reward, relief, and habit profiles. METHOD: Treatment-seeking individuals with an AUD (n = 169) were recruited to participate in a medication trial for AUD (NCT03594435). Reward, relief, and habit drinking groups were assessed using the UCLA Reward Relief Habit Drinking Scale. Group differences at baseline were evaluated using univariate analyses of variance. A subset of participants were enrolled in a 12-week, double-blind, placebo-controlled medication trial (n = 102), and provided longitudinal drinking and phenomenology data. The predictive validity of group membership was assessed using linear regression analyses. RESULTS: At baseline, individuals who drink primarily for relief had higher craving and negative mood than those who drink for reward and habit. Prospectively, membership in the relief drinking group predicted greater alcohol use, greater heavy drinking, and fewer days abstinent compared to those in the reward drinking group. Membership in the relief drinking group also predicted greater alcohol craving, more alcohol-related consequences, and more anxiety symptoms over 12 weeks compared to those in the reward drinking group. CONCLUSIONS: This study provides support for reward and relief drinking motive profiles in treatment-seeking individuals with an AUD. Membership in the relief drinking motive group was predictive of poorer drinking outcomes and more negative symptomology over 12 weeks, indicating that individuals who drink for relief may be a particularly vulnerable sub-population of individuals with AUD.


Subject(s)
Alcohol Drinking , Alcoholism , Habits , Reward , Humans , Male , Female , Alcoholism/therapy , Alcoholism/psychology , Alcohol Drinking/psychology , Alcohol Drinking/therapy , Adult , Middle Aged , Double-Blind Method , Patient Acceptance of Health Care/psychology , Affect , Craving
20.
Addict Biol ; 29(8): e13435, 2024 Aug.
Article in English | MEDLINE | ID: mdl-39188063

ABSTRACT

Heinz et al. (2024) recently criticised habit/compulsion theory of human addiction but nevertheless concluded that 'habit formation plays a significant role in drug addiction'. To challenge this causal claim, the current article develops four further methodological criticisms, that publications supporting the habit/compulsion account of human addiction: (1) under-report contradictory observations; (2) exaggerate the process purity of positive observations; (3) under-emphasise the low quality of epidemiological support for a causal hypothesis; (4) recapitulate the social injustice of racial intelligence era by prematurely attributing lower task performance to drug user group membership (endophenotype) without having adequately tested social, psychological, economic and environmental inequalities. Methodological guidelines are recommended to address each concern, which should raise evidence standards, incorporate social justice and improve accuracy of estimating any specific effect of addiction history on task performance. Given that construing drug users as intellectually impaired could promote stigma and reduce their recovery potential, it is recommended that scientific discourse about habit/compulsive endophenotypes underpinning addiction is avoided until these higher evidence standards are met.


Subject(s)
Social Justice , Humans , Behavior, Addictive/psychology , Racism , Substance-Related Disorders/psychology , Motivation , Compulsive Behavior , Psychological Theory , Habits
SELECTION OF CITATIONS
SEARCH DETAIL