Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 100
Filter
1.
Obes Facts ; 2024 Feb 05.
Article in English | MEDLINE | ID: mdl-38316112

ABSTRACT

INTRODUCTION: School-based exercise interventions targeted at reducing obesity are often successful in the short term, but they are resource-heavy and don't always lead to long-lasting behaviour changes. This study investigated the effect of reducing sedentary time, rather than increasing exercise, on physical activity (PA) behaviours and obesity in primary school children. METHODS: 30 UK state primary schools participated in this cluster-controlled intervention study (IDACI score = .15 ± .07, Free school meals = 26 ± 9 %). Twenty-six intervention and 4 control schools (intervention = 3529, control = 308 children) completed the Physical Activity Questionnaire for Children (PAQ-C) in terms 1 and 3. Three intervention and 3 control schools (intervention = 219, control = 152 children) also measured waist-to-height ratio (WTHR). The Active Movement intervention is a school-based programme which integrates non-sedentary behaviours such as standing and walking in the classroom. Data was analysed via ANCOVAs and multiple linear regressions. RESULTS: WTHR was reduced by 8% in the intervention group only (F(2,285) = 11.387 p < .001), and sport participation increased by 10% in the intervention group only (F(1,232) = 6.982, p = .008). Other PAQ-C measures increased significantly in the intervention group, but there was no group*time interaction. Changes in PAQ-C did not predict reductions in WTHR. Instead, the amount of change in WTHR was predicted by intervention group and by baseline WTHR of the pupil, where children with higher baseline WTHR showed greater reductions (F(2, 365) = 77.21, p < .001, R2 = .30). Socio-economic status (SES), age or gender did not mediate any of the changes in the PAQ-C nor WTHR. CONCLUSION: Reducing sedentary behaviours during school time can be an effective obesity reduction strategy for primary school children who are overweight. The lack of demographic effects suggests that this method can be effective regardless of the school's SES, pupil age or gender.

3.
Article in English | MEDLINE | ID: mdl-37998297

ABSTRACT

Harmful use of alcohol is a problem in the Northern Territory (NT), Australia. The aim of this study was to assess and compare alcohol-attributable deaths and the contribution of alcohol to the burden of disease and injury (BOD) among the Aboriginal and non-Aboriginal populations in the NT between 2014 and 2018. The alcohol-use data for adults aged 15+ years old in the NT population was taken from the 2016 National Drug Strategy Household Survey. BOD was measured in disability-adjusted life years (DALY) as part of the NT BOD study. Population-attributable fractions were derived to analyse deaths and BOD. Between 2014 and 2018, 673 Aboriginal and 392 non-Aboriginal people died of harmful use of alcohol, accounting for 26.3% and 12.9% of the total deaths in the Aboriginal and non-Aboriginal population, respectively. Alcohol caused 38,596 and 15,433 DALY (19.9% and 10.2% of the total), respectively, in the NT Aboriginal and non-Aboriginal population for the same period. The alcohol-attributable DALY rate in the Aboriginal population was 10,444.6 per 100,000 persons, six times the non-Aboriginal rate. This study highlights the urgent need to reduce harmful alcohol use in the NT, which disproportionately affects Aboriginal peoples in rural and remote areas.


Subject(s)
Alcoholism , Australian Aboriginal and Torres Strait Islander Peoples , Adult , Humans , Adolescent , Alcohol Drinking/epidemiology , Alcoholism/epidemiology , Northern Territory/epidemiology , Cost of Illness
4.
Aust J Rural Health ; 31(5): 1017-1026, 2023 Oct.
Article in English | MEDLINE | ID: mdl-37706591

ABSTRACT

OBJECTIVE: To undertake an economic evaluation of community water fluoridation (CWF) in remote communities of the Northern Territory (NT). DESIGN: Dental caries experiences were compared between CWF and non-CWF communities before and after intervention. Costs and benefits of CWF are ascertained from the health sector perspective using water quality, accounting, oral health, dental care and hospitalisation datasets. SETTING AND PARTICIPANTS: Remote Aboriginal population in the NT between 1 January 2008 and 31 December 2020. INTERVENTION: CWF. MAIN OUTCOME MEASURES: Potential economic benefits were estimated by changes in caries scores valued at the NT average dental service costs. RESULTS: Given the total 20-year life span of a fluoridation plant ($1.77 million), the net present benefit of introducing CWF in a typical community of 300-499 population was $3.79 million. For each $1 invested in CWF by government, the estimated long-term economic value of savings to health services ranged from $1.1 (population ≤300) to $16 (population ≥2000) due to reductions in treating dental caries and associated hospitalisations. The payback period ranged from 15 years (population ≤300) to 2.2 years (population ≥2000). CONCLUSIONS: The economic benefits of expanding CWF in remote Aboriginal communities of NT outweigh the costs of installation, operation and maintenance of fluoridation plants over the lifespan of CWF infrastructure for population of 300 or more.


Subject(s)
Dental Caries , Fluoridation , Humans , Cost-Benefit Analysis , Dental Caries/prevention & control , Northern Territory , Indigenous Peoples
5.
Aust Health Rev ; 47(5): 521-534, 2023 Oct.
Article in English | MEDLINE | ID: mdl-37696752

ABSTRACT

Objective This study aimed to externally validate the Commonwealth's Health Care Homes (HCH) algorithm for Aboriginal Australians living in the Northern Territory (NT). Methods A retrospective cohort study design using linked primary health care (PHC) and hospital data was used to analyse the performance of the HCH algorithm in predicting the risk of hospitalisation for the NT study population. The study population consisted of Aboriginal Australians residing in the NT who have visited a PHC clinic at one of the 54 NT Government clinics at least once between 1 January 2013 and 31 December 2017. Predictors of hospitalisation included demographics, patient observations, medications, diagnoses, pathology results and previous hospitalisation. Results There were a total of 3256 (28.5%) emergency attendances or preventable hospitalisations during the study period. The HCH algorithm had an area under the receiver operating characteristic curve (AUC) of 0.58 for the NT remote Aboriginal population, compared with 0.66 in the Victorian cohort. A refitted model including 'previous hospitalisation' had an AUC of 0.72, demonstrating better discrimination than the HCH algorithm. Calibration was also improved in the refitted model, with an intercept of 0.00 and a slope of 1.00, compared with an intercept of 1.29 and a slope of 0.55 in the HCH algorithm. Conclusion The HCH algorithm performed poorly on the NT cohort compared with the Victorian cohort, due to differences in population demographics and burden of disease. A population-specific hospitalisation risk algorithm is required for the NT.


Subject(s)
Australian Aboriginal and Torres Strait Islander Peoples , Hospitalization , Humans , Delivery of Health Care , Hospitals , Northern Territory/epidemiology , Retrospective Studies , Risk Assessment
6.
J Sleep Res ; 32(5): e13862, 2023 10.
Article in English | MEDLINE | ID: mdl-36815627

ABSTRACT

The occupational demands of law enforcement increase the risk of poor-quality sleep, putting officers at risk of adverse physical and mental health. This cross-sectional study aimed to characterise sleep quality in day workers, 8 and 12 h rotating shift pattern workers. One hundred eighty-six officers volunteered for the study (37 female, age: 41 ± 7). Sleep quality was assessed using the Pittsburgh sleep quality index, actigraphy and the Leeds sleep evaluation questionnaire. The maximal aerobic capacity (VO2max ) was measured on a treadmill via breath-by-breath analysis. There was a 70% overall prevalence of poor sleepers based on Pittsburgh sleep quality index scores, where 8 h shifts exhibited the worst prevalence (92%, p = 0.029), however, there was no difference between age, gender, or role. In contrast, 12 h shifts exhibited the poorest short-term measures, including awakening from sleep (p = 0.039) and behaviour following wakefulness (p = 0.033) from subjective measures, and poorer total sleep time (p = 0.024) and sleep efficiency (p = 0.024) from the actigraphy. High VO2max predicted poorer wake after sleep onset (Rsq = 0.07, p = 0.05) and poorer sleep latency (p = 0.028). There was no relationship between the Pittsburgh sleep quality index scores and any of the short-term measures. The prevalence of poor sleepers in this cohort was substantially higher than in the general population, regardless of shift pattern. The results obtained from the long- and short-term measures of sleep quality yielded opposing results, where long-term perceptions favoured the 12 h pattern, but short-term subjective and objective measures both favoured the 8 h pattern.


Subject(s)
Sleep Initiation and Maintenance Disorders , Sleep Wake Disorders , Humans , Female , Adult , Middle Aged , Sleep Quality , Police , Cross-Sectional Studies , Sleep , Wakefulness , Sleep Wake Disorders/epidemiology
7.
Sci Total Environ ; 861: 160618, 2023 Feb 25.
Article in English | MEDLINE | ID: mdl-36460106

ABSTRACT

The drive for farm businesses to move towards net zero greenhouse gas emissions means that there is a need to develop robust methods to quantify the amount of biomass carbon (C) on farms. Direct measurements can be destructive and time-consuming and some prediction methods provide no assessment of uncertainty. This study describes the development, validation, and use of an integrated spatial approach, including the use of lidar data, and Bayesian Belief Networks (BBNs) to quantify total biomass carbon stocks (Ctotal) of i) land cover and ii) landscape features such as hedges and lone trees for five case study sites in lowland England. The results demonstrated that it was possible to develop and use a remote integrated approach to estimate biomass carbon at a farm scale. The highest achievable prediction accuracy was attained from models using the variables AGBC, BGBC, DOMC, age, height, species and land cover, derived from measured information and from literature review. The two BBN models successfully predicted the test values of the total biomass carbon with propagated error rates of 6.7 % and 4.3 % for the land cover and landscape features respectively. These error rates were lower than in other studies indicating that the seven predictors are strong determinants of biomass carbon. The lidar data also enabled the spatial presentation and calculation of the variable C stocks along the length of hedges and within woodlands.


Subject(s)
Carbon , Forests , Biomass , Farms , Bayes Theorem
8.
Blood ; 141(9): 996-1006, 2023 03 02.
Article in English | MEDLINE | ID: mdl-36108341

ABSTRACT

BRAF V600E is the key oncogenic driver mutation in hairy cell leukemia (HCL). We report the efficacy and safety of dabrafenib plus trametinib in patients with relapsed/refractory BRAF V600E mutation-positive HCL. This open-label, phase 2 study enrolled patients with BRAF V600E mutation-positive HCL refractory to first-line treatment with a purine analog or relapsed after ≥2 prior lines of treatment. Patients received dabrafenib 150 mg twice daily plus trametinib 2 mg once daily until disease progression, unacceptable toxicity, or death. The primary endpoint was investigator-assessed objective response rate (ORR) per criteria adapted from National Comprehensive Cancer Network-Consensus Resolution guidelines. Secondary endpoints included duration of response (DOR), progression-free survival (PFS), overall survival (OS), and safety. Fifty-five patients with BRAF V600E mutation-positive HCL were enrolled. The investigator-assessed ORR was 89.0% (95% confidence interval, 77.8%-95.9%); 65.5% of patients had a complete response (without minimal residual disease [MRD]: 9.1% [negative immunohistochemistry of bone marrow {BM} biopsy], 12.7% [negative BM aspirate flow cytometry {FC}], 16.4% [negative immunohistochemistry and/or FC results]; with MRD, 49.1%), and 23.6% had a partial response. The 24-month DOR was 97.7% with 24-month PFS and OS rates of 94.4% and 94.5%, respectively. The most common treatment-related adverse events were pyrexia (58.2%), chills (47.3%), and hyperglycemia (40.0%). Dabrafenib plus trametinib demonstrated durable responses with a manageable safety profile consistent with previous observations in other indications and should be considered as a rituximab-free therapeutic option for patients with relapsed/refractory BRAF V600E mutation-positive HCL. This trial is registered at www.clinicaltrials.gov as #NCT02034110.


Subject(s)
Leukemia, Hairy Cell , Proto-Oncogene Proteins B-raf , Humans , Proto-Oncogene Proteins B-raf/genetics , Leukemia, Hairy Cell/drug therapy , Leukemia, Hairy Cell/genetics , Pyridones/adverse effects , Pyrimidinones/adverse effects , Oximes/adverse effects , Mutation , Antineoplastic Combined Chemotherapy Protocols/adverse effects
9.
Grass Forage Sci ; 78(1): 50-63, 2023 Mar.
Article in English | MEDLINE | ID: mdl-38516168

ABSTRACT

Each new generation of grassland managers could benefit from an improved understanding of how modification of nitrogen application and harvest dates in response to different weather and soil conditions will affect grass yields and quality. The purpose of this study was to develop a freely available grass yield simulation model, validated for England and Wales, and to examine its strengths and weaknesses as a teaching tool for improving grass management. The model, called LINGRA-N-Plus, was implemented in a Microsoft Excel spreadsheet and iteratively evaluated by students and practitioners (farmers, consultants, and researchers) in a series of workshops across the UK over 2 years. The iterative feedback led to the addition of new algorithms, an improved user interface, and the development of a teaching guide. The students and practitioners identified the ease of use and the capacity to understand, visualize and evaluate how decisions, such as variation of cutting intervals, affect grass yields as strengths of the model. We propose that an effective teaching tool must achieve an appropriate balance between being sufficiently detailed to demonstrate the major relationships (e.g., the effect of nitrogen on grass yields) whilst not becoming so complex that the relationships become incomprehensible. We observed that improving the user-interface allowed us to extend the scope of the model without reducing the level of comprehension. The students appeared to be interested in the explanatory nature of the model whilst the practitioners were more interested in the application of a validated model to enhance their decision making.

10.
Neuroimage ; 258: 119392, 2022 09.
Article in English | MEDLINE | ID: mdl-35714887

ABSTRACT

Rostral PFC (area 10) activation is common during prospective memory (PM) tasks. But it is not clear what mental processes these activations index. Three candidate explanations from cognitive neuroscience theory are: (i) monitoring of the environment; (ii) spontaneous intention retrieval; (iii) a combination of the two. These explanations make different predictions about the temporal and spatial patterns of activation that would be seen in rostral PFC in naturalistic settings. Accordingly, we plotted functional events in PFC using portable fNIRS while people were carrying out a PM task outside the lab and responding to cues when they were encountered, to decide between these explanations. Nineteen people were asked to walk around a street in London, U.K. and perform various tasks while also remembering to respond to prospective memory (PM) cues when they detected them. The prospective memory cues could be either social (involving greeting a person) or non-social (interacting with a parking meter) in nature. There were also a number of contrast conditions which allowed us to determine activation specifically related to the prospective memory components of the tasks. We found that maintaining both social and non-social intentions was associated with widespread activation within medial and right hemisphere rostral prefrontal cortex (BA 10), in agreement with numerous previous lab-based fMRI studies of prospective memory. In addition, increased activation was found within lateral prefrontal cortex (BA 45 and 46) when people were maintaining a social intention compared to a non-social one. The data were then subjected to a GLM-based method for automatic identification of functional events (AIDE), and the position of the participants at the time of the activation events were located on a map of the physical space. The results showed that the spatial and temporal distribution of these events was not random, but aggregated around areas in which the participants appeared to retrieve their future intentions (i.e., where they saw intentional cues), as well as where they executed them. Functional events were detected most frequently in BA 10 during the PM conditions compared to other regions and tasks. Mobile fNIRS can be used to measure higher cognitive functions of the prefrontal cortex in "real world" situations outside the laboratory in freely ambulant individuals. The addition of a "brain-first" approach to the data permits the experimenter to determine not only when haemodynamic changes occur, but also where the participant was when it happened. This can be extremely valuable when trying to link brain and cognition.


Subject(s)
Memory, Episodic , Brain Mapping , Humans , Magnetic Resonance Imaging , Mental Recall/physiology , Prefrontal Cortex/physiology , Walking
11.
Aust Health Rev ; 46(3): 302-308, 2022 Jun.
Article in English | MEDLINE | ID: mdl-35508434

ABSTRACT

Objective To analyse Medicare expenditure by State/Territory, remoteness, and Indigenous demography to assess funding equality in meeting the health needs of remote Indigenous populations in the Northern Territory. Methods Analytic descriptions of Medicare online reports on services and benefits by key demographic variables linked with Australian Bureau of Statistics data on remoteness and Indigenous population proportion. The Northern Territory Indigenous and non-Indigenous populations were compared with the Australian average between the 2010/2011 and 2019/2020 fiscal years in terms of standardised rates of Medicare services and benefits. These were further analysed using ordinary least squares, simultaneous equations and multilevel models. Results In per capita terms, the Northern Territory receives around 30% less Medicare funds than the national average, even when additional Commonwealth funding for Aboriginal medical services is included. This funding shortfall amounts to approximately AU$80 million annually across both the Medicare Benefits Schedule and Pharmaceutical Benefits Scheme. The multilevel models indicate that providing healthcare for an Aboriginal and Torres Strait Islander person in a remote area involves a Medicare shortfall of AU$531-AU$1041 less Medicare Benefits Schedule benefits per annum compared with a non-Indigenous person in an urban area. Indigenous population proportion, together with remoteness, explained 51% of the funding variation. An age-sex based capitation funding model would correct about 87% of the Northern Territory primary care funding inequality. Conclusions The current Medicare funding scheme systematically disadvantages the Northern Territory. A needs-based funding model is required that does not penalise the Northern Territory population based on the remote primary health care service model.


Subject(s)
Health Expenditures , Health Services, Indigenous , Aged , Delivery of Health Care , Humans , National Health Programs , Northern Territory , Primary Health Care/methods
12.
BMJ Open ; 12(5): e059716, 2022 05 15.
Article in English | MEDLINE | ID: mdl-35569825

ABSTRACT

OBJECTIVES: To assess the prevalence and incidence of diabetes among Aboriginal peoples in remote communities of the Northern Territory (NT), Australia. DESIGN: Retrospective cohort analysis of linked clinical and administrative data sets from 1 July 2012 to 30 June 2019. SETTING: Remote health centres using the NT Government Primary Care Information System (51 out of a total of 84 remote health centres in the NT). PARTICIPANTS: All Aboriginal clients residing in remote communities serviced by these health centres (N=21 267). PRIMARY OUTCOME MEASURES: Diabetes diagnoses were established using hospital and primary care coding, biochemistry and prescription data. RESULTS: Diabetes prevalence across all ages increased from 14.4% (95% CI: 13.9% to 14.9%) to 17.0% (95% CI: 16.5% to 17.5%) over 7 years. Among adults (≥20 years), the 2018/2019 diabetes prevalence was 28.6% (95% CI: 27.8% to 29.4%), being higher in Central Australia (39.5%, 95% CI: 37.8% to 41.1%) compared with the Top End region (24.2%, 95% CI: 23.3% to 25.1%, p<0.001). Between 2016/2017 and 2018/2019, diabetes incidence across all ages was 7.9 per 1000 person-years (95% CI: 7.3 to 8.7 per 1000 person-years). The adult incidence of diabetes was 12.6 per 1000 person-years (95% CI: 11.5 to 13.8 per 1000 person-years). CONCLUSIONS: The burden of diabetes in the remote Aboriginal population of the NT is among the highest in the world. Strengthened systems of care and public health prevention strategies, developed in partnership with Aboriginal communities, are needed.


Subject(s)
Diabetes Mellitus , Native Hawaiian or Other Pacific Islander , Adult , Child , Diabetes Mellitus/epidemiology , Humans , Incidence , Northern Territory/epidemiology , Prevalence , Retrospective Studies
13.
Neurophotonics ; 9(2): 025001, 2022 Apr.
Article in English | MEDLINE | ID: mdl-35599691

ABSTRACT

Significance: There is a longstanding recommendation within the field of fNIRS to use oxygenated ( HbO 2 ) and deoxygenated (HHb) hemoglobin when analyzing and interpreting results. Despite this, many fNIRS studies do focus on HbO 2 only. Previous work has shown that HbO 2 on its own is susceptible to systemic interference and results may mostly reflect that rather than functional activation. Studies using both HbO 2 and HHb to draw their conclusions do so with varying methods and can lead to discrepancies between studies. The combination of HbO 2 and HHb has been recommended as a method to utilize both signals in analysis. Aim: We present the development of the hemodynamic phase correlation (HPC) signal to combine HbO 2 and HHb as recommended to utilize both signals in the analysis. We use synthetic and experimental data to evaluate how the HPC and current signals used for fNIRS analysis compare. Approach: About 18 synthetic datasets were formed using resting-state fNIRS data acquired from 16 channels over the frontal lobe. To simulate fNIRS data for a block-design task, we superimposed a synthetic task-related hemodynamic response to the resting state data. This data was used to develop an HPC-general linear model (GLM) framework. Experiments were conducted to investigate the performance of each signal at different SNR and to investigate the effect of false positives on the data. Performance was based on each signal's mean T -value across channels. Experimental data recorded from 128 participants across 134 channels during a finger-tapping task were used to investigate the performance of multiple signals [ HbO 2 , HHb, HbT, HbD, correlation-based signal improvement (CBSI), and HPC] on real data. Signal performance was evaluated on its ability to localize activation to a specific region of interest. Results: Results from varying the SNR show that the HPC signal has the highest performance for high SNRs. The CBSI performed the best for medium-low SNR. The next analysis evaluated how false positives affect the signals. The analyses evaluating the effect of false positives showed that the HPC and CBSI signals reflect the effect of false positives on HbO 2 and HHb. The analysis of real experimental data revealed that the HPC and HHb signals provide localization to the primary motor cortex with the highest accuracy. Conclusions: We developed a new hemodynamic signal (HPC) with the potential to overcome the current limitations of using HbO 2 and HHb separately. Our results suggest that the HPC signal provides comparable accuracy to HHb to localize functional activation while at the same time being more robust against false positives.

14.
Thorax ; 77(7): 717-720, 2022 07.
Article in English | MEDLINE | ID: mdl-35354642

ABSTRACT

Given the large numbers of people infected and high rates of ongoing morbidity, research is clearly required to address the needs of adult survivors of COVID-19 living with ongoing symptoms (long COVID). To help direct resource and research efforts, we completed a research prioritisation process incorporating views from adults with ongoing symptoms of COVID-19, carers, clinicians and clinical researchers. The final top 10 research questions were agreed at an independently mediated workshop and included: identifying underlying mechanisms of long COVID, establishing diagnostic tools, understanding trajectory of recovery and evaluating the role of interventions both during the acute and persistent phases of the illness.


Subject(s)
COVID-19 , Adult , COVID-19/complications , Caregivers , Disease Progression , Health Priorities , Humans , Research Personnel , Post-Acute COVID-19 Syndrome
16.
Sci Total Environ ; 827: 154164, 2022 Jun 25.
Article in English | MEDLINE | ID: mdl-35240180

ABSTRACT

Improved farm management of soil organic carbon (SOC) is critical if national governments and agricultural businesses are to achieve net-zero targets. There are opportunities for farmers to secure financial benefits from carbon trading, but field measurements to establish SOC baselines for each part of a farm can be prohibitively expensive. Hence there is a potential role for spatial modelling approaches that have the resolution, accuracy, and estimates to uncertainty to estimate the carbon levels currently stored in the soil. This study uses three spatial modelling approaches to estimate SOC stocks, which are compared with measured data to a 10 cm depth and then used to determine carbon payments. The three approaches used either fine- (100 m × 100 m) or field-scale input soil data to produce either fine- or field-scale outputs across nine geographically dispersed farms. Each spatial model accurately predicted SOC stocks (range: 26.7-44.8 t ha-1) for the five case study farms where the measured SOC was lowest (range: 31.6-48.3 t ha-1). However, across the four case study farms with the highest measured SOC (range: 56.5-67.5 t ha-1), both models underestimated the SOC with the coarse input model predicting lower values (range: 39.8-48.2 t ha-1) than those using fine inputs (range: 43.5-59.2 t ha-1). Hence the use of the spatial models to establish a baseline, from which to derive payments for additional carbon sequestration, favoured farms with already high SOC levels, with that benefit greatest with the use of the coarse input data. Developing a national approach for SOC sequestration payments to farmers is possible but the economic impacts on individual businesses will depend on the approach and the accounting method.


Subject(s)
Carbon , Soil , Agriculture/methods , Carbon Sequestration , Farms
17.
Article in English | MEDLINE | ID: mdl-35144035

ABSTRACT

BACKGROUND: Conventional paradigms in clinical neuroscience tend to be constrained in terms of ecological validity, raising several challenges to studying the mechanisms mediating treatments and outcomes in clinical settings. Addressing these issues requires real-world neuroimaging techniques that are capable of continuously collecting data during free-flowing interpersonal interactions and that allow for experimental designs that are representative of the clinical situations in which they occur. METHODS: In this work, we developed a paradigm that fractionates the major components of human-to-human verbal interactions occurring in clinical situations and used functional near-infrared spectroscopy to assess the brain systems underlying clinician-client discourse (N = 30). RESULTS: Cross-brain neural coupling between people was significantly greater during clinical interactions compared with everyday life verbal communication, particularly between the prefrontal cortex (e.g., inferior frontal gyrus) and inferior parietal lobule (e.g., supramarginal gyrus). The clinical tasks revealed extensive increases in activity across the prefrontal cortex, especially in the rostral prefrontal cortex (area 10), during periods in which participants were required to silently reason about the dysfunctional cognitions of the other person. CONCLUSIONS: This work demonstrates a novel experimental approach to investigating the neural underpinnings of interpersonal interactions that typically occur in clinical settings, and its findings support the idea that particular prefrontal systems might be critical to cultivating mental health.


Subject(s)
Mental Health , Neuroimaging , Brain , Humans , Neuroimaging/methods , Parietal Lobe , Prefrontal Cortex/diagnostic imaging
18.
Article in English | MEDLINE | ID: mdl-36612678

ABSTRACT

Aboriginal and Torres Strait Islander peoples' (hereafter respectfully referred to as Indigenous Australians) experiences of health care are shaped by historical, social and cultural factors, with cultural security critical to effective care provision and engagement between services and community. Positive patient experiences are associated with better health outcomes. Consequently, it is an accreditation requirement that primary health care (PHC) services must formally gather and respond to patient feedback. However, currently available patient feedback tools were not developed with Indigenous Australians, and do not reflect their values and world views. Existing tools do not capture important experiences of care of Indigenous Australians in PHC settings, nor return information that assists services to improve care. Consistent with the principles of Indigenous Data Sovereignty, we will co-design and validate an Indigenous-specific Patient Reported Experience Measure (PREM) that produces data by and for community, suitable for use in quality improvement in comprehensive PHC services. This paper presents the protocol of the study, outlining the rationale, methodologies and associated activities that are being applied in developing the PREM. Briefly, guided by an Aboriginal and Torres Strait Islander Advisory Group, our team of Indigenous and non-Indigenous researchers, service providers and policy makers will use a combination of Indigenous methodologies, participatory, and traditional western techniques for scale development. We will engage PHC service staff and communities in eight selected sites across remote, regional, and metropolitan communities in Australia for iterative cycles of data collection and feedback throughout the research process. Yarning Circles with community members will identify core concepts to develop an "Experience of Care Framework", which will be used to develop items for the PREM. Staff members will be interviewed regarding desirable characteristics and feasibility considerations for the PREM. The PREM will undergo cognitive and psychometric testing.


Subject(s)
Australian Aboriginal and Torres Strait Islander Peoples , Health Services, Indigenous , Patient Reported Outcome Measures , Humans , Australia , Primary Health Care/methods
19.
Lancet Oncol ; 23(1): 53-64, 2022 01.
Article in English | MEDLINE | ID: mdl-34838156

ABSTRACT

BACKGROUND: Effective treatments are needed to improve outcomes for high-grade glioma and low-grade glioma. The activity and safety of dabrafenib plus trametinib were evaluated in adult patients with recurrent or progressive BRAFV600E mutation-positive high-grade glioma and low-grade glioma. METHODS: This study is part of an ongoing open-label, single-arm, phase 2 Rare Oncology Agnostic Research (ROAR) basket trial at 27 community and academic cancer centres in 13 countries (Austria, Belgium, Canada, France, Germany, Italy, Japan, the Netherlands, Norway, South Korea, Spain, Sweden, and the USA). The study enrolled patients aged 18 years or older with an Eastern Cooperative Oncology Group performance status of 0, 1, or 2. Patients with BRAFV600E mutation-positive high-grade glioma and low-grade glioma received dabrafenib 150 mg twice daily plus trametinib 2 mg once daily orally until unacceptable toxicity, disease progression, or death. In the high-grade glioma cohort, patients were required to have measurable disease at baseline using the Response Assessment in Neuro-Oncology high-grade glioma response criteria and have been treated previously with radiotherapy and first-line chemotherapy or concurrent chemoradiotherapy. Patients with low-grade glioma were required to have measurable non-enhancing disease (except pilocytic astrocytoma) at baseline using the Response Assessment in Neuro-Oncology low-grade glioma criteria. The primary endpoint, in the evaluable intention-to-treat population, was investigator-assessed objective response rate (complete response plus partial response for high-grade glioma and complete response plus partial response plus minor response for low-grade glioma). This trial is ongoing, but is closed for enrolment, NCT02034110. FINDINGS: Between April 17, 2014, and July 25, 2018, 45 patients (31 with glioblastoma) were enrolled into the high-grade glioma cohort and 13 patients were enrolled into the low-grade glioma cohort. The results presented here are based on interim analysis 16 (data cutoff Sept 14, 2020). In the high-grade glioma cohort, median follow-up was 12·7 months (IQR 5·4-32·3) and 15 (33%; 95% CI 20-49) of 45 patients had an objective response by investigator assessment, including three complete responses and 12 partial responses. In the low-grade glioma cohort, median follow-up was 32·2 months (IQR 25·1-47·8). Nine (69%; 95% CI 39-91) of 13 patients had an objective response by investigator assessment, including one complete response, six partial responses, and two minor responses. Grade 3 or worse adverse events were reported in 31 (53%) patients, the most common being fatigue (five [9%]), decreased neutrophil count (five [9%]), headache (three [5%]), and neutropenia (three [5%]). INTERPRETATION: Dabrafenib plus trametinib showed clinically meaningful activity in patients with BRAFV600E mutation-positive recurrent or refractory high-grade glioma and low-grade glioma, with a safety profile consistent with that in other indications. BRAFV600E testing could potentially be adopted in clinical practice for patients with glioma. FUNDING: Novartis.


Subject(s)
Antineoplastic Combined Chemotherapy Protocols/therapeutic use , Brain Neoplasms/drug therapy , Glioma/drug therapy , Mutation , Proto-Oncogene Proteins B-raf/genetics , Adolescent , Adult , Aged , Antineoplastic Combined Chemotherapy Protocols/adverse effects , Brain Neoplasms/genetics , Brain Neoplasms/mortality , Female , Glioma/genetics , Glioma/mortality , Humans , Imidazoles/administration & dosage , Isocitrate Dehydrogenase/genetics , Male , Middle Aged , Oximes/administration & dosage , Pyridones/administration & dosage , Pyrimidinones/administration & dosage , Young Adult
20.
Front Neurogenom ; 3: 806485, 2022.
Article in English | MEDLINE | ID: mdl-38235451

ABSTRACT

People with a depressed mood tend to perform poorly on executive function tasks, which require much of the prefrontal cortex (PFC), an area of the brain which has also been shown to be hypo-active in this population. Recent research has suggested that these aspects of cognition might be improved through physical activity and cognitive training. However, whether the acute effects of exercise on PFC activation during executive function tasks vary with depressive symptoms remains unclear. To investigate these effects, 106 participants were given a cardiopulmonary exercise test (CPET) and were administered a set of executive function tests directly before and after the CPET assessment. The composite effects of exercise on the PFC (all experimental blocks) showed bilateral activation changes in dorsolateral (BA46/9) and ventrolateral (BA44/45) PFC, with the greatest changes occurring in rostral PFC (BA10). The effects observed in right ventrolateral PFC varied depending on level of depressive symptoms (13% variance explained); the changes in activation were less for higher levels. There was also a positive relationship between CPET scores (VO2peak) and right rostral PFC, in that greater activation changes in right BA10 were predictive of higher levels of aerobic fitness (9% variance explained). Since acute exercise ipsilaterally affected this PFC subregion and the inferior frontal gyrus during executive function tasks, this suggests physical activity might benefit the executive functions these subregions support. And because physical fitness and depressive symptoms explained some degree of cerebral upregulation to these subregions, physical activity might more specifically facilitate the engagement of executive functions that are typically associated with hypoactivation in depressed populations. Future research might investigate this possibility in clinical populations, particularly the neural effects of physical activity used in combination with mental health interventions.

SELECTION OF CITATIONS
SEARCH DETAIL
...