RESUMO
Percutaneous ablation is recommended in Barcelona Clinic Liver Cancer (BCLC) stage 0/A patients with HCC ≤3 cm as a curative treatment modality alongside surgical resection and liver transplantation. However, trans-arterial chemo-embolisation (TACE) is commonly used in the real-world as an initial treatment in patients with single small HCC in contrast to widely accepted clinical practice guidelines which typically describe TACE as a treatment for intermediate-stage HCC. We performed this real-world propensity-matched multi-centre cohort study in patients with single HCC ≤ 3 cm to assess for differences in survival outcomes between those undergoing initial TACE and those receiving upfront ablation. Patients with a new diagnosis of BCLC 0/A HCC with a single tumour ≤3 cm first diagnosed between 1 January 2016 and 31 December 2020 who received initial TACE or ablation were included in the study. A total of 348 patients were included in the study, with 147 patients receiving initial TACE and 201 patients undergoing upfront ablation. After propensity score matching using key covariates, 230 patients were available for analysis with 115 in each group. There were no significant differences in overall survival (log-rank test p = 0.652) or liver-related survival (log-rank test p = 0.495) over a median follow-up of 43 months. While rates of CR were superior after ablation compared to TACE as a first treatment (74% vs. 56%, p < 0.004), there was no significant difference in CR rates when allowing for further subsequent treatments (86% vs. 80% p = 0.219). In those who achieved CR, recurrence-free survival and local recurrence-free survival were similar (log rank test p = 0.355 and p = 0.390, respectively). Our study provides valuable real-world evidence that TACE when offered with appropriate follow-up treatment is a reasonable initial management strategy in very early/early-stage HCC, with similar survival outcomes as compared to those managed with upfront ablation. Further work is needed to better define the role for TACE in BCLC 0/A HCC.
RESUMO
BACKGROUND AND PURPOSE: Antipsychotics such as olanzapine are associated with significant metabolic dysfunction, attributed to gut microbiome dysbiosis. A recent notion that most psychotropics are detrimental to the gut microbiome has arisen from consistent findings of metabolic adverse effects. However, unlike olanzapine, the metabolic effects of lurasidone are conflicting. Thus, this study investigates the contrasting effects of olanzapine and lurasidone on the gut microbiome to explore the hypothesis of 'gut neutrality' for lurasidone exposure. EXPERIMENTAL APPROACH: Using Sprague-Dawley rats, the effects of olanzapine and lurasidone on the gut microbiome were explored. Faecal and blood samples were collected weekly over a 21-day period to analyse changes to the gut microbiome and related metabolic markers. KEY RESULTS: Lurasidone triggered no significant weight gain or metabolic alterations, instead positively modulating the gut microbiome through increases in mean operational taxonomical units (OTUs) and alpha diversity. This novel finding suggests an underlying mechanism for lurasidone's metabolic inertia. In contrast, olanzapine triggered a statistically significant decrease in mean OTUs, substantial compositional variation and a depletion in short-chain fatty acid abundance. Microbiome depletion correlated with metabolic dysfunction, producing a 30% increase in weight gain, increased pro-inflammatory cytokine expression, and increased blood glycaemic and triglyceride levels. CONCLUSION AND IMPLICATIONS: Our results challenge the notion that all antipsychotics disrupt the gut microbiome similarly and highlights the potential benefits of gut-neutral antipsychotics, such as lurasidone, in managing metabolic side effects. Further research is warranted to validate these findings in humans to guide personalised pharmacological treatment regimens for schizophrenia.
Assuntos
Antipsicóticos , Microbioma Gastrointestinal , Cloridrato de Lurasidona , Olanzapina , Ratos Sprague-Dawley , Animais , Cloridrato de Lurasidona/farmacologia , Olanzapina/farmacologia , Microbioma Gastrointestinal/efeitos dos fármacos , Antipsicóticos/farmacologia , Masculino , Ratos , Fezes/microbiologia , Aumento de Peso/efeitos dos fármacosRESUMO
BACKGROUND: Hepatocellular carcinoma (HCC) recurrence following surgical resection remains a significant clinical challenge, necessitating reliable predictive models to guide personalised interventions. In this study, we sought to harness the power of artificial intelligence (AI) to develop a robust predictive model for HCC recurrence using comprehensive clinical datasets. METHODS: Leveraging data from 958 patients across multiple centres in Australia and Hong Kong, we employed a multilayer perceptron (MLP) as the optimal classifier for model generation. RESULTS: Through rigorous internal cross-validation, including a cohort from the Chinese University of Hong Kong (CUHK), our AI model successfully identified specific pre-surgical risk factors associated with HCC recurrence. These factors encompassed hepatic synthetic function, liver disease aetiology, ethnicity and modifiable metabolic risk factors, collectively contributing to the predictive synergy of our model. Notably, our model exhibited high accuracy during cross-validation (.857 ± .023) and testing on the CUHK cohort (.835), with a notable degree of confidence in predicting HCC recurrence within accurately classified patient cohorts. To facilitate clinical application, we developed an online AI digital tool capable of real-time prediction of HCC recurrence risk, demonstrating acceptable accuracy at the individual patient level. CONCLUSION: Our findings underscore the potential of AI-driven predictive models in facilitating personalised risk stratification and targeted interventions to mitigate HCC recurrence by identifying modifiable risk factors unique to each patient. This model aims to aid clinicians in devising strategies to disrupt the underlying carcinogenic network driving recurrence.
Assuntos
Inteligência Artificial , Carcinoma Hepatocelular , Neoplasias Hepáticas , Recidiva Local de Neoplasia , Humanos , Carcinoma Hepatocelular/cirurgia , Neoplasias Hepáticas/cirurgia , Fatores de Risco , Feminino , Masculino , Hong Kong , Pessoa de Meia-Idade , Austrália , Idoso , Medição de Risco , Hepatectomia/efeitos adversos , Medicina de PrecisãoRESUMO
The management of early-stage hepatocellular carcinoma (HCC) is complex, with multiple treatment strategies available. There is a paucity of literature regarding variations in the patterns of care and outcomes between transplant and non-transplant centres. We conducted this real-world multi-centre cohort study in two liver cancer referral centres with an integrated liver transplant program and an additional eight non-transplant HCC referral centres across Australia to identify variation in patterns of care and key survival outcomes. Patients with stage Barcelona Clinic Liver Cancer (BCLC) 0/A HCC, first diagnosed between 1 January 2016 and 31 December 2020, who were managed at a participating site, were included in the study. Patients were excluded if they had a history of prior HCC or if they received upfront liver transplantation. A total of 887 patients were included in the study, with 433 patients managed at a liver cancer centre with a transplant program (LTC) and 454 patients managed at a non-transplant centre (NTC). Management at an LTC did not significantly predict allocation to resection (adjusted OR 0.75, 95% CI 0.50 to 1.11, p = 0.148). However, in those not receiving resection, LTC and NTC patients were systematically managed differently, with LTC patients five times less likely to receive upfront ablation than NTC patients (adjusted OR 0.19, 95% CI 0.13 to 0.28, p < 0.001), even after adjusting for tumour burden, as well as for age, gender, liver disease aetiology, liver disease severity, and medical comorbidities. LTCs exhibited significantly higher proportions of patients undergoing TACE for every tumour burden category, including those with a single tumour measuring 2 cm or less (p < 0.001). Using multivariable Cox proportional hazards analysis, management at a transplant centre was associated with reduced all-cause mortality (adjusted HR 0.71, 95% CI 0.51 to 0.98, p = 0.036), and competing-risk regression analysis, considering liver transplant as a competing event, demonstrated a similar reduction in risk (adjusted HR 0.70, 95% CI 0.50 to 0.99, p = 0.041), suggesting that the reduced risk of death is not fully explained by higher rates of transplantation. Our study highlights systematic differences in HCC care between large volume liver transplant centres and other sites, which has not previously been well-described. Further work is needed to better define the reasons for differences in treatment allocation and to aim to minimise unwarranted treatment variation to maximise patient outcomes across Australia.
RESUMO
[This corrects the article DOI: 10.1371/journal.pone.0278793.].
RESUMO
BACKGROUND: Eosinophilic oesophagitis (EOE) is a known cause of food bolus obstruction (FBO) with rising incidence and prevalence. AIMS: To assess the rates of EOE in adult cases presenting with an FBO via prospective biopsy collection during index endoscopy. METHODS: Oesophageal FBO cases requiring gastroscopy between February 2014 and January 2021 at a single institution with a unified policy to perform biopsies on FBO cases were analysed using medical records, endoscopy and histology. Statistical analysis was undertaken to compare those with and without EOE as their final diagnosis, including the timing of oesophageal biopsy and the season that cases presented. RESULTS: One hundred ninety FBO presentations were analysed, 15 patients presented twice and one patient presented four times within the 7-year study period. Men represented 72% of cases. A total of 78% of cases had biopsies collected at an index or scheduled follow-up endoscopy. EOE was the cause of the FBO in 28% (53/190) of presentations. FBO secondary to EOE was more likely to occur in the spring and summer months (Australian September to March), with 39% (19 of 49) of cases presenting in spring attributable to EOE. CONCLUSION: EOE affects a significant proportion of patients presenting with FBO (28%); a high biopsy rate of 78% in FBO cases provides an opportunity for prompt diagnosis and treatment.
Assuntos
Esofagite Eosinofílica , Humanos , Esofagite Eosinofílica/epidemiologia , Esofagite Eosinofílica/diagnóstico , Esofagite Eosinofílica/complicações , Masculino , Feminino , Pessoa de Meia-Idade , Adulto , Biópsia , Idoso , Gastroscopia , Transtornos de Deglutição/etiologia , Transtornos de Deglutição/epidemiologia , Estudos Prospectivos , Esôfago/patologia , Alimentos/efeitos adversos , Estudos Retrospectivos , Estações do Ano , Adulto Jovem , Austrália/epidemiologiaRESUMO
OBJECTIVES: Feline idiopathic cystitis (FIC) and urethral obstruction (UO) are commonly linked to increased stress. The influence of human movement restrictions on their incidence remains undetermined. FIC with or without UO is associated with environmental stress factors. The severe acute respiratory syndrome coronavirus 2 (COVID-19) pandemic restricted human movement and working behaviours. It is unknown if these restrictions increased the risk of FIC or UO in cats. METHODS: Total cat emergency accessions and transfers between 8 February 2019 and 8 February 2021 at two private hospitals were retrospectively reviewed. Cats were included in the FIC group if they presented with lower urinary tract signs and supporting urinalysis, and were included in the UO group if they presented with UO. Cats with current urinary tract infection, or previous FIC or UO, were excluded. Groups were considered 'pre-COVID-19' between February 2019 and 2020 and 'COVID-19' between February 2020 and 2021. Cases of FIC and UO were compared between COVID-19 and pre-COVID-19 using Fisher's exact test and relative risk (RR) calculations. RESULTS: The pre-COVID-19 incidence of FIC was 4.3% (63/1477, 95% confidence interval [CI] 0.0332-0.053), non-obstructive FIC was 1.4% (20/1477, 95% CI 0.008-0.020) and UO was 2.9% (43/1477, 95% CI 0.020-0.038). One cat was excluded as obstruction occurred during hospitalisation. The COVID-19 incidence of FIC was 5.4% (113/2081, 95% CI 0.044-0.64), non-obstructive FIC was 2.1% (70/2081, 95% CI 0.014-0.027) and UO was 3.4% (70/2081, 95% CI 0.026-0.042). The risk of non-obstructive FIC (P = 0.122; RR 0.652, 95% CI 0.387-1.096), UO (P = 0.382; RR 0.839, 95% CI 0.577-1.22) or either (P = 0.098; RR 0.773, 95% CI 0.572-1.044) was not significantly higher in the COVID-19 period than the pre-COVID-19 period. CONCLUSIONS AND RELEVANCE: No clear association between COVID-19 movement restrictions and the incidence of UO or non-obstructive FIC was found within this retrospective population.
Assuntos
COVID-19 , Doenças do Gato , Cistite , Obstrução Uretral , Doenças Urológicas , Humanos , Gatos , Animais , Estudos Retrospectivos , Queensland , Incidência , COVID-19/epidemiologia , COVID-19/complicações , COVID-19/veterinária , Doenças Urológicas/veterinária , Austrália , Cistite/veterinária , Obstrução Uretral/veterinária , Doenças do Gato/epidemiologiaRESUMO
The optimal treatment approach in very-early and early-stage hepatocellular carcinoma (HCC) is not precisely defined, and there is ambiguity in the literature around the comparative efficacy of surgical resection versus ablation as curative therapies for limited disease. We performed this real-world propensity-matched, multi-centre cohort study to assess for differences in survival outcomes between those undergoing resection and those receiving ablation. Patients with Barcelona Clinic Liver Cancer (BCLC) 0/A HCC first diagnosed between 1 January 2016 and 31 December 2020 who received ablation or resection as initial treatment were included in the study. A total of 450 patients were included in the study from 10 major liver centres including two transplant centres. Following propensity score matching using key covariates, 156 patients were available for analysis with 78 in each group. Patients who underwent resection had significantly improved overall survival (log-rank test p = 0.023) and local recurrence-free survival (log rank test p = 0.027) compared to those who received ablation. Based on real-world data, our study supports the use of surgical resection in preference to ablation as first-line curative therapy in appropriately selected BCLC 0/A HCC patients.
RESUMO
Chronic exposure to the Cyanobacteria biotoxin Beta-methylamino-L-alanine (BMAA) has been associated with development of a sporadic form of ALS called Amyotrophic Lateral Sclerosis/Parkinsonism-Dementia Complex (ALS/PDC), as observed within certain Indigenous populations of Guam and Japan. Studies in primate models and cell culture have supported the association of BMAA with ALS/PDC, yet the pathological mechanisms at play remain incompletely characterized, effectively stalling the development of rationally-designed therapeutics or application of preventative measures for this disease. In this study we demonstrate for the first time that sub-excitotoxic doses of BMAA modulate the canonical Wnt signaling pathway to drive cellular defects in human neuroblastoma cells, suggesting a potential mechanism by which BMAA may promote neurological disease. Further, we demonstrate here that the effects of BMAA can be reversed in cell culture by use of pharmacological modulators of the Wnt pathway, revealing the potential value of targeting this pathway therapeutically. Interestingly, our results suggest the existence of a distinct Wnt-independent mechanism activated by BMAA in glioblastoma cells, highlighting the likelihood that neurological disease may result from the cumulative effects of distinct cell-type specific mechanisms of BMAA toxicity.
Assuntos
Diamino Aminoácidos , Esclerose Lateral Amiotrófica , Glioblastoma , Neuroblastoma , Transtornos Parkinsonianos , Animais , Humanos , Glioblastoma/induzido quimicamente , Esclerose Lateral Amiotrófica/patologia , Toxinas de Cianobactérias , Diamino Aminoácidos/toxicidade , Diamino Aminoácidos/metabolismo , Neurotoxinas/toxicidadeRESUMO
BACKGROUND: Primary biliary cholangitis (PBC) is a chronic progressive liver disease of unknown aetiology characterised by immune-mediated destruction of small and medium-sized intrahepatic bile ducts. There are few well-established risk factors and epidemiological studies are needed to further evaluate the pathogenesis of the disease. AIM: To evaluate the relationship between alcohol intake, smoking and marijuana use with PBC development. METHODS: We conducted a prevalent case control study of 200 cases and 200 age (within a five year age band) and sex-matched controls, identified from the Victorian PBC prevalence study. We assessed lifetime alcohol intake and smoking behaviour (both tobacco and marijuana) prior to PBC onset and used conditional logistic regression for analyses. RESULTS: Alcohol intake consistently showed a dose-dependent inverse association with case status, and this was most substantial for 21-30 years and 31-40 years (P trend < 0.001). Smoking was associated with PBC, with a stronger association with a longer duration of smoking [e.g., adjusted OR 2.27 (95%CI: 1.12- 4.62) for those who had smoked for 20-35 years]. There was no association between marijuana use and PBC. CONCLUSION: Alcohol appears to have an inverse relationship with PBC. Smoking has been confirmed as an environmental risk factor for PBC. There was no association between marijuana use and PBC.
RESUMO
BACKGROUND: Pulmonary embolism (PE) is associated with significant morbidity and mortality. PE is a heterogeneous entity that causes a wide variety of clinical presentations, making it imperative to establish which clinical symptoms, signs and biomarkers can influence the pretest probability of PE to aid clinicians and reduce over testing. AIM: To analyse the clinical parameters used by clinicians to order a computed tomography pulmonary angiogram (CTPA) and establish which were associated with the presence of PE. METHODS: Medical records of patients who underwent CTPA from December 2015 to March 2016 were extracted. Patient demographics, clinical symptoms, diagnostic and radiological results were analysed. RESULTS: The study included 150 CTPA studies. Of the studies, 25 were positive for PE and 125 were negative. There was no significant relationship between the presence or character of chest pain and a positive CTPA result (P = 0.216). Previous history of venous thromboembolism (VTE) (P < 0.0001), one or more risk factors for VTE and positive troponin (P < 0.002) were all predictive of PE. None of the patients with a negative D-dimer had a positive CTPA. CONCLUSION: This study supports the negative predictive value of the D-dimer for excluding PE and demonstrates that the strongest pretest predictors of PE in our population are a prior history of VTE, risk factors for VTE and elevated troponin. None of the parameters that often generate requests for CTPA, including vital signs or the presence of chest pain, was associated with the presence of PE in our study population.
Assuntos
Angiografia por Tomografia Computadorizada/métodos , Embolia Pulmonar/sangue , Embolia Pulmonar/diagnóstico por imagem , Troponina/sangue , Idoso , Biomarcadores/sangue , Feminino , Humanos , Masculino , Pessoa de Meia-Idade , Valor Preditivo dos Testes , Tomografia Computadorizada por Raios X/métodosRESUMO
In 2011-2012 approximately 26% of Australian children aged between 5-17 years were reported to be overweight or obese. Furthermore, the increase in prevalence of overweight and obesity among US children parallels reported increases in energy intake and portion sizes of common foods, leading to the recognition that availability of larger portion sizes contributes to the rise in overweight and obesity prevalence. Thus, the aim of this time-series analysis was to investigate whether selected food portion sizes in Australian children aged 2-16 years changed between 2007 and 2011-2012. Portion size data from 24-h recalls collected in Australian nutrition surveys were compared between 2007 and 2011-2012. Portion sizes changed significantly in 23% of items with increases in 15% and decreases in 8%. Changes in portion sizes varied by age, sex, and food group. Changes occurred for many meat-based items, energy-dense, nutrient-poor food items, breads, cereals, and some fruits and vegetables. Vegetable and fruit portion sizes were below the respective serving sizes of 75 g and 150 g in the Australian Guide to Healthy Eating, while portion sizes of some energy-dense, nutrient-poor foods have increased. These findings suggest approaches to increasing consumption of nutrient-dense core foods and reducing energy-dense, nutrient-poor food items in children are warranted.
RESUMO
Introduction: Simulation-based, multiprofessional team training (SBMPTT) is used widely in healthcare, with evidence that it can improve clinical outcomes and be associated with a positive safety culture. Our aim was to explore the impact of introducing this type of training to a gynaecological team. Methods: In this interrupted time-series study, 'Safety Attitudes Questionnaire' (SAQ) data was collected both before and after SBMPTT was introduced to a gynaecological team. Results: Low baseline SAQ scores coincided with difficulty in establishing the training, meaning that at the end of our study period only a small proportion of staff had actually attended a training session. Despite trends towards improvement in scores for safety climate, teamwork climate and job satisfaction, no statistically significant difference was observed. There was however an improved perception of the level of collaboration between nursing staff and doctors after the introduction of training. Conclusions and Discussion: In this paper we explore a hypothesis that low baseline SAQ scores may highlight that the multiprofessional teams most in need of training work in environments where it is more challenging to implement. There is evidence from other specialties that multiprofessional team training works, now we need to understand how to address the barriers to getting it started. In this paper we suggest how the SAQ could be used as a directive tool for improvement; using the detailed analysis of the local safety culture it provides to both inform future training design and also provide management with an objective marker of progress.
RESUMO
BACKGROUND: Diet quality tools provide researchers with brief methods to assess the nutrient adequacy of usual dietary intake. This study describes the development and validation of a pediatric diet quality index, the Australian Recommended Food Scores for Pre-schoolers (ARFS-P), for use with children aged two to five years. METHODS: The ARFS-P was derived from a 120-item food frequency questionnaire, with eight sub-scales, and was scored from zero to 73. Linear regressions were used to estimate the relationship between diet quality score and nutrient intakes, in 142 children (mean age 4 years) in rural localities in New South Wales, Australia. RESULTS: Total ARFS-P and component scores were highly related to dietary intake of the majority of macronutrients and micronutrients including protein, ß-carotene, vitamin C, vitamin A. Total ARFS-P was also positively related to total consumption of nutrient dense foods, such as fruits and vegetables, and negatively related to total consumption of discretionary choices, such as sugar sweetened drinks and packaged snacks. CONCLUSION: ARFS-P is a valid measure that can be used to characterise nutrient intakes for children aged two to five years. Further research could assess the utility of the ARFS-P for monitoring of usual dietary intake over time or as part of clinical management.
Assuntos
Comportamento Alimentar , Qualidade dos Alimentos , Recomendações Nutricionais , Ácido Ascórbico/administração & dosagem , Fenômenos Fisiológicos da Nutrição Infantil , Pré-Escolar , Estudos Transversais , Dieta , Inquéritos sobre Dietas , Proteínas Alimentares/administração & dosagem , Ingestão de Energia , Feminino , Alimentos Orgânicos , Frutas , Humanos , Modelos Lineares , Masculino , Micronutrientes/administração & dosagem , New South Wales , Avaliação Nutricional , Ensaios Clínicos Controlados Aleatórios como Assunto , Fatores Socioeconômicos , Inquéritos e Questionários , Verduras , Vitamina A/administração & dosagem , beta Caroteno/administração & dosagemRESUMO
BACKGROUND: Portion size of foods is reported to contribute to the rise in obesity prevalence. However, evidence of changes in portion size for commonly consumed foods in Australia is lacking. The aim was to evaluate whether Australian child and adolescent portion sizes of selected foods changed from 1995 to 2007. METHODS: Time-series study, comparing dietary data from two national cross-sectional surveys in nationally representative population survey of Australian households. The dietary data was from children aged 2-16 years who participated in the 1995 National Nutrition Survey (n = 2198) and 2007 Australian National Children's Nutrition and Physical Activity Survey (n = 4799). RESULTS: Differences were found across survey years in median portion size of common foods and beverages assessed by 24-hour recalls for age and sex categories. Of the 61 foods items evaluated across the whole population sample, portion size increased in 18 items, decreased in 22, with no change in 20, although the magnitude of change varied by age and sex. Decreases in portion size were detected for most dairy products, breakfast cereal, some packaged snack foods and vegetables, p < 0.0001. Increases were detected for cooked chicken, mixed chicken dishes, bacon and ham (p < 0.0001), cooked meat (p < 0.05), fish (p < 0.01) and pizza (p < 0.0001). No significant changes were detected for many items including white and wholemeal bread, mincemeat, chocolate and soft drink. CONCLUSIONS: Small changes in portion sizes were detected over 12 years in Australian children and adolescents with the degree of change varying by sex, age and food group. Knowledge of usual portion sizes could inform programs targeting appropriate serving sizes selection in children and adolescents.
Assuntos
Bebidas/estatística & dados numéricos , Dieta/tendências , Tamanho da Porção/tendências , Adolescente , Austrália , Criança , Pré-Escolar , Estudos Transversais , Feminino , Alimentos , Preferências Alimentares , Humanos , Masculino , Inquéritos NutricionaisRESUMO
BACKGROUND: e-learning is established in many medical schools. However the effectiveness of e-learning has been difficult to quantify and there have been concerns that such educational activities may be driven more by novelty, than pedagogical evidence. Where some domains may lend themselves well to e-learning, clinical skills has been considered a challenging area for online learning. AIMS: The aims of this study are to assess undergraduate medical students? perceived level of IT ability and accessibility, and attitudes towards e-learning in basic clinical skills education, compared to other teaching methods. METHODS: A self-administered questionnaire was developed to capture undergraduate medical students: (i) demographic details (ii) perceived level of IT ability and accessibility (iii) experiences and attitudes towards e-learning and clinical skills training. Responses were linked to student?s performance in a clinical skills OSCE. RESULTS: The majority of students reported good access to computers and the internet, both on and off campus and appear confident using IT. Overall students felt that e-learning had a positive impact on their learning of clinical skills and was comparable to other traditional forms of clinical skills teaching. Students who displayed deep learning traits when using e-learning, performed better in clinical skills OSCEs. CONCLUSION: Undergraduate medical students value the use of e-learning in clinical skills education, however they vary in their utilization of such learning environments. Students rate e-learning just as highly as other traditional methods of clinical skills teaching and acknowledge its integration in a blended approach. Developers of clinical skills curricula need to ensure e-learning environments utilize media that encourage deeper approaches to learning.
Assuntos
Instrução por Computador/métodos , Educação de Graduação em Medicina/organização & administração , Conhecimentos, Atitudes e Prática em Saúde , Internet/estatística & dados numéricos , Estudantes de Medicina/estatística & dados numéricos , Adulto , Competência Clínica , Currículo/normas , Avaliação Educacional , Feminino , Humanos , Masculino , Modelos Educacionais , Aprendizagem Baseada em Problemas/organização & administração , Faculdades de Medicina/organização & administração , Estudantes de Medicina/psicologia , Inquéritos e Questionários , Reino Unido , Adulto JovemRESUMO
Treatment-resistant depression continues to pose a major medical challenge, as up to one-third of patients with major depressive disorder fail to have an adequate response to standard pharmacotherapies. An improved understanding of the complex circuitry underlying depressive disorders has fostered an explosion in the development of new, nonpharmacological approaches. Each of these treatments seeks to restore normal brain activity via electrical or magnetic stimulation. In this article, the authors discuss the ongoing evolution of neurostimulatory treatments for treatment-resistant depression, reviewing the methods, efficacy, and current research on electroconvulsive therapy, repetitive transcranial magnetic stimulation, magnetic seizure therapy, focal electrically administered stimulated seizure therapy, transcranial direct current stimulation, chronic epidural cortical stimulation, and vagus nerve stimulation. Special attention is given to deep brain stimulation, the most focally targeted approach. The history, purported mechanisms of action, and current research are outlined in detail. Although deep brain stimulation is the most invasive of the neurostimulatory treatments developed to date, it may hold significant promise in alleviating symptoms and improving the quality of life for patients with the most severe and disabling mood disorders.