Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 55
Filtrar
Más filtros

Bases de datos
País/Región como asunto
Tipo del documento
Intervalo de año de publicación
1.
Eur J Clin Pharmacol ; 77(7): 1049-1055, 2021 Jul.
Artículo en Inglés | MEDLINE | ID: mdl-33452584

RESUMEN

PURPOSE: Thiazide diuretics are the most common origin of drug-induced hyponatremia. However, population-based studies on clinical outcomes are lacking. We therefore explored the time course and absolute risk of thiazide-associated hospitalization due to hyponatremia in Sweden. METHODS: Population-based case-control study including patients hospitalized with a principal diagnosis of hyponatremia (n = 11,213) compared with controls (n = 44,801). Linkage of registers was used to acquire data. Multivariable regression was applied to explore time-dependent associations between thiazide diuretics and hospitalization due to hyponatremia. Attributable risks were calculated assessing the disease burden attributable to thiazides. RESULTS: Individuals initiating thiazide treatment were exposed to an immediate increase in risk for hospitalization with adjusted odds ratio (aOR) (95% CI) of 48 (28-89). The associations gradually declined reaching an aOR of 2.9 (2.7-3.1) for individuals treated for longer than 13 weeks. The attributable risk of hyponatremia-associated hospitalization due to thiazides of any treatment length was 27% (3095/11,213). Among 806 patients initiating treatment < 90 days before hospitalization, hyponatremia could be attributed to thiazides in 754. Based on nationwide data, 616,678 individuals were initiated on thiazides during the 8-year study period suggesting an absolute risk of 0.12% (754/661,678) for subsequent hospitalization with a main diagnosis of hyponatremia. CONCLUSIONS: Thiazide diuretics attributed to more than one in four individuals hospitalized due to hyponatremia. The risk increase was very pronounced during the first month of treatment and then gradually declined, without returning to normal. However, the absolute risk for the development of hyponatremia demanding hospitalization may for most individuals be modest.


Asunto(s)
Hospitalización/estadística & datos numéricos , Hiponatremia/inducido químicamente , Inhibidores de los Simportadores del Cloruro de Sodio/efectos adversos , Adolescente , Adulto , Anciano , Anciano de 80 o más Años , Femenino , Humanos , Masculino , Persona de Mediana Edad , Estudios Retrospectivos , Factores de Riesgo , Suecia/epidemiología , Adulto Joven
2.
Pharmacoepidemiol Drug Saf ; 29(1): 77-83, 2020 01.
Artículo en Inglés | MEDLINE | ID: mdl-31730289

RESUMEN

PURPOSE: In a patient with clinically significant hyponatremia without other clear causes, thiazide treatment should be replaced with another drug. Data describing to which extent this is being done are scarce. The aim of this study was to investigate sociodemographic and socioeconomic factors that may be of importance for the withdrawal of thiazide diuretics in patients hospitalized due to hyponatremia. METHODS: The study population was sampled from a case-control study investigating individuals hospitalized with a main diagnosis of hyponatremia. For every case, four matched controls were included. In the present study, cases (n = 5204) and controls (n = 7425) that had been dispensed a thiazide diuretic prior to index date were identified and followed onward regarding further dispensations. To investigate the influence of socioeconomic and sociodemographic factors, multiple logistic regression was used. RESULTS: The crude prevalence of thiazide withdrawal for cases and controls was 71.9% and 10.8%, respectively. Thiazide diuretics were more often withdrawn in medium-sized towns (adjusted OR, 1.52; 95% CI, 1.21-1.90) and rural areas (aOR, 1.81; 95% CI, 1.40-2.34) compared with metropolitan areas and less so among divorced (aOR, 0.72; 95% CI, 0.53-0.97). However, education, employment status, income, age, country of birth, and gender did not influence withdrawal of thiazides among patients with hyponatremia. CONCLUSIONS: Thiazide diuretics were discontinued in almost three out of four patients hospitalized due to hyponatremia. Educational, income, gender, and most other sociodemographic and socioeconomic factors were not associated with withdrawal of thiazides.


Asunto(s)
Hospitalización , Hipertensión/tratamiento farmacológico , Hiponatremia/epidemiología , Pautas de la Práctica en Medicina/estadística & datos numéricos , Inhibidores de los Simportadores del Cloruro de Sodio/efectos adversos , Adulto , Anciano , Anciano de 80 o más Años , Estudios de Casos y Controles , Femenino , Humanos , Hipertensión/sangre , Hiponatremia/inducido químicamente , Masculino , Persona de Mediana Edad , Farmacoepidemiología , Sistema de Registros , Factores de Riesgo , Suecia/epidemiología , Adulto Joven
3.
Acta Paediatr ; 108(11): 2001-2007, 2019 11.
Artículo en Inglés | MEDLINE | ID: mdl-31140196

RESUMEN

AIM: We investigated the association between low Apgar score, other perinatal characteristics and low stress resilience in adolescence. A within-siblings analysis was used to tackle unmeasured shared familial confounding. METHODS: We used a national cohort of 527 763 males born in Sweden between 1973 and 1992 who undertook military conscription assessments at mean age of 18 years (17-20). Conscription examinations included a measure of stress resilience. Information on Apgar score and other perinatal characteristics was obtained through linkage with the Medical Birth Register. Analyses were conducted using ordinary least squares and fixed-effects linear regression models adjusted for potential confounding factors. RESULTS: Infants with a prolonged low Apgar score at five minutes had an increased risk of low stress resilience in adolescence compared with those with highest scores at one minute, with an adjusted coefficient and 95% confidence interval of -0.26 (-0.39, -0.13). The associations were no longer statistically significant when using within-siblings models. However, the associations with stress resilience and birthweight remained statistically significant in all analyses. CONCLUSION: The association with low Apgar score seems to be explained by confounding due to shared childhood circumstances among siblings from the same family, while low birthweight is independently associated with low stress resilience.


Asunto(s)
Puntaje de Apgar , Resiliencia Psicológica , Estrés Psicológico/genética , Adolescente , Estudios de Cohortes , Factores de Confusión Epidemiológicos , Humanos , Recién Nacido , Masculino , Adulto Joven
4.
Stroke ; 47(9): 2416-8, 2016 09.
Artículo en Inglés | MEDLINE | ID: mdl-27491740

RESUMEN

BACKGROUND AND PURPOSE: Physical and psychological characteristics in adolescence are associated with subsequent stroke risk. Our aim is to investigate their relevance to length of hospital stay and risk of second stroke. METHODS: Swedish men born between 1952 and 1956 (n=237 879) were followed from 1987 to 2010 using information from population-based national registers. Stress resilience, body mass index, cognitive function, physical fitness, and blood pressure were measured at compulsory military conscription examinations in late adolescence. Joint Cox proportional hazards models estimated the associations of these characteristics with long compared with short duration of stroke-related hospital stay and with second stroke compared with first. RESULTS: Some 3000 men were diagnosed with nonfatal stroke between ages 31 and 58 years. Low stress resilience, underweight, and higher systolic blood pressure (per 1-mm Hg increase) during adolescence were associated with longer hospital stay (compared with shorter) in ischemic stroke, with adjusted relative hazard ratios (and 95% confidence intervals) of 1.46 (1.08-1.89), 1.41 (1.04-1.91), and 1.01 (1.00-1.02), respectively. Elevated systolic and diastolic blood pressures during adolescence were associated with longer hospital stay in men with intracerebral hemorrhage: 1.01 (1.00-1.03) and 1.02 (1.00-1.04), respectively. Among both stroke types, obesity in adolescence conferred an increased risk of second stroke: 2.06 (1.21-3.45). CONCLUSIONS: Some characteristics relevant to length of stroke-related hospital stay and risk of second stroke are already present in adolescence. Early lifestyle influences are of importance not only to stroke risk by middle age but also to recurrence and use of healthcare resources among stroke survivors.


Asunto(s)
Presión Sanguínea/fisiología , Tiempo de Internación , Obesidad/complicaciones , Resiliencia Psicológica , Accidente Cerebrovascular/terapia , Delgadez/complicaciones , Adolescente , Adulto , Humanos , Masculino , Persona de Mediana Edad , Sistema de Registros , Factores de Riesgo , Accidente Cerebrovascular/etiología , Suecia
5.
J Neurol Neurosurg Psychiatry ; 85(12): 1331-6, 2014 Dec.
Artículo en Inglés | MEDLINE | ID: mdl-24681701

RESUMEN

OBJECTIVE: Exposure to psychosocial stress has been identified as a possible stroke risk, but the role of stress resilience which may be relevant to chronic exposure is uncertain. We investigated the association of stress resilience in adolescence with subsequent stroke risk. METHODS: Register-based cohort study. Some 237 879 males born between 1952 and 1956 were followed from 1987 to 2010 using information from Swedish registers. Cox regression estimated the association of stress resilience with stroke, after adjustment for established stroke risk factors. RESULTS: Some 3411 diagnoses of first stroke were identified. Lowest stress resilience (21.8%) compared with the highest (23.7%) was associated with increased stroke risk, producing unadjusted HR (with 95% CIs) of 1.54 (1.40 to 1.70). The association attenuated slightly to 1.48 (1.34 to 1.63) after adjustment for markers of socioeconomic circumstances in childhood; and after further adjustment for markers of development and disease in adolescence (blood pressure, cognitive function and pre-existing cardiovascular disease) to 1.30 (1.18 to 1.45). The greatest reduction followed further adjustment for markers of physical fitness (BMI and physical working capacity) in adolescence to 1.16 (1.04 to 1.29). The results were consistent when stroke was subdivided into fatal, ischaemic and haemorrhagic, with higher magnitude associations for fatal rather than non-fatal, and for haemorrhagic rather than ischaemic stroke. CONCLUSIONS: Stress susceptibility and, therefore, psychosocial stress may be implicated in the aetiology of stroke. This association may be explained, in part, by poorer physical fitness. Effective prevention might focus on behaviour/lifestyle and psychosocial stress.


Asunto(s)
Resiliencia Psicológica , Estrés Psicológico/complicaciones , Accidente Cerebrovascular/etiología , Adolescente , Adulto , Estudios de Cohortes , Humanos , Masculino , Persona de Mediana Edad , Modelos de Riesgos Proporcionales , Sistema de Registros , Factores de Riesgo , Factores Socioeconómicos , Accidente Cerebrovascular/epidemiología , Suecia/epidemiología
6.
JMIR Serious Games ; 11: e44348, 2023 Aug 10.
Artículo en Inglés | MEDLINE | ID: mdl-37561558

RESUMEN

BACKGROUND: Eating disorders and obesity are serious health problems with poor treatment outcomes and high relapse rates despite well-established treatments. Several studies have suggested that virtual reality technology could enhance the current treatment outcomes and could be used as an adjunctive tool in their treatment. OBJECTIVE: This study aims to investigate the differences between eating virtual and real-life meals and test the hypothesis that eating a virtual meal can reduce hunger among healthy women. METHODS: The study included 20 healthy women and used a randomized crossover design. The participants were asked to eat 1 introduction meal, 2 real meals, and 2 virtual meals, all containing real or virtual meatballs and potatoes. The real meals were eaten on a plate that had been placed on a scale that communicated with analytical software on a computer. The virtual meals were eaten in a room where participants were seated on a real chair in front of a real table and fitted with the virtual reality equipment. The eating behavior for both the real and virtual meals was filmed. Hunger was measured before and after the meals using questionnaires. RESULTS: There was a significant difference in hunger from baseline to after the real meal (mean difference=61.8, P<.001) but no significant change in hunger from before to after the virtual meal (mean difference=6.9, P=.10). There was no significant difference in food intake between the virtual and real meals (mean difference=36.8, P=.07). Meal duration was significantly shorter in the virtual meal (mean difference=-5.4, P<.001), which led to a higher eating rate (mean difference=82.9, P<.001). Some participants took bites and chewed during the virtual meal, but the number of bites and chews was lower than in the real meal. The meal duration was reduced from the first virtual meal to the second virtual meal, but no significant difference was observed between the 2 real meals. CONCLUSIONS: Eating a virtual meal does not appear to significantly reduce hunger in healthy individuals. Also, this methodology does not significantly result in eating behaviors identical to real-life conditions but does evoke chewing and bite behavior in certain individuals. TRIAL REGISTRATION: ClinicalTrials.gov NCT05734209, https://clinicaltrials.gov/ct2/show/NCT05734209.

7.
Vaccines (Basel) ; 11(9)2023 Aug 25.
Artículo en Inglés | MEDLINE | ID: mdl-37766096

RESUMEN

Influenza vaccines are designed to mimic natural influenza virus exposure and stimulate a long-lasting immune response to future infections. The evolving nature of the influenza virus makes vaccination an important and efficacious strategy to reduce healthcare-related complications of influenza. Several lines of evidence indicate that influenza vaccination may induce nonspecific effects, also referred to as heterologous or pleiotropic effects, that go beyond protection against infection. Different explanations are proposed, including the upregulation and downregulation of cytokines and epigenetic reprogramming in monocytes and natural killer cells, imprinting an immunological memory in the innate immune system, a phenomenon termed "trained immunity". Also, cross-reactivity between related stimuli and bystander activation, which entails activation of B and T lymphocytes without specific recognition of antigens, may play a role. In this review, we will discuss the possible nonspecific effects of influenza vaccination in cardiovascular disease, type 1 diabetes, cancer, and Alzheimer's disease, future research questions, and potential implications. A discussion of the potential effects on infections by other pathogens is beyond the scope of this review.

8.
BMC Public Health ; 12: 351, 2012 May 14.
Artículo en Inglés | MEDLINE | ID: mdl-22583917

RESUMEN

BACKGROUND: Speed of eating, an important aspect of eating behaviour, has recently been related to loss of control of food intake and obesity. Very little time is allocated for lunch at school and thus children may consume food more quickly and food intake may therefore be affected. Study 1 measured the time spent eating lunch in a large group of students eating together for school meals. Study 2 measured the speed of eating and the amount of food eaten in individual school children during normal school lunches and then examined the effect of experimentally increasing or decreasing the speed of eating on total food intake. METHODS: The time spent eating lunch was measured with a stop watch in 100 children in secondary school. A more detailed study of eating behaviour was then undertaken in 30 secondary school children (18 girls). The amount of food eaten at lunch was recorded by a hidden scale when the children ate amongst their peers and by a scale connected to a computer when they ate individually. When eating individually, feedback on how quickly to eat was visible on the computer screen. The speed of eating could therefore be increased or decreased experimentally using this visual feedback and the total amount of food eaten measured. RESULTS: In general, the children spent very little time eating their lunch. The 100 children in Study 1 spent on average (SD) just 7 (0.8) minutes eating lunch. The girls in Study 2 consumed their lunch in 5.6 (1.2) minutes and the boys ate theirs in only 6.8 (1.3) minutes. Eating with peers markedly distorted the amount of food eaten for lunch; only two girls and one boy maintained their food intake at the level observed when the children ate individually without external influences (258 (38) g in girls and 289 (73) g in boys). Nine girls ate on average 33% less food and seven girls ate 23% more food whilst the remaining boys ate 26% more food. The average speed of eating during school lunches amongst groups increased to 183 (53)% in the girls and to 166 (47)% in the boys compared to the speed of eating in the unrestricted condition. These apparent changes in food intake during school lunches could be replicated by experimentally increasing the speed of eating when the children were eating individually. CONCLUSIONS: If insufficient time is allocated for consuming school lunches, compensatory increased speed of eating puts children at risk of losing control over food intake and in many cases over-eating. Public health initiatives to increase the time available for school meals might prove a relatively easy way to reduce excess food intake at school and enable children to eat more healthily.


Asunto(s)
Ingestión de Alimentos , Conducta Alimentaria , Almuerzo , Instituciones Académicas , Estudiantes/psicología , Adolescente , Niño , Femenino , Humanos , Masculino , Obesidad/epidemiología , Grupo Paritario , Conducta Social , Factores de Tiempo
9.
J Vis Exp ; (183)2022 05 10.
Artículo en Inglés | MEDLINE | ID: mdl-35635472

RESUMEN

Eating disorders (anorexia nervosa, bulimia nervosa, binge-eating disorder, and other specified eating or feeding disorders) have a combined prevalence of 13% and are associated with severe physical and psychosocial problems. Early diagnosis, which is important for effective treatment and prevention of undesirable long-term health consequences, imposes problems among non-specialist clinicians unfamiliar with these patients, such as those working in primary care. Early, accurate diagnosis, particularly in primary care, allows expert interventions early enough in the disorder to facilitate positive treatment outcomes. Computer-assisted diagnostic procedures offer a possible solution to this problem by providing expertise via an algorithm that has been developed from a large number of cases that have been diagnosed in person by expert diagnosticians and expert caregivers. A web-based system for determining an accurate diagnosis for patients suspected to suffer from an eating disorder was developed based on these data. The process is automated using an algorithm that estimates the respondent's probability of having an eating disorder and the type of eating disorder the individual has. The system provides a report that works as an aid for clinicians during the diagnostic process and serves as an educational tool for new clinicians.


Asunto(s)
Anorexia Nerviosa , Trastorno por Atracón , Bulimia Nerviosa , Trastornos de Alimentación y de la Ingestión de Alimentos , Anorexia Nerviosa/diagnóstico , Anorexia Nerviosa/psicología , Anorexia Nerviosa/terapia , Trastorno por Atracón/diagnóstico , Trastorno por Atracón/psicología , Trastorno por Atracón/terapia , Bulimia Nerviosa/diagnóstico , Bulimia Nerviosa/psicología , Bulimia Nerviosa/terapia , Computadores , Trastornos de Alimentación y de la Ingestión de Alimentos/diagnóstico , Humanos
10.
Nutrients ; 14(19)2022 Sep 27.
Artículo en Inglés | MEDLINE | ID: mdl-36235651

RESUMEN

Probiotic and omega-3 supplements have been shown to reduce inflammation, and dual supplementation may have synergistic health effects. We investigated if the novel combination of a multi-strain probiotic (containing B. lactis Bi-07, L. paracasei Lpc-37, L. acidophilus NCFM, and B. lactis Bl-04) alongside omega-3 supplements reduces low-grade inflammation as measured by high-sensitivity C-reactive protein (hs-CRP) in elderly participants in a proof-of-concept, randomized, placebo-controlled, parallel study (NCT04126330). In this case, 76 community-dwelling elderly participants (median: 71.0 years; IQR: 68.0-73.8) underwent an intervention with the dual supplement (n = 37) or placebo (n = 39) for eight weeks. In addition to hs-CRP, cytokine levels and intestinal permeability were also assessed at baseline and after the eight-week intervention. No significant difference was seen for hs-CRP between the dual supplement group and placebo. However, interestingly, supplementation did result in significant increases in the level of the anti-inflammatory marker IL-10. In addition, dual supplementation increased levels of valeric acid, further suggesting the potential of the supplements in reducing inflammation and conferring health benefits. Together, the results suggest that probiotic and omega-3 dual supplementation exerts modest effects on inflammation and may have potential use as a non-pharmacological treatment for low-grade inflammation in the elderly.


Asunto(s)
Ácidos Grasos Omega-3 , Probióticos , Anciano , Proteína C-Reactiva/metabolismo , Suplementos Dietéticos , Método Doble Ciego , Humanos , Inflamación/tratamiento farmacológico , Interleucina-10
11.
Microorganisms ; 9(6)2021 Jun 21.
Artículo en Inglés | MEDLINE | ID: mdl-34205818

RESUMEN

Increasing evidence suggests that probiotic supplementation may be efficacious in counteracting age-related shifts in gut microbiota composition and diversity, thereby impacting health outcomes and promoting healthy aging. However, randomized controlled trials (RCTs) with probiotics in healthy older adults have utilized a wide variety of strains and focused on several different outcomes with conflicting results. Therefore, a systematic review was conducted to determine which outcomes have been investigated in randomized controlled trials with probiotic supplementation in healthy older adults and what has been the effect of these interventions. For inclusion, studies reporting on randomized controlled trials with probiotic and synbiotic supplements in healthy older adults (defined as minimum age of 60 years) were considered. Studies reporting clinical trials in specific patient groups or unhealthy participants were excluded. In addition to assessment of eligibility and data extraction, each study was examined for risk of bias and quality assessment was performed by two independent reviewers. Due to the heterogeneity of outcomes, strains, study design, duration, and methodology, we did not perform any meta-analyses and instead provided a narrative overview of the outcomes examined. Of 1997 potentially eligible publications, 17 studies were included in this review. The risk of bias was low, although several studies failed to adequately describe random sequence generation, allocation concealment, and blinding. The overall study quality was high; however, many studies did not include sample calculations, and the majority of studies had a small sample size. The main outcomes examined in the trials included microbiota composition, immune-related measurements, digestive health, general well-being, cognitive function, and lipid and other biomarkers. The most commonly assessed outcome with the most consistent effect was microbiota composition; all but one study with this outcome showed significant effects on gut microbiota composition in healthy older adults. Overall, probiotic supplementation had modest effects on markers of humoral immunity, immune cell population levels and activity, as well as the incidence and duration of the common cold and other infections with some conflicting results. Digestive health, general-well-being, cognitive function, and lipid and other biomarkers were investigated in a very small number of studies; therefore, the impact on these outcomes remains inconclusive. Probiotics appear to be efficacious in modifying gut microbiota composition in healthy older adults and have moderate effects on immune function. However, the effect of probiotic supplementation on other health outcomes remains inconclusive, highlighting the need for more well-designed, sufficiently-powered studies to investigate if and the mechanisms by which probiotics impact healthy aging.

12.
PLoS One ; 16(11): e0260077, 2021.
Artículo en Inglés | MEDLINE | ID: mdl-34784383

RESUMEN

BACKGROUND: Individuals with Anorexia Nervosa are often described as restless, hyperactive and having disturbed sleep. The result reproducibility and generalisability of these results are low due to the use of unreliable methods, different measurement methods and outcome measures. A reliable method to measure both physical activity and sleep is through accelerometry. The main purpose of the study was to quantify the physical activity and sleeping behaviour of anorexia nervosa patients. Another purpose was to increase result reproducibility and generalisability of the study. MATERIAL AND METHODS: Accelerometer data were collected from the first week of treatment of anorexia nervosa at an inpatient ward. Raw data from the Axivity AX3© accelerometer was used with the open-source package GGIR for analysis, in the free statistical software R. Accelerometer measurements were transformed into euclidean norm minus one with negative values rounded to zero (ENMO). Physical activity measurements of interest were 24h average ENMO, daytime average ENMO, inactivity, light activity, moderate activity, and vigorous activity. Sleep parameters of interest were sleep duration, sleep efficiency, awakenings, and wake after sleep onset. The sleep duration of different age groups was compared to recommendations by the National Sleep Foundation using a Fisher's exact test. RESULTS: Of 67 patients, due to data quality 58 (93% female) were included in the analysis. Average age of participants was 17.8 (±6.9) years and body mass index was 15.5 (±1.9) kg/m2. Daytime average ENMO was 17.4 (±5.1) mg. Participants spent 862.6 (±66.2) min per day inactive, 88.4 (±22.6) min with light activities, 25.8 (±16.7) min with moderate activities and 0.5 (±1.8) min with vigorous activities. Participants slept for 461.0 (±68.4) min, waking up 1.45 (±1.25) times per night for 54.6 (±35.8) min, having an average sleep quality of 0.88 (±0.10). 31% of participants met sleep recommendations, with a significantly higher number of 6-13 year old patients failing to reach recommendations compared to 14-25 year old patients. CONCLUSION: The patient group spent most of their time inactive at the beginning of treatment. Most patients failed to reach sleep recommendations. The use of raw data and opensource software should ensure result reproducibility, enable comparison across points in treatment and comparison with healthy individuals.


Asunto(s)
Anorexia Nerviosa/fisiopatología , Ejercicio Físico/fisiología , Sueño/fisiología , Acelerometría , Adolescente , Adulto , Anorexia Nerviosa/psicología , Anorexia Nerviosa/terapia , Niño , Ejercicio Físico/psicología , Femenino , Humanos , Pacientes Internos , Masculino , Reproducibilidad de los Resultados , Calidad del Sueño , Adulto Joven
14.
JMIR Serious Games ; 9(2): e24998, 2021 04 13.
Artículo en Inglés | MEDLINE | ID: mdl-33847593

RESUMEN

BACKGROUND: Anorexia nervosa is one of the more severe eating disorders, which is characterized by reduced food intake, leading to emaciation and psychological maladjustment. Treatment outcomes are often discouraging, with most interventions displaying a recovery rate below 50%, a dropout rate from 20% to 50%, and a high risk of relapse. Patients with anorexia nervosa often display anxiety and aversive behaviors toward food. Virtual reality has been successful in treating vertigo, anxiety disorder, and posttraumatic stress syndrome, and could potentially be used as an aid in treating eating disorders. OBJECTIVE: The aim of this study was to evaluate the feasibility and usability of an immersive virtual reality technology administered through an app for use by patients with eating disorders. METHODS: Twenty-six participants, including 19 eating disorder clinic personnel and 5 information technology personnel, were recruited through emails and personal invitations. Participants handled virtual food and utensils on an app using immersive virtual reality technology comprising a headset and two hand controllers. In the app, the participants learned about the available actions through a tutorial and they were introduced to a food challenge. The challenge consisted of a meal type (meatballs, potatoes, sauce, and lingonberries) that is typically difficult for patients with anorexia nervosa to eat in real life. Participants were instructed, via visual feedback from the app, to eat at a healthy rate, which is also a challenge for patients. Participants rated the feasibility and usability of the app by responding to the mHealth Evidence Reporting and Assessment checklist, the 10-item System Usability Scale, and the 20-point heuristic evaluation questionnaire. A cognitive walkthrough was performed using video recordings of participant interactions in the virtual environment. RESULTS: The mean age of participants was 37.9 (SD 9.7) years. Half of the participants had previous experience with virtual reality. Answers to the mHealth Evidence Reporting and Assessment checklist suggested that implementation of the app would face minor infrastructural, technological, interoperability, financial, and adoption problems. There was some disagreement on intervention delivery, specifically regarding frequency of use; however, most of the participants agreed that the app should be used at least once per week. The app received a mean score of 73.4 (range 55-90), earning an overall "good" rating. The mean score of single items of the heuristic evaluation questionnaire was 3.6 out of 5. The lowest score (2.6) was given to the "accuracy" item. During the cognitive walkthrough, 32% of the participants displayed difficulty in understanding what to do at the initial selection screen. However, after passing the selection screen, all participants understood how to progress through the tasks. CONCLUSIONS: Participants found the app to be usable and eating disorder personnel were positive regarding its fit with current treatment methods. Along with the food item challenges in the current app, participants considered that the app requires improvement to offer environmental and social (eg, crowded room vs eating alone) challenges.

15.
Trials ; 22(1): 338, 2021 May 10.
Artículo en Inglés | MEDLINE | ID: mdl-33971938

RESUMEN

BACKGROUND: Bilberries from Sweden, rich in polyphenols, have shown cholesterol-lowering effects in small studies, and the cholesterol-lowering properties of oats, with abundant beta-glucans and potentially bioactive phytochemicals, are well established. Both may provide cardiometabolic benefits following acute myocardial infarction (AMI), but large studies of adequate statistical power and appropriate duration are needed to confirm clinically relevant treatment effects. No previous study has evaluated the potential additive or synergistic effects of bilberry combined with oats on cardiometabolic risk factors. Our primary objective is to assess cardioprotective effects of diet supplementation with dried bilberry or with bioprocessed oat bran, with a secondary explorative objective of assessing their combination, compared with a neutral isocaloric reference supplement, initiated within 5 days following percutaneous coronary intervention (PCI) for AMI. METHODS: The effects of Bilberry and Oat intake on lipids, inflammation and exercise capacity after Acute Myocardial Infarction (BIOAMI) trial is a double-blind, randomized, placebo-controlled clinical trial. A total of 900 patients will be randomized post-PCI to one of four dietary intervention arms. After randomization, subjects will receive beverages with bilberry powder (active), beverages with high-fiber bioprocessed oat bran (active), beverages with bilberry and oats combined (active), or reference beverages containing no active bilberry or active oats, for consumption twice daily during a 3-month intervention. The primary endpoint is the difference in LDL cholesterol change between the intervention groups after 3 months. The major secondary endpoint is exercise capacity at 3 months. Other secondary endpoints include plasma concentrations of biochemical markers of inflammation, metabolomics, and gut microbiota composition after 3 months. DISCUSSION: Controlling hyperlipidemia and inflammation is critical to preventing new cardiovascular events, but novel pharmacological treatments for these conditions are expensive and associated with negative side effects. If bilberry and/or oat, in addition to standard medical therapy, can lower LDL cholesterol and inflammation more than standard therapy alone, this could be a cost-effective and safe dietary strategy for secondary prevention after AMI. TRIAL REGISTRATION: ClinicalTrials.gov NCT03620266 . Registered on August 8, 2018.


Asunto(s)
Infarto del Miocardio , Intervención Coronaria Percutánea , Vaccinium myrtillus , Avena , Método Doble Ciego , Tolerancia al Ejercicio , Humanos , Inflamación/diagnóstico , Inflamación/prevención & control , Lípidos , Infarto del Miocardio/diagnóstico , Ensayos Clínicos Controlados Aleatorios como Asunto , Suecia
16.
Mol Nutr Food Res ; 64(20): e2000108, 2020 10.
Artículo en Inglés | MEDLINE | ID: mdl-32846041

RESUMEN

SCOPE: Diet rich in bilberries is considered cardioprotective, but the mechanisms of action are poorly understood. Cardiovascular disease is characterized by increased proatherogenic status and high levels of circulating microvesicles (MVs). In an open-label study patients with myocardial infarction receive an 8 week dietary supplementation with bilberry extract (BE). The effect of BE on patient MV levels and its influence on endothelial vesiculation in vitro is investigated. METHODS AND RESULTS: MVs are captured with acoustic trapping and platelet-derived MVs (PMVs), as well as endothelial-derived MVs (EMVs) are quantified with flow cytometry. The in vitro effect of BE on endothelial extracellular vesicle (EV) release is examined using endothelial cells and calcein staining. The mechanisms of BE influence on vesiculation pathways are studied by Western blot and qRT-PCR. Supplementation with BE decreased both PMVs and EMVs. Furthermore, BE reduced endothelial EV release, Akt phosphorylation, and vesiculation-related gene transcription. It also protects the cells from P2X7 -induced EV release and increase in vesiculation-related gene expression. CONCLUSION: BE supplementation improves the MV profile in patient blood and reduces endothelial vesiculation through several molecular mechanisms related to the P2X7 receptor. The findings provide new insight into the cardioprotective effects of bilberries.


Asunto(s)
Suplementos Dietéticos , Vesículas Extracelulares , Infarto del Miocardio/sangre , Infarto del Miocardio/dietoterapia , Vaccinium myrtillus , Anciano , Plaquetas/citología , Proteínas Sanguíneas/metabolismo , Micropartículas Derivadas de Células/efectos de los fármacos , Células Endoteliales/efectos de los fármacos , Células Endoteliales/metabolismo , Femenino , Expresión Génica , Pruebas Hematológicas/métodos , Células Endoteliales de la Vena Umbilical Humana , Humanos , Masculino , Infarto del Miocardio/fisiopatología , Nanopartículas , Fosforilación/efectos de los fármacos , Receptores Purinérgicos P2X7/genética
17.
Comput Methods Programs Biomed ; 194: 105485, 2020 Oct.
Artículo en Inglés | MEDLINE | ID: mdl-32464588

RESUMEN

BACKGROUND & OBJECTIVE: The study of eating behavior has made significant progress towards understanding the association of specific eating behavioral patterns with medical problems, such as obesity and eating disorders. Smartphones have shown promise in monitoring and modifying unhealthy eating behavior patterns, often with the help of sensors for behavior data recording. However, when it comes to semi-controlled deployment settings, smartphone apps that facilitate eating behavior data collection are missing. To fill this gap, the present work introduces ASApp, one of the first smartphone apps to support researchers in the collection of heterogeneous objective (sensor-acquired) and subjective (self-reported) eating behavior data in an integrated manner from large-scale, naturalistic human subject research (HSR) studies. METHODS: This work presents the overarching and deployment-specific requirements that have driven the design of ASApp, followed by the heterogeneous eating behavior dataset that is collected and the employed data collection protocol. The collected dataset combines objective and subjective behavior information, namely (a) dietary self-assessment information, (b) the food weight timeseries throughout an entire meal (using a portable weight scale connected wirelessly), (c) a photograph of the meal, and (d) a series of quantitative eating behavior indicators, mainly calculated from the food weight timeseries. The designed data collection protocol is quick, straightforward, robust and capable of satisfying the requirement of semi-controlled HSR deployment. RESULTS: The implemented functionalities of ASApp for research assistants and study participants are presented in detail along with the corresponding user interfaces. ASApp has been successfully deployed for data collection in an in-house testing study and the SPLENDID study, i.e., a real-life semi-controlled HSR study conducted in the cafeteria of a Swedish high-school in the context of an EC-funded research project. The two deployment studies are described and the promising results from the evaluation of the app with respect to attractiveness, usability, and technical soundness are discussed. Access details for ASApp are also provided. CONCLUSIONS: This work presents the requirement elucidation, design, implementation and evaluation of a novel smartphone application that supports researchers in the integrated collection of a concise yet rich set of heterogeneous eating behavior data for semi-controlled HSR.


Asunto(s)
Trastornos de Alimentación y de la Ingestión de Alimentos , Aplicaciones Móviles , Conducta Alimentaria , Humanos , Obesidad , Teléfono Inteligente
18.
PLoS One ; 15(8): e0236866, 2020.
Artículo en Inglés | MEDLINE | ID: mdl-32760080

RESUMEN

INTRODUCTION: Influenza may precipitate cardiovascular disease, but influenza typically peaks in winter, coinciding with other triggers of myocardial infarction (MI) such as low air temperature, high wind velocity, low atmospheric pressure, and short sunshine duration. OBJECTIVE: We aimed to determine the relationship of week-to-week variation in influenza cases and acute MI, controlling for meteorological factors in a nationwide population. METHODS: Weekly laboratory-confirmed influenza case reports were obtained from the Public Health Agency of Sweden from 2009 to 2016 and merged with the nationwide SWEDEHEART MI registry. Weekly incidence of MI was studied with regard to number of influenza cases stratified into tertiles of 0-16, 17-164, and >164 cases/week. Incidence rate ratios (IRR) were calculated using a count regression model for each category and compared to a non-influenza period as reference, controlling for air temperature, atmospheric pressure, wind velocity, and sunshine duration. RESULTS: A total of 133562 MI events was reported to the registry during the study period. Weeks with influenza cases were associated with higher incidence of MI than those without in unadjusted analysis for overall MI, ST-elevation MI and non ST-elevation MI independently. During the influenza season, weeks with 0-16 reported cases/week were not associated with MI incidence after adjusting for weather parameters, adjusted IRR for MI was 1.03 (95% CI 1.00-1.06, P = 0.09). However, weeks with more cases reported were associated with MI incidence: 17-163 reported cases/week, adjusted IRR = 1.05 (95% CI 1.02-1.08, P = 0.003); and for ≥164 cases/week, the IRR = 1.06 (95% CI 1.02-1.09, P = 0.002). Results were consistent across a large range of subgroups. CONCLUSIONS: In this nationwide observational study, we found an association of incidence of MI with incidence of influenza cases beyond what could be explained by meteorological factors.


Asunto(s)
Gripe Humana/diagnóstico , Infarto del Miocardio/diagnóstico , Enfermedad Aguda , Anciano , Anciano de 80 o más Años , Bases de Datos Factuales , Femenino , Humanos , Incidencia , Gripe Humana/complicaciones , Gripe Humana/epidemiología , Masculino , Persona de Mediana Edad , Infarto del Miocardio/complicaciones , Infarto del Miocardio/epidemiología , Sistema de Registros , Infarto del Miocardio con Elevación del ST/complicaciones , Infarto del Miocardio con Elevación del ST/diagnóstico , Infarto del Miocardio con Elevación del ST/epidemiología , Suecia/epidemiología
19.
Physiol Behav ; 96(2): 270-5, 2009 Feb 16.
Artículo en Inglés | MEDLINE | ID: mdl-18992760

RESUMEN

Women were divided into those eating at a decelerated or linear rate. Eating rate was then experimentally increased or decreased by asking the women to adapt their rate of eating to curves presented on a computer screen and the effect on food intake and satiety was studied. Decelerated eaters were unable to eat at an increased rate, but ate the same amount of food when eating at a decreased rate as during the control condition. Linear eaters ate more food when eating at an increased rate, but less food when eating at a decreased rate. Decelerated eaters estimated their level of satiety lower when eating at an increased rate but similar to the control condition when eating at a decreased rate. Linear eaters estimated their level of satiety similar to the control level despite eating more food at an increased rate and higher despite eating less food at a decreased rate. The cumulative satiety curve was fitted to a sigmoid curve both in decelerated and linear eater under all conditions. Linear eaters rated their desire to eat and estimated their prospective intake lower than decelerated eaters and scored higher on a scale for restrained eating. It is suggested that linear eaters have difficulty maintaining their intake when eating rate is dissociated from its baseline level and that this puts them at risk of developing disordered eating. It is also suggested that feedback on eating rate can be used as an intervention to treat eating disorders.


Asunto(s)
Ingestión de Alimentos/psicología , Conducta Alimentaria , Motivación , Respuesta de Saciedad/fisiología , Afecto , Análisis de Varianza , Femenino , Privación de Alimentos , Humanos , Dimensión del Dolor , Reproducibilidad de los Resultados , Estadística como Asunto , Adulto Joven
20.
Physiol Behav ; 96(4-5): 518-21, 2009 Mar 23.
Artículo en Inglés | MEDLINE | ID: mdl-19087882

RESUMEN

It has been suggested that restrained eating is a cognitive strategy that an individual uses for control of food intake. If losing control, the restrained eater enters a state of disinhibition and is therefore thought to be at risk for developing eating disorders and obesity. Restrained eaters eat at a constant rate and can therefore also be referred to as linear eaters. Here, we have tested the hypothesis that restrained eating is a state that can be modified by teaching linear eaters to eat at a decelerated rate. Seventeen female linear eaters scored high on a scale for restrained eating. When challenged to eat at an increased rate, a test of disinhibition, the women overate by 16% on average. The women then practiced eating at a decelerated rate by use of feedback from a training curve displayed on a computer screen during the meals. The training occurred three times each week and lasted eight weeks. When re-tested in the absence of feedback, the women ate at a decelerated rate, they did not overeat in the test of disinhibition and they scored lower on the scale for restrained eating. It is suggested that restrained eating is a state that can be reduced by training.


Asunto(s)
Ingestión de Alimentos/psicología , Conducta Alimentaria/psicología , Trastornos de Alimentación y de la Ingestión de Alimentos/prevención & control , Inhibición Psicológica , Análisis de Varianza , Trastornos de Alimentación y de la Ingestión de Alimentos/psicología , Femenino , Humanos , Medición de Riesgo , Factores de Tiempo , Adulto Joven
SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA