RESUMEN
Context: Ghrelin circulates in acylated (AG) and deacylated (DAG) forms, which are known to affect appetite. Although acute exercise has been shown to modulate ghrelin levels, data on the impact of exercise intensity on AG and DAG levels and their effects on appetite are sparse and primarily limited to males. Objective: To investigate the effect of exercise intensity and sex on ghrelin levels and appetite in untrained humans. Methods: Eight males (age: 43.1 ± 10.9 years; body mass index [BMI]: 22.2 ± 1.7â kg/m2; peak oxygen consumption [VO2peak]: 36.3 ± 6.4â mL/kg/min) and 6 females (age: 32.2 ± 11.1 years; BMI: 22.7 ± 1.0â kg/m2; VO2peak: 29.2 ± 4.0â mL/kg/min) completed a maximal graded cycle ergometer lactate threshold (LT)/VO2peak test. These data were used to determine the exercise intensity on 3 subsequent randomized control or calorically matched cycle exercise bouts: (1) CON, no exercise; (2) MOD, the power output at LT; (3) HIGH, the power output associated with 75% of the difference between LT and VO2peak. Perception of appetite was analyzed using visual analog scales. Results: Females had higher levels of total ghrelin (TG) (P = .03) and DAG (P = .01) at baseline than males. Both groups exhibited reduced DAG levels in HIGH compared with MOD and CON (P < .0001-.004); however, only females had significantly reduced AG in HIGH (P < .0001). Hunger scores were higher in MOD than in CON (P < .01). Conclusion: High-intensity may be superior to moderate-intensity exercise for reducing ghrelin levels and modifying hunger, and sex may impact this response.
RESUMEN
Anthropogenic climate warming affects plant communities by changing community structure and function. Studies on climate warming have primarily focused on individual effects of warming, but the interactive effects of warming with biotic factors could be at least as important in community responses to climate change. In addition, climate change experiments spanning multiple years are necessary to capture interannual variability and detect the influence of these effects within ecological communities. Our study explores the individual and interactive effects of warming and insect herbivory on plant traits and community responses within a 7-year warming and herbivory manipulation experiment in two early successional plant communities in Michigan, USA. We find stronger support for the individual effects of both warming and herbivory on multiple plant morphological and phenological traits; only the timing of plant green-up and seed set demonstrated an interactive effect between warming and herbivory. With herbivory, warming advanced green-up, but with reduced herbivory, there was no significant effect of warming. In contrast, warming increased plant biomass, but the effect of warming on biomass did not depend upon the level of insect herbivores. We found that these treatments had stronger effects in some years than others, highlighting the need for multiyear experiments. This study demonstrates that warming and herbivory can have strong direct effects on plant communities, but that their interactive effects are limited in these early successional systems. Because the strength and direction of these effects can vary by ecological context, it is still advisable to include levels of biotic interactions, multiple traits and years, and community type when studying climate change effects on plants and their communities.
RESUMEN
OBJECTIVE: Computer-based auditory training (CBAT) has been shown to improve outcomes in adult cochlear implant (CI) users. This study evaluates in new CI users whether starting CBAT within 3 months of activation or later impacts CI outcomes. STUDY DESIGN: Prospective natural experiment. SETTING: Tertiary academic medical center. PATIENTS: Sixty-five new adult CI users. INTERVENTIONS: CBAT use over the first-year postactivation. MAIN OUTCOME MEASURES: Speech recognition scores and CIQOL-35 Profile score improvements between CI recipients who started CBAT resources early (<3 mo) and late (3-12 mo) postactivation. RESULTS: A total of 43 CI recipients started using CBAT within 3 months postactivation (early) and 22 after 3 months (late). Patients who used CBAT within 3 months postactivation showed significantly greater improvement in consonant-nucleus-consonant words (CNCw) (48.3 ± 24.2% vs 27.8 ± 24.9%; d = 0.84), AzBio Sentences in quiet (55.1 ± 28.0% vs 35.7 ± 36.5%; d = 0.62), and CIQOL-35 listening domain scores (18.2 ± 16.3 vs 6.9 ± 12.9, d = 0.73 [0.023, 1.43]), at 3 months postactivation, compared to those who had not yet initiated CBAT. However, by 12 months postactivation, after which all CI recipients had started CBAT, there were no differences observed between patients who started CBAT early or late in speech recognition scores (CNCw: d = 0.26 [-0.35, 0.88]; AzBio: d = 0.37 [-0.23, 0.97]) or in any CIQOL global or domain score (d-range = 0.014-0.47). CONCLUSIONS: Auditory training with self-directed computer software (CBAT) may yield speech recognition and quality-of-life benefits for new adult CI recipients. While early users showed greater improvement in outcomes at 3 months postactivation than users who started later, both groups achieved similar benefits by 12 months postactivation.
Asunto(s)
Implantación Coclear , Calidad de Vida , Percepción del Habla , Humanos , Masculino , Percepción del Habla/fisiología , Femenino , Persona de Mediana Edad , Implantación Coclear/métodos , Anciano , Estudios Prospectivos , Adulto , Implantes Cocleares , Resultado del Tratamiento , Terapia Asistida por Computador/métodos , Factores de TiempoRESUMEN
Leading successful change efforts first requires assessment of the "before change" environment and culture. At our institution, the radiation oncology (RO) residents follow a longitudinal didactic learning program consisting of weekly 1-h lectures, case conferences, and journal clubs. The resident didactic education series format has not changed since its inception over 10 years ago. We evaluated the perceptions of current residents and faculty about the effectiveness of the curriculum in its present form. Two parallel surveys were designed, one each for residents and attendings, to assess current attitudes regarding the effectiveness and need for change in the RO residency curriculum, specifically the traditional didactic lectures, the journal club sessions, and the case conferences. We also investigated perceived levels of engagement among residents and faculty, whether self-assessments would be useful to increase material retention, and how often the content of didactic lectures is updated. Surveys were distributed individually to each resident (N = 10) and attending (N = 24) either in-person or via Zoom. Following completion of the survey, respondents were informally interviewed about their perspectives on the curriculum's strengths and weaknesses. Compared to 46% of attendings, 80% of RO residents believed that the curriculum should be changed. Twenty percent of residents felt that the traditional didactic lectures were effective in preparing them to manage patients in the clinic, compared to 74% of attendings. Similarly, 10% of residents felt that the journal club sessions were effective vs. 42% of attendings. Finally, 40% of residents felt that the case conferences were effective vs. 67% of attendings. Overall, most respondents (56%) favored change in the curriculum. Our results suggest that the perceptions of the residents did not align with those of the attending physicians with respect to the effectiveness of the curriculum and the need for change. The discrepancies between resident and faculty views highlight the importance of a dedicated change management effort to mitigate this gap. Based on this project, we plan to propose recommended changes in structure to the residency program directors. Main changes would be to increase the interactive nature of the course material, incorporate more ways to increase faculty engagement, and consider self-assessment questions to promote retention. Once we get approval from the residency program leadership, we will follow Kotter's "Eight steps to transforming your organization" to ensure the highest potential for faculty to accept the expectations of a new curriculum.
RESUMEN
Empirical studies on peer review bias are primarily conducted by people from privileged groups and with affiliations with the journals studied. Data access is one major barrier to conducting peer review research. Accordingly, we propose pathways to broaden access to peer review data to people from more diverse backgrounds.
Asunto(s)
Publicaciones Periódicas como Asunto , Humanos , Revisión por Pares , Revisión de la Investigación por ParesRESUMEN
INTRODUCTION: Optimal cochlear implant (CI) outcomes are due to, at least in part, appropriate device programming. Objective measures, such as electrically evoked stapedial reflex thresholds (ESRTs), can be used to more accurately set programming levels. However, underlying factors that contribute to ESRT levels are not well understood. The objective of the current study was to analyze how demographic variables of patient sex and age, along with CI electrode location, influence ESRTs in adult CI recipients. METHODS: A single institution retrospective review was performed. Electronic medical records, CI programming records, and clinic database of postoperative computerized tomography were reviewed to gather information regarding patient demographics, ESRTs, and electrode array metrics including medial-lateral distance and scalar location. Linear mixed models were constructed to determine how demographic variables and electrode position influence ESRTs recorded in 138 adult CI recipients. RESULTS: ESRTs were significantly affected by recipient age, with older listeners demonstrating higher ESRT levels. On average, males had higher ESRT levels when compared to females. In a subset of the study sample, ESRT levels increased with increasing medial-lateral distance; however, there was not a statistically significant effect of electrode type (lateral/straight arrays compared to perimodiolar arrays). ESRTs were not affected by scalar location. DISCUSSION/CONCLUSIONS: The results suggest that key demographic and electrode position characteristics influence the level of ESRTs in adult CI recipients. While ESRTs are widely used to assist with CI programming, underlying factors are not well understood. The significant factors of aging and sex could be due to middle ear mechanics or neural health differences. However, further data are needed to better understand these associations.
Asunto(s)
Umbral Auditivo , Implantación Coclear , Implantes Cocleares , Humanos , Masculino , Femenino , Estudios Retrospectivos , Persona de Mediana Edad , Anciano , Adulto , Umbral Auditivo/fisiología , Anciano de 80 o más Años , Reflejo Acústico/fisiología , Adulto Joven , Factores Sexuales , Factores de EdadRESUMEN
OBJECTIVE: The process of adapting to communicate with a cochlear implant (CI) is complex. The use of auditory training after cochlear implantation may help to facilitate improvements in postoperative speech recognition and quality-of-life outcomes in new adult CI recipients. However, the effectiveness of auditory training remains uncertain and long-term effects have not been examined in a large sample of new adult CI users. As such, the objective of this study was to examine the influence of common forms of auditory training on speech recognition and CI-related quality-of-life (CI-related QOL) outcomes at 1 year after cochlear implantation. We hypothesized that patients who reported use of computer-based auditory training (CBAT) would show improved speech and CIQOL-35 Profile scores at 1 year after activation of their implant, compared with their peers. DESIGN: This study was designed as a prospective study and was undertaken at a tertiary academic CI center. Participants included 114 adults undergoing cochlear implantation for bilateral hearing loss. Patients serially self-reported use of the following types of post-CI auditory training over their first-year postactivation: (1) face-to-face training (e.g., speech-language pathologist), (2) passive home-based training (e.g., listening to audiobooks), and (3) CBAT (e.g., self-directed software). Outcomes measures for this study included change in Consonant-Nucleus-Consonant phoneme (CNCp), CNC word (CNCw), AzBio sentences in quiet, and CIQOL-35 Profile global and domain scores from pre-CI to 12-mo post-CI. RESULTS: Of 114 patients, 94 (82.5%) used one or more auditory training resources. Of these, 19.3% used face-to-face training, 67.5% passive home-based training, and 46.5% CBAT. Of 114 patients, 73 had complete CIQOL data. At 12 mo, only CBAT use was associated with significantly greater improvements in global and all domain-specific CIQOL scores ( d -range = 0.72-0.87), compared with those not using CBAT. Controlling for demographics and use of multiple training resources, CBAT remained the strongest positive predictor of CIQOL improvement, with significant associations with global score (ß = 12.019[4.127,19.9]) and all domain scores at 12-mo post-CI: communication (ß = 11.937[2.456,21.318), emotional (ß = 12.293[1.827,22.759), entertainment (ß = 17.014[5.434,28.774), environment (ß = 13.771[1.814,25.727]), listening effort (ß = 12.523[2.798,22.248]), and social (ß = 18.114[7.403,28.826]). No significant benefits were noted with use of CBAT or any other form of auditory training and speech recognition scores at 12-mo post-CI ( d -range = -0.12-0.22). CONCLUSIONS: Auditory training with CBAT was associated with improved CI-related QOL outcomes at 12-mo post-CI. Given its availability and low cost, this study provides evidence to support using CBAT to improve real-world functional abilities in new adult CI recipients.
Asunto(s)
Implantación Coclear , Implantes Cocleares , Calidad de Vida , Percepción del Habla , Humanos , Masculino , Femenino , Persona de Mediana Edad , Anciano , Adulto , Estudios Prospectivos , Pérdida Auditiva Bilateral/rehabilitación , Anciano de 80 o más Años , Terapia Asistida por ComputadorRESUMEN
OBJECTIVE: To retrospectively compare frequency-place mismatch among adult cochlear implant (CI) recipients with lateral wall (LW) and perimodiolar/Mid Scala (PM/MS) arrays, and to quantify the impact of these factors on early post-activation (3 months) speech recognition abilities and CI-specific quality of life. METHODS: One hundred and twenty-six adult participants were separated into two groups: (1) 83 participants who underwent CI with a PM/MS array and 43 patients who underwent CI with a LW array. All participants completed the Cochlear Implant Quality of Life Profile (CIQOL-35 Profile) instrument. Angular insertion depth and semitone mismatch, which contribute to frequency-place mismatch, were assessed using post-operative CT scans. Word and speech recognition in quiet were determined using the Consonant-Nucleus-Consonant (CNC) and the AzBio tests, respectively (n = 82 patients). RESULTS: LW arrays were more deeply inserted and exhibited less semitone mismatch compared to PM/MS arrays. No significant relationship was found between semitone mismatch and early post-operative speech perception scores for either PM/MS or LW arrays. However, greater degrees of semitone mismatch were associated with lower CIQOL-35 profile scores for PM/MS arrays. CONCLUSIONS AND RELEVANCE: The results of this study indicate that both the degree of frequency-place mismatch, and its impact on CI-specific quality of life, vary by CI array design. LEVEL OF EVIDENCE: 4 Laryngoscope, 134:2898-2905, 2024.
Asunto(s)
Implantación Coclear , Implantes Cocleares , Calidad de Vida , Percepción del Habla , Humanos , Percepción del Habla/fisiología , Femenino , Masculino , Persona de Mediana Edad , Estudios Retrospectivos , Anciano , AdultoRESUMEN
Objective: to review evidence on the efficacy of auditory training in adult cochlear implant recipients. Data Sources: PRISMA guidelines for a systematic review of the literature were followed. PubMed, Scopus, and CINAHL databases were queried on 29 June 2023 for terms involving cochlear implantation and auditory training. Studies were limited to the English language and adult patient populations. Study Selection: Three authors independently reviewed publications for inclusion in the review based on a priori inclusion and exclusion criteria. Inclusion criteria encompassed adult cochlear implant populations, an analysis of clinician- or patient-directed auditory training, and an analysis of one or more measures of speech recognition and/or patient-reported outcome. Exclusion criteria included studies with only pediatric implant populations, music or localization training in isolation, and single-sample case studies. Data Extraction: The data were collected regarding study design, patient population, auditory training modality, auditory training timing, speech outcomes, and data on the durability of outcomes. A quality assessment of the literature was performed using a quality metric adapted from the Grading of Recommendations Assessment, Development, and Evaluation (GRADE) Working Group guidelines. Data Synthesis and Meta-Analysis: Data were qualitatively summarized for 23 studies. All but four studies demonstrated significant improvement in at least one measured or patient-reported outcome measure with training. For 11 studies with sufficient data reporting, pre-intervention and post-intervention pooled means of different outcome measures were compared for 132 patients using meta-analysis. Patient-direct training was associated with significant improvement in vowel-phoneme recognition and speech recognition in noise (p < 0.05 and p < 0.001, respectively), and clinician-directed training showed significant improvement in sentence recognition in noise (p < 0.001). Conclusions: The literature on auditory training for adult cochlear implant recipients is limited and heterogeneous, including a small number of studies with limited levels of evidence and external validity. However, the current evidence suggests that auditory training can improve speech recognition in adult cochlear implant recipients.
RESUMEN
OBJECTIVE: Determine associations expected and actual cochlear implant (CI) outcomes, decisional regret, and satisfaction in experienced adult CI users. STUDY DESIGN: Cross-sectional cohort study. SETTING: Tertiary medical center. PATIENTS: Thirty-nine adult CI users meeting traditional bilateral hearing loss indications with ≥12 months CI experience. INTERVENTIONS/MAIN OUTCOME MEASURES: Patients completed the validated Satisfaction with Amplification in Daily Living and Decisional Regret instruments. Pre- and post-CI outcomes (CI Quality of Life [CIQOL]-Expectations; CIQOL-35 Profile; CNC words, AzBio Sentences) were obtained from a prospectively maintained clinical database. RESULTS: Using established cutoff scores, 29% of patients reported a substantial degree of post-CI decisional regret. For each CIQOL domain, patients without decisional regret obtained post-CI outcome scores closer to pre-CI expectations compared with patients with decisional regret ( d = 0.34 to 0.91); similar results were observed with higher CI user satisfaction ( d = 0.17-0.83). Notably, the degree of pre- to post-CI improvement in CNC or AzBio scores did not differ between patients with and without decisional regret or with lower and higher satisfaction. Finally, greater pre-/postimprovement in CIQOL-35 Profile domain scores demonstrated far stronger associations with lower decisional regret and higher satisfaction than changes in speech recognition scores. CONCLUSIONS: Patients with better alignment of their pre-CI expectations and post-CI outcomes and greater pre-/post-CIQOL improvement had lower decisional regret and higher satisfaction. This emphasizes the importance of evidence-based pre-CI counseling regarding real-world CI benefits and caution against assuming that improvements in speech recognition are related to patient satisfaction.
Asunto(s)
Implantación Coclear , Implantes Cocleares , Percepción del Habla , Humanos , Adulto , Calidad de Vida , Estudios Transversales , Emociones , Resultado del TratamientoRESUMEN
Importance: Several ophthalmic diseases disproportionately affect racial and ethnic minority patients, yet most clinical trials struggle to enroll cohorts that are demographically representative of disease burden; some barriers to recruitment include time and transportation, language and cultural differences, and fear and mistrust of research due to historical abuses. Incorporating diversity within the research team has been proposed as a method to increase trust and improve engagement among potential study participants. Objective: To examine how demographic factors of potential research participants and personnel may be associated with patient consent rates to participate in prospective ophthalmic clinical studies. Design, Setting, and Participants: This retrospective cohort study included patients from an urban, academic hospital who were approached for consent to participate in prospective ophthalmic clinical studies conducted between January 2015 and December 2021. Main Outcomes and Measures: Multivariable logistic regression assessing associations between patient and research personnel demographics and rates of affirmative consent to participate was used. Results: In total, 1380 patients (mean [SD] age, 58.6 [14.9] years; 50.3% male) who were approached for consent to participate in 10 prospective ophthalmic clinical studies were included. Of prospective patients, 566 (43.5%) were Black; 327 (25.1%), Hispanic or Latino; 373 (28.6%), White; 36 (2.8%), other race and ethnicity; and 78 (5.8%) declined to answer. Black patients (odds ratio [OR], 0.32; 95% CI, 0.24-0.44; P < .001) and Hispanic or Latino patients (OR, 0.31; 95% CI, 0.20-0.47; P < .001) were less likely to consent compared with White patients. Patients with lower socioeconomic status were less likely to consent than patients with higher socioeconomic status (OR, 0.43; 95% CI, 0.33-0.53; P < .001). Concordance between patient and research staff race and ethnicity was associated with increased odds of affirmative consent (OR, 2.72; 95% CI, 1.99-3.73; P < .001). Conclusions and Relevance: In this cohort study, patients from underrepresented racial and ethnic groups and those with lower socioeconomic status were less likely to participate in ophthalmic clinical studies. Concordance of race and ethnicity between patients and research staff was associated with improved participant enrollment. These findings underscore the importance of increasing diversity in clinical research teams to improve racial and ethnic representation in clinical studies.
Asunto(s)
Etnicidad , Grupos Minoritarios , Humanos , Masculino , Persona de Mediana Edad , Femenino , Estudios de Cohortes , Estudios Prospectivos , Estudios RetrospectivosRESUMEN
Objective Although the purpose of community eye screening programs is to reduce health care disparities, the effectiveness of these programs is limited by the follow-up adherence of their participants. The aim of this review is to investigate factors that may promote or hinder participants from attending follow-up ophthalmological exams after community eye screenings and identify interventions to increase follow-up rates. Methods For literature review, PubMed, Web of Science, Embase, Proquest/Global Health Library, and Google Scholar databases were searched to identify studies of community eye screenings published between January 2000 and May 2023. Data from these articles were analyzed to identify barriers and facilitators of follow-up adherence after community eye screenings in the United States and to examine strategies used to increase follow-up rates. Only published manuscripts were included. We excluded studies of school screenings and clinic-based screenings. Results A total of 28 articles were included. Follow-up rates ranged from 12.5 to 89%. Nineteen articles reviewed facilitators and barriers to follow-up. Eighteen articles were non interventional and seven (see Table 1 and 2 , respectively) articles described interventions that were tested to improve follow-up rates after screening. Interventions included prescheduled appointments, transportation assistance, patient education, and patient navigators. Conclusion Several interventions are promising to increase follow-up adherence in community eye screenings, but more evidence is needed. Future research should focus on randomized trials of isolated interventions to improve follow-up adherence of disadvantaged populations, although this may be limited given ethical considerations and documented lack of follow-up after screening.
RESUMEN
PURPOSE: To explore the geographic variability of the epidemiology of pediatric uveitis, which, although rare in children, carries a significant risk of morbidity. METHODS: This was a retrospective review conducted at two tertiary referral centers in Buenos Aires, Argentina. Demographic and clinical data of patients younger than 16 years diagnosed as having uveitis between January 1, 2006 and October 1, 2014 were collected. RESULTS: A total of 257 patients (380 eyes) were included in the study. Cases tended to be unilateral (134, 52.1%), granulomatous (146, 56.8%), and localized to the posterior segment (121, 47.1%). Toxoplasmosis was the most common etiology (98, 38.1%). DISCUSSION: The spectrum of pediatric uveitis in Buenos Aires most closely resembles that of Colombia. Understanding these geographic variations is important to aid providers who are caring for children in an increasingly globalized world. [J Pediatr Ophthalmol Strabismus. 20XX;X(X):XX-XX.].
RESUMEN
Importance: Most ovarian cancers originate in the fimbriated end of the fallopian tube. This has led to the hypothesis that surgical resection of the fallopian tubes at the time of gynecologic and nongynecologic surgical procedures-referred to as an opportunistic salpingectomy-may prevent the development of epithelial ovarian cancer for women at an average risk of developing the disease. Objective: To compile a comprehensive, state-of-the-science review examining the current landscape of performing bilateral salpingectomy for ovarian cancer prevention. Evidence Review: A systematic review of the literature was performed on March 4, 2022, to identify studies examining salpingectomy for ovarian cancer prevention. This review was performed according to the Preferred Reporting Items for Systematic Reviews and Meta-analyses (PRISMA) 2020 statement. Four databases were selected: PubMed via the National Library of Medicine's PubMed.gov, Embase via Elsevier's Embase.com, Cochrane Central Register of Controlled Trials (CENTRAL) via Wiley's Cochrane Library, and Northern Light Life Sciences Conference Abstracts via Ovid. A total of 20 gray literature sources, including 1 database, 2 registers, 1 repository, 1 index, 1 archive, 1 preprint server, 1 agency, and 12 organizations, were also searched. Findings: The initial search produced 1089 results; a total of 158 publications were included in the final review. Salpingectomy has been associated with ovarian cancer risk reduction of approximately 80%. Studies have demonstrated that salpingectomy was safe, cost-effective, and was not associated with an earlier age of menopause onset. With widespread implementation, salpingectomy has the potential to reduce ovarian cancer mortality in the US by an estimated 15%. Both physician and patient awareness regarding the adnexa as the origin for most ovarian cancers, as well as the existence of salpingectomy and its potential benefits in reducing ovarian cancer risk, has increased during the past decade. Raising awareness and developing effective implementation strategies are essential. Conclusions and Relevance: The results of this systematic review suggest that bilateral salpingectomy for ovarian cancer prevention was safe and feasible and has the potential to be a cost-effective and cost-saving strategy across the population. Prospective studies to demonstrate long-term survival outcomes and feasibility in nongynecologic surgical procedures are warranted.
Asunto(s)
Neoplasias Ováricas , Femenino , Humanos , Estudios Prospectivos , Neoplasias Ováricas/prevención & control , Neoplasias Ováricas/cirugía , Salpingectomía/métodos , Histerectomía/métodos , Prevención PrimariaRESUMEN
Lacticaseibacillus paracasei Lpc-37 (Lpc-37) has previously shown to reduce perceived stress in healthy adults. The ChillEx study investigated whether Lpc-37 reduces stress in a model of chronic examination stress in healthy students. One hundred ninety university students (18-40 y) were randomized to take 1.56 × 1010 colony-forming units of Lpc-37 or placebo (1:1) each day for 10 weeks, in a triple-blind, parallel, multicenter clinical trial consisting of six visits: two screening visits, a baseline visit, and visits at 4, 8, and 10 weeks after baseline. The primary objective was to demonstrate that Lpc-37 reduces stress, as measured by the change in state anxiety from baseline to just before the first examination, after 8 weeks using the State Trait Anxiety Inventory (STAI-state). Secondary objectives aimed to demonstrate that Lpc-37 modulates psychological stress-induced symptoms and biomarkers related to mood and sleep. An exploratory analysis of fecal microbiota composition was also conducted. There was no difference between Lpc-37 and placebo groups in the change of STAI-state score (estimate 1.03; 95% confidence interval [CI]: -1.62, 3.67; p = 0.446). None of the secondary outcomes resulted in significant results when corrected for multiplicity, but exploratory results were notable. Results showed an improvement in sleep-disturbance scores (odds ratio 0.30; 95% CI: 0.11, 0.82; p = 0.020) and reduction in duration of sleep (odds ratio 3.52; 95% CI: 1.46, 8.54; p = 0.005) on the Pittsburgh Sleep Quality Index questionnaire after 8 weeks in the Lpc-37 group compared to placebo. A reduction in Bond Lader VAS-alertness was also demonstrated in the Lpc-37 group compared to placebo (estimate -3.97; 95% CI: -7.78, -0.15; p = 0.042) just prior to the examination. Analysis of fecal microbiota found no differences between study groups for alpha and beta diversity or microbiota abundance. Adverse events were similar between groups. Vital signs, safety-related laboratory measures, and gastrointestinal parameters were stable during the trial. In conclusion, probiotic Lpc-37 was safe but had no effect on stress, mood, or anxiety in healthy university students in this model of chronic academic stress. ClinicalTrials.gov: NCT04125810.
RESUMEN
BACKGROUND: Inorganic nitrate (NO3-) supplementation is purported to benefit short-term exercise performance, but it is unclear whether NO3- improves longer-term exercise training responses (such as improvements in VO2peak or time to exhaustion (TTE)) versus exercise training alone. The purpose of this systematic review and meta-analysis was to determine the effects of NO3- supplementation combined with exercise training on VO2peak and TTE, and to identify potential factors that may impact outcomes. METHODS: Electronic databases (PubMed, Medscape, and Web of Science) were searched for articles published through June 2022 with article inclusion determined a priori as: (1) randomized placebo-controlled trials, (2) exercise training lasted at least three weeks, (3) treatment groups received identical exercise training, (4) treatment groups had matched VO2peak at baseline. Study quality was assessed using the Cochrane Risk-of-Bias 2 tool. Standardized mean difference (SMD) with 95% confidence intervals (CI) were calculated using restricted maximum likelihood estimation between pre- and post-training differences in outcomes. Moderator subgroup and meta-regression analyses were completed to determine whether the overall effect was influenced by age, sex, NO3- dosage, baseline VO2peak, health status, NO3- administration route, and training conditions. RESULTS: Nine studies consisting of eleven trials were included: n = 228 (72 females); age = 37.7 ± 21 years; VO2peak: 40 ± 18 ml/kg/min. NO3- supplementation did not enhance exercise training with respect to VO2peak (SMD: 0.18; 95% CI: -0.09, 0.44; p = 0.19) or TTE (SMD: 0.08; 95% CI: - 0.21, 0.37; p = 0.58). No significant moderators were revealed on either outcome. Subset analysis on healthy participants who consumed beetroot juice (BRJ) revealed stronger trends for NO3- improving VO2peak (p = 0.08) compared with TTE (p = 0.19), with no significant moderators. Sunset funnel plot revealed low statistical power in all trials. CONCLUSIONS: NO3- supplementation combined with exercise training may not enhance exercise outcomes such as VO2peak or TTE. A trend for greater improvement in VO2peak in healthy participants supplemented with BRJ may exist (p = 0.08). Overall, future studies in this area need increased sample sizes, more unified methodologies, longer training interventions, and examination of sex as a biological variable to strengthen conclusions.
RESUMEN
BACKGROUND: Ghrelin is an orexigenic hormone primarily released by the stomach and has 2 isoforms: acylated ghrelin (AG) and de-acylated ghrelin (DAG), that appear to have different functions in humans. OBJECTIVES: To perform a systematic review and meta-analysis of the association between plasma concentrations of total ghrelin (TG), AG, and DAG and perceptions of hunger in healthy adults. METHODS: The following criteria were used for inclusion: 1) sample contained adults ≥18 y of age, 2) body mass index [BMI kg/m2] was ≥18.5, 3) ghrelin was sampled through blood, 4) subjective hunger was measured on a validated scale, 5) study reported a Pearson product correlation of ghrelin or had relevant figure(s) for data extraction, 6) participants were healthy with no overt disease, 7) protocols contained no physical activity or weight loss medication that suppressed appetite, 8) interventions were conducted without environmental manipulations. Moderators assessed were age, BMI, percentage of body fat (%BF), macronutrient content of test meals, energy intake (kcals), sex, and ghrelin isoform (AG, DAG, or TG). RESULTS: The analysis included 47 studies (110 trials, n = 1799, age: 31.4 ± 12.0 y, BMI: 26.0 ± 4.75 kg/m2) and measured AG (n = 47 trials), DAG (n = 12 trials), and TG (n = 51 trials). The overall model indicated that ghrelin concentrations and perceptions of hunger were moderately correlated (r = 0.43, P < 0.001), and ghrelin isoform significantly moderated this relationship (AG: r = 0.60, P < 0.001; TG: r = 0.215, P = 0.01; DAG: r = 0.53, P = 0.695). Other significant moderators included age (b = -0.02, P = 0.01), BMI (b = -0.03, P = 0.05), %BF (b = -0.03, P = 0.05), energy intake (b = 0.0003, P = 0.04), and percentage of carbohydrates of test meals (b = 0.008, P = 0.05). CONCLUSIONS: Ghrelin is associated with perceptions of hunger in humans, and this relationship is strengthened when AG is isolated; thus, AG may have a large impact on hunger signals in various populations. Future research should attempt to understand the role of DAG in hunger sensations.
Asunto(s)
Ghrelina , Hambre , Adulto , Humanos , Adulto Joven , Preescolar , Ingestión de Energía , Índice de Masa Corporal , Percepción , ApetitoRESUMEN
Importance: Polymicrogyria is the most commonly diagnosed cortical malformation and is associated with neurodevelopmental sequelae including epilepsy, motor abnormalities, and cognitive deficits. Polymicrogyria frequently co-occurs with other brain malformations or as part of syndromic diseases. Past studies of polymicrogyria have defined heterogeneous genetic and nongenetic causes but have explained only a small fraction of cases. Objective: To survey germline genetic causes of polymicrogyria in a large cohort and to consider novel polymicrogyria gene associations. Design, Setting, and Participants: This genetic association study analyzed panel sequencing and exome sequencing of accrued DNA samples from a retrospective cohort of families with members with polymicrogyria. Samples were accrued over more than 20 years (1994 to 2020), and sequencing occurred in 2 stages: panel sequencing (June 2015 to January 2016) and whole-exome sequencing (September 2019 to March 2020). Individuals seen at multiple clinical sites for neurological complaints found to have polymicrogyria on neuroimaging, then referred to the research team by evaluating clinicians, were included in the study. Targeted next-generation sequencing and/or exome sequencing were performed on probands (and available parents and siblings) from 284 families with individuals who had isolated polymicrogyria or polymicrogyria as part of a clinical syndrome and no genetic diagnosis at time of referral from clinic, with sequencing from 275 families passing quality control. Main Outcomes and Measures: The number of families in whom genetic sequencing yielded a molecular diagnosis that explained the polymicrogyria in the family. Secondarily, the relative frequency of different genetic causes of polymicrogyria and whether specific genetic causes were associated with co-occurring head size changes were also analyzed. Results: In 32.7% (90 of 275) of polymicrogyria-affected families, genetic variants were identified that provided satisfactory molecular explanations. Known genes most frequently implicated by polymicrogyria-associated variants in this cohort were PIK3R2, TUBB2B, COL4A1, and SCN3A. Six candidate novel polymicrogyria genes were identified or confirmed: de novo missense variants in PANX1, QRICH1, and SCN2A and compound heterozygous variants in TMEM161B, KIF26A, and MAN2C1, each with consistent genotype-phenotype relationships in multiple families. Conclusions and Relevance: This study's findings reveal a higher than previously recognized rate of identifiable genetic causes, specifically of channelopathies, in individuals with polymicrogyria and support the utility of exome sequencing for families affected with polymicrogyria.
Asunto(s)
Polimicrogiria , Humanos , Polimicrogiria/diagnóstico por imagen , Polimicrogiria/genética , Secuenciación del Exoma , Estudios Retrospectivos , Mutación Missense , Hermanos , Proteínas del Tejido Nervioso/genética , Conexinas/genéticaRESUMEN
OBJECTIVES: To analyse the effects of auditory environments on receptive and expressive language outcomes in children with a CI. DESIGN: A single-institution retrospective review was performed. The auditory environments included Speech-Noise, Speech-Quiet, Quiet, Music, and Noise. Hearing Hour Percentage (HHP) and percent total hours were calculated for each environment. Generalised Linear Mixed Models (GLMM) analyses were used to study the effects of auditory environments on PLS Receptive and Expressive scores. STUDY SAMPLE: Thirty-nine children with CI. RESULTS: On GLMM, an increase in Quiet HHP and Quiet percent total hours were positively associated with PLS Receptive scores. Speech-Quiet, Quiet, and Music HHP were positively associated with PLS Expressive scores, with only Quiet being significant for percent total hours. In contrast, percent total hours of Speech-Noise and Noise had a significant negative association with PLS Expressive scores. CONCLUSIONS: This study suggests that more time spent in a quiet auditory environment positively influences PLS Receptive and Expressive scores and that more time listening to speech in quiet and music positively influences PLS Expressive scores. Time spent in environments recognised as Speech-Noise and Noise might negatively impact a child's expressive language outcomes with a CI. Future research is needed to better understand this association.