Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 4.324
Filter
1.
J Leukoc Biol ; 2024 Oct 01.
Article in English | MEDLINE | ID: mdl-39351765

ABSTRACT

Treatment with the toll-like receptor (TLR) 4 agonist monophosphoryl lipid A (MPLA) conditions innate immunocytes to respond robustly to subsequent infection, a phenotype termed innate immune memory. Our published studies show that metabolic reprogramming of macrophages is a prominent feature of the memory phenotype. We undertook studies to define the functional contributions of tricarboxylic acid (TCA) cycle reprogramming to innate immune memory. We observed that priming of wild type (WT) mice with MPLA potently facilitated accumulation of the TCA cycle metabolite itaconate at sites of infection and enhanced microbial clearance. Augmentation of itaconate accumulation and microbial clearance was ablated in immuneresponsive gene 1 (Irg1) -deficient mice. We further observed that MPLA potently induces expression of Irg1 and accumulation of itaconate in macrophages. Compared to WT macrophages, the ability of Irg1-deficient macrophages to kill Pseudomonas aeruginosa was impaired. We further observed that itaconate is directly antimicrobial against P. aeruginosa at pH 5, which is characteristic of the phagolysosome, and is facilitated by reactive oxygen species. MPLA-induced augmentation of glycolysis, oxidative phosphorylation and accumulation of the TCA cycle metabolites succinate and malate was decreased in Irg1 KO macrophages compared to WT controls. RNA sequencing revealed suppressed transcription of genes associated with phagolysosome function and increased expression of genes associated with cytokine production and chemotaxis in Irg1 deficient macrophages. This study identifies a contribution of itaconate to MPLA-induced augmentation of innate antimicrobial immunity via facilitation of microbial killing as well as impact on metabolic and transcriptional adaptations.

2.
J Adolesc ; 2024 Oct 02.
Article in English | MEDLINE | ID: mdl-39358971

ABSTRACT

INTRODUCTION: Attaining social success is a significant concern during early adolescence. The characteristics that youth believe will bring social success are known to change over time and vary across contexts, especially over the transition to middle school. METHODS: The analytic sample included 614 students (52% girls, 48% boys; 53% Black, 47% White) from the Midwestern United States. At yearly intervals during grades 6-8, participants completed self-report surveys assessing their endorsement of five characteristics (sincerity, academic responsibility, dominance, disingenuity, athleticism/attractiveness) that described peers in their grade who have lots of friends and get along well with others (i.e., social success). The sample included students who attended the same school from kindergarten-eighth grade (K8) and students who made a transition from an elementary to a middle school after 6th grade (ESMS). RESULTS: Multigroup longitudinal growth models revealed some concerning trends over time. For both ESMS and K8 students, their endorsement of sincerity decreased, their endorsement of disingenuity increased, and their endorsement of athleticism/attractiveness was high and stable. ESMS students' endorsement of academic responsibility decreased over time and their endorsement of dominance showed increasing trends. K8 students' endorsements of academic responsibility and dominance were stable. However, across contexts, compared to the other characteristics, sincerity was most often ranked the highest. CONCLUSIONS: The findings highlight that some changes in students' beliefs about social success may be unique to students who experience a school transition whereas others may be developmentally normative. Implications for the education of young adolescent students are discussed.

3.
Neurologist ; 2024 Oct 01.
Article in English | MEDLINE | ID: mdl-39353868

ABSTRACT

OBJECTIVE: To report a case of hemicrania continua (HC) and persistent visual aura without infarction in a patient with previous episodic migraine with visual aura, whose persistent aura symptoms improved only after treatment with divalproex sodium. BACKGROUND: Once regarded as highly specific for migraine, visual aura has been associated with trigeminal autonomic cephalalgias, including HC. In previous descriptions of HC and episodes of typical visual aura, the aura occurred exclusively with severe headache exacerbations and, like the pain, resolved with indomethacin. METHODS: Case report and literature review. RESULTS: A 54-year-old man with a history of episodic migraine with visual aura reported a gradual onset of HC with persistent visual aura of 15 months duration. General medical and neurological examinations were normal, including imaging studies. HC's headache responded to indomethacin, while the visual aura was recalcitrant, only improving with oral divalproex sodium treatment. CONCLUSION: As our patient experienced HC, which evolved from episodic migraine, we hypothesize that migraine and HC may share a common pathophysiology. However, the persistence of the visual aura, despite the abolition of pain and autonomic features with a therapeutic dose of indomethacin, and the subsequent successful treatment of the aura with divalproex sodium, suggest that aura and HC headache arise from distinct and dissociable mechanisms.

4.
Birth Defects Res ; 116(10): e2406, 2024 Oct.
Article in English | MEDLINE | ID: mdl-39382014

ABSTRACT

BACKGROUND: Infants with Trisomy 21 are known to have increased incidence congenital anomalies including congenital heart diseases (CHD) and congenital gastrointestinal anomalies. It is not known if there are patterns of coexistence. OBJECTIVES: To examine the coexistence of CHD with various gastrointestinal anomalies in infants with Trisomy 21. METHODS: We assessed a sample of infants with Trisomy 21 from the National Inpatient Sample (NIS), and its KID subversion, produced by the Healthcare Cost and Utilization Project for 2003-2015. We identified CHD using international classification of diseases version 9 (ICD9) and categorized them into four groups: left sided lesions, right sided lesions, conotruncal lesions, and shunt lesions. We identified small intestinal atresia and Hirschsprung disease with ICD9 codes. RESULTS: The sample included 81,561 newborn infants diagnosed with Trisomy 21; 45% of them had CHD; 4.7% had small intestinal atresia, and 1.6% had Hirschsprung disease. All subcategories of CHD were associated with increased incidence of both small intestinal atresia and Hirschsprung disease, p value < 0.05 compared to infants with Trisomy 21 who did not have CHD. CONCLUSIONS: Among infants with Trisomy 21, the presence of CHD increased the odds of a concomitant congenital GI anomaly.


Subject(s)
Down Syndrome , Gastrointestinal Tract , Heart Defects, Congenital , Humans , Down Syndrome/complications , Female , Male , Infant, Newborn , Gastrointestinal Tract/abnormalities , Infant , Incidence , Hirschsprung Disease/complications , Intestinal Atresia
5.
Psychiatry Res ; 342: 116214, 2024 Sep 23.
Article in English | MEDLINE | ID: mdl-39368239

ABSTRACT

INTRODUCTION: Research has established that adverse childhood experiences (ACEs) confer risk for psychiatric diagnoses, and that protective factors moderate this association. Investigation into the effect of protective factors in the relationship between ACEs and internalizing disorders (e.g., depression, anxiety) is limited in high-risk groups. The present study investigated the relationship between ACEs and risk for internalizing disorders in youth at clinical high risk for psychosis (CHR-P) and tests the hypothesis that protective factors moderate this relationship. METHODS: 688 participants aged 12-30 (M = 18; SD = 4.05) meeting criteria for CHR-P were administered measures of child adversity, protective factors (SAVRY), and diagnostic assessment (SCID- 5). Logistic regression tested whether ACEs predicted internalizing disorders. Moderation regression analyses determined whether these associations were weaker in the presence of protective factors. RESULTS & CONCLUSIONS: Higher levels of ACEs predicted history of depressive disorder (ß = 0.26(1.30), p < .001), self-harm/suicide attempts (ß = 0.34(1.40), p < .001), and substance use (ß = 0.14(1.15), p = .04). Childhood sexual abuse (ß = 0.77(2.15), p = .001), emotional neglect (ß = 0.38(1.46), p = .05), and psychological abuse (ß = 0.42(1.52), p = .04), predicted self- harm/suicide attempts. Sexual abuse (ß = 1.00 (2.72), p = .001), and emotional neglect (ß = 0.53(1.71), p = .011), were also linked to depressive disorder. There was no association between ACEs and anxiety disorder, and no moderation effect of protective factors in the relationship between ACEs and psychiatric outcomes. These findings add nuance to a growing literature linking ACEs to psychopathology and highlight the importance of investigation into the mechanisms that may buffer this relationship.

6.
Res Pract Thromb Haemost ; 8(6): 102549, 2024 Aug.
Article in English | MEDLINE | ID: mdl-39403187

ABSTRACT

Background: Venous thromboembolism (VTE) is the third leading cause of preventable hospital-associated (HA) death. Most HA-VTE, including fatal pulmonary emboli, occur among medically ill patients. The rate of symptomatic VTE more than doubles over the first 21 days after hospital discharge. Trials have demonstrated that the burden of HA-VTE may be reduced with postdischarge thromboprophylaxis; however, few patients receive this therapy. We formerly validated the ability of eVTE (eVTE is the abbreviation for a risk assessment tool constituted by 2 calculations: one predicts 90-day VTE and the other predicts 30-day major bleeding derived from only elements of the complete blood count and basic metabolic panel and age) to identify medical patients being discharged with both an elevated risk of VTE and a low risk of bleeding. Objectives: Implement a cluster-randomized, stepped wedge, type II hybrid implementation/effectiveness trial generating an alert among select at-risk patients upon discharge for implementation of thrombosis chemoprophylaxis in a 23-hospital not-for-profit healthcare system. Methods: We use the Reach, Effectiveness, Adoption, Implementation, and Maintenance framework to guide implementation and outcomes reporting. Results: The primary outcome for aim 1 (implementation) is the prescription of rivaroxaban 10 mg daily for 30 days as postdischarge thromboprophylaxis among at-risk patients. The primary efficacy and safety outcomes (effectiveness) are the 90-day composite of symptomatic VTE, myocardial infartcion, nonhemorrhagic stroke, all-cause mortality, and 30-day major bleeding. Conclusion: The eVTE trial will provide high-quality, real-world evidence on the effectiveness and safety of a pragmatic intervention to implement targeted postdischarge thromboprophylaxis using decision support embedded in the electronic health record.

7.
Sci Data ; 11(1): 1121, 2024 Oct 11.
Article in English | MEDLINE | ID: mdl-39394238

ABSTRACT

As renewable energy continues its rapid expansion in the Unites States, multi-decadal hourly datasets of electricity production are needed to asses reliability and resource adequacy of power grids. Recent years have seen the release of grid-cell-level simulated meteorological variables, however these are not extended to the power domain, are not developed from a dynamically consistent numerical weather model, and only cover a historical baseline of less than a decade. To fill this gap, this work provides a dataset of 43 years of coincident plant-level wind and solar power production data. The dataset is designed to be aggregated to appropriate scales of interest for bulk system studies such as Balancing Authorities (BAs), states, and nodes of a production cost model. The dataset covers every plant in the contiguous U.S. that is reported in the U.S. Energy Information Administration (EIA) Form 860 as of 2020. When compared with the EIA-923 monthly generation, we find minimal bias (less than 5%). When compared with BA-reported hourly generation, we find low bias in solar (less than 7%), and slight underdispersion in wind. This coincident multi-decadal historical dataset provides a documented and evaluated multi-resource baseline for studies on reliability, resource adequacy, climate change impacts, and characterization of emergent climate threats on renewable resources.

8.
ATS Sch ; 5(3): 392-407, 2024 Sep 30.
Article in English | MEDLINE | ID: mdl-39371235

ABSTRACT

Background: End-of-life communication skills are vital to high-quality critical care. Patients and families often report deficiencies in end-of-life communication by providers. However, formalized training is difficult to implement and study on a large scale. Furthermore, curricula are often designed with early-stage clinical trainees in mind and are not tailored to advanced clinician learners. Objective: The goal of this pilot study was to explore educational and practical implications of using Multiple Goals Theory (MGT), Communication Quality Analysis (CQA), and communication logs as a three-pronged, reflective communication curriculum for advanced trainees. Methods: We describe design and qualitative evaluation of a novel, pilot, longitudinal curricular intervention for pulmonary and critical care fellows and program directors at a tertiary academic medical center. The 2-year longitudinal communication curriculum incorporates 1) a theoretical framework from communication science (MGT), with 2) a novel training modality of analyzing audio-recorded intensive care unit family meetings (CQA), and 3) written communication logs after an intensive care unit family meeting. Results: The sample included 13 pulmonary and critical care medicine fellows and two program directors. Qualitative thematic analysis was conducted on seven fellow interviews and on 23 communication logs completed. Four themes emerged from interviews: 1) fellows incorporated the skills into real-life practice and found the curriculum useful and valuable; 2) a key takeaway from MGT was the deemphasis of task goals; 3) CQA was an engaging opportunity for self-reflection and learning; and 4) written communication logs were perceived as helpful in theory but too burdensome in practice. Findings from analyses of the communication logs included that most fellows' writing was brief and without substantial reflection. Conclusion: Many scholars have argued that communication theory can impact practice, but few have recognized the potential of theory and methods, such as MGT and CQA, as educational tools. Our findings demonstrate that MGT is a feasible and useful theoretical framework for improving communication skills among advanced trainees, and CQA fosters meaningful self-reflection about practice. Communication logs were not feasible or useful training tools in this context, but CQA workshops helped fulfill the goals of narrative reflection. Next steps are to implement this curriculum in more programs and measure changes in behavior acquisition and clinical care.

9.
Article in English | MEDLINE | ID: mdl-39400905

ABSTRACT

PURPOSE: This study measured associations between ON and OFF functional indicators and structural optical coherence tomography (OCT) and OCT angiography (OCTA) markers in diabetic retinal disease. METHODS: Fifty-four participants with type 1 or type 2 diabetes (mean age = 34.1 years; range 18-60) and 48 age-matched controls (mean age = 35.4 years, range 18-59) underwent visual psychophysical testing, OCT and OCTA retinal imaging. Psychophysical tasks measuring (A) contrast increment and decrement sensitivity and (B) response times to increment and decrement targets were assessed as surrogate measures of ON and OFF retinal ganglion cell function. RESULTS: The group with diabetes had worse foveal contrast increment and decrement thresholds (p = 0.04) and were slower to search for increment and decrement targets relative to controls (p = 0.009). Individuals with diabetes had a less circular foveal avascular zone (FAZ) (p < 0.001) but did not differ from controls in foveal vessel density and FAZ area. Functional and structural outcome measures related to the peripheral retina were also comparable between those with and without diabetes. Functional responses to increments and decrements were not significantly correlated with FAZ circularity or vessel density in individuals with diabetes. CONCLUSIONS: Diabetic retinal disease results in impaired performance on measures of inferred ON and OFF pathway function in addition to vascular deficits measurable with OCTA. Future longitudinal studies may determine the temporal relationship between these deficits, and whether they predict future diabetic retinopathy.

10.
Invest Ophthalmol Vis Sci ; 65(12): 23, 2024 Oct 01.
Article in English | MEDLINE | ID: mdl-39412817

ABSTRACT

Purpose: Palinopsia (persistent afterimages and/or trailing) is a common but poorly understood symptom of the neurological condition visual snow syndrome. This study aimed to collect a phenotypical description of palinopsia in visual snow syndrome and probe for abnormalities in temporal visual processing, hypothesizing that palinopsia could arise from increased visibility of normal afterimage signals or prolonged visible persistence. Methods: Thirty controls and 31 participants with visual snow syndrome (18 with migraine) took part. Participants completed a palinopsia symptom questionnaire. Contrast detection thresholds were measured before and after brief exposure to a spatial grating because deficient contrast adaptation could increase afterimage visibility. Temporal integration and segregation were assessed using missing-element and odd-element tasks, respectively, because prolonged persistence would promote integration at wide temporal offsets. To distinguish the effects of visual snow syndrome from comorbid migraine, 25 people with migraine alone participated in an additional experiment. Results: Palinopsia was common in visual snow syndrome, typically presenting as unformed images that were frequently noticed. Contrary to our hypotheses, we found neither reduced contrast adaptation (F(3.22, 190.21) = 0.71, P = 0.56) nor significantly prolonged temporal integration thresholds (F(1, 59) = 2.35, P = 0.13) in visual snow syndrome. Instead, participants with visual snow syndrome could segregate stimuli in closer succession than controls (F(1, 59) = 4.62, P = 0.04, ηp2 = 0.073) regardless of co-occurring migraine (F(2, 53) = 1.22, P = 0.30). In contrast, individuals with migraine alone exhibited impaired integration (F(2, 53) = 4.44, P = 0.017, ηp2 = 0.14). Conclusions: Although neither deficient contrast adaptation nor prolonged visible persistence explains palinopsia, temporal resolution of spatial cues is enhanced and potentially more flexible in visual snow syndrome.


Subject(s)
Contrast Sensitivity , Phenotype , Humans , Female , Male , Adult , Young Adult , Contrast Sensitivity/physiology , Afterimage/physiology , Vision Disorders/physiopathology , Middle Aged , Syndrome , Sensory Thresholds/physiology , Surveys and Questionnaires , Adolescent , Visual Acuity/physiology , Migraine Disorders/physiopathology , Migraine Disorders/complications , Perceptual Disorders
11.
Br J Anaesth ; 2024 Oct 09.
Article in English | MEDLINE | ID: mdl-39389834

ABSTRACT

BACKGROUND: The accurate diagnosis of heart failure (HF) before major noncardiac surgery is frequently challenging. The impact of diagnostic accuracy for HF on intraoperative practice patterns and clinical outcomes remains unknown. METHODS: We performed an observational study of adult patients undergoing major noncardiac surgery at an academic hospital from 2015 to 2019. A preoperative clinical diagnosis of HF was defined by keywords in the preoperative assessment or a diagnosis code. Medical records of patients with and without HF clinical diagnoses were reviewed by a multispecialty panel of physician experts to develop an adjudicated HF reference standard. The exposure of interest was an adjudicated diagnosis of heart failure. The primary outcome was volume of intraoperative fluid administered. The secondary outcome was postoperative acute kidney injury (AKI). RESULTS: From 40 659 surgeries, a stratified subsample of 1018 patients were reviewed by a physician panel. Among patients with adjudicated diagnoses of HF, those without a clinical diagnosis (false negatives) more commonly had preserved left ventricular ejection fractions and fewer comorbidities. Compared with false negatives, an accurate diagnosis of HF (true positives) was associated with 470 ml (95% confidence interval: 120-830; P=0.009) lower intraoperative fluid administration and lower risk of AKI (adjusted odds ratio:0.39, 95% confidence interval 0.18-0.89). For patients without adjudicated diagnoses of HF, non-HF was not associated with differences in either fluids administered or AKI. CONCLUSIONS: An accurate preoperative diagnosis of heart failure before noncardiac surgery is associated with reduced intraoperative fluid administration and less acute kidney injury. Targeted efforts to improve preoperative diagnostic accuracy for heart failure may improve perioperative outcomes.

12.
Front Plant Sci ; 15: 1440885, 2024.
Article in English | MEDLINE | ID: mdl-39328792

ABSTRACT

Plant cell walls (PCWs) are intricate structures with complex polysaccharides delivered by distinct trafficking routes. Unravelling the intricate trafficking pathways of polysaccharides and proteins involved in PCW biosynthesis is a crucial first step towards understanding the complexities of plant growth and development. This study investigated the feasibility of employing a multi-modal approach that combines transmission electron microscopy (TEM) with molecular-genetic tagging and antibody labelling techniques to differentiate these pathways at the nanoscale. The genetically encoded electron microscopy (EM) tag APEX2 was fused to Arabidopsis thaliana cellulose synthase 6 (AtCESA6) and Nicotiana alata ARABINAN DEFICIENT LIKE 1 (NaARADL1), and these were transiently expressed in Nicotiana benthamiana leaves. APEX2 localization was then combined with immunolabeling using pectin-specific antibodies (JIM5 and JIM7). Our results demonstrate distinct trafficking patterns for AtCESA6 and NaARADL, with AtCESA6 localized primarily to the plasma membrane and vesicles, while NaARADL1 was found in the trans-Golgi network and cytoplasmic vesicles. Pectin epitopes were observed near the plasma membrane, in Golgi-associated vesicles, and in secretory vesicle clusters (SVCs) with both APEX2 constructs. Notably, JIM7 labelling was found in vesicles adjacent to APEX2-AtCESA6 vesicles, suggesting potential co-trafficking. This integrative approach offers a powerful tool for elucidating the dynamic interactions between PCW components at the nanoscale level. The methodology presented here facilitates the precise mapping of protein and polysaccharide trafficking pathways, advancing our understanding of PCW biosynthesis and providing avenues for future research aimed at engineering plant cell walls for various applications.

13.
Photoacoustics ; 40: 100649, 2024 Dec.
Article in English | MEDLINE | ID: mdl-39347465

ABSTRACT

In this study, we demonstrate the potential of the bornite crystal structure (Cu5FeS4) of copper iron sulfide as a second near infrared (NIR-II) photoacoustic (PA) contrast agent. Bornite exhibits comparable dose-dependent biocompatibility to copper sulfide nanoparticles in a cell viability study with HepG2 cells, while exhibiting a 10-fold increase in PA amplitude. In comparison to other benchmark contrast agents at similar mass concentrations, bornite demonstrated a 10× increase in PA amplitude compared to indocyanine green (ICG) and a 5× increase compared to gold nanorods (AuNRs). PA signal was detectable with a light pathlength greater than 5 cm in porcine tissue phantoms at bornite concentrations where in vitro cell viability was maintained. In vivo imaging of mice vasculature resulted in a 2× increase in PA amplitude compared to AuNRs. In summary, bornite is a promising NIR-II contrast agent for deep tissue PA imaging.

14.
PLoS One ; 19(9): e0308964, 2024.
Article in English | MEDLINE | ID: mdl-39331590

ABSTRACT

Understanding the association between initial experimentation with a tobacco product and subsequent patterns of tobacco use among youth is important to informing prevention activities for youth in the US. We conducted an online survey from August to October 2017 among youth aged 13-18 years. The current analysis focused on respondents reporting initial experimentation with any tobacco product (n = 2,022). Using multinomial logistic regression, we examined the association between first tobacco product tried (cigarettes; cigars including cigarillos, little cigars, and bidis; electronic nicotine delivery systems (ENDS); smokeless and chewing tobacco; or hookah) with subsequent patterns of tobacco use while adjusting for covariates. Of the youth who experimented, 56.8% were non-current tobacco users. Of current tobacco users (n = 934), 13% were exclusive ENDS users, 5.3% exclusive combustible mono-users, 13.4% ENDS plus combustible poly-users, 3.3% combustible product only poly-users, and 8.2% other tobacco poly-users. The most common type of first tobacco product tried was ENDS (44.7%), followed by cigarettes (35.0%) and cigars (8.6%). Those who experimented with combustible tobacco products were less likely to be exclusive ENDS users [Relative Risk Ratio (RRR) = 0.46; 95% CI = 0.28, 0.73 for cigarettes; RRR = 0.32; 95% CI = 0.13, 0.81 for cigars; and RRR = 0.33; 95% CI = 0.14, 0.79 for hookah] when compared to non-current tobacco users (reference group). Tobacco product choices for initial experimentation appear to play a role in subsequent tobacco use patterns among youth. Understanding the reasons behind initial product choice may inform our understanding regarding the reasons for subsequent current tobacco product use, thus informing youth prevention efforts.


Subject(s)
Tobacco Products , Tobacco Use , Humans , Adolescent , Male , Female , United States/epidemiology , Tobacco Use/epidemiology , Tobacco Products/statistics & numerical data , Electronic Nicotine Delivery Systems/statistics & numerical data , Surveys and Questionnaires , Tobacco, Smokeless/statistics & numerical data
15.
Nicotine Tob Res ; 2024 Sep 05.
Article in English | MEDLINE | ID: mdl-39234858

ABSTRACT

INTRODUCTION: The use of cigars for blunts (i.e., cannabis rolled in cigar paper) is well-documented; prevalence of cigar and blunt use and associated characteristics are less studied. METHODS: Pooled data from the 2015-2019 National Survey on Drug Use and Health (NSDUH) were analyzed in 2023. Respondents aged 12+ who reported past 30-day cigar use were categorized into three mutually exclusive use categories: (1) exclusively cigars, (2) exclusively blunts, and (3) both cigars and blunts. We examined associations between cigar-blunt use categories and sociodemographic characteristics. RESULTS: Among respondents aged 12+ who reported past 30-day cigar use, 48.6% (95% CI=47.6-49.6) reported exclusive cigar use; 44.3% (95% CI=43.3-45.3) reported exclusive blunt use; and 7.2% (95% CI=6.8-7.6) reported cigars and blunts. The prevalence differed by age, with exclusive blunt use most prevalent among youth (72.5% [95% CI=70.7-74.3]) and young adults (62.4% [95% CI=61.4-63.5]), and exclusive cigar use most prevalent among adults 26+ (61.2% [95% CI=59.8-62.5]). Exclusive blunt users smoked more days in the past month (17.5; 95% CI=16.8-18.2), compared to 13.8 days (95% CI=13.2-14.4) for cigar and blunt users, and 7.7 days (95% CI=7.5-8.0) for exclusive cigar users. There were significant differences in sociodemographic characteristics, with female (41.6%; 95% CI=40.3-42.9) and Hispanic (18.2%; 95% CI=17.3-19.2) participants more likely to report exclusive blunt use. CONCLUSIONS: Exclusive blunt use was the most prevalent pattern of past-30-day cigar use among youth and young adults. Those who use cigars as blunts smoked more cigars per month, suggesting this may be an important group for education and policy efforts. IMPLICATIONS: Studies that aggregate cigars and blunts into one group may limit potentially meaningful subgroup risk profiles. Additionally, when assessing cigar use, particularly among youth and young adults, it is important to consider blunt use to avoid missing youth who exclusively use cigars for blunts and may not consider blunts as cigar products. Accurate measurement may better inform tobacco and cannabis regulatory actions. Finally, given the high prevalence of blunt use among youth and young adults identified in the present study, additional education efforts may be warranted for this population to reduce long-term risks.

16.
Invest Ophthalmol Vis Sci ; 65(11): 44, 2024 Sep 03.
Article in English | MEDLINE | ID: mdl-39330986

ABSTRACT

Purpose: During the non-attack period, people with migraine may show retinal dysfunction. This study builds on previous work by exploring the possibility of foveal and non-foveal visual field and electroretinographic deficits and determining the overlap in eccentricity of such localized visual deficits in people with migraine. Methods: Visual fields and multifocal electroretinography (mf-ERG) were tested in 27 people with migraine (aged 19-45 years) and 18 non-headache controls (aged 20-46 years). Data were averaged according to 5 concentric rings at < 1.5 degrees (foveal) and 5 degrees, 10 degrees, 15.5 degrees, and 22 degrees eccentricities (non-foveal). Linear mixed effects modelling was used to predict mf-ERG amplitude, mf-ERG peak time, and visual field sensitivity with fixed effects of eye, group, and eccentricity. Results: Foveal mf-ERG responses, and visual field sensitivity across all eccentricities (foveal and non-foveal) were similar between the migraine and control groups (P > 0.05). In contrast, the non-foveal mf-ERG was reduced in amplitude in people with migraine relative to controls (P < 0.001), and this group difference depended on eccentricity (P < 0.001) - most prominently, in the parafoveal region (ring 2, P = 0.001). Conclusions: Retinal electrophysiological deficits were observed in people with migraine in the parafoveal region (between 1.5 degrees and 5 degrees eccentricity), without corresponding visual field deficits. This suggests a spatially localized area of retinal neuronal dysfunction in people with migraine that is insufficient to manifest as a visual field sensitivity loss using standard perimetric methods. Our study highlights the added confound of migraine when conducting standard clinical retinal electrophysiological tests for conditions such as glaucoma, particularly non-foveally.


Subject(s)
Electroretinography , Migraine Disorders , Retina , Visual Fields , Humans , Adult , Electroretinography/methods , Migraine Disorders/physiopathology , Visual Fields/physiology , Male , Middle Aged , Female , Young Adult , Retina/physiopathology , Visual Field Tests/methods , Retinal Diseases/physiopathology , Retinal Diseases/diagnosis , Vision Disorders/physiopathology
17.
Genes (Basel) ; 15(9)2024 Sep 10.
Article in English | MEDLINE | ID: mdl-39336779

ABSTRACT

BACKGROUND/OBJECTIVES: α-1 antitrypsin (AAT) deficiency is an inherited, genetic condition characterized by reduced serum levels of AAT and increased risk of developing emphysema and liver disease. AAT is normally synthesized primarily in the liver, but muscle-targeting with a recombinant adeno-associated virus (rAAV) vector for α-1 antitrypsin (AAT) gene therapy has been used to minimize liver exposure to the virus and hepatotoxicity. Clinical trials of direct intramuscular (IM) administration of rAAV1-hAAT have demonstrated its overall safety and transgene expression for 5 years. However, the failure to reach the therapeutic target level after 100 large-volume (1.5 mL) IM injections of maximally concentrated vector led us to pursue a muscle-targeting approach using isolated limb perfusion. This targets the rAAV to a greater muscle mass and allows for a higher total volume (and thereby a higher dose) than is tolerable by multiple direct IM injections. Limb perfusion has been shown to be feasible in non-human primates using the rAAV1 serotype and a ubiquitous promoter expressing an epitope-tagged AAT matched to the host species. METHODS: In this study, we performed a biodistribution and preclinical safety study in non-human primates with a clinical candidate rAAV1-human AAT (hAAT) vector at doses ranging from 3.0 × 1012 to 1.3 × 1013 vg/kg, bracketing those used in our clinical trials. RESULTS: We found that limb perfusion delivery of rAAV1-hAAT was safe and showed a biodistribution pattern similar to previous studies. However, serum levels of AAT obtained with high-dose limb perfusion still reached only ~50% of the target serum levels. CONCLUSIONS: Our results suggest that clinically effective AAT gene therapy may ultimately require delivery at doses between 3.5 × 1013-1 × 1014 vg/kg, which is within the dose range used for approved rAAV gene therapies. Muscle-targeting strategies could be incorporated when delivering systemic administration of high-dose rAAV gene therapies to increase transduction of muscle tissues and reduce the burden on the liver, especially in diseases that can present with hepatotoxicity such as AAT deficiency.


Subject(s)
Dependovirus , Genetic Therapy , Genetic Vectors , alpha 1-Antitrypsin Deficiency , alpha 1-Antitrypsin , Animals , alpha 1-Antitrypsin/genetics , alpha 1-Antitrypsin/administration & dosage , Dependovirus/genetics , Genetic Vectors/administration & dosage , Genetic Vectors/genetics , Genetic Therapy/methods , alpha 1-Antitrypsin Deficiency/therapy , alpha 1-Antitrypsin Deficiency/genetics , Humans , Male , Muscle, Skeletal/metabolism
18.
bioRxiv ; 2024 Sep 21.
Article in English | MEDLINE | ID: mdl-39345544

ABSTRACT

α/ß T cells are key players in adaptive immunity. The specificity of T cells is determined by the sequences of the hypervariable T cell receptor (TCR) α and ß chains. Although bulk TCR sequencing offers a cost-effective approach for in-depth TCR repertoire profiling, it does not provide chain pairings, which are essential for determining T cell specificity. In contrast, single-cell TCR sequencing technologies produce paired chain data, but are limited in throughput to thousands of cells and are cost-prohibitive for cohort-scale studies. Here, we present TIRTL-seq (Throughput-Intensive Rapid TCR Library sequencing), a novel approach that generates ready-to-sequence TCR libraries from live cells in less than 7 hours. The protocol is optimized for use with non-contact liquid handlers in an automation-friendly 384-well plate format. Reaction volume miniaturization reduces library preparation costs to <$0.50 per well. The core principle of TIRTL-seq is the parallel generation of hundreds of libraries providing multiple biological replicates from a single sample that allows precise inference of both frequencies of individual clones and TCR chain pairings from well-occurrence patterns. We demonstrate scalability of our approach up to 1 million unique paired αßTCR clonotypes corresponding to over 30 million T cells per sample at a cost of less than $2000. For a sample of 10 million cells the cost is ~$200. We benchmarked TIRTL-seq against state-of-the-art 5'RACE bulk TCR-seq and 10x Genomics Chromium technologies on longitudinal samples. We show that TIRTL-seq is able to quantitatively identify expanding and contracting clonotypes between timepoints while providing accurate TCR chain pairings, including distinct temporal dynamics of SARS-CoV-2-specific and EBV-specific CD8+ T cell responses after infection. While clonal expansion was followed by sharp contraction for SARS-CoV-2 specific TCRs, EBV-specific TCRs remained stable once established. The sequences of both α and ß TCR chains are essential for determining T cell specificity. As the field moves towards greater applications in diagnostics and immunotherapy that rely on TCR specificity, we anticipate that our scalable paired TCR sequencing methodology will be instrumental for collecting large paired-chain datasets and ultimately extracting therapeutically relevant information from the TCR repertoire.

19.
Microbiol Spectr ; : e0424623, 2024 Sep 30.
Article in English | MEDLINE | ID: mdl-39345232

ABSTRACT

Wastewater discharge is a global threat to freshwater resources. Streams, in particular, are receiving waterbodies that are directly impacted chemically and biologically due to effluent discharge. However, it is largely unknown how wastewater serves as a subsidy or a stressor to aquatic biodiversity, particularly microbiota, over space. Nutrient-diffusing substrata (NDS) were deployed; NDS release nutrients through diffusion into the water column into a wastewater-dependent stream across three reaches. We used N, P, and N + P treatments for the measurement of single nutrient and co-nutrient limitation, and a no-nutrient control. Both algal and total biofilm biomass was measured and the 16S ribosomal RNA genes via targeted amplicon sequencing was used to assess bacterial/archaeal community diversity. Data indicated that total organic matter in biofilms differs spatially with the greatest organic matter (OM) concentrations in the confluence downstream of wastewater inputs. Biofilm OM concentrations were greatest in P and N + P treatments in the confluence site relative to control or N-only treatments. This indicates heterotrophic microbial communities-likely bacteria that dominate stream biofilms-are P-limited in this ecosystem even with upstream wastewater inputs. In conjunction, bacteria/archaeal communities differed the greatest among nutrient treatments versus spatially and had several indicator taxa belonging to Flavobacterium spp. in N treatments relative to controls. Collectively with historical water quality data, we conclude that this wastewater-fed stream is primarily N-enriched but potentially P-limited, which results in significant shifts in biofilm bacterial communities and likely their overall biomass in this urban watershed. IMPORTANCE: Streams in arid and semi-arid biomes are often dependent on their flow from municipal sources, such as wastewater effluent. However, wastewater has been shown to contain high concentrations of nutrients and chemical pollutants that can potentially harm aquatic ecosystems and their biota. Understanding if and the type of microorganisms that respond to pollution sources, specifically effluent from wastewater treatment facilities, in regions where flow is predominantly from treatment facilities, is critical for developing a predictive monitoring approach for eutrophication or other ecological degradation states for freshwaters.

20.
medRxiv ; 2024 Sep 12.
Article in English | MEDLINE | ID: mdl-39314951

ABSTRACT

Objective: Antidepressants are commonly prescribed medications in the United States, however, factors underlying response are poorly understood. Electronic health records (EHRs) provide a cost-effective way to create and test response algorithms on large, longitudinal cohorts. We describe a new antidepressant response algorithm, validation in two independent EHR databases, and genetic associations with antidepressant response. Method: We deployed the algorithm in EHRs at Vanderbilt University Medical Center (VUMC), the All of Us Research Program, and the Mass General Brigham Healthcare System (MGB) and validated response outcomes with patient health questionnaire (PHQ) scores. In a meta-analysis across all sites, worse antidepressant response associated with higher PHQ-8 scores (beta = 0.20, p-value = 1.09 × 10-18). Results: We used polygenic scores to investigate the relationship between genetic liability of psychiatric disorders and response to first antidepressant trial across VUMC and MGB. After controlling for depression diagnosis, higher polygenic scores for depression, schizophrenia, bipolar, and cross-disorders associated with poorer response to the first antidepressant trial (depression: p-value = 2.84 × 10-8, OR = 1.07; schizophrenia: p-value = 5.93 × 10-4, OR = 1.05; bipolar: p-value = 1.99 × 10-3, OR = 1.04; cross-disorders: p-value = 1.03 × 10-3, OR = 1.05). Conclusions: Overall, we demonstrate our antidepressant response algorithm can be deployed across multiple EHR systems to increase sample size of genetic and epidemiologic studies of antidepressant response.

SELECTION OF CITATIONS
SEARCH DETAIL