Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 1.011
Filter
1.
Circ Heart Fail ; : e011741, 2024 Aug 01.
Article in English | MEDLINE | ID: mdl-39087365

ABSTRACT

BACKGROUND: More women of childbearing age are surviving after heart transplantation (HT), many of whom have a desire to become pregnant. Limited data exist evaluating patients' perspectives, receipt of counseling, and knowledge surrounding contraception, pregnancy, breastfeeding, and medication safety after HT. METHODS: We conducted a voluntary, confidential, web-based cross-sectional survey of women who were childbearing age (defined as 18-45 years) at the time of HT. Transplants occurred between January 2005 and January 2020. Surveys were conducted across 6 high-volume HT centers in the United States. RESULTS: There were 64 responses from women who were of childbearing age at the time of HT. Twenty-five women (39.1%) were pregnant before HT, and 6 (9.4%) women reported at least 1 pregnancy post-transplant. Fifty-three percent (n=34) reported they did not receive enough information on post-HT pregnancy before listing for HT, and 26% (n=16) did not discuss their ability to become pregnant with their care team before proceeding with HT. Following HT, 44% (n=28) still felt that they had not received enough information regarding pregnancy. The majority of women (n=49, 77%) had discussed contraception to prevent unplanned pregnancy with their transplant team. Twenty percent (n=13) reported that pregnancy was never safe after transplantation based on the information they had received from their transplant providers. CONCLUSIONS: Many women feel they are not receiving adequate counseling with regard to posttransplant reproductive health. This survey highlights an opportunity to improve both provider education and patient communication to better support women with HT desiring posttransplant pregnancy.

2.
Indian J Microbiol ; 64(2): 593-602, 2024 Jun.
Article in English | MEDLINE | ID: mdl-39011007

ABSTRACT

Seaweed, a valuable marine resource widely cultivated worldwide, can be vulnerable to stress and microbiome alterations, resulting in the decay of seaweeds and substantial economic losses. To investigate the seaweed-microbiome interaction, our study aimed to isolate marine bacteria and fungi that can cause Ice-Ice disease and evaluate their enzymatic characteristics for potential application in bioethanol production from seaweed biomass. Three red seaweed species (Gracilaria edulis, Kappaphycus alvarezii, and Eucheuma cottonii) were obtained for our study and placed in separate culture tanks. Among the 18 isolated marine microbial species, 12 tested positive for agar and carrageenan activity: six exhibited both activities, three displayed only agar activity, and three only carrageenan activity. DNA sequencing of the positive microbes identified ten bacteria and two yeast species. The 3,5-Dinitrosalicylic acid (DNSA) assay results revealed that the identified bacterial Caldibacillus kokeshiiformis strain FJAT-47861 exhibited the highest carrageenase activity (0.76 units/ml), while the yeast Pichia fermentans strain PM79 demonstrated the highest agarase activity (0.52 units/ml). Notably, Pichia fermentans strain PM79 exhibited the highest overall agarase and carrageenase activity, averaging 0.63 units/ml. The average carrageenase activity of all six positive microbes was 1.5 times higher than their agarase activity. These findings suggest that the 12 isolated microbes hold potential for bioethanol production from macroalgae, as their agarase and carrageenase activity indicates their ability to break down seaweed cell wall carbohydrates, causing ice-ice disease. Moreover, these results provide exciting prospects for harnessing the bioconversion capabilities of these microbes, paving the way for sustainable and efficient bioethanol production from seaweed resources. Supplementary Information: The online version contains supplementary material available at 10.1007/s12088-024-01205-w.

3.
Article in English | MEDLINE | ID: mdl-38950666

ABSTRACT

BACKGROUND: Prior studies have shown reduced development of cardiac allograft vasculopathy (CAV) in multiorgan transplant recipients. The aim of this study was to compare the incidence of CAV between isolated heart transplants and simultaneous multiorgan heart transplants in the contemporary era. METHODS: We utilized the Scientific Registry of Transplant Recipients to perform a retrospective analysis of first-time adult heart transplant recipients between January 1, 2010 and December 31, 2019 in the United States. The primary end-point was the development of angiographic CAV within 5 years of follow-up. RESULTS: Among 20,591 patients included in the analysis, 1,279 (6%) underwent multiorgan heart transplantation (70% heart-kidney, 16% heart-liver, 13% heart-lung, and 1% triple-organ), and 19,312 (94%) were isolated heart transplant recipients. The average age was 53 years, and 74% were male. There were no significant between-group differences in cold ischemic time. The incidence of acute rejection during the first year after transplant was significantly lower in the multiorgan group (18% vs 33%, p < 0.01). The 5-year incidence of CAV was 33% in the isolated heart group and 27% in the multiorgan group (p < 0.0001); differences in CAV incidence were seen as early as 1 year after transplant and persisted over time. In multivariable analysis, multiorgan heart transplant recipients had a significantly lower likelihood of CAV at 5 years (hazard ratio = 0.76, 95% confidence interval: 0.66-0.88, p < 0.01). CONCLUSIONS: Simultaneous multiorgan heart transplantation is associated with a significantly lower long-term risk of angiographic CAV compared with isolated heart transplantation in the contemporary era.

4.
Cureus ; 16(6): e62109, 2024 Jun.
Article in English | MEDLINE | ID: mdl-38993438

ABSTRACT

Introduction The basic principle of a sensory adaptive dental environment is that an individual's sensory experiences have a significant impact on their emotional and psychological well-being. Taste, smell, touch, hearing, and sight are the five basic senses that affect our perception and responses to the environment. The study aimed to assess the effectiveness of a Sensory-Adaptive Dental Environment (SADE) compared with a Regular Dental Environment (RDE) in reducing anxiety, improving behavior, and providing a smooth experience for children undergoing dental treatment. Materials and methods  This parallel-arm pilot study was conducted at the outpatient Department of Pediatric and Preventive Dentistry, Saveetha Dental College and Hospitals, Chennai, from January 2024 to March 2024. A total of 148 children who met the inclusion criteria were divided into two groups: Group I (intervention group) received SADE or MSE (Multi-Sensory Environment) intervention, while Group 2 (control group) underwent dental treatments in a Regular Dental Environment (RDE). Patient behavior was assessed using Frankl's behavior rating scale, and anxiety levels were measured using Ayesha's Oddbodd anxiety scale. Additionally, heart rate and oxygen saturation (SpO2) were evaluated using a pulse oximeter. Statistical analysis was conducted using IBM SPSS Statistics for Windows, Version 26.0 (IBM Corp., Armonk, NY), with significance set at a p-value less than 0.05. Results Before the procedure, there were no notable differences in behavior or anxiety levels. However, after the procedure, children undergoing treatment under SADE resulted in markedly improved behavior and notably lower anxiety levels. Also, this correlated with reduced anxiety levels, indicated by lower heart rates and higher oxygen saturation levels. Conclusion The study concluded that there were notable differences in patient experiences between SADE and RDE. After their dental procedures, participants in the SADE group were found to behave better and feel less nervous. Still, in the conventional setting, only improved behavior was noted, with no significant difference in anxiety levels. Overall, our study suggests that dental offices can significantly enhance patient experiences by providing a sensory-friendly setting that helps children feel more at ease, improves patient outcomes, and less nervous during their visits.

5.
Alzheimers Dement ; 2024 Jul 05.
Article in English | MEDLINE | ID: mdl-38967283

ABSTRACT

INTRODUCTION: Microtubule (MT) stability is crucial for proper neuronal function. Understanding MT dysregulation is critical for connecting amyloid beta (Aß) and tau-based degenerative events and early changes in presymptomatic Alzheimer's disease (AD). Herein we present positron emission tomography (PET) imaging properties of our MT-PET radiotracer, [11C]MPC-6827, in multiple established AD mouse models. METHODS: Longitudinal PET, biodistribution, autoradiography, immunohistochemistry, and behavioral studies were conducted at multiple time points in APPswe/PSEN1dE9 (APP/PS1), P301S-PS19 (P301S), 5xFAD, and age-matched control mice. RESULTS: Longitudinal [11C]MPC-6827 brain imaging showed significant increases in APP/PS1, P301S, and 5xFAD mice compared to controls. Longitudinal MT-PET correlated positively with biodistribution, autoradiography, and immunohistochemistry results and negatively with behavior data. DISCUSSION: Our study demonstrated significant longitudinal [11C]MPC-6827 PET increases in multiple AD mouse models for the first time. Strong correlations between PET and biomarker data underscored the interplay of MT destabilization, amyloid, and tau pathology in AD. These results suggest [11C]MPC-6827 PET as a promising tool for monitoring MT dysregulation early in AD progression. HIGHLIGHTS: Longitudinal positron emission tomography (PET) imaging studies using [11C]MPC-6827 in multiple established Alzheimer's disease (AD) mouse models revealed an early onset of microtubule dysregulation, with significant changes in brain radiotracer uptake evident from 2 to 4 months of age. Intra-group analysis showed a progressive increase in microtubule dysregulation with increasing AD burden, supported by significant correlations between PET imaging data and biodistribution, autoradiography, and molecular pathological markers. [11C]MPC-6827 PET imaging demonstrated its efficacy in detecting early microtubule alterations preceding observable behavioral changes in AD mouse models, suggesting its potential for early AD imaging. The inclusion of the 5xFAD mouse model further elucidated the impact of amyloid beta (Aß) toxicity on inducing tau hyperphosphorylation-mediated microtubule dysregulation, highlighting the versatility of [11C]MPC-6827 in delineating various aspects of AD pathology. Our study provides immediate clarity on high uptake of the microtubule-based radiotracer in AD brains in a longitudinal setting, which directly informs clinical utility in Aß/tau-based studies.

6.
Bioorg Med Chem Lett ; 111: 129906, 2024 Jul 25.
Article in English | MEDLINE | ID: mdl-39059565

ABSTRACT

Despite recent advancements in imaging (amyloid-PET & tau-PET) and fluid (Aß42/Aß40 & Aß42/ptau) biomarkers, the current standard for in vivo assessment of AD, diagnosis and prediction of Alzheimer's disease (AD) remains challenging. We demonstrated in nonhuman primates (NHP) that increased plasma and cerebrospinal fluid (CSF) glucose correlated with decreased CSF Aß42 and CSF Aß40, a hallmark of plaque promoting pathogenesis. Together, our findings demonstrate that altered glucose homeostasis and insulin resistance are associated with Aß and amyloid in rodent and NHP models. This warranted further exploration into the dynamics of altered brain metabolism in the NHP model of T2D, cross referenced with CSF and blood-based AD markers. Preliminary dual PET ([11C]acetoacetate ([11C]AcAc) and [18F]fluorodeoxyglucose ([18F]FDG) imaging studies were conducted in an aged cohort of NHPs classified as T2D (n = 5) and pre-diabetic (n = 1) along with corresponding plasma and CSF samples for metabolite analysis. [11C]AcAc and [18F]FDG PET brain standard uptake values (SUV) were highly positively associated (r = 0.88, p = 0.02) in the T2D and pre-diabetic NHPs. Age was not significantly associated with brain SUV (age range 16.5-23.5 years old). Metabolic measures were positively correlated with brain [18F]FDG and CSF Aß42:40 was positively correlated to fasting glucose values. Although our findings suggest moderate correlations, this study further elucidates that peripheral insulin resistance and poor glycemia control alter AD-related pathology, illustrating how T2D is a risk factor for AD.

7.
Sci Rep ; 14(1): 17148, 2024 Jul 26.
Article in English | MEDLINE | ID: mdl-39060369

ABSTRACT

The Internet of Things (IoT) permeates various sectors, including healthcare, smart cities, and agriculture, alongside critical infrastructure management. However, its susceptibility to malware due to limited processing power and security protocols poses significant challenges. Traditional antimalware solutions fall short in combating evolving threats. To address this, the research work developed a feature selection-based classification model. At first stage, a preprocessing stage enhances dataset quality through data smoothing and consistency improvement. Feature selection via the Zebra Optimization Algorithm (ZOA) reduces dimensionality, while a classification phase integrates the Graph Attention Network (GAN), specifically the Dual-channel GAN (DGAN). DGAN incorporates Node Attention Networks and Semantic Attention Networks to capture intricate IoT device interactions and detect anomalous behaviors like botnet activity. The model's accuracy is further boosted by leveraging both structural and semantic data with the Sooty Tern Optimization Algorithm (STOA) for hyperparameter tuning. The proposed STOA-DGAN model achieves an impressive 99.87% accuracy in botnet activity classification, showcasing robustness and reliability compared to existing approaches.

8.
Clin Transplant ; 38(7): e15416, 2024 Jul.
Article in English | MEDLINE | ID: mdl-39058520

ABSTRACT

Cardiac allograft vasculopathy (CAV) is a leading cause of death after heart transplantation (HT). We evaluated donor-derived cell-free DNA (dd-cfDNA) as a noninvasive biomarker of CAV development after HT. The INSPIRE registry at the Intermountain Medical Center was queried for stored plasma samples from HT patients with and without CAV. At Stanford University, HT patients with CAV (cases) and without CAV (controls) were enrolled prospectively, and blood samples were collected. All the samples were analyzed for dd-cfDNA using the AlloSure assay (CareDx, Inc.). CAV was defined per the ISHLT 2010 standardized classification system. Univariate associations between patient demographics and clinical characteristics and their CAV grade were tested using chi-square and Wilcoxon rank sum tests. Associations between their dd-cfDNA levels and CAV grades were examined using a nonparametric Kruskal-Wallis test. A total of 69 pts were included, and 101 samples were analyzed for dd-cfDNA. The mean age at sample collection was 58.6 ± 13.7 years; 66.7% of the patients were male, and 81% were White. CAV 0, 1, 2, and 3 were present in 37.6%, 22.8%, 22.8%, and 16.8% of included samples, respectively. The median dd-cfDNA level was 0.13% (0.06, 0.33). The median dd-cfDNA level was not significantly different between CAV (-) and CAV (+): 0.09% (0.05%-0.32%) and 0.15% (0.07%-0.33%), respectively, p = 0.25 and with similar results across all CAV grades. In our study, dd-cfDNA levels did not correlate with the presence of CAV and did not differ across CAV grades. As such, dd-cfDNA does not appear to be a reliable noninvasive biomarker for CAV surveillance.


Subject(s)
Biomarkers , Cell-Free Nucleic Acids , Heart Transplantation , Tissue Donors , Humans , Male , Female , Heart Transplantation/adverse effects , Cell-Free Nucleic Acids/blood , Middle Aged , Follow-Up Studies , Prospective Studies , Biomarkers/blood , Prognosis , Risk Factors , Allografts , Postoperative Complications/blood , Postoperative Complications/diagnosis , Postoperative Complications/etiology , Case-Control Studies , Graft Rejection/etiology , Graft Rejection/diagnosis , Graft Rejection/blood , Graft Survival , Vascular Diseases/etiology , Vascular Diseases/blood , Adult
11.
Transplant Direct ; 10(7): e1669, 2024 Jul.
Article in English | MEDLINE | ID: mdl-38953039

ABSTRACT

Background: A prior single-center, retrospective cohort study identified baseline lung allograft dysfunction (BLAD) as a risk factor for death in bilateral lung transplant recipients. In this multicenter prospective cohort study, we test the association of BLAD with death in bilateral lung transplant recipients, identify clinical risk factors for BLAD, and assess its association with allograft injury on the molecular level. Methods: This multicenter, prospective cohort study included 173 bilateral lung transplant recipients that underwent serial pulmonary function testing and plasma collection for donor-derived cell-free DNA at prespecified time points. BLAD was defined as failure to achieve ≥80% predicted for both forced expiratory volume in 1 s and forced vital capacity after lung transplant, on 2 consecutive measurements at least 3 mo apart. Results: BLAD was associated with increased risk of death (hazard ratio, 1.97; 95% confidence interval [CI], 1.05-3.69; P = 0.03) but not chronic lung allograft dysfunction alone (hazard ratio, 1.60; 95% CI, 0.87-2.95; P = 0.13). Recipient obesity (odds ratio, 1.69; 95% CI, 1.15-2.80; P = 0.04) and donor age (odds ratio, 1.03; 95% CI, 1.02-1.05; P = 0.004) increased the risk of developing BLAD. Patients with BLAD did not demonstrate higher log10(donor-derived cell-free DNA) levels compared with no BLAD (slope [SE]: -0.0095 [0.0007] versus -0.0109 [0.0007]; P = 0.15). Conclusions: BLAD is associated with an increased risk of death following lung transplantation, representing an important posttransplant outcome with valuable prognostic significance; however, early allograft specific injury on the molecular level does not increase the risk of BLAD, supporting further mechanistic insight into disease pathophysiology.

12.
AJNR Am J Neuroradiol ; 45(8): 1116-1123, 2024 Aug 09.
Article in English | MEDLINE | ID: mdl-39054293

ABSTRACT

BACKGROUND AND PURPOSE: During a season of high school football, adolescents with actively developing brains experience a considerable number of head impacts. Our aim was to determine whether repetitive head impacts in the absence of a clinically diagnosed concussion during a season of high school football produce changes in cognitive performance or functional connectivity of the salience network and its central hub, the dorsal anterior cingulate cortex. MATERIALS AND METHODS: Football players were instrumented with the Head Impact Telemetry System during all practices and games, and the helmet sensor data were used to compute a risk-weighted exposure metric (RWEcp), accounting for the cumulative risk during the season. Participants underwent MRI and a cognitive battery (ImPACT) before and shortly after the football season. A control group of noncontact/limited-contact-sport athletes was formed from 2 cohorts: one from the same school and protocol and another from a separate, nearly identical study. RESULTS: Sixty-three football players and 34 control athletes were included in the cognitive performance analysis. Preseason, the control group scored significantly higher on the ImPACT Visual Motor (P = .04) and Reaction Time composites (P = .006). These differences increased postseason (P = .003, P < .001, respectively). Additionally, the control group had significantly higher postseason scores on the Visual Memory composite (P = .001). Compared with controls, football players showed significantly less improvement in the Verbal (P = .04) and Visual Memory composites (P = .01). A significantly greater percentage of contact athletes had lower-than-expected scores on the Verbal Memory (27% versus 6%), Visual Motor (21% versus 3%), and Reaction Time composites (24% versus 6%). Among football players, a higher RWEcp was significantly associated with greater increments in ImPACT Reaction Time (P = .03) and Total Symptom Scores postseason (P = .006). Fifty-seven football players and 13 control athletes were included in the imaging analyses. Postseason, football players showed significant decreases in interhemispheric connectivity of the dorsal anterior cingulate cortex (P = .026) and within-network connectivity of the salience network (P = .018). These decreases in dorsal anterior cingulate cortex interhemispheric connectivity and within-network connectivity of the salience network were significantly correlated with deteriorating ImPACT Total Symptom (P = .03) and Verbal Memory scores (P = .04). CONCLUSIONS: Head impact exposure during a single season of high school football is negatively associated with cognitive performance and brain network connectivity. Future studies should further characterize these short-term effects and examine their relationship with long-term sequelae.


Subject(s)
Brain Concussion , Football , Magnetic Resonance Imaging , Humans , Adolescent , Male , Football/injuries , Brain Concussion/diagnostic imaging , Brain Concussion/physiopathology , Gyrus Cinguli/diagnostic imaging , Gyrus Cinguli/physiopathology , Cognition/physiology , Head Protective Devices , Athletic Injuries/diagnostic imaging , Athletic Injuries/physiopathology
13.
BJS Open ; 8(3)2024 May 08.
Article in English | MEDLINE | ID: mdl-38758563

ABSTRACT

BACKGROUND: Breast-conserving surgery with adjuvant radiotherapy and mastectomy are currently offered as equivalent surgical options for early-stage breast cancer based on RCTs from the 1970s and 1980s. However, the treatment of breast cancer has evolved and recent observational studies suggest a survival advantage for breast-conserving surgery with adjuvant radiotherapy. A systematic review and meta-analysis was undertaken to summarize the contemporary evidence regarding survival after breast-conserving surgery with adjuvant radiotherapy versus mastectomy for women with early-stage breast cancer. METHODS: A systematic search of MEDLINE, the Cochrane Central Register of Controlled Trials (CENTRAL), and Embase that identified studies published between 1 January 2000 and 18 December 2023 comparing overall survival after breast-conserving surgery with adjuvant radiotherapy versus mastectomy for patients with unilateral stage 1-3 breast cancer was undertaken. The main exclusion criteria were studies evaluating neoadjuvant chemotherapy, rare breast cancer subtypes, and specific breast cancer populations. The ROBINS-I tool was used to assess risk of bias, with the overall certainty of evidence assessed using the Grading of Recommendations, Assessment, Development, and Evaluation (GRADE) tool. Studies without critical risk of bias were included in a quantitative meta-analysis. RESULTS: From 11 750 abstracts, 108 eligible articles were identified, with one article including two studies; 29 studies were excluded from the meta-analysis due to an overall critical risk of bias, 42 studies were excluded due to overlapping study populations, and three studies were excluded due to reporting incompatible results. A total of 35 observational studies reported survival outcomes for 909 077 patients (362 390 patients undergoing mastectomy and 546 687 patients undergoing breast-conserving surgery with adjuvant radiotherapy). The pooled HR was 0.72 (95% c.i. 0.68 to 0.75, P < 0.001), demonstrating improved overall survival for patients undergoing breast-conserving surgery with adjuvant radiotherapy. The overall certainty of the evidence was very low. CONCLUSION: This meta-analysis provides evidence suggesting a survival advantage for women undergoing breast-conserving surgery with adjuvant radiotherapy for early-stage breast cancer compared with mastectomy. Although these results should be interpreted with caution, they should be shared with patients to support informed surgical decision-making.


Subject(s)
Breast Neoplasms , Mastectomy, Segmental , Humans , Radiotherapy, Adjuvant , Female , Breast Neoplasms/radiotherapy , Breast Neoplasms/mortality , Breast Neoplasms/surgery , Breast Neoplasms/pathology , Neoplasm Staging , Mastectomy
15.
Alzheimers Dement ; 20(6): 4159-4173, 2024 06.
Article in English | MEDLINE | ID: mdl-38747525

ABSTRACT

INTRODUCTION: We evaluated associations between plasma and neuroimaging-derived biomarkers of Alzheimer's disease and related dementias and the impact of health-related comorbidities. METHODS: We examined plasma biomarkers (neurofilament light chain, glial fibrillary acidic protein, amyloid beta [Aß] 42/40, phosphorylated tau 181) and neuroimaging measures of amyloid deposition (Aß-positron emission tomography [PET]), total brain volume, white matter hyperintensity volume, diffusion-weighted fractional anisotropy, and neurite orientation dispersion and density imaging free water. Participants were adjudicated as cognitively unimpaired (CU; N = 299), mild cognitive impairment (MCI; N = 192), or dementia (DEM; N = 65). Biomarkers were compared across groups stratified by diagnosis, sex, race, and APOE ε4 carrier status. General linear models examined plasma-imaging associations before and after adjusting for demographics (age, sex, race, education), APOE ε4 status, medications, diagnosis, and other factors (estimated glomerular filtration rate [eGFR], body mass index [BMI]). RESULTS: Plasma biomarkers differed across diagnostic groups (DEM > MCI > CU), were altered in Aß-PET-positive individuals, and were associated with poorer brain health and kidney function. DISCUSSION: eGFR and BMI did not substantially impact associations between plasma and neuroimaging biomarkers. HIGHLIGHTS: Plasma biomarkers differ across diagnostic groups (DEM > MCI > CU) and are altered in Aß-PET-positive individuals. Altered plasma biomarker levels are associated with poorer brain health and kidney function. Plasma and neuroimaging biomarker associations are largely independent of comorbidities.


Subject(s)
Alzheimer Disease , Amyloid beta-Peptides , Biomarkers , Magnetic Resonance Imaging , Positron-Emission Tomography , Humans , Male , Female , Biomarkers/blood , Aged , Alzheimer Disease/blood , Alzheimer Disease/diagnostic imaging , Amyloid beta-Peptides/blood , Comorbidity , Brain/diagnostic imaging , Brain/pathology , Dementia/blood , Dementia/diagnostic imaging , tau Proteins/blood , Cohort Studies , Independent Living , Cognitive Dysfunction/blood , Cognitive Dysfunction/diagnostic imaging , Middle Aged , Neuroimaging
17.
Nature ; 629(8014): 1082-1090, 2024 May.
Article in English | MEDLINE | ID: mdl-38750354

ABSTRACT

Cell types with specialized functions fundamentally regulate animal behaviour, and yet the genetic mechanisms that underlie the emergence of novel cell types and their consequences for behaviour are not well understood1. Here we show that the monogamous oldfield mouse (Peromyscus polionotus) has recently evolved a novel cell type in the adrenal gland that expresses the enzyme AKR1C18, which converts progesterone into 20α-hydroxyprogesterone. We then demonstrate that 20α-hydroxyprogesterone is more abundant in oldfield mice, where it induces monogamous-typical parental behaviours, than in the closely related promiscuous deer mice (Peromyscus maniculatus). Using quantitative trait locus mapping in a cross between these species, we ultimately find interspecific genetic variation that drives expression of the nuclear protein GADD45A and the glycoprotein tenascin N, which contribute to the emergence and function of this cell type in oldfield mice. Our results provide an example by which the recent evolution of a new cell type in a gland outside the brain contributes to the evolution of social behaviour.


Subject(s)
Adrenal Glands , Biological Evolution , Paternal Behavior , Peromyscus , Animals , Female , Male , 20-alpha-Dihydroprogesterone/metabolism , Adrenal Glands/cytology , Adrenal Glands/enzymology , Adrenal Glands/metabolism , Estradiol Dehydrogenases/genetics , Estradiol Dehydrogenases/metabolism , GADD45 Proteins/genetics , Genetic Variation , Hybridization, Genetic , Peromyscus/classification , Peromyscus/genetics , Peromyscus/physiology , Progesterone/metabolism , Quantitative Trait Loci , Social Behavior , Tenascin/genetics
18.
J Heart Lung Transplant ; 43(9): 1374-1382, 2024 Sep.
Article in English | MEDLINE | ID: mdl-38705500

ABSTRACT

BACKGROUND: Lung transplant recipients are traditionally monitored with pulmonary function testing (PFT) and lung biopsy to detect post-transplant complications and guide treatment. Plasma donor-derived cell free DNA (dd-cfDNA) is a novel molecular approach of assessing allograft injury, including subclinical allograft dysfunction. The aim of this study was to determine if episodes of extreme molecular injury (EMI) in lung transplant recipients increases the risk of chronic lung allograft dysfunction (CLAD) or death. METHODS: This multicenter prospective cohort study included 238 lung transplant recipients. Serial plasma samples were collected for dd-cfDNA measurement by shotgun sequencing. EMI was defined as a dd-cfDNA above the third quartile of levels observed for acute rejection (dd-cfDNA level of ≥5% occurring after 45 days post-transplant). EMI was categorized as Secondary if associated with co-existing acute rejection, infection or PFT decline; or Primary if not associated with these conditions. RESULTS: EMI developed in 16% of patients at a median 343.5 (IQR: 177.3-535.5) days post-transplant. Over 50% of EMI episodes were classified as Primary. EMI was associated with an increased risk of severe CLAD or death (HR: 2.78, 95% CI: 1.26-6.22, p = 0.012). The risk remained consistent for the Primary EMI subgroup (HR: 2.34, 95% CI 1.18-4.85, p = 0.015). Time to first EMI episode was a significant predictor of the likelihood of developing CLAD or death (AUC=0.856, 95% CI=0.805-0.908, p < 0.001). CONCLUSIONS: Episodes of EMI in lung transplant recipients are often isolated and may not be detectable with traditional clinical monitoring approaches. EMI is associated with an increased risk of severe CLAD or death, independent of concomitant transplant complications.


Subject(s)
Cell-Free Nucleic Acids , Graft Rejection , Lung Transplantation , Tissue Donors , Humans , Lung Transplantation/adverse effects , Male , Female , Prospective Studies , Middle Aged , Cell-Free Nucleic Acids/blood , Allografts , Chronic Disease , Adult , Primary Graft Dysfunction/blood , Primary Graft Dysfunction/diagnosis , Primary Graft Dysfunction/etiology , Primary Graft Dysfunction/epidemiology , Risk Factors , Follow-Up Studies , Risk Assessment/methods
19.
IEEE J Transl Eng Health Med ; 12: 448-456, 2024.
Article in English | MEDLINE | ID: mdl-38765887

ABSTRACT

OBJECTIVE: Sleep monitoring has extensively utilized electroencephalogram (EEG) data collected from the scalp, yielding very large data repositories and well-trained analysis models. Yet, this wealth of data is lacking for emerging, less intrusive modalities, such as ear-EEG. METHODS AND PROCEDURES: The current study seeks to harness the abundance of open-source scalp EEG datasets by applying models pre-trained on data, either directly or with minimal fine-tuning; this is achieved in the context of effective sleep analysis from ear-EEG data that was recorded using a single in-ear electrode, referenced to the ipsilateral mastoid, and developed in-house as described in our previous work. Unlike previous studies, our research uniquely focuses on an older cohort (17 subjects aged 65-83, mean age 71.8 years, some with health conditions), and employs LightGBM for transfer learning, diverging from previous deep learning approaches. RESULTS: Results show that the initial accuracy of the pre-trained model on ear-EEG was 70.1%, but fine-tuning the model with ear-EEG data improved its classification accuracy to 73.7%. The fine-tuned model exhibited a statistically significant improvement (p < 0.05, dependent t-test) for 10 out of the 13 participants, as reflected by an enhanced average Cohen's kappa score (a statistical measure of inter-rater agreement for categorical items) of 0.639, indicating a stronger agreement between automated and expert classifications of sleep stages. Comparative SHAP value analysis revealed a shift in feature importance for the N3 sleep stage, underscoring the effectiveness of the fine-tuning process. CONCLUSION: Our findings underscore the potential of fine-tuning pre-trained scalp EEG models on ear-EEG data to enhance classification accuracy, particularly within an older population and using feature-based methods for transfer learning. This approach presents a promising avenue for ear-EEG analysis in sleep studies, offering new insights into the applicability of transfer learning across different populations and computational techniques. CLINICAL IMPACT: An enhanced ear-EEG method could be pivotal in remote monitoring settings, allowing for continuous, non-invasive sleep quality assessment in elderly patients with conditions like dementia or sleep apnea.


Subject(s)
Electroencephalography , Scalp , Humans , Electroencephalography/methods , Aged , Scalp/physiology , Aged, 80 and over , Male , Female , Sleep/physiology , Signal Processing, Computer-Assisted , Ear/physiology , Machine Learning , Polysomnography/methods
20.
JCO Precis Oncol ; 8: e2300531, 2024 May.
Article in English | MEDLINE | ID: mdl-38723230

ABSTRACT

PURPOSE: Conventional surveillance methods are poorly sensitive for monitoring appendiceal cancers (AC). This study investigated the utility of circulating tumor DNA (ctDNA) in evaluating systemic therapy response and recurrence after surgery for AC. METHODS: Patients from two specialized centers who underwent tumor-informed ctDNA testing (Signatera) were evaluated to determine the association between systemic therapy and ctDNA detection. In addition, the accuracy of ctDNA detection during surveillance for the diagnosis of recurrence after complete cytoreductive surgery (CRS) for grade 2-3 ACs with peritoneal metastases (PM) was investigated. RESULTS: In this cohort of 94 patients with AC, most had grade 2-3 tumors (84.0%) and PM (84.0%). Fifty patients completed the assay in the presence of identifiable disease, among which ctDNA was detected in 4 of 7 (57.1%), 10 of 16 (62.5%), and 19 of 27 (70.4%) patients with grade 1, 2, and 3 diseases, respectively. Patients who had recently received systemic chemotherapy had ctDNA detected less frequently (7 of 16 [43.8%] v 26 of 34 [76.5%]; odds ratio, 0.22 [95% CI, 0.06 to 0.82]; P = .02). Among 36 patients with complete CRS for grade 2-3 AC-PM, 16 (44.4%) developed recurrence (median follow-up, 19.6 months). ctDNA detection was associated with shorter recurrence-free survival (median 11.3 months v not reached; hazard ratio, 14.1 [95% CI, 1.7 to 113.8]; P = .01) and showed high accuracy for the detection of recurrence (sensitivity 93.8%, specificity 85.0%). ctDNA was more sensitive than carcinoembryonic antigen (62.5%), CA19-9 (25.0%), and CA125 (18.8%) and was the only elevated biomarker in four (25%) patients with recurrence. CONCLUSION: This study revealed a reduced ctDNA detection frequency after systemic therapy and accurate recurrence assessment after CRS. These findings underscore the role of ctDNA as a predictive and prognostic biomarker for grade 2-3 AC-PM management.


Subject(s)
Appendiceal Neoplasms , Circulating Tumor DNA , Humans , Circulating Tumor DNA/blood , Circulating Tumor DNA/genetics , Male , Female , Appendiceal Neoplasms/genetics , Appendiceal Neoplasms/blood , Appendiceal Neoplasms/pathology , Appendiceal Neoplasms/therapy , Appendiceal Neoplasms/drug therapy , Middle Aged , Aged , Adult , Neoplasm Recurrence, Local/blood , Aged, 80 and over
SELECTION OF CITATIONS
SEARCH DETAIL