Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 348
Filtrar
1.
Curr Med Chem ; 2024 Oct 24.
Artículo en Inglés | MEDLINE | ID: mdl-39449339

RESUMEN

Signs and symptoms that persist or worsen beyond the "acute COVID-19" stage are referred to as long-COVID. These patients are more likely to suffer from multiple organ failure, readmission, and mortality. According to a recent theory, long-lasting COVID-19 symptoms may be caused by abnormal autonomic nervous system (ANS) activity, such as hypovolemia, brain stem involvement, and autoimmune reactions. Furthermore, COVID-19 can also cause impaired fertility in women, which may also be related to inflammation and immune responses. Currently, few treatments are available for long-COVID symptoms. This article reviews the major effects of COVID-19 on the nervous system and female fertility, as well as offers potential treatment approaches.

2.
Biomed Phys Eng Express ; 10(6)2024 Sep 20.
Artículo en Inglés | MEDLINE | ID: mdl-39111323

RESUMEN

Periodic discharges (PDs) are pathologic patterns of epileptiform discharges repeating at regular intervals, commonly detected in the human electroencephalogram (EEG) signals in patients who are critically ill. The frequency and spatial extent of PDs are associated with the tendency of PDs to cause brain injury, existing automated algorithms do not quantify the frequency and spatial extent of PDs. The present study presents an algorithm for quantifying frequency and spatial extent of PDs. The algorithm quantifies the evolution of these parameters within a short (10-14 second) window, with a focus on lateralized and generalized periodic discharges. We test our algorithm on 300 'easy', 300 'medium', and 240 'hard' examples (840 total epochs) of periodic discharges as quantified by interrater consensus from human experts when analyzing the given EEG epochs. We observe 95.0% agreement with a 95% confidence interval (CI) of [94.9%, 95.1%] between algorithm outputs with reviewer clincal judgement for easy examples, 92.0% agreement (95% CI [91.9%, 92.2%]) for medium examples, and 90.4% agreement (95% CI [90.3%, 90.6%]) for hard examples. The algorithm is also computationally efficient and is able to run in 0.385 ± 0.038 seconds for a single epoch using our provided implementation of the algorithm. The results demonstrate the algorithm's effectiveness in quantifying these discharges and provide a standardized and efficient approach for PD quantification as compared to existing manual approaches.


Asunto(s)
Algoritmos , Electroencefalografía , Procesamiento de Señales Asistido por Computador , Humanos , Electroencefalografía/métodos , Epilepsia/fisiopatología , Encéfalo , Automatización , Reproducibilidad de los Resultados
3.
Platelets ; 35(1): 2364748, 2024 Dec.
Artículo en Inglés | MEDLINE | ID: mdl-39115322

RESUMEN

Currently, the standard treatment for patients who have undergone percutaneous coronary intervention (PCI) following acute myocardial infarction (MI) involves dual antiplatelet therapy (DAPT) with a combination of aspirin and a potent P2Y12 receptor inhibitor. However, the potential benefits of aspirin were partially constrained by the intolerance of some patients. The safety and efficacy of indobufen, an alternative antiplatelet agents to aspirin, in patients with AMI after PCI are yet to be thoroughly investigated.This retrospective study was conducted at a single center and utilized propensity score matching. The enrollment spanned from January 2019 to June 2022, incorporating patients with AMI after PCI. The participants were categorized into two groups based on discharged prescriptions: the aspirin DAPT group and the indobufen DAPT group. The primary endpoint focused on net adverse clinical event (NACE), defined as a composite outcome, including cardiac death, recurrence of MI, definite or probable stent thrombosis (ST), target lesion revascularization (TLR), ischemic stroke and Bleeding Academic Research Consortium (BARC) criteria type 2, 3, or 5. All the patients underwent a one-year follow-up period.A total of 1451 patients were enrolled in this study, with 258 assigned to the indobufen DAPT group and 1193 to the aspirin DAPT group. Following 1:1 propensity score matching, 224 patients were retained in each group. In the indobufen DAPT group, 58 individuals (25.9%) experienced the primary endpoint within one year, compared to 52 individuals (23.2%) in the aspirin DAPT group (HR 1.128, 95% CI 0.776-1.639, p = .527). Specifically, no significant differences were observed in either the efficacy endpoint (MACCE, 20.1% vs. 14.7%, HR 1.392, 95% CI 0.893-2.170, p = .146) or the safety endpoint (BARC 2,3 or 5, 8.04% vs. 10.30%, HR 0.779, p = .427). These findings remained consistent at 1, 3, or 6 months. Additionally, the incidence of gastrointestinal symptoms were significantly lower in indobufen DAPT group compared to the aspirin DAPT group (7.1% vs. 14.3%, p = .022).Our research reveals that the efficacy and safety of indobufen are comparable to aspirin in Chinese patients with AMI following PCI. Given the potential advantages of indobufen in alleviating gastrointestinal symptoms, we propose it as a viable alternative for individuals intolerant to aspirin.


What is the context? Currently, the standard treatment for patients who have undergone percutaneous coronary intervention following acute myocardial infarction involves dual antiplatelet therapy with a combination of aspirin and a potent P2Y12 receptor inhibitor.However, the potential benefits of aspirin were partially constrained by the intolerance of some patients.The safety and efficacy of indobufen, an alternative antiplatelet agents to aspirin, in patients with AMI after PCI are yet to be thoroughly investigated.What is new? While both American and European clinical guidelines recommend the use of indobufen as an alternative treatment for patients who cannot tolerate aspirin, there exists a limited body of research on this subject.Our research is the first to address this gap by comparing the efficacy and safety of indobufen and aspirin in patients with AMI.Our research reveals that the efficacy and safety of indobufen are comparable to aspirin in Chinese patients with AMI following PCI. Given the potential advantages of indobufen in alleviating gastrointestinal symptoms, we propose it as a viable alternative for individuals intolerant to aspirin.What is the impact? These findings might pave the way for further exploration of alternatives to aspirin in patients with AMI.


Asunto(s)
Aspirina , Clopidogrel , Infarto del Miocardio , Intervención Coronaria Percutánea , Humanos , Intervención Coronaria Percutánea/métodos , Aspirina/uso terapéutico , Masculino , Femenino , Clopidogrel/uso terapéutico , Persona de Mediana Edad , Estudios Retrospectivos , Inhibidores de Agregación Plaquetaria/uso terapéutico , Inhibidores de Agregación Plaquetaria/farmacología , Anciano , Resultado del Tratamiento , Quimioterapia Combinada/métodos
4.
Artículo en Inglés | MEDLINE | ID: mdl-38992486

RESUMEN

BACKGROUND: Morphological awareness (MA) deficit is strongly associated with Chinese developmental dyslexia (DD). However, little is known about the white matter substrates underlying the MA deficit in Chinese children with DD. METHODS: In the current study, 34 Chinese children with DD and 42 typical developmental (TD) children were recruited to complete a diffusion magnetic resonance imaging scan and cognitive tests for MA. We conducted linear regression to test the correlation between MA and DTI metrics, the structural abnormalities of the tracts related to MA, and the interaction effect of DTI metrics by group on MA. RESULTS: First, MA was significant related to the right inferior occipito-frontal fascicle (IFO) and inferior longitudinal fsciculus (ILF), the bilateral thalamo-occipital (T_OCC) and the left arcuate fasciculus (AF); second, compared to TD children, Chinese children with DD had lower axial diffusivity (AD) in the right IFO and T_OCC; third, there were significant interactions between metrics (fractional anisotropy (FA) and radial diffusivity (RD)) of the right IFO and MA in groups. The FA and RD of the right IFO were significantly associated with MA in children with DD but not in TD children. CONCLUSION: In conclusion, compared to TD children, Chinese children with DD had axonal degeneration not only in the ventral tract (the right IFO) but also the visuospatial tract (the right T_OCC) which were associated with their MA deficit. And Chinese MA involved not only the ventral tracts, but also the visuospatial pathway and dorsal tracts.


Asunto(s)
Imagen de Difusión Tensora , Dislexia , Sustancia Blanca , Humanos , Dislexia/diagnóstico por imagen , Dislexia/patología , Masculino , Femenino , Sustancia Blanca/diagnóstico por imagen , Sustancia Blanca/patología , Niño , Concienciación , China , Imagen de Difusión por Resonancia Magnética , Pruebas Neuropsicológicas , Anisotropía , Pueblos del Este de Asia
5.
Life (Basel) ; 14(7)2024 Jul 08.
Artículo en Inglés | MEDLINE | ID: mdl-39063609

RESUMEN

BACKGROUNDS: Sleep disturbances are prevalent among elderly individuals. While polysomnography (PSG) serves as the gold standard for sleep monitoring, its extensive setup and data analysis procedures impose significant costs and time constraints, thereby restricting the long-term application within the general public. Our laboratory introduced an innovative biomarker, utilizing artificial intelligence algorithms applied to PSG data to estimate brain age (BA), a metric validated in cohorts with cognitive impairments. Nevertheless, the potential of exercise, which has been a recognized means of enhancing sleep quality in middle-aged and older adults to reduce BA, remains undetermined. METHODS: We conducted an exploratory study to evaluate whether 12 weeks of moderate-intensity exercise can improve cognitive function, sleep quality, and the brain age index (BAI), a biomarker computed from overnight sleep electroencephalogram (EEG), in physically inactive middle-aged and older adults. Home wearable devices were used to monitor heart rate and overnight sleep EEG over this period. The NIH Toolbox Cognition Battery, in-lab overnight polysomnography, cardiopulmonary exercise testing, and a multiplex cytokines assay were employed to compare pre- and post-exercise brain health, exercise capacity, and plasma proteins. RESULTS: In total, 26 participants completed the initial assessment and exercise program, and 24 completed all procedures. Data are presented as mean [lower 95% CI of mean, upper 95% CI of mean]. Participants significantly increased maximal oxygen consumption (Pre: 21.11 [18.98, 23.23], Post 22.39 [20.09, 24.68], mL/kg/min; effect size: -0.33) and decreased resting heart rate (Pre: 66.66 [63.62, 67.38], Post: 65.13 [64.25, 66.93], bpm; effect size: -0.02) and sleeping heart rate (Pre: 64.55 [61.87, 667.23], Post: 62.93 [60.78, 65.09], bpm; effect size: -0.15). Total cognitive performance (Pre: 111.1 [107.6, 114.6], Post: 115.2 [111.9, 118.5]; effect size: 0.49) was significantly improved. No significant differences were seen in BAI or measures of sleep macro- and micro-architecture. Plasma IL-4 (Pre: 0.24 [0.18, 0.3], Post: 0.33 [0.24, 0.42], pg/mL; effect size: 0.49) was elevated, while IL-8 (Pre: 5.5 [4.45, 6.55], Post: 4.3 [3.66, 5], pg/mL; effect size: -0.57) was reduced. CONCLUSIONS: Cognitive function was improved by a 12-week moderate-intensity exercise program in physically inactive middle-aged and older adults, as were aerobic fitness (VO2max) and plasma cytokine profiles. However, we found no measurable effects on sleep architecture or BAI. It remains to be seen whether a study with a larger sample size and more intensive or more prolonged exercise exposure can demonstrate a beneficial effect on sleep quality and brain age.

6.
Int J Ophthalmol ; 17(6): 1007-1017, 2024.
Artículo en Inglés | MEDLINE | ID: mdl-38895685

RESUMEN

AIM: To identify genetic defects in a Chinese family with congenital posterior polar cataracts and assess the pathogenicity. METHODS: A four-generation Chinese family affected with autosomal dominant congenital cataract was recruited. Nineteen individuals took part in this study including 5 affected and 14 unaffected individuals. Sanger sequencing targeted hot-spot regions of 27 congenital cataract-causing genes for variant discovery. The pathogenicity of the variant was evaluated by the guidelines of American College of Medical Genetics and InterVar software. Confocal microscopy was applied to detect the subcellular localization of fluorescence-labeled ephrin type-A receptor 2 (EPHA2). Co-immunoprecipitation assay was implemented to estimate the interaction between EphA2 and other lens membrane proteins. The mRNA and protein expression were analyzed by reverse transcription-polymerase chain reaction (qRT-PCR) and Western blotting assay, respectively. The cell migration was analyzed by wound healing assay. Zebrafish model was generated by ectopic expression of human EPHA2/p.R957P mutant to demonstrate whether the mutant could cause lens opacity in vivo. RESULTS: A novel missense and pathogenic variant c.2870G>C was identified in the sterile alpha motif (SAM) domain of EPHA2. Functional studies demonstrated the variant's impact: reduced EPHA2 protein expression, altered subcellular localization, and disrupted interactions with other lens membrane proteins. This mutant notably enhanced human lens epithelial cell migration, and induced a central cloudy region and roughness in zebrafish lenses with ectopic expression of human EPHA2/p.R957P mutant under differential interference contrast (DIC) optics. CONCLUSION: Novel pathogenic c.2870G>C variant of EPHA2 in a Chinese congenital cataract family contributes to disease pathogenesis.

7.
Clin Appl Thromb Hemost ; 30: 10760296241262789, 2024.
Artículo en Inglés | MEDLINE | ID: mdl-38870349

RESUMEN

BACKGROUND: Aspirin is a widely used antiplatelet medication to prevent blood clots, reducing the risk of cardiovascular event. Healthcare providers need to be mindful of the risk of aspirin-induced bleeding and carefully balancing its benefits against potential risks. The objective of this study was to create a practical nomogram for predicting bleeding risk in patients with a history of myocardial infarction treating with aspirin. METHODS: A total of 2099 myocardial infarction patients with aspirin were enrolled. The patients were randomly divided into two groups, with a 7:3 ratio, for model development and internal validation. Boruta analysis was utilized to identify clinically significant features associated with bleeding. Logistic regression model based on independent bleeding risk factors was constructed and presented as a nomogram. Model performance was assessed from three aspects: identification, calibration, and clinical utility. RESULTS: Boruta analysis identified eight clinical features from 25, and further multivariate logistic regression analysis selected four independent risk factors: hemoglobin, platelet count, previous bleeding, and sex. A visual nomogram was created based on these variables. The model achieved an area under the curve of 0.888 (95% CI: 0.845-0.931) in the training dataset and 0.888 (95% CI: 0.808-0.968) in the test dataset. Calibration curve analysis showed close approximation to the ideal curve. Decision curve analysis demonstrated favorable clinical net benefit for the model. CONCLUSIONS: Our study focused on creating and validating a model to evaluate bleeding risk in patients with a history of myocardial infarction treated with aspirin, which demonstrated outstanding performance in discrimination, calibration, and net clinical benefit.


Asunto(s)
Aspirina , Hemorragia , Infarto del Miocardio , Nomogramas , Humanos , Aspirina/efectos adversos , Aspirina/uso terapéutico , Hemorragia/inducido químicamente , Femenino , Masculino , Persona de Mediana Edad , Anciano , Factores de Riesgo , Inhibidores de Agregación Plaquetaria/efectos adversos , Inhibidores de Agregación Plaquetaria/uso terapéutico , Medición de Riesgo/métodos
8.
NEJM AI ; 1(6)2024 Jun.
Artículo en Inglés | MEDLINE | ID: mdl-38872809

RESUMEN

BACKGROUND: In intensive care units (ICUs), critically ill patients are monitored with electroencephalography (EEG) to prevent serious brain injury. EEG monitoring is constrained by clinician availability, and EEG interpretation can be subjective and prone to interobserver variability. Automated deep-learning systems for EEG could reduce human bias and accelerate the diagnostic process. However, existing uninterpretable (black-box) deep-learning models are untrustworthy, difficult to troubleshoot, and lack accountability in real-world applications, leading to a lack of both trust and adoption by clinicians. METHODS: We developed an interpretable deep-learning system that accurately classifies six patterns of potentially harmful EEG activity - seizure, lateralized periodic discharges (LPDs), generalized periodic discharges (GPDs), lateralized rhythmic delta activity (LRDA), generalized rhythmic delta activity (GRDA), and other patterns - while providing faithful case-based explanations of its predictions. The model was trained on 50,697 total 50-second continuous EEG samples collected from 2711 patients in the ICU between July 2006 and March 2020 at Massachusetts General Hospital. EEG samples were labeled as one of the six EEG patterns by 124 domain experts and trained annotators. To evaluate the model, we asked eight medical professionals with relevant backgrounds to classify 100 EEG samples into the six pattern categories - once with and once without artificial intelligence (AI) assistance - and we assessed the assistive power of this interpretable system by comparing the diagnostic accuracy of the two methods. The model's discriminatory performance was evaluated with area under the receiver-operating characteristic curve (AUROC) and area under the precision-recall curve. The model's interpretability was measured with task-specific neighborhood agreement statistics that interrogated the similarities of samples and features. In a separate analysis, the latent space of the neural network was visualized by using dimension reduction techniques to examine whether the ictal-interictal injury continuum hypothesis, which asserts that seizures and seizure-like patterns of brain activity lie along a spectrum, is supported by data. RESULTS: The performance of all users significantly improved when provided with AI assistance. Mean user diagnostic accuracy improved from 47 to 71% (P<0.04). The model achieved AUROCs of 0.87, 0.93, 0.96, 0.92, 0.93, and 0.80 for the classes seizure, LPD, GPD, LRDA, GRDA, and other patterns, respectively. This performance was significantly higher than that of a corresponding uninterpretable black-box model (with P<0.0001). Videos traversing the ictal-interictal injury manifold from dimension reduction (a two-dimensional representation of the original high-dimensional feature space) give insight into the layout of EEG patterns within the network's latent space and illuminate relationships between EEG patterns that were previously hypothesized but had not yet been shown explicitly. These results indicate that the ictal-interictal injury continuum hypothesis is supported by data. CONCLUSIONS: Users showed significant pattern classification accuracy improvement with the assistance of this interpretable deep-learning model. The interpretable design facilitates effective human-AI collaboration; this system may improve diagnosis and patient care in clinical settings. The model may also provide a better understanding of how EEG patterns relate to each other along the ictal-interictal injury continuum. (Funded by the National Science Foundation, National Institutes of Health, and others.).

9.
Epileptic Disord ; 26(4): 444-459, 2024 Aug.
Artículo en Inglés | MEDLINE | ID: mdl-38669007

RESUMEN

OBJECTIVE: To assess the effectiveness of an educational program leveraging technology-enhanced learning and retrieval practice to teach trainees how to correctly identify interictal epileptiform discharges (IEDs). METHODS: This was a bi-institutional prospective randomized controlled educational trial involving junior neurology residents. The intervention consisted of three video tutorials focused on the six IFCN criteria for IED identification and rating 500 candidate IEDs with instant feedback either on a web browser (intervention 1) or an iOS app (intervention 2). The control group underwent no educational intervention ("inactive control"). All residents completed a survey and a test at the onset and offset of the study. Performance metrics were calculated for each participant. RESULTS: Twenty-one residents completed the study: control (n = 8); intervention 1 (n = 6); intervention 2 (n = 7). All but two had no prior EEG experience. Intervention 1 residents improved from baseline (mean) in multiple metrics including AUC (.74; .85; p < .05), sensitivity (.53; .75; p < .05), and level of confidence (LOC) in identifying IEDs/committing patients to therapy (1.33; 2.33; p < .05). Intervention 2 residents improved in multiple metrics including AUC (.81; .86; p < .05) and LOC in identifying IEDs (2.00; 3.14; p < .05) and spike-wave discharges (2.00; 3.14; p < .05). Controls had no significant improvements in any measure. SIGNIFICANCE: This program led to significant subjective and objective improvements in IED identification. Rating candidate IEDs with instant feedback on a web browser (intervention 1) generated greater objective improvement in comparison to rating candidate IEDs on an iOS app (intervention 2). This program can complement trainee education concerning IED identification.


Asunto(s)
Electroencefalografía , Internado y Residencia , Neurología , Humanos , Proyectos Piloto , Neurología/educación , Electroencefalografía/métodos , Epilepsia/fisiopatología , Epilepsia/diagnóstico , Estudios Prospectivos , Competencia Clínica , Adulto , Masculino , Femenino
10.
Front Psychiatry ; 15: 1363406, 2024.
Artículo en Inglés | MEDLINE | ID: mdl-38596639

RESUMEN

Background: Motor coordination difficulties could contribute to social communication deficits in autistic children. However, the exploration of the mechanism implicated in these claims has been limited by the lack of potential confounders such as executive function (EF). Methods: We investigated the role that EF plays in the relationship between motor coordination and social communication in a school-aged autistic population via a structural model in a statistically robust manner. The results of questionnaires, including the Developmental Coordination Disorder questionnaire, the Behavior Rating Inventory of Executive Function, and the Social Responsiveness Scale, were collected to measure motor coordination, social communication deficits, and EF. Results: A total of 182 autistic children (7.61±1.31 years, 87.9% boys) were included in the final analysis. In the model with EF as a mediator, the total effect (ß=-0.599, P<0.001) and the direct effect (ß=-0.331, P =0.003) of motor coordination function on social communication were both significant among autistic children without intellectual disability (ID), as were indirect effects through EF (ß=-0.268, P<0.001). Conclusion: EF partially mediates the motor coordination and social communication correlation among autistic children. We suggest that motor coordination should be included in the routine evaluation of autistic surveillance and rehabilitation procedures.

12.
Ying Yong Sheng Tai Xue Bao ; 35(2): 330-338, 2024 Feb.
Artículo en Inglés | MEDLINE | ID: mdl-38523089

RESUMEN

Soil aggregates are important for the storage and availability of phosphorus in the soil. However, how forest regeneration types affect phosphorus fractions of soil aggregates remains unclear. In this study, we examined the composition of aggregate particle size, phosphorus fractions, phosphorus sorption capacity index (PSOR), legacy phosphorus index (PLGC) and degree of phosphorus saturation by Mehlich 3 (DPSM3) in bulk soils and soil aggregates of Castanopsis carlesii secondary forest (slight disturbance), C. carlesii human-assisted regeneration forest (moderate disturbance), and Cunninghamia lanceolata plantation (severe disturbance), aiming to explore the impact of forest regeneration types on phosphorus availability and supply potential of bulk soils and soil aggregates. The results showed that forest regeneration types significantly influenced the composition of soil aggregates. The proportion of coarse macroaggregates (>2 mm) in the soil of C. carlesii secondary forest and human-assisted regeneration forest was significantly higher than that in the C. lanceolata plantation, while the proportion of silt and clay fraction (<0.053 mm) showed an opposite trend. The composition of soil aggregates significantly affected the contents of different phosphorus fractions. The contents of soil labile phosphorus fractions (PSOL and PM3) decreased as aggregate particle size decreased. The contents of soil total phosphorus (TP), total organic phosphorus (Po), mode-rately labile phosphorus fractions (PiOH and PoOH), and occluded phosphorus (POCL), as well as PSOR and PLGC, exhibited a trend of decreasing at the beginning and then increasing as particle size decreased. The contents of TP, Po, and PiOH in coarse and silt macroaggregates was significantly higher than that in fine macroaggregates (0.25-2 mm) and microaggregates (0.053-0.25 mm). Forest regeneration types significantly influenced the contents of phosphorus fractions of bulk soils and soil aggregates. The contents of TP, Po, PSOL, and PM3 in the soil of C. carlesii secondary forests was significantly higher than that in C. carlesii human-assisted regeneration forest and C. lanceolata plantation. The contents of PSOL and PM3 in different-sized aggregates of C. carlesii secondary forests were significantly higher than that in the C. lanceolata plantation. Forest regeneration types significantly influenced the composition and supply potential of phosphorus fractions in soil aggregates. The proportions of PSOL, and PM3 to TP in different-sized soil aggregates were significantly lower in C. carlesii human-assisted regeneration forest compared with C. carlesii secondary forest. PSOR and DPSM3 in different-sized soil aggregates were significantly lower in C. lanceolata plantation than that in C. carlesii secondary forest. Overall, our results indicated that natural regeneration is more favorable for maintaining soil phosphorus availability, and that forest regeneration affects soil phosphorus availa-bility and its supply potential by altering the composition of soil aggregates.


Asunto(s)
Fagaceae , Suelo , Humanos , Fósforo , Bosques , Arcilla , China , Carbono/análisis
13.
Front Public Health ; 12: 1321046, 2024.
Artículo en Inglés | MEDLINE | ID: mdl-38299071

RESUMEN

Objective: To investigate the relationship between maternal folic acid (FA) supplementation during the pre-conceptional and prenatal periods and the subsequent risk of autism spectrum disorder (ASD) in offspring. Methods: A total of 6,049 toddlers aged 16-30 months were recruited from August 2016 to March 2017 for this cross-sectional study conducted in China. The parents of the enrolled toddlers provided information on maternal supplemental FA, socio-demographic information, and related covariates. Standard diagnostic procedures were implemented to identify toddlers with ASD. Results: Among the 6,049 children included in the study, consisting of 3,364 boys with an average age of 22.7 ± 4.1 months, a total of 71 children (1.2%) were diagnosed with ASD. Mothers who did not consume FA supplements during the prenatal period were found to have a significantly increased risk of having offspring with ASD, in comparison to those who were exposed to FA supplements (odds ratio [OR] = 2.47). However, we did not find a similar association during the pre-conceptional period. Compared to mothers who consistently used FA supplements from pre-conception to the prenatal period, those who never used FA supplements were statistically significantly associated with a higher risk of ASD in their offspring (OR = 2.88). Conclusion: This study indicated that providing continuous maternal FA supplementation during the pre-conceptional and prenatal periods may decrease the risk of ASD in offspring. The prenatal period is considered to be the most crucial time for intervention.


Asunto(s)
Trastorno del Espectro Autista , Ácido Fólico , Masculino , Embarazo , Femenino , Humanos , Lactante , Preescolar , Ácido Fólico/efectos adversos , Trastorno del Espectro Autista/epidemiología , Trastorno del Espectro Autista/etiología , Estudios Transversales , Suplementos Dietéticos/efectos adversos , Vitaminas , China/epidemiología
14.
Nutr J ; 23(1): 27, 2024 Feb 28.
Artículo en Inglés | MEDLINE | ID: mdl-38419087

RESUMEN

BACKGROUND: Dietary and gastrointestinal (GI) problems have been frequently reported in autism spectrum disorder (ASD). However, the relative contributions of autism-linked traits to dietary and GI problems in children with ASD are poorly understood. This study firstly compared the dietary intake and GI symptoms between children with ASD and typically developing children (TDC), and then quantified the relative contributions of autism-linked traits to dietary intake, and relative contributions of autism-linked traits and dietary intake to GI symptoms within the ASD group. METHODS: A sample of 121 children with ASD and 121 age-matched TDC were eligible for this study. The dietary intake indicators included food groups intakes, food variety, and diet quality. The autism-linked traits included ASD symptom severity, restricted repetitive behaviors (RRBs), sensory profiles, mealtime behaviors, and their subtypes. Linear mixed-effects models and mixed-effects logistic regression models were used to estimate the relative contributions. RESULTS: Children with ASD had poorer diets with fewer vegetables/fruits, less variety of food, a higher degree of inadequate/unbalanced dietary intake, and more severe constipation/total GI symptoms than age-matched TDC. Within the ASD group, compulsive behavior (a subtype of RRBs) and taste/smell sensitivity were the only traits associated with lower vegetables and fruit consumption, respectively. Self-injurious behavior (a subtype of RRBs) was the only contributing trait to less variety of food. Limited variety (a subtype of mealtime behavior problems) and ASD symptom severity were the primary and secondary contributors to inadequate dietary intake, respectively. ASD symptom severity and limited variety were the primary and secondary contributors to unbalanced dietary intake, respectively. Notably, unbalanced dietary intake was a significant independent factor associated with constipation/total GI symptoms, and autism-linked traits manifested no contributions. CONCLUSIONS: ASD symptom severity and unbalanced diets were the most important contributors to unbalanced dietary intake and GI symptoms, respectively. Our findings highlight that ASD symptom severity and unbalanced diets could provide the largest benefits for the dietary and GI problems of ASD if they were targeted for early detection and optimal treatment.


Asunto(s)
Trastorno del Espectro Autista , Trastorno Autístico , Enfermedades Gastrointestinales , Niño , Humanos , Trastorno del Espectro Autista/epidemiología , Trastorno del Espectro Autista/complicaciones , Trastorno Autístico/complicaciones , Enfermedades Gastrointestinales/epidemiología , Estreñimiento/epidemiología , Frutas , Verduras , Ingestión de Alimentos
15.
Biosens Bioelectron ; 251: 116101, 2024 May 01.
Artículo en Inglés | MEDLINE | ID: mdl-38324971

RESUMEN

Abnormal levels of uric acid (UA) in urine serve as warning signs for gout and metabolic cardiovascular diseases, necessitating the monitoring of UA levels for early prevention. However, the current analytical methods employed suffer from limitations in terms of inadequate suitability for home-based applications and the requirement of non-invasive procedures. In this approach, creatinine, a metabolite with a constant excretion rate, was incorporated as an endogenous internal standard (e-IS) for calibration, presenting a rapid, pretreatment-free, and accurate strategy for quantitative determination of UA concentrations. By utilizing urine creatinine as an internal reference value to calibrate the signal fluctuation of surface-enhanced Raman spectroscopy (SERS) of UA, the quantitative accuracy can be significantly improved without the need for an external internal standard. Due to the influence of the medium, UA, which carries a negative charge, is selectively adsorbed by Au@Ag nanoparticles functionalized with hexadecyltrimethylammonium chloride (CTAC) in this system. Furthermore, a highly convenient detection method was developed, which eliminates the need for pre-processing and minimizes matrix interference by simple dilution. The method was applied to the urine detection of different volunteers, and the results were highly consistent with those obtained using the UA colorimetric kit (UACK). The detection time of SERS was only 30 s, which is 50 times faster than UACK. This quantitative strategy of using urinary creatinine as an internal standard to correct the SERS intensity of uric acid is also expected to be extended to the quantitative detection needs of other biomarkers in urine.


Asunto(s)
Técnicas Biosensibles , Nanopartículas del Metal , Humanos , Ácido Úrico/orina , Creatinina/orina , Espectrometría Raman/métodos , Nanopartículas del Metal/química , Plata/química
16.
Int Urol Nephrol ; 56(2): 483-497, 2024 Feb.
Artículo en Inglés | MEDLINE | ID: mdl-37740848

RESUMEN

BACKGROUND: Bladder cancer, predominantly affecting men, is a prevalent malignancy of the urinary system. Although platinum-based chemotherapy has demonstrated certain enhancements in overall survival when compared to surgery alone, the efficacy of treatments is impeded by the unfavorable side effects of conventional chemotherapy medications. Nonetheless, immunotherapy exhibits potential in the treatment of bladder cancer. METHODS: To create an immune-associated prognostic signature for bladder cancer, bioinformatics analyses were performed utilizing The Cancer Genome Atlas (TCGA) database in this study. By identifying differential gene expressions between the high-risk and low-risk groups, a potential therapeutic drug was predicted using the Connectivity Map database. Subsequently, the impact of this drug on the growth of T24 cells was validated through MTT assay and 3D cell culture techniques. RESULTS: The signature included 1 immune-associated LncRNA (NR2F1-AS1) and 16 immune-associated mRNAs (DEFB133, RBP7, PDGFRA, CGB3, PDGFD, SCG2, ADCYAP1R1, OPRL1, PGR, PSMD1, TANK, PRDX1, ADIPOR2, S100A8, AHNAK, EGFR). Based on the assessment of risk scores, the patients were classified into cohorts of low-risk and high-risk individuals. The cohort with low risk demonstrated a considerably higher likelihood of survival in comparison to the group with high risk. Furthermore, variations in immune infiltration were noted among the two categories. Cephaeline, a possible medication, was discovered by analyzing variations in gene expression. It exhibited promise in suppressing the viability and growth of T24 bladder cancer cells. CONCLUSION: The novel predictive pattern allows for efficient categorization of patients with bladder cancer, enabling focused and rigorous treatment for those expected to have a worse prognosis. The discovery of a possible curative medication establishes a basis for forthcoming immunotherapy trials in bladder cancer.


Asunto(s)
Neoplasias de la Vejiga Urinaria , Masculino , Humanos , Pronóstico , Neoplasias de la Vejiga Urinaria/tratamiento farmacológico , Neoplasias de la Vejiga Urinaria/genética , Vejiga Urinaria , Inmunoterapia , Técnicas de Cultivo de Célula
17.
Sci Total Environ ; 913: 169649, 2024 Feb 25.
Artículo en Inglés | MEDLINE | ID: mdl-38159763

RESUMEN

BACKGROUND: Secondhand smoke (SHS) exposure was harmful for brain development. However, the association between SHS exposure and NDDs diagnosis were unclear. OBJECTIVES: To evaluate associations between SHS exposure and NDDs diagnosis, identify critical time windows, and summarize the strength of evidence. METHODS: To investigate the associations of SHS exposure and the development of NDDs, we searched Ovid, EMBASE, Web of Science, Cochrane Library, and PubMed for all the relevant studies up to 31 March 2023. The risk estimates and standardized mean differences (SMD) for the individuals with any NDDs who were exposed to SHS exposure compared with those unexposed or low-exposed. RESULTS: The results showed that a total of 31,098 citations were identified, of which 54 studies were included. We identified significant associations between SHS exposure and the risks of NDDs including specific types of NDDs like attention deficit hyperactivity disorder (ADHD) and learning disabilities (LD) despite the observed heterogeneity for NDDs and ADHD. We also observed a significant association between cotinine exposure and ADHD. However, inconsistent ratings between the two quality-of-evidence methods for all the meta-analyses indicated the current evidence of the associations and the potential exposure window remained inconclusive. DISCUSSION: Our findings suggested that SHS exposure was associated with a higher risk of developing ADHD and LD, with inconclusive quality-of-evidence. In addition, period-specific associations remained unclear based on current evidence.


Asunto(s)
Trastorno por Déficit de Atención con Hiperactividad , Contaminación por Humo de Tabaco , Humanos , Contaminación por Humo de Tabaco/efectos adversos , Contaminación por Humo de Tabaco/análisis , Cotinina , Factores de Riesgo
18.
Aging Clin Exp Res ; 35(12): 3023-3031, 2023 Dec.
Artículo en Inglés | MEDLINE | ID: mdl-37923935

RESUMEN

BACKGROUND: Observational studies have suggested an association between white blood cells (WBCs) and frailty, but considering the susceptibility to reverse causality and confounding, the causal direction and magnitude of this association remain ambiguous. Our aim was to investigate the causal effect of WBCs on frailty by means of a Mendelian randomization (MR) analysis. METHODS: Based on the genome-wide association study (GWAS) summary statistics data provided by the European Bioinformatics Institute (EBI), we carried out a two-sample MR study. We applied the genetically predicted independent WBCs from GWAS as a measure of exposure data. The Rockwood Frailty Index (FI) was used as outcome measure, which was derived from a meta-analysis from GWAS in UK Biobank European ancestry participants and Swedish TwinGene participants. Our study applied inverse variance weighted (IVW), weighted median, Mendelian randomization-Egger (MR-Egger) and outlier test (MR-PRESSO) methods to explore relationships between various WBCs and frailty. RESULTS: In our study, a possible causal relationship between eosinophil levels and frailty was demonstrated by two-sample MR analysis. Eosinophils were associated with FI (beta:0.0609; 95% CI 0.0382, 0.0836; P = 1.38E-07). Our results suggest that as the level of eosinophils increases, so does the risk of frailty. No meaningful causal relationship between neutrophils, lymphocytes, monocytes or basophils and FI was found in the MR results (P > 0.05). CONCLUSIONS: According to this MR study, higher eosinophil counts are related to an increased risk of frailty. To validate these findings and investigate the mechanisms underlying these connections, future studies are warranted.


Asunto(s)
Fragilidad , Humanos , Fragilidad/genética , Estudio de Asociación del Genoma Completo , Leucocitos , Monocitos , Predisposición Genética a la Enfermedad
20.
Gut Microbes ; 15(2): 2281350, 2023 Dec.
Artículo en Inglés | MEDLINE | ID: mdl-38010793

RESUMEN

Our previous work revealed that unbalanced dietary intake was an important independent factor associated with constipation and gastrointestinal (GI) symptoms in children with autism spectrum disorder (ASD). Growing evidence has shown the alterations in the gut microbiota and gut microbiota-derived metabolites in ASD. However, how the altered microbiota might affect the associations between unbalanced diets and GI symptoms in ASD remains unknown. We analyzed microbiome and metabolomics data in 90 ASD and 90 typically developing (TD) children based on 16S rRNA and untargeted metabolomics, together with dietary intake and GI symptoms assessment. We found that there existed 11 altered gut microbiota (FDR-corrected P-value <0.05) and 397 altered metabolites (P-value <0.05) in children with ASD compared with TD children. Among the 11 altered microbiota, the Turicibacter, Coprococcus 1, and Lachnospiraceae FCS020 group were positively correlated with constipation (FDR-corrected P-value <0.25). The Eggerthellaceae was positively correlated with total GI symptoms (FDR-corrected P-value <0.25). More importantly, three increased microbiota including Turicibacter, Coprococcus 1, and Eggerthellaceae positively modulated the associations of unbalanced dietary intake with constipation and total GI symptoms, and the decreased Clostridium sp. BR31 negatively modulated their associations in ASD children (P-value <0.05). Together, the altered microbiota strengthens the relationship between unbalanced dietary intake and GI symptoms. Among the altered metabolites, ten metabolites derived from microbiota (Turicibacter, Coprococcus 1, Eggerthellaceae, and Clostridium sp. BR31) were screened out, enriched in eight metabolic pathways, and were identified to correlate with constipation and total GI symptoms in ASD children (FDR-corrected P-value <0.25). These metabolomics findings further support the modulating role of gut microbiota on the associations of unbalanced dietary intake with GI symptoms. Collectively, our research provides insights into the relationship between diet, the gut microbiota, and GI symptoms in children with ASD.


Asunto(s)
Trastorno del Espectro Autista , Enfermedades Gastrointestinales , Microbioma Gastrointestinal , Humanos , Niño , Trastorno del Espectro Autista/metabolismo , ARN Ribosómico 16S/genética , Multiómica , Estreñimiento/complicaciones , Ingestión de Alimentos
SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA
...