Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 77
Filter
1.
Heliyon ; 10(7): e28555, 2024 Apr 15.
Article in English | MEDLINE | ID: mdl-38623248

ABSTRACT

Introduction: Previous studies have reported a correlation between a high-grade CMV-infection and an unfavorable prognosis in glioblastoma (GB). Coversely, epilepsy has been associated with a more favorable outcome in GB patients. Despites epilepsy and CMV share similar molecular mechanisms in GB tumoral microenvironment, the correlation between Tumor-Related-Epilepsy (TRE) and CMVinfection remains unexplored. The aim of our study is to examine the correlation between the dregree of CMV infection and seizure types on the survival of TRE Adult-type-diffuse-glioma. To achieve this objective, we conducted a comprehensive literature review to assess our results regarding previous publications. Methods: We conducted a retrospective-observational study on TRE Adult-type-diffuse-gliomas treated at a single center in Mexico from 2010 to 2018. Tumor tissue and cDNA were analyzed by immunochemistry (IHC) for CMV (IE and LA antigens) at the Karolinska Institute in Sweden, and RT-PCR for CMV-gB in Torreon Mexico, respectively. Bivariate analysis (X2-test) was performed to evaluate the association between subtypes of Adult-type-diffuse-glioma (IDH-mut grade 4 astrocytoma vs. IDH-wt glioblastoma) and the following variables: type of hemispheric involvement (mesial vs. neocortical involvement), degree of CMV infection (<25%vs. >25% infected-tumoral cells) and seizure types [Focal awareness, focal impaired awareness, and FBTCS]. Kaplan Meier and Cox analyses were performed to determine the risk, p < 0.05 was considered statistically significant. Results: Sixty patients with TRE Adult type diffuse gliomas were included (80% IDH-wt glioblastoma and 20% IDH-mut grade 4astrocytomas). The mean age was 61.5 SD ± 18.4, and 57% were male. Fifty percent of the patients presented with mesial involvement of the hemysphere. Seizure types included focal awareness (15%), focal impaired awareness (43.3%), and FBTCS (41.7%). Ninety percent of cases were treated with Levetiracetam and 33.3% presented Engel-IA postoperative seizure control. More than 90% of samples were positive for CMV-immunohistochemistry (IHC). However, all cDNA analyzed by RT-PCR return negative results. The median of overall survival (OS) was 15 months. High-grade CMV-IE infection (14 vs. 25 months, p<0.001), mesial involvement (12 vs. 18 months, p<0.001), and FBTCS were associated with worse OS (9 vs.18 months for non-FBTCS). Multivariate analysis demonstrated that high-grade CMV infection (HR = 3.689, p=0.002) and FBTCS (HR=7.007, p<0.001) were independent unfavorable survival factors. Conclusions: CMV induces a proinflammatory tumoral microenvironment that contributes to the developmet of epilepsy. Tumor progression could be associated not only with a higher degree of CMV infection but also to epileptogenesis, resulting in a seizure phenotype chracterized by FBTCS and poor survival outcomes. This study represents the first survival analysis in Latin America to include a representative sample of TRE Adult-type diffuse gliomas considering CMV-infection-degree and distinguishing features (such as FBTCS) that might have potential clinical relevance in this group of patients. Further prospective studies are required to validate these results.

2.
Arch. argent. pediatr ; 122(2): e202310050, abr. 2024. tab
Article in English, Spanish | LILACS, BINACIS | ID: biblio-1537591

ABSTRACT

Introducción. Contar con los datos del consumo de alimentos ultraprocesados en los niños resulta importante para planificar políticas públicas. Objetivos. Describir la prevalencia de consumo de alimentos ultraprocesados en menores de 2 años e identificar factores asociados. Describir la proporción que los alimentos ultraprocesados representan del número total de los alimentos consumidos en el día. Métodos. Análisis secundario de los datos de niños entre 6 y 23 meses de edad con al menos un recordatorio de 24 horas de consumo de alimentos de la Segunda Encuesta Nacional de Nutrición y Salud de Argentina del año 2018. Se estudiaron como variables principales: "consumo de alimentos ultraprocesados" (según el sistema NOVA) categorizada en sí/no y la "proporción de ultraprocesados del total de alimentos consumidos". Los factores asociados explorados fueron lactancia materna, sexo, edad y el número de alimentos no ultraprocesados consumidos. Se realizó un modelo de regresión logística multivariable y se aplicó un factor de expansión para ponderar los datos. Resultados. Se incluyeron 4224 niños (ponderado 908 104). La prevalencia de consumo de ultraprocesados fue del 90,8 % (IC95%: 89,5-92) y fue asociado con mayor edad (OR 3,21; IC95% 2,28-4,52) y con el número de alimentos no ultraprocesados consumidos (OR 1,17; IC95% 1,13-1,23). Los ultraprocesados representaron una mediana del 20 % (RIC: 12,5-28,6 %) del total de alimentos consumidos en el día. Conclusiones. Este estudio señala la alta penetración de los alimentos ultraprocesados en la alimentación complementaria.


Introduction. The availability of data on the consumption of ultra-processed foods among children is important for planning public policies. Objectives. To describe the prevalence of consumption of ultra-processed foods in children under 2 years of age and identify associated factors. To describe the proportion that ultra-processed foods represent out of the total number of foods consumed in a day. Methods. Secondary analysis of data from children aged 6­23 months with at least a 24-hour recall of food consumption based on the Second National Survey on Nutrition and Health of Argentina (2018). The following primary variables were studied: "consumption of ultra-processed foods" (according to the NOVA system) categorized into yes/no and "proportion of ultra-processed out of total foods consumed." The following associated factors were studied: breastfeeding, sex, age, and number of non-ultra-processed foods consumed. A multivariate logistic regression model was developed and an expansion factor was applied to weight the data. Results. A total of 4224 children were included (weighed: 908 104). The prevalence of ultra-processed food consumption was 90.8% (95% CI: 89.5­92) and was associated with an older age (OR: 3.21, 95% CI: 2.28­4.52) and the number of non-ultra-processed foods consumed (OR: 1.17, 95% CI: 1.13­1.23). Ultra-processed foods accounted for a median 20% (IQR: 12.5­28.6%) of all foods consumed in a day. Conclusions. This study highlights the high penetration of ultra-processed foods in complementary feeding.


Subject(s)
Humans , Infant , Diet , Food, Processed , Argentina , Fast Foods , Food Handling
3.
Sci Rep ; 14(1): 1137, 2024 01 11.
Article in English | MEDLINE | ID: mdl-38212416

ABSTRACT

The study of specific T-cell responses against SARS-CoV-2 is important for understanding long-term immunity and infection management. The aim of this study was to assess the dual IFN-γ and IL-2 detection, using a SARS-CoV-2 specific fluorescence ELISPOT, in patients undergoing acute disease, during convalescence, and after vaccination. We also evaluated humoral response and compared with T-cells with the aim of correlating both types of responses, and increase the number of specific response detection. Blood samples were drawn from acute COVID-19 patients and convalescent individuals classified according to disease severity; and from unvaccinated and vaccinated uninfected individuals. IgGs against Spike and nucleocapsid, IgMs against nucleocapsid, and neutralizing antibodies were also analyzed. Our results show that IFN-γ in combination with IL-2 increases response detection in acute and convalescent individuals (p = 0.023). In addition, IFN-γ detection can be a useful biomarker for monitoring severe acute patients, as our results indicate that those individuals with a poor outcome have lower levels of this cytokine. In some cases, the lack of cellular immunity is compensated by antibodies, confirming the role of both types of immune responses in infection, and confirming that their dual detection can increase the number of specific response detections. In summary, IFN-γ/IL-2 dual detection is promising for characterizing and assessing the immunization status, and helping in the patient management.


Subject(s)
COVID-19 , SARS-CoV-2 , Humans , Interleukin-2 , Immunity, Cellular , Antibodies, Neutralizing , Antibodies, Viral , Immunity, Humoral
4.
Arch Argent Pediatr ; 122(2): e202310050, 2024 04 01.
Article in English, Spanish | MEDLINE | ID: mdl-37870979

ABSTRACT

Introduction. The availability of data on the consumption of ultra-processed foods among children is important for planning public policies. Objectives. To describe the prevalence of consumption of ultra-processed foods in children under 2 years of age and identify associated factors. To describe the proportion that ultra-processed foods represent out of the total number of foods consumed in a day. Methods. Secondary analysis of data from children aged 6-23 months with at least a 24-hour recall of food consumption based on the Second National Survey on Nutrition and Health of Argentina (2018). The following primary variables were studied: "consumption of ultra-processed foods" (according to the NOVA system) categorized into yes/no and "proportion of ultra-processed out of total foods consumed." The following associated factors were studied: breastfeeding, sex, age, and number of non-ultra-processed foods consumed. A multivariate logistic regression model was developed and an expansion factor was applied to weight the data. Results. A total of 4224 children were included (weighed: 908 104). The prevalence of ultra-processed food consumption was 90.8% (95% CI: 89.5-92) and was associated with an older age (OR: 3.21, 95% CI: 2.28-4.52) and the number of non-ultra-processed foods consumed (OR: 1.17, 95% CI: 1.13-1.23). Ultra-processed foods accounted for a median 20% (IQR: 12.5-28.6%) of all foods consumed in a day. Conclusions. This study highlights the high penetration of ultra-processed foods in complementary feeding.


Introducción. Contar con los datos del consumo de alimentos ultraprocesados en los niños resulta importante para planificar políticas públicas. Objetivos. Describir la prevalencia de consumo de alimentos ultraprocesados en menores de 2 años e identificar factores asociados. Describir la proporción que los alimentos ultraprocesados representan del número total de los alimentos consumidos en el día. Métodos. Análisis secundario de los datos de niños entre 6 y 23 meses de edad con al menos un recordatorio de 24 horas de consumo de alimentos de la Segunda Encuesta Nacional de Nutrición y Salud de Argentina del año 2018. Se estudiaron como variables principales: "consumo de alimentos ultraprocesados" (según el sistema NOVA) categorizada en sí/no y la "proporción de ultraprocesados del total de alimentos consumidos". Los factores asociados explorados fueron lactancia materna, sexo, edad y el número de alimentos no ultraprocesados consumidos. Se realizó un modelo de regresión logística multivariable y se aplicó un factor de expansión para ponderar los datos. Resultados. Se incluyeron 4224 niños (ponderado 908 104). La prevalencia de consumo de ultraprocesados fue del 90,8 % (IC95%: 89,5-92) y fue asociado con mayor edad (OR 3,21; IC95% 2,28-4,52) y con el número de alimentos no ultraprocesados consumidos (OR 1,17; IC95% 1,13-1,23). Los ultraprocesados representaron una mediana del 20 % (RIC: 12,5-28,6 %) del total de alimentos consumidos en el día. Conclusiones. Este estudio señala la alta penetración de los alimentos ultraprocesados en la alimentación complementaria.


Subject(s)
Diet , Food, Processed , Child , Female , Humans , Infant , Argentina , Fast Foods , Food Handling
5.
Biomed Pharmacother ; 168: 115720, 2023 Dec.
Article in English | MEDLINE | ID: mdl-37839110

ABSTRACT

The aggressive and incurable diffuse gliomas constitute 80% of malignant brain tumors, and patients succumb to recurrent surgeries and drug resistance. Epidemiological research indicates that substantial consumption of fruits and vegetables diminishes the risk of developing this tumor type. Broccoli consumption has shown beneficial effects in both cancer and neurodegenerative diseases. These effects are partially attributed to the isothiocyanate sulforaphane (SFN), which can regulate the Keap1/Nrf2/ARE signaling pathway, stimulate detoxifying enzymes, and activate cellular antioxidant defense processes. This study employs a C6 rat glioma model to assess the chemoprotective potential of aqueous extracts from broccoli seeds, sprouts, and inflorescences, all rich in SFN, and pure SFN as positive control. The findings reveal that administering a dose of 100 mg/kg of broccoli sprout aqueous extract and 0.1 mg/kg of SFN to animals for 30 days before introducing 1 × 104 cells effectively halts tumor growth and progression. This study underscores the significance of exploring foods abundant in bioactive compounds, such as derivatives of broccoli, for potential preventive integration into daily diets. Using broccoli sprouts as a natural defense against cancer development might seem idealistic, yet this investigation establishes that administering this extract proves to be a valuable approach in designing strategies for glioma prevention. Although the findings stem from a rat glioma model, they offer promising insights for subsequent preclinical and clinical research endeavors.


Subject(s)
Brassica , Glioma , Humans , Rats , Animals , Kelch-Like ECH-Associated Protein 1/metabolism , Plant Extracts/pharmacology , Plant Extracts/therapeutic use , NF-E2-Related Factor 2/metabolism , Isothiocyanates/pharmacology , Isothiocyanates/therapeutic use , Glioma/prevention & control
6.
J Clin Med ; 12(17)2023 Aug 22.
Article in English | MEDLINE | ID: mdl-37685503

ABSTRACT

Uterine leiomyomas or uterine fibroids are the most common benign soft tissue tumor in reproductive-aged women. Fumarate hydratase deficient (FH-d) uterine fibroids are a rare subtype that is diagnosed only on pathologic evaluation. FH-d uterine fibroids may be the first indicator of hereditary leiomyomatosis and renal cell cancer (HLRCC) syndrome. Therefore, identifying and understanding the clinical implication and diagnosis of FH-d uterine fibroids is critical for early diagnosis of HLRCC. This case series investigates the uncommon yet significant condition of FH-d uterine fibroids. We examined the clinical manifestation, diagnostic imaging, and histopathological characteristics of FH-d uterine fibroids in five cases identified at our institution over the last ten years. All diagnoses were confirmed by pathologic evaluation after surgical treatment. Gynecologists and pathologists play a critical role in the early diagnosis of FH-d uterine fibroids and must recognize the relevant clinical and pathologic findings that raise suspicion about this diagnosis. The detection of these cases is largely dependent on the pathologist's ability to recognize unique histopathologic features. Once these characteristics are identified, it should prompt a referral to a gynecologist to consider conducting germline genetic testing. The management of FH-d uterine fibroids necessitates a multidisciplinary approach, including proper genetic screening and regular surveillance, especially for renal tumors.

8.
Gene ; 877: 147565, 2023 Aug 15.
Article in English | MEDLINE | ID: mdl-37315635

ABSTRACT

BACKGROUND: The use of novel and accurate techniques to identify genetic variants (with or without a record in the National Center for Biotechnology Information (NCBI) database) improves diagnosis, prognosis, and therapeutics for patients with epilepsy, especially in populations for whom such techniques exist. The aim of this study was to find a genetic profile in Mexican pediatric epilepsy patients by focusing on ten genes associated with drug-resistant epilepsy (DRE). METHODS: This was a prospective, analytical, cross-sectional study of pediatric patients with epilepsy. Informed consent was granted by the patients' guardians or parents. Genomic DNA from the patients was sequenced using next-generation sequencing (NGS). For statistical analysis, Fisher's exact, Chi-square or Mann-Whitney U, and OR (95% CI) tests were performed, with significance values of p < 0.05. RESULTS: Fifty-five patients met the inclusion criteria (female 58.2%, ages 1-16 years); 32 patients had controlled epilepsy (CTR), and 23 had DRE. Four hundred twenty-two genetic variants were identified (71.3% with a known SNP registered in the NCBI database). A dominant genetic profile consisting of four haplotypes of the SCN1A, CYP2C9, and CYP2C19 genes was identified in most of the patients studied. When comparing the results between patients with DRE and CTR, the prevalence of polymorphisms in the SCN1A (rs10497275, rs10198801, and rs67636132), CYP2D6 (rs1065852), and CYP3A4 (rs2242480) genes showed statistical significance (p = 0.021). Finally, the number of missense genetic variants in patients in the nonstructural subgroup was significantly higher in DRE than in CTR (1 [0-2] vs. 3 [2-4]; p = 0.014). CONCLUSIONS: The Mexican pediatric epilepsy patients included in this cohort presented a characteristic genetic profile infrequent in the Mexican population. SNP rs1065852 (CYP2D6*10) is associated with DRE, especially with nonstructural damage. The presence of three genetic alterations affecting the CYP2B6, CYP2C9, and CYP2D6 cytochrome genes is associated with nonstructural DRE.


Subject(s)
Drug Resistant Epilepsy , Epilepsy , Humans , Child , Female , Cytochrome P-450 CYP2D6/genetics , Cytochrome P-450 CYP2C9/genetics , Clinical Relevance , Cross-Sectional Studies , Prospective Studies , Epilepsy/genetics
9.
Ann N Y Acad Sci ; 1524(1): 97-104, 2023 06.
Article in English | MEDLINE | ID: mdl-37026582

ABSTRACT

The risk of inadequate calcium intake is a worldwide problem. We performed a simulation exercise on the impact, effectiveness, and safety of increasing calcium levels in drinking water using the 2019 Health and Nutrition National Survey of Argentina, which provides water intake and water sources data at the individual level. We simulated the distribution of calcium intake assuming a calcium concentration of 100 mg of calcium per liter of tap water and 400 mg of calcium per liter of bottled water. After the simulation, all population groups had a slightly improved calcium intake. Higher impacts were observed in adults, as reported water intake was higher in adults 19-51 years old. In young adult women, the estimated calcium intake inadequacy decreased from 91.0% to 79.7% when calcium was increased in tap water and to 72.2% when calcium was increased in tap and bottled water. The impact was lower in adolescents and older adults who have higher calcium recommendations and reported lower water intake. Increased calcium concentration of water could improve calcium intake in Argentina, especially in adults as their reported water intake is higher. Combining more than one strategy to improve calcium intake might be required for countries like Argentina with low calcium intake.


Subject(s)
Drinking Water , Young Adult , Adolescent , Humans , Female , Aged , Adult , Middle Aged , Calcium , Drinking , Water Supply , Nutrition Surveys , Calcium, Dietary
10.
JMIR Form Res ; 6(10): e34055, 2022 Oct 17.
Article in English | MEDLINE | ID: mdl-36251350

ABSTRACT

BACKGROUND: Genetic testing uptake is low, despite the well-established connection between pathogenic variants in certain cancer-linked susceptibility genes and ovarian cancer risk. Given that most major insurers cover genetic testing for those with a family history suggestive of hereditary cancer, the issue may lie in access to genetic testing. Remotely accessible web-based communication systems may improve awareness, and uptake, of genetic testing services. OBJECTIVE: This study aims to present the development and formative evaluation of the multistep web-based communication system required to support the implementation of, and access to, genetic testing. METHODS: While designing the multistep web-based communication system, we considered various barriers and facilitators to genetic testing, guided by dimensions of accessibility. In addition to conducting usability testing, we performed ongoing assessments focusing on the function of the web-based system and participant response rates, with the goal of continuing to make modifications to the web-based communication system as it is in use. RESULTS: The combined approach of usability testing and expert user experience consultation resulted in several modifications to the multistep web-based communication system, including changes that related to imagery and content, web accessibility, and general organization of the web-based system. All recommendations were made with the goal of improving the overall accessibility of the web-based communication system. CONCLUSIONS: A multistep web-based communication system appears to be an effective way to address many potential barriers to access, which may otherwise make genetic testing difficult for at-risk individuals to participate in. Importantly, some dimensions of access were easy to assess before study recruitment, but other aspects of the communication system required ongoing assessment during the implementation process of the Making Genetic Testing Accessible study.

11.
JMIR Form Res ; 6(9): e35035, 2022 Sep 26.
Article in English | MEDLINE | ID: mdl-36155347

ABSTRACT

BACKGROUND: Strong participant recruitment practices are critical to public health research but are difficult to achieve. Traditional recruitment practices are often time consuming, costly, and fail to adequately target difficult-to-reach populations. Social media platforms such as Facebook are well-positioned to address this area of need, enabling researchers to leverage existing social networks and deliver targeted information. The MAGENTA (Making Genetic Testing Accessible) study aimed to improve the availability of genetic testing for hereditary cancer susceptibility in at-risk individuals through the use of a web-based communication system along with social media advertisements to improve reach. OBJECTIVE: This paper is aimed to evaluate the effectiveness of Facebook as an outreach tool for targeting women aged ≥30 years for recruitment in the MAGENTA study. METHODS: We designed and implemented paid and unpaid social media posts with ongoing assessment as a primary means of research participant recruitment in collaboration with patient advocates. Facebook analytics were used to assess the effectiveness of paid and unpaid outreach efforts. RESULTS: Over the course of the reported recruitment period, Facebook materials had a reach of 407,769 people and 57,248 (14.04%) instances of engagement, indicating that approximately 14.04% of people who saw information about the study on Facebook engaged with the content. Paid advertisements had a total reach of 373,682. Among those reached, just <15% (54,117/373,682, 14.48%) engaged with the page content. Unpaid posts published on the MAGENTA Facebook page resulted in a total of 34,087 reach and 3131 instances of engagement, indicating that around 9.19% (3131/34,087) of people who saw unpaid posts engaged. Women aged ≥65 years reported the best response rate, with approximately 43.95% (15,124/34,410) of reaches translating to engagement. Among the participants who completed the eligibility questionnaire, 27.44% (3837/13,983) had heard about the study through social media or another webpage. CONCLUSIONS: Facebook is a useful way of enhancing clinical trial recruitment of women aged ≥30 years who have a potentially increased risk for ovarian cancer by promoting news stories over social media, collaborating with patient advocacy groups, and running paid and unpaid campaigns. TRIAL REGISTRATION: ClinicalTrials.gov NCT02993068; https://clinicaltrials.gov/ct2/show/NCT02993068.

12.
Front Microbiol ; 13: 885312, 2022.
Article in English | MEDLINE | ID: mdl-35935194

ABSTRACT

Background: Current blood-based diagnostic tools for TB are insufficient to properly characterize the distinct stages of TB, from the latent infection (LTBI) to its active form (aTB); nor can they assess treatment efficacy. Several immune cell biomarkers have been proposed as potential candidates for the development of improved diagnostic tools. Objective: To compare the capacity of CD27, HLA-DR, CD38 and Ki-67 markers to characterize LTBI, active TB and patients who ended treatment and resolved TB. Methods: Blood was collected from 45 patients defined according to clinical and microbiological criteria as: LTBI, aTB with less than 1 month of treatment and aTB after completing treatment. Peripheral blood mononuclear cells were stimulated with ESAT-6/CFP-10 or PPD antigens and acquired for flow cytometry after labelling with conjugated antibodies against CD3, CD4, CD8, CD27, IFN-γ, TNF-α, CD38, HLA-DR, and Ki-67. Conventional and multiparametric analyses were done with FlowJo and OMIQ, respectively. Results: The expression of CD27, CD38, HLA-DR and Ki-67 markers was analyzed in CD4+ T-cells producing IFN-γ and/or TNF-α cytokines after ESAT-6/CFP-10 or PPD stimulation. Within antigen-responsive CD4+ T-cells, CD27- and CD38+ (ESAT-6/CFP-10-specific), and HLA-DR+ and Ki-67+ (PPD- and ESAT-6/CFP-10-specific) populations were significantly increased in aTB compared to LTBI. Ki-67 demonstrated the best discriminative performance as evaluated by ROC analyses (AUC > 0.9 after PPD stimulation). Data also points to a significant change in the expression of CD38 (ESAT-6/CFP-10-specific) and Ki-67 (PPD- and ESAT-6/CFP-10-specific) after ending the anti-TB treatment regimen. Furthermore, ratio based on the CD27 median fluorescence intensity in CD4+ T-cells over Mtb-specific CD4+ T-cells showed a positive association with aTB over LTBI (ESAT-6/CFP-10-specific). Additionally, multiparametric FlowSOM analyses revealed an increase in CD27 cell clusters and a decrease in HLA-DR cell clusters within Mtb-specific populations after the end of treatment. Conclusion: Our study independently confirms that CD27-, CD38+, HLA-DR+ and Ki-67+ populations on Mtb-specific CD4+ T-cells are increased during active TB disease. Multiparametric analyses unbiasedly identify clusters based on CD27 or HLA-DR whose abundance can be related to treatment efficacy. Further studies are necessary to pinpoint the convergence between conventional and multiparametric approaches.

13.
Curr Pharm Des ; 28(28): 2283-2297, 2022.
Article in English | MEDLINE | ID: mdl-35713147

ABSTRACT

Epilepsy is the most common chronic neurological disease, affecting approximately 65 million people worldwide, with mesial temporal lobe epilepsy (mTLE) being the most common type, characterized by the presence of focal seizures that begin in the hippocampus, and subsequently generalize to structures such as the cerebral cortex. It is estimated that approximately 40% of patients with mTLE develop drug resistance (DR), whose pathophysiological mechanisms remain unclear. The neuronal network hypothesis is one attempt to understand the mechanisms underlying resistance to antiepileptic drugs (AEDs), since recurrent seizure activity generates excitotoxic damage and activation of neuronal death and survival pathways that, in turn, promote the formation of aberrant neuronal networks. This review addresses the mechanisms that are activated, perhaps as compensatory mechanisms in response to the neurological damage caused by epileptic seizures, but that affect the formation of aberrant connections that allow the establishment of inappropriate circuits. On the other hand, glia seems to have a relevant role in post-seizure plasticity, thus supporting the hypothesis of the neuronal network in drug-resistant epilepsy, which has been proposed for ELT.


Subject(s)
Epilepsy, Temporal Lobe , Epilepsy , Anticonvulsants/therapeutic use , Epilepsy, Temporal Lobe/drug therapy , Hippocampus , Humans , Neuroglia , Seizures/drug therapy
14.
Epilepsia Open ; 7 Suppl 1: S68-S80, 2022 08.
Article in English | MEDLINE | ID: mdl-35247028

ABSTRACT

More than one-third of people with epilepsy develop drug-resistant epilepsy (DRE). Different hypotheses have been proposed to explain the origin of DRE. Accumulating evidence suggests the contribution of neuroinflammation, modifications in the integrity of the blood-brain barrier (BBB), and altered immune responses in the pathophysiology of DRE. The inflammatory response is mainly due to the increase of cytokines and related molecules; these molecules have neuromodulatory effects that contribute to hyperexcitability in neural networks that cause seizure generation. Some patients with DRE display the presence of autoantibodies in the serum and mainly cerebrospinal fluid. These patients are refractory to the different treatments with standard antiseizure medications (ASMs), and they could be responding well to immunomodulatory therapies. This observation emphasizes that the etiopathogenesis of DRE is involved with immunology responses and associated long-term events and chronic inflammation processes. Furthermore, multiple studies have shown that functional polymorphisms as risk factors are involved in inflammation processes. Several relevant polymorphisms could be considered risk factors involved in inflammation-related DRE such as receptor for advanced glycation end products (RAGE) and interleukin 1ß (IL-1ß). All these evidences sustained the hypothesis that the chronic inflammation process is associated with the DRE. However, the effect of the chronic inflammation process should be investigated in further clinical studies to promote the development of novel therapeutics useful in treatment of DRE.


Subject(s)
Drug Resistant Epilepsy , Epilepsy , Blood-Brain Barrier , Epilepsy/drug therapy , Humans , Neuroinflammatory Diseases , Receptor for Advanced Glycation End Products/therapeutic use
15.
Med Sci Educ ; 31(2): 599-606, 2021 Apr.
Article in English | MEDLINE | ID: mdl-34457914

ABSTRACT

PURPOSE: To assess obstetrician-gynecologist (Ob/Gyn) resident experiences with and preferences for lesbian, gay, bisexual, transgender, and queer (LGBTQ) healthcare training. METHODS: A cross-sectional, web-based survey was deployed to residents from accredited Illinois Ob/Gyn training programs. The survey included 32 questions on resident demographics, LGBTQ training, and self-perceived preparedness in providing LGBTQ patient care. RESULTS: Of 257 eligible Ob/Gyn residents, 105 (41%) responded. Fifty percent of residents felt unprepared to care for lesbian or bisexual patients and 76% felt unprepared to care for transgender patients. Feeling prepared to provide care for lesbian or bisexual patients was associated with attending a university-based program, working in a hospital without religious affiliation, and year of training. Feeling prepared to provide healthcare for transgender patients correlated with grand rounds focused on LGBTQ health and supervised clinical involvement. Regarding training, 62% and 63% of participants stated their programs dedicate 1-5 h per year to lesbian/bisexual healthcare and transgender healthcare training, respectively. Concurrently, 92% desired more education on how to provide healthcare to LGBTQ patients. Perceived barriers to receiving training in LGBTQ healthcare included curricular crowding (85%) and lack of experienced faculty (91%). CONCLUSION: Our assessment indicates Illinois Ob/Gyn residents feel inadequately prepared to address healthcare needs of LGBTQ patients. Although barriers exist, residents desire more education and training in providing healthcare to the LGBTQ community. Future work is needed to address this gap through curricular development to ensure that Ob/Gyn residency graduates are prepared care for LGBTQ patients.

16.
Cancer Epidemiol Biomarkers Prev ; 30(10): 1826-1833, 2021 10.
Article in English | MEDLINE | ID: mdl-34272263

ABSTRACT

BACKGROUND: The influence of prenatal diethylstilbestrol (DES) exposure on cancer incidence among middle-aged men has not been well-characterized. We investigated whether exposure to DES before birth impacts overall cancer risk, and risk of site-specific cancers. METHODS: Men (mean age in 2016 = 62.0 years) who were or were not prenatally DES exposed were identified between 1953 and 1994 and followed for cancer primarily via questionnaire approximately every 5 years between 1994 and 2016. The overall and site-specific cancer rates of the two groups were compared using Poisson regression and proportional hazards modeling with adjustment for age. RESULTS: DES exposure was not associated with either overall cancer [hazard ratio (HR), 0.94; 95% confidence interval (CI), 0.77-1.15] or total prostate cancer rates (HR, 0.95; 95% CI, 0.68-1.33), but was inversely associated with urinary tract cancer incidence (HR, 0.48; 95% CI, 0.23-1.00). CONCLUSIONS: There was no increase in either overall or prostate cancer rates among men prenatally DES exposed relative to those unexposed. An unexpected risk reduction was observed for urinary system cancers among the exposed relative to those unexposed. These findings suggest that prenatal DES exposure is unlikely to be an important contributor to cancer development in middle-aged men. IMPACT: The results of this study could lend reassurance to middle-aged men who were prenatally DES exposed that their exposure does not adversely influence their overall cancer risk.


Subject(s)
Neoplasms , Prenatal Exposure Delayed Effects , Diethylstilbestrol/adverse effects , Female , Humans , Incidence , Male , Middle Aged , Neoplasms/chemically induced , Neoplasms/epidemiology , Pregnancy , Prenatal Exposure Delayed Effects/chemically induced , Prenatal Exposure Delayed Effects/epidemiology , Risk
17.
Nutrients ; 13(2)2021 Jan 22.
Article in English | MEDLINE | ID: mdl-33499250

ABSTRACT

Calcium supplementation and fortification are strategies widely used to prevent adverse outcome in population with low-calcium intake which is highly frequent in low-income settings. We aimed to determine the effectiveness and cost-effectiveness of calcium fortified foods on calcium intake and related health, or economic outcomes. We performed a systematic review and meta-analysis involving participants of any age or gender, drawn from the general population. We searched PubMed, Agricola, EMBASE, CINAHL, Global Health, EconLit, the FAO website and Google until June 2019, without language restrictions. Pair of reviewers independently selected, extracted data and assessed the risk of bias of included studies using Covidence software. Disagreements were resolved by consensus. We performed meta-analyses using RevMan 5.4 and subgroup analyses by study design, age group, and fortification levels. We included 20 studies of which 15 were randomized controlled trials (RCTs), three were non-randomised studies and two were economic evaluations. Most RCTs had high risk of bias on randomization or blinding. Most represented groups were women and children from 1 to 72 months, most common intervention vehicles were milk and bakery products with a fortification levels between 96 and 1200 mg per 100 g of food. Calcium intake increased in the intervention groups between 460 mg (children) and 1200 mg (postmenopausal women). Most marked effects were seen in children. Compared to controls, height increased 0.83 cm (95% CI 0.00; 1.65), plasma parathyroid hormone decreased -1.51 pmol/L, (-2.37; -0.65), urine:calcium creatinine ratio decreased -0.05, (-0.07; -0.03), femoral neck and hip bone mineral density increased 0.02 g/cm2 (0.01; 0.04) and 0.03 g/cm2 (0.00; 0.06), respectively. The largest cost savings (43%) reported from calcium fortification programs came from prevented hip fractures in older women from Germany. Our study highlights that calcium fortification leads to a higher calcium intake, small benefits in children's height and bone health and also important evidence gaps for other outcomes and populations that could be solved with high quality experimental or quasi-experimental studies in relevant groups, especially as some evidence of calcium supplementation show controversial results on the bone health benefit on older adults.


Subject(s)
Calcium, Dietary , Calcium/administration & dosage , Food, Fortified , Aged , Bone Density , Calcium/blood , Calcium/deficiency , Calcium/urine , Child , Child, Preschool , Female , Hip Fractures/prevention & control , Humans , Infant , Male
18.
Ann N Y Acad Sci ; 1493(1): 59-74, 2021 06.
Article in English | MEDLINE | ID: mdl-33432622

ABSTRACT

Calcium intake is low in many countries, especially in low-income countries. Our objective was to perform a simulation exercise on the impact, effectiveness, and safety of a flour fortification strategy using the Intake Modelling, Assessment, and Planning Program. Modeling of calcium fortification scenarios was performed with available dietary intake databases from Argentina, Bangladesh, Italy, the Lao People's Democratic Republic (Lao PDR), Uganda, Zambia, and the United States. This theoretical exercise showed that simulating a fortification with 156 mg of calcium per 100 g of flour would decrease the prevalence of low calcium intake, and less than 2% of the individuals would exceed the recommended calcium upper limit (UL) in Argentina, Italy, Uganda, and Zambia. Bangladesh and the Lao PDR showed little impact, as flour intake is uncommon. By contrast, in the United States, this strategy would lead to some population groups exceeding the UL. This exercise should be replicated and adapted to each country, taking into account the updated prevalence of calcium inadequacy, flour consumption, and technical compatibility between calcium and the flour-type candidate for fortification. A fortification plan should consider the impact on all age groups to avoid the risk of exceeding the upper levels of calcium intake.


Subject(s)
Calcium, Dietary/administration & dosage , Flour , Food, Fortified , Adolescent , Adult , Aged , Child , Child, Preschool , Computer Simulation , Databases, Factual , Developing Countries , Diet Records , Eating , Female , Flour/analysis , Food, Fortified/analysis , Humans , Infant , Infant, Newborn , Male , Middle Aged , Poverty , Pregnancy , Recommended Dietary Allowances , Young Adult
19.
Gates Open Res ; 5: 151, 2021.
Article in English | MEDLINE | ID: mdl-35071994

ABSTRACT

Background: Food fortification is an effective strategy that has been recommended for improving population calcium inadequate intakes. Increasing calcium concentration of water has been proposed as a possible strategy to improve calcium intake. The objective of this study was to determine the sensory threshold of different calcium salts added to drinking water using survival analysis. Methods: We performed the triangle test methodology for samples of water with added calcium using three different calcium salts: calcium chloride, calcium gluconate and calcium lactate. For each salt, a panel of 54 consumers tested seven batches of three water samples. Data were adjusted for chance and sensory threshold was estimated using the survival methodology and a discrimination of 50%. Results: The threshold value estimation for calcium gluconate was 587 ± 131 mg/L of water, corresponding to 25% discrimination, for calcium lactate was 676 ± 186 mg/L, corresponding to 50% discrimination, and for calcium chloride was 291 ± 73 mg/L, corresponding to 50% discrimination. Conclusions: These results show that water with calcium added in different salts and up to a concentration of 500 mg of calcium/L of water is feasible. The calcium salt allowing the highest calcium concentration with the lowest perceived changes in taste was calcium gluconate. Future studies need to explore stability and acceptability over longer periods of time.

20.
Gene ; 769: 145255, 2021 Feb 15.
Article in English | MEDLINE | ID: mdl-33098938

ABSTRACT

INTRODUCTION: In the central nervous system (CNS), tibolone actions are mainly modulated through its interaction with estrogen, progesterone, and androgen receptors. Several studies have reported the expression of sex hormone receptors in the CNS using the RT-PCR endpoint technique. Although some studies have validated reference genes for rat brain tissue in different experimental conditions, no suitable reference genes have been reported in brain tissue from ovariectomized rats treated with tibolone. OBJECTIVE: The aim of this investigation was to evaluate the expression of different housekeeping genes in several brain regions in ovariectomized rats treated with tibolone to determine the stability of a single housekeeping gene and a combination of two housekeeping genes under these experimental conditions. METHODS: Adult female Sprague-Dawley rats were ovariectomized. Seven days after the surgery, animals were administered a single dose of vehicle (water) or tibolone (10 mg/kg/weight). Twenty-four hours later, animals were sacrificed, and the hypothalamus, hippocampus, prefrontal cortex, and cerebellum were dissected. Total RNA was extracted from these tissues, and RT-qPCR was performed to amplify Ppia, Hprt1, Rpl32, and Gapdh housekeeping genes. RESULTS: Ppia was the most stable gene in the hypothalamus and cerebellum, whereas Hprt1 was the most stable gene in the prefrontal cortex. For the analysis of the combination of two genes, the most stable combination was Ppia and Hrpt1 for the prefrontal cortex and Ppia and Rpl32 for the cerebellum. CONCLUSION: In ovariectomized rats treated with tibolone, Hprt1 and Ppia genes showed high stability as housekeeping genes for qPCR analysis.


Subject(s)
Brain/drug effects , Estrogen Receptor Modulators/pharmacology , Genes, Essential , Norpregnenes/pharmacology , Ovariectomy , Animals , Brain/metabolism , Female , Rats , Rats, Sprague-Dawley
SELECTION OF CITATIONS
SEARCH DETAIL
...