Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 1.799
Filter
1.
Conserv Biol ; : e14316, 2024 Jul 01.
Article in English | MEDLINE | ID: mdl-38946355

ABSTRACT

Assessing the extinction risk of species based on the International Union for Conservation of Nature (IUCN) Red List (RL) is key to guiding conservation policies and reducing biodiversity loss. This process is resource demanding, however, and requires continuous updating, which becomes increasingly difficult as new species are added to the RL. Automatic methods, such as comparative analyses used to predict species RL category, can be an efficient alternative to keep assessments up to date. Using amphibians as a study group, we predicted which species are more likely to change their RL category and thus should be prioritized for reassessment. We used species biological traits, environmental variables, and proxies of climate and land-use change as predictors of RL category. We produced an ensemble prediction of IUCN RL category for each species by combining 4 different model algorithms: cumulative link models, phylogenetic generalized least squares, random forests, and neural networks. By comparing RL categories with the ensemble prediction and accounting for uncertainty among model algorithms, we identified species that should be prioritized for future reassessment based on the mismatch between predicted and observed values. The most important predicting variables across models were species' range size and spatial configuration of the range, biological traits, climate change, and land-use change. We compared our proposed prioritization index and the predicted RL changes with independent IUCN RL reassessments and found high performance of both the prioritization and the predicted directionality of changes in RL categories. Ensemble modeling of RL category is a promising tool for prioritizing species for reassessment while accounting for models' uncertainty. This approach is broadly applicable to all taxa on the IUCN RL and to regional and national assessments and may improve allocation of the limited human and economic resources available to maintain an up-to-date IUCN RL.


Uso del análisis comparativo del riesgo de extinción para priorizar la reevaluación de los anfibios en la Lista Roja de la UICN Resumen El análisis del riesgo de extinción de una especie con base en la Lista Roja (LR) de la Unión Internacional para la Conservación de la Naturaleza (UICN) es clave para guiar las políticas de conservación y reducir la pérdida de la biodiversidad. Sin embargo, este proceso demanda recursos y requiere de actualizaciones continuas, lo que se complica conforme se añaden especies nuevas a la LR. Los métodos automáticos, como los análisis comparativos usados para predecir la categoría de la especie en la LR, pueden ser una alternativa eficiente para mantener actualizados los análisis. Usamos a los anfibios como grupo de estudio para predecir cuáles especies tienen mayor probabilidad de cambiar de categoría en la LR y que, por lo tanto, se debería priorizar su reevaluación. Usamos las características biológicas de la especie, las variables ambientales e indicadores climáticos y del cambio de uso de suelo como predictores de la categoría en la LR. Elaboramos una predicción de ensamble de la categoría en la LR de la UICN para cada especie mediante la combinación de cuatro algoritmos diferentes: modelos de vínculo acumulativo, menor número de cuadros filogenéticos generalizados, bosques aleatorios y redes neurales. Con la comparación entre las categorías de la LR y la predicción de ensamble y con considerar la incertidumbre entre los algoritmos identificamos especies que deberían ser prioridad para futuras reevaluaciones con base en el desfase entre los valores predichos y los observados. Las variables de predicción más importantes entre los modelos fueron el tamaño de la distribución de la especie y su configuración espacial, las características biológicas, el cambio climático y el cambio de uso de suelo. Comparamos nuestra propuesta de índice de priorización y los cambios predichos en la LR con las reevaluaciones independientes de la LR de la UICN y descubrimos un buen desempeño tanto para la priorización como para la direccionalidad predicha de los cambios en las categorías de la LR. El modelo de ensamble de la categoría de la LR esa una herramienta prometedora para priorizar la reevaluación de las especies a la vez que considera la incertidumbre del modelo. Esta estrategia puede generalizarse para aplicarse a todos los taxones de la LR de la UICN y a los análisis regionales y nacionales. También podría mejorar la asignación de los recursos humanos y económicos limitados disponibles para mantener actualizada la LR de la UICN.

2.
Food Chem ; 457: 140073, 2024 Jun 11.
Article in English | MEDLINE | ID: mdl-38909456

ABSTRACT

The phytochemical composition and physicochemical attributes of polyphenol-enriched protein particle ingredients produced with pulse proteins (e.g. chickpea protein, pea protein, and a chickpea-pea protein blend) and polyphenols recovered from wild blueberry pomace were investigated for colloidal and interfacial properties. Anthocyanins were the major polyphenol fraction (27.74-36.47 mg C3G/g) of these polyphenol-rich particles (44.95-62.08 mg GAE/g). Dispersions of pea protein-polyphenol particles showed a superior phase stability before and after heat treatment compared to the chickpea pea protein-polyphenol system. This observation was independent of the added amount of NaCl in the dispersion. In general, at quasi equilibrium state, pulse protein-polyphenol particles and parental pulse protein ingredients showed similar oil-water interfacial tension. However, pea protein-polyphenol particles demonstrated a reduced diffusion-driven oil-water interfacial adsorption rate constant compared to the parental pea protein ingredient. Overall, the obtained results suggest application potential of pea protein-polyphenol particles as a functional food/beverage ingredient.

3.
Article in English | MEDLINE | ID: mdl-38842428

ABSTRACT

In a previous study characterizing Campylobacter strains deficient in selenium metabolism, 50 strains were found to be similar to, but distinct from, the selenonegative species Campylobacter lanienae. Initial characterization based on multilocus sequence typing and the phylogeny of a set of 20 core genes determined that these strains form three putative taxa within the selenonegative cluster. A polyphasic study was undertaken here to further clarify their taxonomic position within the genus. The 50 selenonegative strains underwent phylogenetic analyses based on the sequences of the 16S rRNA gene and an expanded set of 330 core genes. Standard phenotypic testing was also performed. All strains were microaerobic and anaerobic, Gram-negative, spiral or curved cells with some displaying coccoid morphologies. Strains were motile, oxidase, catalase, and alkaline phosphatase positive, urease negative, and reduced nitrate. Strains within each clade had unique phenotypic profiles that distinguished them from other members of the genus. Core genome phylogeny clearly placed the 50 strains into three clades. Pairwise average nucleotide identity and digital DNA-DNA hybridization values were all below the recommended cut-offs for species delineation with respect to C. lanienae and other related Campylobacter species. The data presented here clearly show that these strains represent three novel species within the genus, for which the names Campylobacter devanensis sp. nov. (type strain RM3662T=LMG 33097T=NCTC 15074T), Campylobacter porcelli sp. nov. (type strain RM6137T=LMG 33098T=CCUG 77054T=NCTC 15075T) and Campylobacter vicugnae sp. nov. (type strain RM12175T=LMG 33099T=CCUG 77055T=NCTC 15076T) are proposed.


Subject(s)
Bacterial Typing Techniques , Campylobacter , DNA, Bacterial , Multilocus Sequence Typing , Nucleic Acid Hybridization , Phylogeny , RNA, Ribosomal, 16S , Sequence Analysis, DNA , RNA, Ribosomal, 16S/genetics , Campylobacter/genetics , Campylobacter/classification , Campylobacter/isolation & purification , Animals , DNA, Bacterial/genetics , Swine , Ruminants/microbiology
4.
Spat Spatiotemporal Epidemiol ; 49: 100659, 2024 Jun.
Article in English | MEDLINE | ID: mdl-38876558

ABSTRACT

Spatial cluster analyses are commonly used in epidemiologic studies of case-control data to detect whether certain areas in a study region have an excess of disease risk. Case-control studies are susceptible to potential biases including selection bias, which can result from non-participation of eligible subjects in the study. However, there has been no systematic evaluation of the effects of non-participation on the findings of spatial cluster analyses. In this paper, we perform a simulation study assessing the effect of non-participation on spatial cluster analysis using the local spatial scan statistic under a variety of scenarios that vary the location and rates of study non-participation and the presence and intensity of a zone of elevated risk for disease for simulated case-control studies. We find that geographic areas of lower participation among controls than cases can greatly inflate false-positive rates for identification of artificial spatial clusters. Additionally, we find that even modest non-participation outside of a true zone of elevated risk can decrease spatial power to identify the true zone. We propose a spatial algorithm to correct for potentially spatially structured non-participation that compares the spatial distributions of the observed sample and underlying population. We demonstrate its ability to markedly decrease false positive rates in the absence of elevated risk and resist decreasing spatial sensitivity to detect true zones of elevated risk. We apply our method to a case-control study of non-Hodgkin lymphoma. Our findings suggest that greater attention should be paid to the potential effects of non-participation in spatial cluster studies.


Subject(s)
Spatial Analysis , Humans , Cluster Analysis , Case-Control Studies , Selection Bias , Computer Simulation , Algorithms , Lymphoma, Non-Hodgkin/epidemiology
5.
Cureus ; 16(5): e61146, 2024 May.
Article in English | MEDLINE | ID: mdl-38933631

ABSTRACT

INTRODUCTION: Phytotherapeutics derived from medicinal plants treat various illnesses, including viral infections such as SARS, MERS, and SARSCoV-2, as well as bacterial and fungal diseases. It highlights ongoing research into the chemical compositions of plant components for developing new drugs, with a particular emphasis on anti-cytotoxic agents for anticancer drugs. Traditional extraction methods have limitations, leading to the exploration of environmentally friendly technologies such as ultrasound-assisted, supercritical fluid, microwave-assisted, and accelerated solvent extraction. The paragraph concludes by stating the aim of a specific study to optimize extraction conditions of bioactive compounds from Urtica dioica in Kurdistan, comparing conventional and non-conventional extraction methods, solvents, and extraction times. MATERIALS AND METHODS: The study was conducted between June 2022 and August 2022, fresh leaves and stems of U. dioica plant were collected and sequentially underwent four extraction methods (maceration, Soxhlet, ultrasound-assisted extraction (UAE), and microwave-assisted extraction (MAE) by using petroleum ether, chloroform, ethanol, and distilled-water as solvents. RESULTS: The results highlighted significant variations in the yields of bioactive compounds based on the extraction method, solvent type, and duration. Among conventional methods, Soxhlet was the most powerful method and had the most extraction yields, while maceration had the lowest yields. The modern techniques surpassed the conventional methods by producing high extraction yields within a shorter time (a few minutes) and using a lesser amount of solvent. Consequently, UAE and MAE emerge as the most efficient techniques. Hence, MAE effectively produced the highest extraction yields and is considered the preferred technique. The choice of solvents significantly influenced the extraction yields, with ethanol consistently emerging as an effective solvent across various extraction methods. In contrast, petroleum ether demonstrated the lowest efficacy as a solvent. Furthermore, the results unveiled the impact of extraction time on yields, indicating a correlation between increased time and extraction yield in certain cases. CONCLUSION: Extraction is a very critical step in the study of medicinal plants. The amount of extracted compounds is significantly affected by the extraction method, solvent, and time. Ethanol stands out as the most effective solvent, producing the highest yields of bioactive compounds, while petroleum ether yields the least. Additionally, extraction yield shows a direct relation with extraction time. Soxhlet being the most powerful among conventional methods and maceration yields the least. Modern techniques, particularly UAE and MAE, surpass conventional methods by achieving high yields in shorter times with less solvent. MAE, in particular, offers advantages such as shortened extraction time, increased efficiency, reduced labor, and enhanced selectivity, making it the preferred method for extracting bioactive compounds from aerial parts U. dioica.

6.
Clin Neuropsychol ; : 1-17, 2024 May 13.
Article in English | MEDLINE | ID: mdl-38741352

ABSTRACT

Objective: Our study aimed to explore whether physical condition might affect the association between genetic predisposition for Alzheimer's Disease (AD) and AD incidence. Methods: The sample of participants consisted of 561 community-dwelling adults over 64 years old, without baseline dementia (508 cognitively normal and 53 with mild cognitive impairment), deriving from the HELIAD, an ongoing longitudinal study with follow-up evaluations every 3 years. Physical condition was assessed at baseline through walking time (WT), while a Polygenic Risk Score for late onset AD (PRS-AD) was used to estimate genetic predisposition. The association between WT and PRS-AD with AD incidence was evaluated with Cox proportional hazard models adjusted for age, sex, education years, global cognition score and APOE ε-4 genotype. Then, the association between WT and AD incidence was investigated after stratifying participants by low and high PRS-AD. Finally, we examined the association between PRS-AD and AD incidence after stratifying participants by WT. Results: Both WT and PRS-AD were connected with increased AD incidence (p < 0.05), after adjustments. In stratified analyses, in the slow WT group participants with a greater genetic risk had a 2.5-fold higher risk of developing AD compared to participants with lower genetic risk (p = 0.047). No association was observed in the fast WT group or when participants were stratified based on PRS-AD. Conclusions: Genetic predisposition for AD is more closely related to AD incidence in the group of older adults with slow WT. Hence, physical condition might be a modifier in the relationship of genetic predisposition with AD incidence.

7.
Int J Cancer ; 2024 May 16.
Article in English | MEDLINE | ID: mdl-38757245

ABSTRACT

Dietary folate intake has been identified as a potentially modifiable factor of gastric cancer (GC) risk, although the evidence is still inconsistent. We evaluate the association between dietary folate intake and the risk of GC as well as the potential modification effect of alcohol consumption. We pooled data for 2829 histologically confirmed GC cases and 8141 controls from 11 case-control studies from the international Stomach Cancer Pooling Consortium. Dietary folate intake was estimated using food frequency questionnaires. We used linear mixed models with random intercepts for each study to calculate adjusted odds ratios (OR) and 95% confidence interval (CI). Higher folate intake was associated with a lower risk of GC, although this association was not observed among participants who consumed >2.0 alcoholic drinks/day. The OR for the highest quartile of folate intake, compared with the lowest quartile, was 0.78 (95% CI, 0.67-0.90, P-trend = 0.0002). The OR per each quartile increment was 0.92 (95% CI, 0.87-0.96) and, per every 100 µg/day of folate intake, was 0.89 (95% CI, 0.84-0.95). There was a significant interaction between folate intake and alcohol consumption (P-interaction = 0.02). The lower risk of GC associated with higher folate intake was not observed in participants who consumed >2.0 drinks per day, ORQ4v Q1 = 1.15 (95% CI, 0.85-1.56), and the OR100 µg/day = 1.02 (95% CI, 0.92-1.15). Our study supports a beneficial effect of folate intake on GC risk, although the consumption of >2.0 alcoholic drinks/day counteracts this beneficial effect.

8.
JHEP Rep ; 6(5): 101050, 2024 May.
Article in English | MEDLINE | ID: mdl-38699531

ABSTRACT

Background & Aims: Peripartum prophylaxis (PP) with tenofovir disoproxil fumarate (TDF) is the standard of care to prevent mother-to-child transmission of chronic hepatitis B (CHB) infection in mothers who are highly viremic. We investigated the maternal and infant outcomes in a large Chinese cohort of TDF-treated CHB pregnant participants. Methods: In this prospective study, treatment-naive mothers with CHB and highly viremic (HBV DNA ≥200,000 IU/ml) but without cirrhosis were treated with TDF at 24-28 weeks of pregnancy. In accordance with Chinese CHB guidelines, TDF was stopped at delivery or ≥4 weeks postpartum. Serum HBV DNA and alanine aminotransferase were monitored every 6-8 weeks to determine virological relapse (VR). Infants received standard neonatal immunization, and HBV serology was checked at 7-12 months of age. Results: Among 330 participants recruited (median age 30, 82.7% HBeAg+, median HBV DNA 7.82 log IU/ml), TDF was stopped at delivery in 66.4% and at ≥4 weeks in 33.6%. VR was observed in 98.3%, among which 11.6% were retreated with TDF. Timing of TDF cessation did not alter the risk of VR (99.0 vs. 96.9%), clinical relapse (19.5 vs. 14.3%), or retreatment (12.6 vs. 10.1%) (all p > 0.05). A similar proportion of patients developed alanine aminotransferase flare five times (1.1 vs. 2.1%; p = 0.464) and 10 times (0.5 vs. 0%; p = 0.669) above the upper limit of normal (ULN) in the early withdrawal and late withdrawal groups, respectively. No infants developed HBsAg-positivity. Conclusions: PP-TDF and neonatal immunization were highly effective in preventing mother-to-child transmission of HBV in mothers who are highly viremic. Timing of cessation of PP-TDF did not affect the risk of VR or retreatment. Impact and Implications: In pregnant mothers with chronic hepatitis B infection who are started on peripartum tenofovir to prevent mother-to-child-transmission (MTCT), the optimal timing for antiviral withdrawal during the postpartum period remains unknown. This prospective study demonstrates that stopping tenofovir immediately at delivery, compared with longer treatment duration of tenofovir, did not lead to an increased risk of virological relapse, retreatment, or transmission of the virus to the baby. Shortening the duration of peripartum antiviral prophylaxis from 12 weeks to immediately after delivery can be considered. The immediate withdrawal of peripartum tenofovir, combined with standard neonatal immunization schemes, is 100% effective in preventing MTCT among pregnant mothers with CHB who are highly viremic, with a high rate of vaccine response in infants.

9.
Infection ; 2024 May 11.
Article in English | MEDLINE | ID: mdl-38733459

ABSTRACT

PURPOSE: It is unclear whether common maternal infections during pregnancy are risk factors for adverse birth outcomes. We assessed the association between self-reported infections during pregnancy with preterm birth and small-for-gestational-age (SGA) in an international cohort consortium. METHODS: Data on 120,507 pregnant women were obtained from six population-based birth cohorts in Australia, Denmark, Israel, Norway, the UK and the USA. Self-reported common infections during pregnancy included influenza-like illness, common cold, any respiratory tract infection, vaginal thrush, vaginal infections, cystitis, urinary tract infection, and the symptoms fever and diarrhoea. Birth outcomes included preterm birth, low birth weight and SGA. Associations between maternal infections and birth outcomes were first assessed using Poisson regression in each cohort and then pooled using random-effect meta-analysis. Risk ratios (RR) and 95% confidence intervals (CI) were calculated, adjusted for potential confounders. RESULTS: Vaginal infections (pooled RR, 1.10; 95% CI, 1.02-1.20) and urinary tract infections (pooled RR, 1.17; 95% CI, 1.09-1.26) during pregnancy were associated with higher risk of preterm birth. Similar associations with low birth weight were also observed for these two infections. Fever during pregnancy was associated with higher risk of SGA (pooled RR, 1.07; 95% CI, 1.02-1.12). No other significant associations were observed between maternal infections/symptoms and birth outcomes. CONCLUSION: Vaginal infections and urinary infections during pregnancy were associated with a small increased risk of preterm birth and low birth weight, whereas fever was associated with SGA. These findings require confirmation in future studies with laboratory-confirmed infection diagnosis.

10.
Biomedicines ; 12(5)2024 May 10.
Article in English | MEDLINE | ID: mdl-38791015

ABSTRACT

The possible relationship between Subjective Cognitive Decline (SCD) and dementia needs further investigation. In the present study, we explored the association between specific biomarkers of Alzheimer's Disease (AD), amyloid-beta 42 (Aß42) and Tau with the odds of SCD using data from two ongoing studies. In total, 849 cognitively normal (CN) individuals were included in our analyses. Among the participants, 107 had available results regarding cerebrospinal fluid (CSF) Aß42 and Tau, while 742 had available genetic data to construct polygenic risk scores (PRSs) reflecting their genetic predisposition for CSF Aß42 and plasma total Tau levels. The associations between AD biomarkers and SCD were tested using logistic regression models adjusted for possible confounders such as age, sex, education, depression, and baseline cognitive test scores. Abnormal values of CSF Aß42 were related to 2.5-fold higher odds of SCD, while higher polygenic loading for Aß42 was associated with 1.6-fold higher odds of SCD. CSF Tau, as well as polygenic loading for total Tau, were not associated with SCD. Thus, only cerebral amyloidosis appears to be related to SCD status, either in the form of polygenic risk or actual CSF measurements. The temporal sequence of amyloidosis being followed by tauopathy may partially explain our findings.

11.
J Affect Disord ; 359: 373-381, 2024 Aug 15.
Article in English | MEDLINE | ID: mdl-38788860

ABSTRACT

BACKGROUND: Emerging observational evidence supports a role for higher fruit and vegetable intake in protecting against the development of depression. However, there is a scarcity of research in older adults or in low- to middle-income countries (LMICs). METHODS: Participants were 7801 community-based adults (mean age 68.6 ± 8.0 years, 55.8 % female) without depression, from 10 diverse cohorts, including four cohorts from LMICs. Fruit and vegetable intake was self-reported via comprehensive food frequency questionnaire, short food questionnaire or diet history. Depressive symptoms were assessed using validated measures, and depression defined applying validated cut-offs. The associations between baseline fruit and vegetable intakes and incident depression over a follow-up period of three to nine years were examined using Cox regression. Analyses were performed by cohort with results meta-analysed. RESULTS: There were 1630 cases of incident depression (21 % of participants) over 40,258 person-years of follow-up. Higher intake of fruit was associated with a lower risk of incident depression (HR 0.87, 95%CI [0.77, 0.99], I2 = 4 %). No association was found between vegetable intake and incident depression (HR 0.93, 95%CI [0.84, 1.04], I2 = 0 %). LIMITATIONS: Diverse measures used across the different cohorts and the modest sample size of our study compared with prior studies may have prevented an association being detected for vegetable intake. CONCLUSIONS: Our study supports a role for fruit, but not vegetable intake in protecting against depression. Research investigating different types of fruits and vegetables using standardised measures in larger cohorts of older adults from low- and middle-income countries is warranted.


Subject(s)
Depression , Diet , Fruit , Vegetables , Humans , Female , Male , Aged , Middle Aged , Depression/epidemiology , Longitudinal Studies , Diet/statistics & numerical data , Incidence
12.
Neuropharmacology ; 255: 110019, 2024 Sep 01.
Article in English | MEDLINE | ID: mdl-38810926

ABSTRACT

The endogenous opioid system has been implicated in alcohol consumption and preference in both humans and animals. The mu opioid receptor (MOR) is expressed on multiple cells in the striatum, however little is known about the contributions of specific MOR populations to alcohol drinking behaviors. The current study used mice with a genetic deletion of MOR in cholinergic cells (ChAT-Cre/Oprm1fl/fl) to examine the role of MORs expressed in cholinergic interneurons (CINs) in home cage self-administration paradigms. Male and female ChAT-Cre/Oprm1fl/fl mice were generated and heterozygous Cre+ (knockout) and Cre- (control) mice were tested for alcohol consumption in two drinking paradigms: limited access "Drinking in the Dark" and intermittent access. Quinine was added to the drinking bottles in the DID experiment to test aversion-resistant, "compulsive" drinking. Nicotine and sucrose drinking were also assessed so comparisons could be made with other rewarding substances. Cholinergic MOR deletion did not influence consumption or preference for ethanol (EtOH) in either drinking task. Differences were observed in aversion-resistance in males with Cre + mice tolerating lower concentrations of quinine than Cre-. In contrast to EtOH, preference for nicotine was reduced following cholinergic MOR deletion while sucrose consumption and preference was increased in Cre+ (vs. Cre-) females. Locomotor activity was also greater in females following the deletion. These results suggest that cholinergic MORs participate in preference for rewarding substances. Further, while they are not required for consumption of alcohol alone, cholinergic MORs may influence the tendency to drink despite negative consequences.


Subject(s)
Alcohol Drinking , Mice, Knockout , Quinine , Receptors, Opioid, mu , Reward , Animals , Receptors, Opioid, mu/genetics , Receptors, Opioid, mu/metabolism , Male , Female , Mice , Quinine/pharmacology , Quinine/administration & dosage , Alcohol Drinking/genetics , Alcohol Drinking/psychology , Nicotine/pharmacology , Ethanol/pharmacology , Ethanol/administration & dosage , Cholinergic Neurons/drug effects , Cholinergic Neurons/physiology , Cholinergic Neurons/metabolism , Self Administration , Sucrose/administration & dosage , Avoidance Learning/drug effects , Avoidance Learning/physiology , Interneurons/drug effects , Interneurons/physiology , Interneurons/metabolism
13.
Environ Int ; 188: 108767, 2024 Jun.
Article in English | MEDLINE | ID: mdl-38795658

ABSTRACT

BACKGROUND: Polychlorinated dibenzo-p-dioxins and dibenzofurans (PCDD/Fs) are persistent organic pollutants emitted from industrial sources. Residential proximity to these emissions has been associated with risk of non-Hodgkin lymphoma (NHL) in a limited number of studies. METHODS: We evaluated associations between residential proximity to PCDD/F-emitting facilities and NHL in the NIH-AARP Diet and Health Study (N = 451,410), a prospective cohort enrolled in 1995-1996 in 6 states and 2 U.S. cities. We linked enrollment addresses with a U.S. Environmental Protection Agency database of 4,478 historical PCDD/F sources with estimated toxic equivalency quotient (TEQ) emissions. We evaluated associations between NHL and exposures during a historical period prior to enrollment (1980-1995) using an average emissions index, weighted by toxicity, distance, and wind direction (AEI-W [g TEQ/km2]) within 3-, 5- and 10 km of residences. We also evaluated proximity-only metrics indicating the presence/absence of one or more facilities within each distance, and metrics calculated separately for each facility type. We used Cox regression to estimate associations (hazard ratio, HR; 95 % confidence interval, 95 %CI) with NHL and major subtypes, adjusting for demographic, lifestyle, and dietary factors. RESULTS: A total of 6,467 incident cases of NHL were diagnosed through 2011. Participants with an AEI-W ≥ 95th percentile had elevated risk of NHL compared to those unexposed at 3 km (HR = 1.16; 95 %CI = 0.89-1.52; p-trend = 0.24), 5 km (HR = 1.20;95 %CI = 0.99-1.46;p-trend = 0.05) and 10 km (HR = 1.15; 95 %CI = 0.99-1.34; p-trend = 0.04). We found a positive association at 5 km with follicular lymphoma (HR≥95vs.0 = 1.62; 95 %CI = 0.98-2.67; p-trend = 0.05) and a suggestive association for diffuse large B-cell lymphoma (HR≥95vs.0 = 1.40; 95 %CI = 0.91-2.14; p-trend = 0.11). NHL risk was also associated with high emissions from coal-fired power plants within 10 km (HR≥95vs.0 = 1.42; 95 %CI = 1.09-1.84; p-trend = 0.05). CONCLUSIONS: Residential proximity to relatively high dioxin emissions from industrial sources may increase the risk of NHL and specific subtypes.


Subject(s)
Lymphoma, Non-Hodgkin , Humans , Lymphoma, Non-Hodgkin/epidemiology , Lymphoma, Non-Hodgkin/chemically induced , Middle Aged , United States/epidemiology , Male , Female , Dioxins/analysis , Aged , Environmental Exposure/statistics & numerical data , Prospective Studies , Air Pollutants/analysis
14.
Int J Epidemiol ; 53(3)2024 Apr 11.
Article in English | MEDLINE | ID: mdl-38670544

ABSTRACT

BACKGROUND: Evidence on the potential association between dietary copper intake and gastric cancer (GC) is lacking. Thus, we aimed to evaluate this association within the Stomach cancer Pooling (StoP) Project-an international consortium of epidemiological studies on GC. METHODS: Data from five case-control studies within the StoP Project were included (2448 cases, 4350 controls). We estimated adjusted odds ratios (ORs) and 95% CIs for the association between dietary copper intake and GC using multivariable mixed-effects logistic regression models. We also modelled the dose-response relationship between copper intake and GC using a logistic mixed-effects model with fractional polynomial. RESULTS: The OR for the highest quartile of copper intake compared with the lowest one was 0.78 (95% CI: 0.63-0.95; P for trend = 0.013). Results were similar for non-cardia-type (OR: 0.72; 95% CI: 0.57-0.91), intestinal-type (OR: 0.75; 95% CI: 0.56-0.99) and other histological-type GC (OR: 0.65; 95% CI: 0.44-0.96). The dose-response analysis showed a steep decrease in ORs for modest intakes (<1 mg/day), which were subsequently steady for ≤3 mg/day (OR: 0.09; 95% CI: 0.02-0.41) and slowly increased for higher intakes. CONCLUSIONS: The findings of our large study suggest that copper intake might be inversely associated with GC, although their confirmation by prospective studies is required.


Subject(s)
Copper , Diet , Stomach Neoplasms , Humans , Stomach Neoplasms/epidemiology , Copper/administration & dosage , Female , Male , Middle Aged , Case-Control Studies , Aged , Logistic Models , Adult , Odds Ratio , Risk Factors
15.
J Eval Clin Pract ; 30(4): 693-702, 2024 Jun.
Article in English | MEDLINE | ID: mdl-38652541

ABSTRACT

RATIONALE: The shift toward virtual academic detailing (AD) was accelerated by the COVID-19 pandemic. AIMS AND OBJECTIVES: We aimed to examine the role of external, contextual, and intrinsic programme-specific factors in virtual engagement of healthcare providers (HCPs) and delivery of AD. METHODS: AD groups throughout North America were contacted to participate in semistructured interviews. An interview guide was constructed by adapting the Consolidated Framework for Implementation Research (CFIR). A point of emphasis included strategies AD groups employed for provider engagement while implementing virtual AD programmes. Independent coders conducted qualitative analysis using the framework method. RESULTS: Fifteen AD groups from Canada (n = 3) and the United States (n = 12) participated. Technological issues and training detailers and HCPs were challenges during the transition to virtual AD visits. Restrictions on in-person activities during the pandemic created difficulties engaging HCPs and fewer AD visits. Continuing education was one strategy to incentivize participation, but credits were often not claimed by HCPs. Groups with established networks and prior experience with virtual AD leveraged connections to mitigate disruptions and continue AD visits. Other facilitators included emphasizing contemporary topics, including opioid education beyond fundamental guidelines. Virtual AD had the additional benefit of expanding geographic reach and flexible scheduling with providers. CONCLUSIONS: AD groups across North America have shifted to virtual outreach and delivery strategies. This trend toward virtual AD may aid outreach to vulnerable rural communities, improving health equity. More research is needed on the effectiveness of virtual AD and its future implications.


Subject(s)
COVID-19 , Qualitative Research , Humans , COVID-19/epidemiology , Canada , Health Personnel/education , North America , SARS-CoV-2 , United States , Telemedicine/organization & administration , Pandemics
16.
Life (Basel) ; 14(4)2024 Mar 24.
Article in English | MEDLINE | ID: mdl-38672702

ABSTRACT

Background: Restless legs syndrome/Willis-Ekbom disease (RLS/WED) has occasionally but not consistently been associated with cognitive and most notably language and executive impairment. The present study was conducted to investigate the cognitive trajectories of older individuals with RLS/WED. Methods: Participants were drawn from the randomly selected, older (>64 years), population-based HELIAD cohort. Individuals without dementia and with available neuropsychological evaluations at baseline and follow-up were considered for potential eligibility. A comprehensive assessment examining five principal components of cognition (memory, visuo-spatial ability, attention, executive function, and language) was administered to the participants. Generalized estimating equation analyses were used to examine the unadjusted and adjusted (for critical factors and covariates) effects of RLS/WED on cognition over time. Results: A total of 1003 predominantly female (59.5%), older (72.9 ± 4.9 years) participants with follow-up evaluations after a mean of 3.09 ± 0.85 years and without dementia at baseline and follow-up were included in the present study. Among them, 81 were diagnosed with RLS/WED at baseline. Global cognition, memory, attention, and executive and visuo-perceptual skills did not differ between those with and without RLS/WED. However, the RLS/WED group performed worse on language at baseline by a standard deviation of 0.249, while demonstrating a mitigated language decline over time, by a standard deviation of 0.063. The unadjusted models yielded similar results. Conclusions: Our findings were indicative of a baseline language disadvantage among older individuals with RLS/WED, but the initial discrepancy tends to dissolve over time.

17.
Eur J Nutr ; 2024 Apr 30.
Article in English | MEDLINE | ID: mdl-38687390

ABSTRACT

PURPOSE: Gastric cancer (GC) is among the leading causes of cancer mortality worldwide. The objective of this study was to investigate the association between dietary fiber intake and GC. METHODS: We pooled data from 11 population or hospital-based case-control studies included in the Stomach Cancer Pooling (StoP) Project, for a total of 4865 histologically confirmed cases and 10,626 controls. Intake of dietary fibers and other dietary factors was collected using food frequency questionnaires. We calculated the odds ratios (OR) and 95% confidence intervals (CI) of the association between dietary fiber intake and GC by using a multivariable logistic regression model adjusted for study site, sex, age, caloric intake, smoking, fruit and vegetable intake, and socioeconomic status. We conducted stratified analyses by these factors, as well as GC anatomical site and histological type. RESULTS: The OR of GC for an increase of one quartile of fiber intake was 0.91 (95% CI: 0.85, 0.97), that for the highest compared to the lowest quartile of dietary fiber intake was 0.72 (95% CI: 0.59, 0.88). Results were similar irrespective of anatomical site and histological type. CONCLUSION: Our analysis supports the hypothesis that dietary fiber intake may exert a protective effect on GC.

18.
Science ; 384(6694): 453-458, 2024 Apr 26.
Article in English | MEDLINE | ID: mdl-38662833

ABSTRACT

Governments recently adopted new global targets to halt and reverse the loss of biodiversity. It is therefore crucial to understand the outcomes of conservation actions. We conducted a global meta-analysis of 186 studies (including 665 trials) that measured biodiversity over time and compared outcomes under conservation action with a suitable counterfactual of no action. We find that in two-thirds of cases, conservation either improved the state of biodiversity or at least slowed declines. Specifically, we find that interventions targeted at species and ecosystems, such as invasive species control, habitat loss reduction and restoration, protected areas, and sustainable management, are highly effective and have large effect sizes. This provides the strongest evidence to date that conservation actions are successful but require transformational scaling up to meet global targets.


Subject(s)
Biodiversity , Conservation of Natural Resources , Extinction, Biological , Introduced Species , Animals , Ecosystem
19.
Environ Int ; 187: 108644, 2024 May.
Article in English | MEDLINE | ID: mdl-38636272

ABSTRACT

Glyphosate is the most widely applied herbicide worldwide. Glyphosate biomonitoring data are limited for agricultural settings. We measured urinary glyphosate concentrations and assessed exposure determinants in the Biomarkers of Exposure and Effect in Agriculture (BEEA) study. We selected four groups of BEEA participants based on self-reported pesticide exposure: recently exposed farmers with occupational glyphosate use in the last 7 days (n = 98), farmers with high lifetime glyphosate use (>80th percentile) but no use in the last 7 days (n = 70), farming controls with minimal lifetime use (n = 100), and nonfarming controls with no occupational pesticide exposures and no recent home/garden glyphosate use (n = 100). Glyphosate was quantified in first morning void urine using ion chromatography isotope-dilution tandem mass spectrometry. We estimated associations between urinary glyphosate concentrations and potential determinants using multivariable linear regression. Glyphosate was detected (≥0.2 µg/L) in urine of most farmers with recent (91 %) and high lifetime (93 %) use, as well as farming (88 %) and nonfarming (81 %) controls; geometric mean concentrations were 0.89, 0.59, 0.46, and 0.39 µg/L (0.79, 0.51, 0.42, and 0.37 µg/g creatinine), respectively. Compared with both control groups, urinary glyphosate concentrations were significantly elevated among recently exposed farmers (P < 0.0001), particularly those who used glyphosate in the previous day [vs. nonfarming controls; geometric mean ratio (GMR) = 5.46; 95 % confidence interval (CI): 3.75, 7.93]. Concentrations among high lifetime exposed farmers were also elevated (P < 0.01 vs. nonfarming controls). Among recently exposed farmers, glyphosate concentrations were higher among those not wearing gloves when applying glyphosate (GMR = 1.91; 95 % CI: 1.17, 3.11), not wearing long-sleeved shirts when mixing/loading glyphosate (GMR = 2.00; 95 % CI: 1.04, 3.86), applying glyphosate exclusively using broadcast/boom sprayers (vs. hand sprayer only; GMR = 1.70; 95 % CI: 1.00, 2.92), and applying glyphosate to crops (vs. non-crop; GMR = 1.72; 95 % CI: 1.04, 2.84). Both farmers and nonfarmers are exposed to glyphosate, with recency of occupational glyphosate use being the strongest determinant of urinary glyphosate concentrations. Continued biomonitoring of glyphosate in various settings is warranted.


Subject(s)
Agriculture , Biological Monitoring , Biomarkers , Farmers , Glycine , Glyphosate , Herbicides , Occupational Exposure , Humans , Glycine/analogs & derivatives , Glycine/urine , Male , Occupational Exposure/analysis , Herbicides/urine , Middle Aged , Adult , Biomarkers/urine , Aged , Environmental Monitoring/methods
20.
PLOS Glob Public Health ; 4(3): e0002744, 2024.
Article in English | MEDLINE | ID: mdl-38446807

ABSTRACT

Aedes aegypti control has been fraught with challenges in Puerto Rico. The government has implemented commonly used vector control methods, but arboviral epidemics still occur. It is necessary to explore new Ae. aegypti control methods. This study aimed to understand the perceptions of community members in Ponce, Puerto Rico about emergent and traditional Ae. aegypti vector control methods and determine their acceptability and support for these methods. We identified the type of information needed to increase support for emergent vector control methods, and the preferred strategies to disseminate this information. Four group discussions were conducted with a total of 32 participants representing eight of the 14 clusters participating in the Communities Organized for the Prevention of Arboviruses (COPA), a project designed to mobilize communities in Ponce, Puerto Rico to prevent diseases transmitted by mosquitoes. Group discussions began with an overview of different methods used for controlling Ae. aegypti mosquitoes. These overviews facilitated participant understanding of the mosquito control methods presented. Use of source reduction, autocidal gravid ovitraps (AGO), and manual application of larvicide for arboviral mosquito control received support from almost all participants. Vector control methods that use more familiar techniques in Puerto Rico such as truck-mounted larvicide spraying (TMLS) and insecticide residual spraying received support from most participants. More than half of participants supported the use of emergent mosquito control methods including Wolbachia suppression, Wolbachia replacement, or genetically modified mosquitoes (GMM). Participants preferred to receive vector control information through house-to-house visits with the distribution of written materials, followed by dissemination of information through traditional (i.e., radio, television) and social media. The detailed information resulting from this study was used to develop messages for a communications campaign to garner future community support. Community acceptance and support are critical for the success of vector control programs using emergent mosquito control methods.

SELECTION OF CITATIONS
SEARCH DETAIL
...