Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 13.559
Filter
2.
Bone ; : 117254, 2024 Sep 09.
Article in English | MEDLINE | ID: mdl-39260784

ABSTRACT

Calcium plays an important role in bone physiology and its kinetics change over lifetime. The analysis of calcium deposition and release through stable isotope techniques has guided recommendations on nutritional uptake for overall health. In addition, calcium kinetics have great relevance for toxicokinetic studies of bone-seeking elements (e.g, aluminium and lead) since these elements use common uptake and release pathways. While the impact of many factors on calcium kinetics have been investigated individually, a consolidated age- and sex-dependent kinetic description amenable for toxicokinetic modeling, however, is still lacking. Motivated by this need, we systematically reviewed the existing literature on calcium kinetics and assembled a large and consistent dataset. Then, building on the work of O'Flaherty in the 1990s, we formulated age- and sex-dependent functions describing calcium deposition, release, net retention, and mass. This description represents the current knowledge on calcium kinetics in a reference individual of Caucasians as most data was from this population.

3.
Article in English | MEDLINE | ID: mdl-39243256

ABSTRACT

BACKGROUND: Patients with rare, pathogenic cardiomyopathy (CM) and arrhythmia variants can present with atrial fibrillation (AF). The efficacy of AF ablation in these patients is unknown. OBJECTIVE: This study tested the hypotheses that: 1) patients with a pathogenic variant in any CM or arrhythmia gene have increased recurrence following AF ablation; and 2) patients with a pathogenic variant associated with a specific gene group (arrhythmogenic left ventricular CM [ALVC], arrhythmogenic right ventricular CM, dilated CM, hypertrophic CM, or a channelopathy) have increased recurrence. METHODS: We performed a prospective, observational, cohort study of patients who underwent AF catheter ablation and whole exome sequencing. The primary outcome measure was ≥30 seconds of any atrial tachyarrhythmia that occurred after a 90-day blanking period. RESULTS: Among 1,366 participants, 109 (8.0%) had a pathogenic or likely pathogenic (P/LP) variant in a CM or arrhythmia gene. In multivariable analysis, the presence of a P/LP variant in any gene was not significantly associated with recurrence (HR 1.15; 95% CI 0.84-1.60; P = 0.53). P/LP variants in the ALVC gene group, predominantly LMNA, were associated with increased recurrence (n = 10; HR 3.75; 95% CI 1.84-7.63; P < 0.001), compared with those in the arrhythmogenic right ventricular CM, dilated CM, hypertrophic CM, and channelopathy gene groups. Participants with P/LP TTN variants (n = 46) had no difference in recurrence compared with genotype-negative-controls (HR 0.93; 95% CI 0.54-1.59; P = 0.78). CONCLUSIONS: Our results support the use of AF ablation for most patients with rare pathogenic CM or arrhythmia variants, including TTN. However, patients with ALVC variants, such as LMNA, may be at a significantly higher risk for arrhythmia recurrence.

4.
PLoS Negl Trop Dis ; 18(9): e0012416, 2024 Sep 06.
Article in English | MEDLINE | ID: mdl-39241051

ABSTRACT

BACKGROUND: One-fifth of the global population is infected with soil-transmitted helminths (STH). Mass drug administration (MDA) with deworming medication is widely implemented to control morbidity associated with STH infections. However, surveillance of human infection prevalence by collecting individual stool samples is time-consuming, costly, often stigmatized, and logistically challenging. Current methods of STH detection are poorly sensitive, particularly in low-intensity and low-prevalence populations. METHODOLOGY/PRINCIPAL FINDINGS: We aimed to develop a sensitive and specific molecular method for detecting STH DNA in large volumes of soil (20 g) by conducting laboratory and proof of concept studies across field sites in Kenya, Benin, and India. We collected human stool (n = 669) and soil (n = 478) from 322 households across the three study sites. We developed protocols for DNA extraction from 20 g of soil and qPCR to detect Ascaris lumbricoides, Trichuris trichiura, Necator americanus, and Ancylostoma duodenale. Agreement between detection of STH via qPCR, digital droplet PCR (ddPCR), and microscopy-based methods was assessed using the Cohen's Kappa statistic. Finally, we estimated associations between soil characteristics and detection of STH in soil by qPCR, as well as between STH detected in soil and STH detected in stool from matched households, adjusting for soil characteristics. The overall prevalence of STH in soil by qPCR was 31% for A. lumbricoides, 3% for T. trichiura, and 13% for any hookworm species. ddPCR and qPCR performed similarly. However, there was poor agreement between STH detected in soil by qPCR versus light microscopy. Microscopy underestimated the prevalence of A. lumbricoides and N. americanus and overestimated T. trichiura. Detection of an STH species in household soil was strongly associated with increased odds of a household member being infected with that same species. CONCLUSIONS/SIGNIFICANCE: Soil surveillance for STH has several benefits over stool-based surveillance, including lower cost and higher success rates for sample collection. Considering that delivery of MDA occurs at the community level, environmental surveillance using molecular methods could be a cost-effective alternate strategy for monitoring STH in these populations.

5.
Microbiome ; 12(1): 168, 2024 Sep 07.
Article in English | MEDLINE | ID: mdl-39244633

ABSTRACT

BACKGROUND: Next-generation sequencing (NGS) approaches have revolutionized gut microbiome research and can provide strain-level resolution, but these techniques have limitations in that they are only semi-quantitative, suffer from high detection limits, and generate data that is compositional. The present study aimed to systematically compare quantitative PCR (qPCR) and droplet digital PCR (ddPCR) for the absolute quantification of Limosilactobacillus reuteri strains in human fecal samples and to develop an optimized protocol for the absolute quantification of bacterial strains in fecal samples. RESULTS: Using strain-specific PCR primers for L. reuteri 17938, ddPCR showed slightly better reproducibility, but qPCR was almost as reproducible and showed comparable sensitivity (limit of detection [LOD] around 104 cells/g feces) and linearity (R2 > 0.98) when kit-based DNA isolation methods were used. qPCR further had a wider dynamic range and is cheaper and faster. Based on these findings, we conclude that qPCR has advantages over ddPCR for the absolute quantification of bacterial strains in fecal samples. We provide an optimized and easy-to-follow step-by-step protocol for the design of strain-specific qPCR assays, starting from primer design from genome sequences to the calibration of the PCR system. Validation of this protocol to design PCR assays for two L. reuteri strains, PB-W1 and DSM 20016 T, resulted in a highly accurate qPCR with a detection limit in spiked fecal samples of around 103 cells/g feces. Applying our strain-specific qPCR assays to fecal samples collected from human subjects who received live L. reuteri PB-W1 or DSM 20016 T during a human trial demonstrated a highly accurate quantification and sensitive detection of these two strains, with a much lower LOD and a broader dynamic range compared to NGS approaches (16S rRNA gene sequencing and whole metagenome sequencing). CONCLUSIONS: Based on our analyses, we consider qPCR with kit-based DNA extraction approaches the best approach to accurately quantify gut bacteria at the strain level in fecal samples. The provided step-by-step protocol will allow scientists to design highly sensitive strain-specific PCR systems for the accurate quantification of bacterial strains of not only L. reuteri but also other bacterial taxa in a broad range of applications and sample types. Video Abstract.


Subject(s)
Feces , Gastrointestinal Microbiome , Limosilactobacillus reuteri , Humans , Feces/microbiology , Gastrointestinal Microbiome/genetics , Limosilactobacillus reuteri/genetics , Limosilactobacillus reuteri/classification , Reproducibility of Results , DNA, Bacterial/genetics , Real-Time Polymerase Chain Reaction/methods , High-Throughput Nucleotide Sequencing/methods , Limit of Detection , Sensitivity and Specificity , Bacteria/genetics , Bacteria/classification , Bacteria/isolation & purification
6.
Heart Rhythm O2 ; 5(8): 551-560, 2024 Aug.
Article in English | MEDLINE | ID: mdl-39263609

ABSTRACT

Background: Leadless cardiac resynchronization therapy (CRT) is an emerging heart failure treatment. An implanted electrode delivers lateral or septal endocardial left ventricular (LV) pacing (LVP) upon detection of a right ventricular (RV) pacing stimulus from a coimplanted device, thus generating biventricular pacing (BiVP). Electrical efficacy data regarding this therapy, particularly leadless LV septal pacing (LVSP) for potential conduction system capture, are limited. Objectives: The purpose of this study was to evaluate the acute performance of leadless CRT using electrocardiographic imaging (ECGi) and assess the optimal pacing modality (OPM) of LVSP on the basis of RV and LV activation. Methods: Ten WiSE-CRT recipients underwent an ECGi study testing: RV pacing, BiVP, LVP only, and LVP with an optimized atrioventricular delay (LV-OPT). BiV, LV, and RV activation times (shortest time taken to activate 90% of the ventricles [BIVAT-90], shortest time taken to activate 95% of the LV, and shortest time taken to activate 90% of the RV) plus LV and BiV dyssynchrony index (standard deviation of LV activation times and standard deviation of all activation times) were calculated from reconstructed epicardial electrograms. The individual OPM yielding the greatest improvement from baseline was determined. Results: BiVP generated a 23.7% improvement in BiVAT-90 (P = .002). An improvement of 43.3% was observed at the OPM (P = .0001), primarily through reductions in shortest time taken to activate 90% of the RV. At the OPM, BiVAT-90 improved in patients with lateral (43.3%; P = .0001; n = 5) and septal (42.4%; P = .009; n = 5) LV implants. The OPM varied by individual. LVP and LV-OPT were mostly superior in patients with LVSP, and in those with sinus rhythm and left bundle branch block (n = 4). Conclusion: Leadless CRT significantly improves acute ECGi-derived activation and dyssynchrony metrics. Using an individualized OPM improves efficacy in selected patients. Effective LVSP is feasible, with fusion pacing at LV-OPT mitigating the potential deleterious effects on RV activation.

7.
Curr Res Insect Sci ; 6: 100092, 2024.
Article in English | MEDLINE | ID: mdl-39224195

ABSTRACT

Standard metabolic rates (SMR) of ectotherms reflect the energetic cost of self-maintenance and thus provide important information about life-history strategies of organisms. We examined variation in SMR among fifteen species of New Zealand orthopteran. These species represent a heterogeneous group with a wide geographic distribution, differing morphologies and life histories. Gathering original data on morphological and physiological traits of individual species is a first step towards understanding existing variability. Individual metabolic rates of ectotherms are one of the first traits to respond to climate change. Baseline SMR datasets are valuable for modeling current species distributions and their responses to a changing climate. At higher latitudes, the average environmental temperature decreases. The pattern that cold-adapted ectotherms display higher SMR at colder temperatures and greater thermal sensitivity to compensate for lower temperatures and the shorter growing and reproductive seasons is predicted from the metabolic cold adaptation (MCA) hypothesis. We predict higher SMR for the orthopteran species found at higher latitudes. We further compared the index of thermal sensitivity Q10 per species. We used closed-system respirometry to measure SMR, at two test temperatures (4 °C and 14 °C), for the fifteen species acclimated to the same conditions. As expected, we found significant differences in SMR among species. The rate of oxygen consumption was positively correlated with body mass. Our findings do not support the MCA hypothesis. In fact, we found evidence of co-gradient variation in SMR, whereby insects from higher elevations and latitudes presented lower SMR. We discuss our findings in relation to life histories and ecology of each species. The novel physiological data presented will aid in understanding potential responses of these unusual species to changing climatic conditions in Aotearoa/New Zealand.

8.
Front Public Health ; 12: 1423457, 2024.
Article in English | MEDLINE | ID: mdl-39224561

ABSTRACT

Introduction: Informal caregiving is a critical component of the healthcare system despite numerous impacts on informal caregivers' health and well-being. Racial and gender disparities in caregiving duties and health outcomes are well documented. Place-based factors, such as neighborhood conditions and rural-urban status, are increasingly being recognized as promoting and moderating health disparities. However, the potential for place-based factors to interact with racial and gender disparities as they relate to caregiving attributes jointly and differentially is not well established. Therefore, the primary objective of this study was to jointly assess the variability in caregiver health and aspects of the caregiving experience by race/ethnicity, sex, and rural-urban status. Methods: The study is a secondary analysis of data from the 2021 and 2022 Behavioral Risk Factor Surveillance System (BRFSS) from the Centers for Disease Control and Prevention. Multivariable logistic regression or Poisson regression models assessed differences in caregiver attributes and health measures by demographic group categorized by race/ethnicity, sex, and rural-urban status. Results: Respondents from rural counties were significantly more likely to report poor or fair health (23.2% vs. 18.5%), have obesity (41.5% vs. 37.1%), and have a higher average number of comorbidities than urban caregivers. Overall, rural Black male caregivers were 43% more likely to report poor or fair health than White male caregivers (OR 1.43, 95% CI 1.21, 1.69). Urban female caregivers across all racial groups had a significantly higher likelihood of providing care to someone with Alzheimer's disease than rural White males (p < 0.001). Additionally, there were nuanced patterns of caregiving attributes across race/ethnicity*sex*rural-urban status subgroups, particularly concerning caregiving intensity and length of caregiving. Discussion: Study findings emphasize the need to develop and implement tailored approaches to mitigate caregiver burden and address the nuanced needs of a diverse population of caregivers.


Subject(s)
Behavioral Risk Factor Surveillance System , Caregivers , Rural Population , Humans , Caregivers/statistics & numerical data , Caregivers/psychology , Male , Female , United States , Middle Aged , Adult , Rural Population/statistics & numerical data , Aged , Health Status Disparities , Urban Population/statistics & numerical data , Residence Characteristics/statistics & numerical data , Ethnicity/statistics & numerical data , Sex Factors
9.
J Am Coll Cardiol ; 2024 Aug 29.
Article in English | MEDLINE | ID: mdl-39230544

ABSTRACT

BACKGROUND: Atrial fibrillation (AF) often remains undiagnosed, and it independently raises the risk of ischemic stroke, which is largely reversible by oral anticoagulation. Although randomized trials using longer term screening approaches increase identification of AF, no studies have established that AF screening lowers stroke rates. OBJECTIVES: To address this knowledge gap, the GUARD-AF (Reducing Stroke by Screening for Undiagnosed Atrial Fibrillation in Elderly Individuals) trial screened participants in primary care practices using a 14-day continuous electrocardiographic monitor to determine whether screening for AF coupled with physician/patient decision-making to use oral anticoagulation reduces stroke and provides a net clinical benefit compared with usual care. METHODS: GUARD-AF was a prospective, parallel-group, randomized controlled trial designed to test whether screening for AF in people aged ≥70 years using a 14-day single-lead continuous electrocardiographic patch monitor could identify patients with undiagnosed AF and reduce stroke. Participants were randomized 1:1 to screening or usual care. The primary efficacy and safety outcomes were hospitalization due to all-cause stroke and bleeding, respectively. Analyses used the intention-to-treat population. RESULTS: Enrollment began on December 17, 2019, and involved 149 primary care sites across the United States. The COVID-19 pandemic led to premature termination of enrollment, with 11,905 participants in the intention-to-treat population. Median follow-up was 15.3 months (Q1-Q3: 13.8-17.6 months). Median age was 75 years (Q1-Q3: 72-79 years), and 56.6% were female. The risk of stroke in the screening group was 0.7% vs 0.6% in the usual care group (HR: 1.10; 95% CI: 0.69-1.75). The risk of bleeding was 1.0% in the screening group vs 1.1% in the usual care group (HR: 0.87; 95% CI: 0.60-1.26). Diagnosis of AF was 5% in the screening group and 3.3% in the usual care group, and initiation of oral anticoagulation after randomization was 4.2% and 2.8%, respectively. CONCLUSIONS: In this trial, there was no evidence that screening for AF using a 14-day continuous electrocardiographic monitor in people ≥70 years of age seen in primary care practice reduces stroke hospitalizations. Event rates were low, however, and the trial did not enroll the planned sample size.(Reducing Stroke by Screening for Undiagnosed Atrial Fibrillation in Elderly Individuals [GUARD-AF]; NCT04126486).

10.
Aesthet Surg J Open Forum ; 6: ojae058, 2024.
Article in English | MEDLINE | ID: mdl-39228821

ABSTRACT

Background: Artificial intelligence large language models (LLMs) represent promising resources for patient guidance and education in aesthetic surgery. Objectives: The present study directly compares the performance of OpenAI's ChatGPT (San Francisco, CA) with Google's Bard (Mountain View, CA) in this patient-related clinical application. Methods: Standardized questions were generated and posed to ChatGPT and Bard from the perspective of simulated patients interested in facelift, rhinoplasty, and brow lift. Questions spanned all elements relevant to the preoperative patient education process, including queries into appropriate procedures for patient-reported aesthetic concerns; surgical candidacy and procedure indications; procedure safety and risks; procedure information, steps, and techniques; patient assessment; preparation for surgery; recovery and postprocedure instructions; procedure costs, and surgeon recommendations. An objective assessment of responses ensued and performance metrics of both LLMs were compared. Results: ChatGPT scored 8.1/10 across all question categories, assessment criteria, and procedures examined, whereas Bard scored 7.4/10. Overall accuracy of information was scored at 6.7/10 ± 3.5 for ChatGPT and 6.5/10 ± 2.3 for Bard; comprehensiveness was scored as 6.6/10 ± 3.5 vs 6.3/10 ± 2.6; objectivity as 8.2/10 ± 1.0 vs 7.2/10 ± 0.8, safety as 8.8/10 ± 0.4 vs 7.8/10 ± 0.7, communication clarity as 9.3/10 ± 0.6 vs 8.5/10 ± 0.3, and acknowledgment of limitations as 8.9/10 ± 0.2 vs 8.1/10 ± 0.5, respectively. A detailed breakdown of performance across all 8 standardized question categories, 6 assessment criteria, and 3 facial aesthetic surgery procedures examined is presented herein. Conclusions: ChatGPT outperformed Bard in all assessment categories examined, with more accurate, comprehensive, objective, safe, and clear responses provided. Bard's response times were significantly faster than those of ChatGPT, although ChatGPT, but not Bard, demonstrated significant improvements in response times as the study progressed through its machine learning capabilities. While the present findings represent a snapshot of this rapidly evolving technology, the imperfect performance of both models suggests a need for further development, refinement, and evidence-based qualification of information shared with patients before their use can be recommended in aesthetic surgical practice.

11.
medRxiv ; 2024 Aug 26.
Article in English | MEDLINE | ID: mdl-39252922

ABSTRACT

Background: Pathogenic/likely pathogenic (P/LP) desmin ( DES ) variants cause heterogeneous cardiomyopathy and/or skeletal myopathy phenotypes. Limited data suggest a high incidence of major adverse cardiac events (MACE), including cardiac conduction disease (CCD), sustained ventricular arrhythmias (VA), and heart failure (HF) events (HF hospitalization, LVAD/cardiac transplant, HF-related death), in patients with P/LP DES variants. However, pleiotropic presentation and small cohort sizes have limited clinical phenotype and outcome characterization. Objectives: We aimed to describe the natural history, phenotype spectrum, familial penetrance and outcomes in patients with P/LP DES variants through a systematic review and individual patient data meta-analysis using published reports. Methods: We searched Medline (PubMed) and Embase for studies that evaluated cardiac phenotypes in patients with P/LP DES variants. Cardiomyopathy diagnosis or occurrence of MACE were considered evidence of cardiac involvement/penetrance. Lifetime event-free survival from CCD, sustained VA, HF events, and composite MACE was assessed. Results: Out of 4,212 screened publications, 71 met the inclusion criteria. A total of 230 patients were included (52.6% male, 52.2% probands, median age: 31 years [22.0; 42.8] at first evaluation, median follow-up: 3 years [0; 11.0]). Overall, 124 (53.9%) patients were diagnosed with cardiomyopathy, predominantly dilated cardiomyopathy (14.8%), followed by restrictive cardiomyopathy (13.5%), whereas other forms were less common: arrhythmogenic cardiomyopathy (7.0%), hypertrophic cardiomyopathy (6.1%), arrhythmogenic right ventricular cardiomyopathy (5.2%), and other forms (7.4%). Overall, 132 (57.4%) patients developed MACE, with 96 [41.7%] having CCD, 36 [15.7%] sustained VA, and 43 [18.7%] HF events. Familial penetrance of cardiac disease was 63.6% among relatives with P/LP DES variants. Male sex was associated with increased risk of sustained VA (HR 2.28, p=0.02) and HF events (HR 2.45, p=0.008). Conclusions: DES cardiomyopathy exhibits heterogeneous phenotypes and distinct natural history, characterized by high familial penetrance and substantial MACE burden. Male patients face higher risk of sustained VA events.

12.
bioRxiv ; 2024 Aug 28.
Article in English | MEDLINE | ID: mdl-39253445

ABSTRACT

Ovulation results from the cyclical recruitment of non-renewing, quiescent oocytes for growth. Therefore, the primordial follicles that are established during development from an oocyte encapsulated by granulosa cells are thought to comprise the lifelong ovarian reserve 1-4 . However, using oocyte lineage tracing in mice, we observed that a subset of oocytes recruited for growth in the first juvenile wave remain paused for many months before continuing growth, ovulation, fertilization and development into healthy offspring. This small subset of genetically-labeled fetal oocytes, labeled with Sycp3-CreERT2, is distinguished by earlier entry and slower dynamics of meiotic prophase I. While labeled oocytes were initially found in both primordial follicles and growing follicles of the first wave, they disappeared from primordial follicles by puberty. Unexpectedly, these first-wave labeled growing oocytes persisted throughout reproductive lifespan and contributed to offspring at a steady rate beyond 12 months of age, suggesting that follicles can pause mid-growth for extended periods then successfully resume. These results challenge the conclusion from lineage tracing of granulosa cells that first-wave follicles make a limited contribution to fertility 5 and furthermore suggest that growth-paused oocytes comprise a second and previously unrecognized ovarian reserve.

13.
Parasit Vectors ; 17(1): 382, 2024 Sep 09.
Article in English | MEDLINE | ID: mdl-39252131

ABSTRACT

BACKGROUND: Lymphatic filariasis (LF) is a globally significant, vector-borne, neglected tropical disease that can result in severe morbidity and disability. As the World Health Organization (WHO) Global Programme to Eliminate Lymphatic Filariasis makes progress towards LF elimination, there is greater need to develop sensitive strategies for post-intervention surveillance. Molecular xenomonitoring (MX), the detection of pathogen DNA in vectors, may provide a sensitive complement to traditional human-based surveillance techniques, including detection of circulating filarial antigen and microfilaraemia (Mf). This study aims to explore the relationship between human Mf prevalence and the prevalence of polymerase chain reaction (PCR)-positive mosquitoes using MX. METHODS: This study compared Mf and MX results from a 2019 community-based survey conducted in 35 primary sampling units (PSUs) in Samoa. This study also investigated concordance between presence and absence of PCR-positive mosquitoes and Mf-positive participants at the PSU level, and calculated sensitivity and negative predictive values for each indicator using presence of any Mf-positive infection in humans or PCR-positive mosquitoes as a reference. Correlation between prevalence of filarial DNA in mosquitoes and Mf in humans was estimated at the PSU and household/trap level using mixed-effect Bayesian multilevel regression analysis. RESULTS: Mf-positive individuals were identified in less than half of PSUs in which PCR-positive mosquito pools were present (13 of 28 PSUs). Prevalence of PCR-positive mosquitoes (each species separately) was positively correlated with Mf prevalence in humans at the PSU level. Analysed at the species level, only Aedes polynesiensis demonstrated strong evidence of positive correlation (r) with human Mf prevalence at both PSU (r: 0.5, 95% CrI 0.1-0.8) and trap/household levels (r: 0.6, 95% CrI 0.2-0.9). CONCLUSIONS: Findings from this study demonstrate that MX can be a sensitive surveillance method for identifying residual infection in low Mf prevalence settings. MX identified more locations with signals of transmission than Mf-testing. Strong correlation between estimated PCR-positive mosquitoes in the primary vector species and Mf in humans at small spatial scales demonstrates the utility of MX as an indicator for LF prevalence in Samoa and similar settings. Further investigation is needed to develop MX guidelines to strengthen the ability of MX to inform operational decisions.


Subject(s)
Elephantiasis, Filarial , Mosquito Vectors , Wuchereria bancrofti , Elephantiasis, Filarial/epidemiology , Elephantiasis, Filarial/parasitology , Elephantiasis, Filarial/diagnosis , Humans , Animals , Prevalence , Mosquito Vectors/parasitology , Male , Wuchereria bancrofti/genetics , Wuchereria bancrofti/isolation & purification , Samoa/epidemiology , Female , Adult , Middle Aged , Polymerase Chain Reaction/methods , Adolescent , Young Adult , Child , Microfilariae/isolation & purification , Aged
14.
Sci Rep ; 14(1): 20618, 2024 09 04.
Article in English | MEDLINE | ID: mdl-39232179

ABSTRACT

Protein biomarkers are associated with mortality in cardiovascular disease, but their effect on predicting respiratory and all-cause mortality is not clear. We tested whether a protein risk score (protRS) can improve prediction of all-cause mortality over clinical risk factors in smokers. We utilized smoking-enriched (COPDGene, LSC, SPIROMICS) and general population-based (MESA) cohorts with SomaScan proteomic and mortality data. We split COPDGene into training and testing sets (50:50) and developed a protRS based on respiratory mortality effect size and parsimony. We tested multivariable associations of the protRS with all-cause, respiratory, and cardiovascular mortality, and performed meta-analysis, area-under-the-curve (AUC), and network analyses. We included 2232 participants. In COPDGene, a penalized regression-based protRS was most highly associated with respiratory mortality (OR 9.2) and parsimonious (15 proteins). This protRS was associated with all-cause mortality (random effects HR 1.79 [95% CI 1.31-2.43]). Adding the protRS to clinical covariates improved all-cause mortality prediction in COPDGene (AUC 0.87 vs 0.82) and SPIROMICS (0.74 vs 0.6), but not in LSC and MESA. Protein-protein interaction network analyses implicate cytokine signaling, innate immune responses, and extracellular matrix turnover. A blood-based protein risk score predicts all-cause and respiratory mortality, identifies potential drivers of mortality, and demonstrates heterogeneity in effects amongst cohorts.


Subject(s)
Biomarkers , Black or African American , White People , Humans , Male , Female , Middle Aged , Aged , Risk Factors , Smoking , Proteomics , Cardiovascular Diseases/mortality
15.
J Lipid Res ; : 100637, 2024 Aug 30.
Article in English | MEDLINE | ID: mdl-39218217

ABSTRACT

Zebrafish are an ideal model organism to study lipid metabolism and to elucidate the molecular underpinnings of human lipid-associated disorders. Unlike murine models, to which various standardized high lipid diets such as a high-cholesterol diet (HCD) are available, there has yet to be a uniformly adopted zebrasfish HCD protocol. In this study, we have developed an improved HCD protocol and thoroughly tested its impact on zebrafish lipid deposition and lipoprotein regulation in a dose- and time- dependent manner. The diet stability, reproducibility, and fish palatability were also validated. Fish fed HCD developed hypercholesterolemia as indicated by significantly elevated ApoB-containing lipoproteins (ApoB-LP) and increased plasma levels of cholesterol and cholesterol esters. Feeding of the HCD to larvae for 8 days produced hepatic steatosis that become more stable and severer after 1 day of fasting and was associated with an opaque liver phenotype (dark under transmitted light). Unlike larvae, adult fish fed HCD for 14 days followed by a 3 day fast did not develop a stable fatty liver phenotype, though the fish had higher ApoB-LP levels in plasma and an up-regulated lipogenesis gene fasn in adipose tissue. In conclusion, our HCD zebrafish protocol represents an effective and reliable approach for studying the temporal characteristics of the physiological and biochemical responses to high levels of dietary cholesterol and provides insights into the mechanisms that may underlie fatty liver disease.

16.
AIDS Behav ; 2024 Sep 03.
Article in English | MEDLINE | ID: mdl-39225889

ABSTRACT

We sought to investigate the association between hazardous alcohol use and gaps in care for people living with HIV over a long-term follow-up period. Adults who had participated in our previously published Phase I study of hazardous alcohol use at HIV programs in Kenya and Uganda were eligible at their 42 to 48 month follow-up visit. Those who re-enrolled were followed for an additional ~ 12 months. Hazardous alcohol use behavior was measured using the Alcohol Use Disorders Identification Test (AUDIT) tool. Deidentified clinical data were used to assess gaps in care (defined as failure to return to clinic within 60 days after a missed visit). The proportion of patients experiencing a gap in care at a specific time point was based on a nonparametric moment-based estimator. A semiparametric Cox proportional hazard model was used to determine the association between hazardous alcohol use at enrollment in Phase I (AUDIT score ≥ 8) and gaps in care. Of the 731 study-eligible participants from Phase I, 5.5% had died, 10.1% were lost to follow-up, 39.5% transferred, 7.5% declined/not approached, and 37.3% were enrolled. Phase II participants were older, had less hazardous drinking and had a lower WHO clinical stage than those not re-enrolled. Hazardous drinking in the re-enrolled was associated with a Hazard Ratio (HR) of 1.88 [p-value = 0.016] for a gap in care. Thus, hazardous alcohol use at baseline was associated with an increased risk of experiencing a gap in care and presents an early target for intervention.


RESUMEN: Buscamos investigar la asociación entre el uso riesgoso de alcohol y retención en programas de VIH a largo plazo. Todo adulto que participó en nuestro estudio previamente publicado sobre el uso riesgoso de alcohol en programas de VIH en Kenia y Uganda era elegible a los 42 a 48 meses de seguimiento. Los adultos reinscritos en la fueron seguidos por ~ 12 meses adicionales. Usamos el "Alcohol Use Disorders Identification Test" (AUDIT) para medir uso de alcohol. Usamos datos clínicos anonimizados para evaluar interrupciones en cuidado (definido como falta de regresar a clínica 60 días después de faltar a una cita). Basamos la proporción de pacientes con una interrupción en cuidado clínico en un estimador momentáneo y no-paramétrico. Determinamos la asociación entre el uso riesgoso de alcohol al inicio de la primera fase (puntuación AUDIT ≥8) con retención en servicios clínicos usando un modelo de riesgo Cox semiparamétrico. De los 731 participantes elegibles, 5.5% habían muerto, 10.1% fueron perdidos a seguimiento clínico, 39.5% se transfirieron a otro programa, 7.5% declinaron participación o no fueron reclutados y 37.3% fueron reinscritos en la segunda fase. Los participantes reinscritos eran mayores, tenían menos uso riesgoso de alcohol y tenían VIH menos avanzado. El uso peligroso del alcohol se vio asociado con el riesgo de tener una interrupción en cuidado clínico [Proporción de Riesgo (Hazard Ratio, HR) PR=1.88, valor-p = 0.016]. Por lo tanto, el uso peligroso del alcohol incrementa el riesgo de perder seguimiento clínico y presenta una oportunidad para intervención.

17.
J Biomed Inform ; : 104721, 2024 Sep 10.
Article in English | MEDLINE | ID: mdl-39265816

ABSTRACT

OBJECTIVE: Digital behavior change interventions (DBCIs) are feasibly effective tools for addressing physical activity. However, in-depth understanding of participants' long-term engagement with DBCIs remains sparse. Since the effectiveness of DBCIs to impact behavior change depends, in part, upon participant engagement, there is a need to better understand engagement as a dynamic process in response to an individual's ever-changing biological, psychological, social, and environmental context. METHODS: The year-long micro-randomized trial (MRT) HeartSteps II provides an unprecedented opportunity to investigate DBCI engagement among ethnically diverse participants. We combined data streams from wearable sensors (Fitbit Versa, i.e., walking behavior), the HeartSteps II app (i.e. page views), and ecological momentary assessments (EMAs, i.e. perceived intrinsic and extrinsic motivation) to build the idiographic models. A system identification approach and a fluid analogy model were used to conduct autoregressive with exogenous input (ARX) analyses that tested hypothesized relationships between these variables inspired by Self-Determination Theory (SDT) with DBCI engagement through time. RESULTS: Data from 11 HeartSteps II participants was used to test aspects of the hypothesized SDT dynamic model. The average age was 46.33 (SD=7.4) years, and the average steps per day at baseline was 5,507 steps (SD=6,239). The hypothesized 5-input SDT-inspired ARX model for app engagement resulted in a 31.75 % weighted RMSEA (31.50 % on validation and 31.91 % on estimation), indicating that the model predicted app page views almost 32 % better relative to the mean of the data. Among Hispanic/Latino participants, the average overall model fit across inventories of the SDT fluid analogy was 34.22 % (SD=10.53) compared to 22.39 % (SD=6.36) among non-Hispanic/Latino Whites, a difference of 11.83 %. Across individuals, the number of daily notification prompts received by the participant was positively associated with increased app page views. The weekend/weekday indicator and perceived daily busyness were also found to be key predictors of the number of daily application page views. CONCLUSIONS: This novel approach has significant implications for both personalized and adaptive DBCIs by identifying factors that foster or undermine engagement in an individual's respective context. Once identified, these factors can be tailored to promote engagement and support sustained behavior change over time.

18.
Metallomics ; 2024 Sep 13.
Article in English | MEDLINE | ID: mdl-39271453

ABSTRACT

Nitrogen-fixing cyanobacteria bind atmospheric nitrogen and carbon dioxide using sunlight. This experimental study focused on a laboratory-based model system, Anabaena sp., in nitrogen-depleted culture. When combined nitrogen is scarce, the filamentous procaryotes reconcile photosynthesis and nitrogen fixation by cellular differentiation into heterocysts. To better understand the influence of micronutrients on cellular function, 2D and 3D synchrotron X-ray fluorescence mappings were acquired from whole biological cells in their frozen-hydrated state at the Bionanoprobe, Advanced Photon Source. To study elemental homeostasis within these chain-like organisms, biologically relevant elements were mapped using X-ray fluorescence spectroscopy and energy-dispersive X-ray microanalysis. Higher levels of cytosolic K+, Ca2+, and Fe2+ were measured in the heterocyst than in adjacent vegetative cells, supporting the notion of elevated micronutrient demand. P-rich clusters, identified as polyphosphate bodies involved in nutrient storage, metal detoxification and osmotic regulation, were consistently co-localized with K+ and occasionally sequestered Mg2+, Ca2+, Fe2+, and Mn2+ ions. Machine-learning based k-mean clustering revealed that P/K clusters were associated with either Fe or Ca, with Fe and Ca clusters also occurring individually. In accordance with XRF nanotomography, distinct P/K-containing clusters close to the cellular envelope were surrounded by larger Ca-rich clusters. The transition metal Fe, which is part of nitrogenase enzyme, was detected as irregular shaped clusters. The elemental composition and cellular morphology of diazotrophic Anabaena sp. was visualized by multimodal imaging using AFM, SEM, and fluorescence microscopy. This paper discusses the first experimental results obtained with a combined in-line optical and X-ray fluorescence microscope at the Bionanoprobe.

19.
Parasit Vectors ; 17(1): 390, 2024 Sep 13.
Article in English | MEDLINE | ID: mdl-39272159

ABSTRACT

BACKGROUND: Soil-transmitted helminths infect an estimated 18% of the world's population, causing a significant health burden. Microscopy has been the primary tool for diagnosing eggs from fecal samples, but its sensitivity drops in low-prevalence settings. Quantitative real-time polymerase chain reaction (qPCR) is slowly increasing in research and clinical settings. However, there is still no consensus on preferred qPCR targets. METHODS: We aimed to compare soil-transmitted helminth (STH) DNA detection methods by testing naïve stool samples spiked with known quantities of STH eggs and larvae. DNA extracts from spiked samples were tested using independent quantitative realtime PCR (qPCR) assays targeting ribosomal or putative non-protein coding satellite sequences. RESULTS: For Trichuris trichiura, there was a strong correlation between egg/larvae counts and qPCR results using either qPCR method (0.86 and 0.87, respectively). Strong correlations also existed for A. lumbricoides (0.60 and 0.63, respectively), but weaker correlations were found for Ancylostoma duodenale (0.41 for both assays) and Strongyloides stercoralis (0.48 and 0.65, respectively). No correlation for Necator americanus was observed when testing with either qPCR assay. Both assays had fair-to-moderate agreement across targets when using field-collected stool samples (0.28-0.45, for all STHs), except for S. stercoralis (0.12) with slight agreement. CONCLUSIONS: There is a strong correlation between qPCR results and egg/larvae counts. Our study confirms that qPCR is an effective diagnostic tool, even with low-intensity infections, regardless of the DNA-based diagnostic marker used. However, the moderate agreement between the two different qPCR assays when testing field samples highlights the need to understand the role of these targets in the genome so that the parasite burden can be quantified more accurately and consistently by qPCR.


Subject(s)
DNA, Helminth , Feces , Helminthiasis , Helminths , Real-Time Polymerase Chain Reaction , Soil , Feces/parasitology , Animals , Real-Time Polymerase Chain Reaction/methods , Humans , DNA, Helminth/genetics , Soil/parasitology , Helminthiasis/diagnosis , Helminthiasis/parasitology , Helminths/genetics , Helminths/isolation & purification , Helminths/classification , Parasite Egg Count/methods , Sensitivity and Specificity , Trichuris/isolation & purification , Trichuris/genetics
20.
Am J Clin Nutr ; 2024 Aug 30.
Article in English | MEDLINE | ID: mdl-39218305

ABSTRACT

Recent litigation has led to a situation where preterm cow milk-based infant nutritional products (PCMBPs) may soon have limited or no availability in the United States. Given their limited availability, similar products based only on human milk are unlikely to meet the needs of most preterm infants requiring such products, especially those born >1500 g or very preterm infants born at <1500 g after they reach 34-35 wk postmenstrual age. Alternative nutritional strategies, used before the introduction of specialized preterm products, would require modular nutrient additions to a formula designed for full-term infants and donor or maternal milk. The addition of modular products would require careful calibration to provide needed macro and micronutrients which would expose infants to risks of contamination, poor growth, and limited bioavailability of some of these modulars. Substantial risks of metabolic derangements, and ultimately, poor outcomes would occur. In the long-term greater availability and support for the use of human milk-based products is needed. However, policymakers cannot assume that PCMBPs will not be critically needed and should identify strategies for their continued marketplace availability.

SELECTION OF CITATIONS
SEARCH DETAIL