Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 4.635
Filter
1.
Nat Commun ; 15(1): 4795, 2024 Jun 11.
Article in English | MEDLINE | ID: mdl-38862487

ABSTRACT

Microgravity is associated with immunological dysfunction, though the mechanisms are poorly understood. Here, using single-cell analysis of human peripheral blood mononuclear cells (PBMCs) exposed to short term (25 hours) simulated microgravity, we characterize altered genes and pathways at basal and stimulated states with a Toll-like Receptor-7/8 agonist. We validate single-cell analysis by RNA sequencing and super-resolution microscopy, and against data from the Inspiration-4 (I4) mission, JAXA (Cell-Free Epigenome) mission, Twins study, and spleens from mice on the International Space Station. Overall, microgravity alters specific pathways for optimal immunity, including the cytoskeleton, interferon signaling, pyroptosis, temperature-shock, innate inflammation (e.g., Coronavirus pathogenesis pathway and IL-6 signaling), nuclear receptors, and sirtuin signaling. Microgravity directs monocyte inflammatory parameters, and impairs T cell and NK cell functionality. Using machine learning, we identify numerous compounds linking microgravity to immune cell transcription, and demonstrate that the flavonol, quercetin, can reverse most abnormal pathways. These results define immune cell alterations in microgravity, and provide opportunities for countermeasures to maintain normal immunity in space.


Subject(s)
Leukocytes, Mononuclear , Single-Cell Analysis , Space Flight , Weightlessness Simulation , Animals , Female , Humans , Male , Mice , Immunity, Innate , Inflammation/immunology , Killer Cells, Natural/immunology , Leukocytes, Mononuclear/immunology , Leukocytes, Mononuclear/metabolism , Machine Learning , Mice, Inbred C57BL , Quercetin/pharmacology , Signal Transduction , T-Lymphocytes/immunology , Weightlessness
2.
NPJ Digit Med ; 7(1): 151, 2024 Jun 11.
Article in English | MEDLINE | ID: mdl-38862589

ABSTRACT

The objective of this study is to use statistical techniques for the identification of transition points along the life course, aiming to identify fundamental changes in patient multimorbidity burden across phases of clinical care. This retrospective cohort analysis utilized 5.2 million patient encounters from 2013 to 2022, collected from a large academic institution and its affiliated hospitals. Structured information was systematically gathered for each encounter and three methodologies - clustering analysis, False Nearest Neighbor, and transitivity analysis - were employed to pinpoint transitions in patients' clinical phase. Clustering analysis identified transition points at age 2, 17, 41, and 66, FNN at 4.27, 5.83, 5.85, 14.12, 20.62, 24.30, 25.10, 29.08, 33.12, 35.7, 38.69, 55.66, 70.03, and transitivity analysis at 7.27, 23.58, 29.04, 35.00, 61.29, 67.03, 77.11. Clustering analysis identified transition points that align with the current clinical gestalt of pediatric, adult, and geriatric phases of care. Notably, over half of the transition points identified by FNN and transitivity analysis were between ages 20 and 40, a population that is traditionally considered to be clinically homogeneous. Few transition points were identified between ages 3 and 17. Despite large social and developmental transition at those ages, the burden of multimorbidities may be consistent across the age range. Transition points derived through unsupervised machine learning approaches identify changes in the clinical phase that align with true differences in underlying multimorbidity burden. These transitions may be different from conventional pediatric and geriatric phases, which are often influenced by policy rather than clinical changes.

3.
bioRxiv ; 2024 Jun 02.
Article in English | MEDLINE | ID: mdl-38854012

ABSTRACT

Regular exercise yields a multitude of systemic benefits, many of which may be mediated through the gut microbiome. Here, we report that cecal microbial transplants (CMTs) from exercise-trained vs. sedentary mice have modest benefits in reducing skeletal muscle atrophy using a mouse model of unilaterally hindlimb-immobilization. Direct administration of top microbial-derived exerkines from an exercise-trained gut microbiome preserved muscle function and prevented skeletal muscle atrophy.

4.
J Steroid Biochem Mol Biol ; : 106571, 2024 Jun 21.
Article in English | MEDLINE | ID: mdl-38909866

ABSTRACT

Prostate cancer is primarily hormone-dependent, and medical treatments have focused on inhibiting androgen biosynthesis or signaling through various approaches. Despite significant advances with the introduction androgen receptor signalling inhibitors (ARSIs), patients continue to progress to castration-resistant prostate cancer (CRPC), highlighting the need for targeted therapies that extend beyond hormonal blockade. Chimeric Antigen Receptor (CAR) T cells and other engineered immune cells represent a new generation of adoptive cellular therapies. While these therapies have significantly enhanced outcomes for patients with hematological malignancies, ongoing research is exploring the broader use of CAR T therapy in solid tumors, including advanced prostate cancer. In general, CAR T cell therapies are less effective against solid cancers with the immunosuppressive tumor microenvironment hindering T cell infiltration, activation and cytotoxicity following antigen recognition. In addition, inherent tumor heterogeneity exists in patients with advanced prostate cancer that may prevent durable therapeutic responses using single-target agents. These barriers must be overcome to inform clinical trial design and improve treatment efficacy. In this review, we discuss the innovative and rationally designed strategies under investigation to improve the clinical translation of cellular immunotherapy in prostate cancer and maximise therapeutic outcomes for these patients.

5.
ACS ES T Eng ; 4(6): 1492-1506, 2024 Jun 14.
Article in English | MEDLINE | ID: mdl-38899163

ABSTRACT

As water treatment technology has improved, the amount of available process data has substantially increased, making real-time, data-driven fault detection a reality. One shortcoming of the fault detection literature is that methods are usually evaluated by comparing their performance on hand-picked, short-term case studies, which yields no insight into long-term performance. In this work, we first evaluate multiple statistical and machine learning approaches for detrending process data. Then, we evaluate the performance of a PCA-based fault detection approach, applied to the detrended data, to monitor influent water quality, filtrate quality, and membrane fouling of an ultrafiltration membrane system for indirect potable reuse. Based on two short case studies, the adaptive lasso detrending method is selected, and the performance of the multivariate approach is evaluated over more than a year. The method is tested for different sets of three critical tuning parameters, and we find that for long-term, autonomous monitoring to be successful, these parameters should be carefully evaluated. However, in comparison with industry standards of simpler, univariate monitoring or daily pressure decay tests, multivariate monitoring produces substantial benefits in long-term testing.

6.
Ecol Appl ; 34(5): e3003, 2024 Jul.
Article in English | MEDLINE | ID: mdl-38890813

ABSTRACT

Large terrestrial mammals increasingly rely on human-modified landscapes as anthropogenic footprints expand. Land management activities such as timber harvest, agriculture, and roads can influence prey population dynamics by altering forage resources and predation risk via changes in habitat, but these effects are not well understood in regions with diverse and changing predator guilds. In northeastern Washington state, USA, white-tailed deer (Odocoileus virginianus) are vulnerable to multiple carnivores, including recently returned gray wolves (Canis lupus), within a highly human-modified landscape. To understand the factors governing predator-prey dynamics in a human context, we radio-collared 280 white-tailed deer, 33 bobcats (Lynx rufus), 50 cougars (Puma concolor), 28 coyotes (C. latrans), and 14 wolves between 2016 and 2021. We first estimated deer vital rates and used a stage-structured matrix model to estimate their population growth rate. During the study, we observed a stable to declining deer population (lambda = 0.97, 95% confidence interval: 0.88, 1.05), with 74% of Monte Carlo simulations indicating population decrease and 26% of simulations indicating population increase. We then fit Cox proportional hazard models to evaluate how predator exposure, use of human-modified landscapes, and winter severity influenced deer survival and used these relationships to evaluate impacts on overall population growth. We found that the population growth rate was dually influenced by a negative direct effect of apex predators and a positive effect of timber harvest and agricultural areas. Cougars had a stronger effect on deer population dynamics than wolves, and mesopredators had little influence on the deer population growth rate. Areas of recent timber harvest had 55% more forage biomass than older forests, but horizontal visibility did not differ, suggesting that timber harvest did not influence predation risk. Although proximity to roads did not affect the overall population growth rate, vehicle collisions caused a substantial proportion of deer mortalities, and reducing these collisions could be a win-win for deer and humans. The influence of apex predators and forage indicates a dual limitation by top-down and bottom-up factors in this highly human-modified system, suggesting that a reduction in apex predators would intensify density-dependent regulation of the deer population owing to limited forage availability.


Subject(s)
Deer , Population Dynamics , Wolves , Animals , Deer/physiology , Wolves/physiology , Humans , Predatory Behavior , Washington , Human Activities , Coyotes/physiology , Puma/physiology , Food Chain , Ecosystem , Lynx/physiology
7.
Urology ; 2024 Jun 20.
Article in English | MEDLINE | ID: mdl-38908562

ABSTRACT

OBJECTIVE: To evaluate the effect of disposable cystoscopes on the rate of symptomatic urinary tract infections (UTI) following post-renal transplant cystoscopic stent removal. METHODS: We performed a retrospective study of post-renal transplant cystoscopic stent removals in our outpatient clinic from March 2019 to March 2022. Our clinic converted to disposable cystoscopes in October 2021. All outpatient, phone, and portal encounters were reviewed for 30 days following the procedure. The primary outcome was the number of post-procedural symptomatic UTI within 30 days of the procedure. Symptomatic UTI was defined as fever, dysuria, or hematuria accompanied by a positive urine culture. RESULTS: A total of 323 patients had post-transplant stent removals including 123 with reusable scopes and 200 with disposable scopes. Around 1.6% (2/123) of patients with a reusable cystoscope experienced symptomatic UTI's. They had positive urine cultures for Escherichia coli and Klebsiella. 2.0% (4/200) of patients with a disposable cystoscopy had a symptomatic UTI. The 3 types of positive urine cultures they experienced were E Coli, Klebsiella, and Enterococcus. CONCLUSION: The conversion from reusable to disposable cystoscopes did not decrease symptomatic UTI following renal transplant stent removal.

9.
Antioxidants (Basel) ; 13(6)2024 May 31.
Article in English | MEDLINE | ID: mdl-38929116

ABSTRACT

Imbalances in the redox state of the liver arise during metabolic processes, inflammatory injuries, and proliferative liver disorders. Acute exposure to intracellular reactive oxygen species (ROS) results from high levels of oxidative stress (OxS) that occur in response to hepatic ischemia/reperfusion injury (IRI) and metabolic diseases of the liver. Antisense oligonucleotides (ASOs) are an emerging class of gene expression modulators that target RNA molecules by Watson-Crick binding specificity, leading to RNA degradation, splicing modulation, and/or translation interference. Here, we review ASO inhibitor/activator strategies to modulate transcription and translation that control the expression of enzymes, transcription factors, and intracellular sensors of DNA damage. Several small-interfering RNA (siRNA) drugs with N-acetyl galactosamine moieties for the liver have recently been approved. Preclinical studies using short-activating RNAs (saRNAs), phosphorodiamidate morpholino oligomers (PMOs), and locked nucleic acids (LNAs) are at the forefront of proof-in-concept therapeutics. Future research targeting intracellular OxS-related pathways in the liver may help realize the promise of precision medicine, revolutionizing the customary approach to caring for and treating individuals afflicted with liver-specific conditions.

10.
Expert Rev Cardiovasc Ther ; : 1-12, 2024 Jun 26.
Article in English | MEDLINE | ID: mdl-38913423

ABSTRACT

INTRODUCTION: Stroke is a significant public health challenge as it is the second most common cause of death and the third leading cause of disability globally. Additionally, stroke incidence and the number of stroke deaths have been rising. Efforts to prevent stroke have been made, including high-risk approaches where patients are screened for cardiovascular risk factors, and population-based approaches which attempt to reduce stroke rates by improving overall population health. AREAS COVERED: We summarize studies of population-based approaches to stroke prevention involving greater than 1,000 participants identified on a PubMed database search. Based on these programs, challenges of population-based stroke prevention programs are discussed and potential keys to success are highlighted. EXPERT OPINION: Population-based stroke prevention programs face challenges including cost and interest of the public and certain stakeholders. Additionally, secular trends for improvement in risk factors and catastrophic adverse environmental circumstances add to the complexity of analyzing program success. Factors leading to successful programs include validated digital solutions for self-monitoring of risks, backing by global policy and legislation, flexibility to the needs of the population, intersectoral programs, community engagement, information dissemination back to the populations, and high-risk screening to develop a complementary combination approach to stroke prevention.

12.
PLoS One ; 19(5): e0301013, 2024.
Article in English | MEDLINE | ID: mdl-38758942

ABSTRACT

The use of the Sequential Organ Failure Assessment (SOFA) score, originally developed to describe disease morbidity, is commonly used to predict in-hospital mortality. During the COVID-19 pandemic, many protocols for crisis standards of care used the SOFA score to select patients to be deprioritized due to a low likelihood of survival. A prior study found that age outperformed the SOFA score for mortality prediction in patients with COVID-19, but was limited to a small cohort of intensive care unit (ICU) patients and did not address whether their findings were unique to patients with COVID-19. Moreover, it is not known how well these measures perform across races. In this retrospective study, we compare the performance of age and SOFA score in predicting in-hospital mortality across two cohorts: a cohort of 2,648 consecutive adult patients diagnosed with COVID-19 who were admitted to a large academic health system in the northeastern United States over a 4-month period in 2020 and a cohort of 75,601 patients admitted to one of 335 ICUs in the eICU database between 2014 and 2015. We used age and the maximum SOFA score as predictor variables in separate univariate logistic regression models for in-hospital mortality and calculated area under the receiver operator characteristic curves (AU-ROCs) and area under precision-recall curves (AU-PRCs) for each predictor in both cohorts. Among the COVID-19 cohort, age (AU-ROC 0.795, 95% CI 0.762, 0.828) had a significantly better discrimination than SOFA score (AU-ROC 0.679, 95% CI 0.638, 0.721) for mortality prediction. Conversely, age (AU-ROC 0.628 95% CI 0.608, 0.628) underperformed compared to SOFA score (AU-ROC 0.735, 95% CI 0.726, 0.745) in non-COVID-19 ICU patients in the eICU database. There was no difference between Black and White COVID-19 patients in performance of either age or SOFA Score. Our findings bring into question the utility of SOFA score-based resource allocation in COVID-19 crisis standards of care.


Subject(s)
COVID-19 , Hospital Mortality , Intensive Care Units , Organ Dysfunction Scores , Humans , COVID-19/mortality , COVID-19/epidemiology , Male , Middle Aged , Female , Aged , Retrospective Studies , Age Factors , Intensive Care Units/statistics & numerical data , Adult , SARS-CoV-2/isolation & purification , ROC Curve , Aged, 80 and over
13.
Blood Adv ; 8(13): 3507-3518, 2024 Jul 09.
Article in English | MEDLINE | ID: mdl-38739715

ABSTRACT

ABSTRACT: Little is known about risk factors for central nervous system (CNS) relapse in mature T-cell and natural killer cell neoplasms (MTNKNs). We aimed to describe the clinical epidemiology of CNS relapse in patients with MTNKN and developed the CNS relapse In T-cell lymphoma Index (CITI) to predict patients at the highest risk of CNS relapse. We reviewed data from 135 patients with MTNKN and CNS relapse from 19 North American institutions. After exclusion of leukemic and most cutaneous forms of MTNKNs, patients were pooled with non-CNS relapse control patients from a single institution to create a CNS relapse-enriched training set. Using a complete case analysis (n = 182), including 91 with CNS relapse, we applied a least absolute shrinkage and selection operator Cox regression model to select weighted clinicopathologic variables for the CITI score, which we validated in an external cohort from the Swedish Lymphoma Registry (n = 566). CNS relapse was most frequently observed in patients with peripheral T-cell lymphoma, not otherwise specified (25%). Median time to CNS relapse and median overall survival after CNS relapse were 8.0 and 4.7 months, respectively. We calculated unique CITI risk scores for individual training set patients and stratified them into risk terciles. Validation set patients with low-risk (n = 158) and high-risk (n = 188) CITI scores had a 10-year cumulative risk of CNS relapse of 2.2% and 13.4%, respectively (hazard ratio, 5.24; 95% confidence interval, 1.50-18.26; P = .018). We developed an open-access web-based CITI calculator (https://redcap.link/citicalc) to provide an easy tool for clinical practice. The CITI score is a validated model to predict patients with MTNKN at the highest risk of developing CNS relapse.


Subject(s)
Central Nervous System Neoplasms , Humans , Central Nervous System Neoplasms/diagnosis , Central Nervous System Neoplasms/secondary , Central Nervous System Neoplasms/pathology , Central Nervous System Neoplasms/mortality , Male , Female , Middle Aged , Aged , Adult , Lymphoma, T-Cell/pathology , Lymphoma, T-Cell/diagnosis , Lymphoma, T-Cell/mortality , Prognosis , Aged, 80 and over , Neoplasm Recurrence, Local , Lymphoma, Extranodal NK-T-Cell/diagnosis , Lymphoma, Extranodal NK-T-Cell/mortality , Lymphoma, Extranodal NK-T-Cell/therapy , Risk Factors , Recurrence , Killer Cells, Natural , Young Adult
14.
JAMA Netw Open ; 7(5): e2414213, 2024 May 01.
Article in English | MEDLINE | ID: mdl-38819823

ABSTRACT

Importance: Emergency department (ED) visits by older adults with life-limiting illnesses are a critical opportunity to establish patient care end-of-life preferences, but little is known about the optimal screening criteria for resource-constrained EDs. Objectives: To externally validate the Geriatric End-of-Life Screening Tool (GEST) in an independent population and compare it with commonly used serious illness diagnostic criteria. Design, Setting, and Participants: This prognostic study assessed a cohort of patients aged 65 years and older who were treated in a tertiary care ED in Boston, Massachusetts, from 2017 to 2021. Patients arriving in cardiac arrest or who died within 1 day of ED arrival were excluded. Data analysis was performed from August 1, 2023, to March 27, 2024. Exposure: GEST, a logistic regression algorithm that uses commonly available electronic health record (EHR) datapoints and was developed and validated across 9 EDs, was compared with serious illness diagnoses as documented in the EHR. Serious illnesses included stroke/transient ischemic attack, liver disease, cancer, lung disease, and age greater than 80 years, among others. Main Outcomes and Measures: The primary outcome was 6-month mortality following an ED encounter. Statistical analyses included area under the receiver operating characteristic curve, calibration analyses, Kaplan-Meier survival curves, and decision curves. Results: This external validation included 82 371 ED encounters by 40 505 unique individuals (mean [SD] age, 76.8 [8.4] years; 54.3% women, 13.8% 6-month mortality rate). GEST had an external validation area under the receiver operating characteristic curve of 0.79 (95% CI, 0.78-0.79) that was stable across years and demographic subgroups. Of included encounters, 53.4% had a serious illness, with a sensitivity of 77.4% (95% CI, 76.6%-78.2%) and specificity of 50.5% (95% CI, 50.1%-50.8%). Varying GEST cutoffs from 5% to 30% increased specificity (5%: 49.1% [95% CI, 48.7%-49.5%]; 30%: 92.2% [95% CI, 92.0%-92.4%]) at the cost of sensitivity (5%: 89.3% [95% CI, 88.8-89.9]; 30%: 36.2% [95% CI, 35.3-37.1]). In a decision curve analysis, GEST outperformed serious illness criteria across all tested thresholds. When comparing patients referred to intervention by GEST with serious illness criteria, GEST reclassified 45.1% of patients with serious illness as having low risk of mortality with an observed mortality rate 8.1% and 2.6% of patients without serious illness as having high mortality risk with an observed mortality rate of 34.3% for a total reclassification rate of 25.3%. Conclusions and Relevance: The findings of this study suggest that both serious illness criteria and GEST identified older ED patients at risk for 6-month mortality, but GEST offered more useful screening characteristics. Future trials of serious illness interventions for high mortality risk in older adults may consider transitioning from diagnosis code criteria to GEST, an automatable EHR-based algorithm.


Subject(s)
Emergency Service, Hospital , Terminal Care , Humans , Aged , Female , Male , Aged, 80 and over , Terminal Care/statistics & numerical data , Emergency Service, Hospital/statistics & numerical data , Geriatric Assessment/methods , Geriatric Assessment/statistics & numerical data , Boston/epidemiology , Prognosis , Mortality
15.
J Neurol ; 2024 May 17.
Article in English | MEDLINE | ID: mdl-38758279

ABSTRACT

BACKGROUND: A subgroup of people with multiple sclerosis (pwMS) will develop severe disability. The pathophysiology underlying severe MS is unknown. The comprehensive assessment of severely affected MS (CASA-MS) was a case-controlled study that compared severely disabled in skilled nursing (SD/SN) (EDSS ≥ 7.0) to less-disabled (EDSS 3.0-6.5) community dwelling (CD) progressive pwMS, matched on age-, sex- and disease-duration (DDM). OBJECTIVES: To identify neuroimaging and molecular biomarker characteristics that distinguish SD/SN from DDM-CD progressive pwMS. METHODS: This study was carried at SN facility and at a tertiary MS center. The study collected clinical, molecular (serum neurofilament light chain, sNfL and glial acidic fibrillary protein, sGFAP) and MRI quantitative lesion-, brain volume-, and tissue integrity-derived measures. Statistical analyses were controlled for multiple comparisons. RESULTS: 42 SD/SN and 42 DDM-CD were enrolled. SD/SN pwMS showed significantly lower cortical volume (CV) (p < 0.001, d = 1.375) and thalamic volume (p < 0.001, d = 0.972) compared to DDM-CD pwMS. In a logistic stepwise regression model, the SD/SN pwMS were best differentiated from the DDM-CD pwMS by lower CV (p < 0.001) as the only significant predictor, with the accuracy of 82.3%. No significant differences between the two groups were observed for medulla oblongata volume, a proxy for spinal cord atrophy and white matter lesion burden, while there was a statistical trend for numerically higher sGFAP in SD/SN pwMS. CONCLUSIONS: The CASA-MS study showed significantly more gray matter atrophy in severe compared to less-severe progressive MS.

16.
Orthop Traumatol Surg Res ; : 103903, 2024 May 22.
Article in English | MEDLINE | ID: mdl-38789001

ABSTRACT

BACKGROUND: The role of tendon transfer and ideal insertion sites to improve axial rotation in reverse total shoulder arthroplasty (RTSA) is debated. We systematically reviewed the available biomechanical evidence to elucidate the ideal tendon transfer and insertion sites for restoration of external and internal rotation in the setting of RTSA and the influence of implant lateralization. PATIENTS AND METHODS: We queried the PubMed/MEDLINE, Embase, Web of Science, and Cochrane databases to identify biomechanical studies examining the application of tendon transfer to augment shoulder external or internal rotation range of motion in the setting of concomitant RTSA. A descriptive synthesis of six included articles was conducted to elucidate trends in the literature. RESULTS: Biomechanics literature demonstrates that increasing humeral-sided lateralization optimized tendon transfers performed for both ER and IR. The optimal latissimus dorsi (LD) transfer site for ER is posterior to the greater tuberosity (adjacent to the teres minor insertion); however, LD transfer to this site results in greater tendon excursion compared to posterodistal insertion site. In a small series with nearly 7-year mean follow-up, the LD transfer demonstrated longevity with all 10 shoulders having>50% ER strength compared to the contralateral native shoulder and a negative Hornblower's at latest follow-up; however, reduced electromyography activity of the transferred LD compared to the native contralateral side was noted. One study found that transfer of the pectoralis major has the greatest potential to restore IR in the setting of lateralized humerus RTSA. CONCLUSION: To restore ER, LD transfer posterior on the greater tuberosity provides optimal biomechanics with functional longevity. The pectoralis major has the greatest potential to restore IR. Future clinical investigation applying the biomechanical principles summarized herein is needed to substantiate the role of tendon transfer in the modern era of lateralized RTSA. LEVEL OF EVIDENCE: IV; systematic review.

17.
J Am Board Fam Med ; 37(2): 251-260, 2024.
Article in English | MEDLINE | ID: mdl-38740476

ABSTRACT

INTRODUCTION: Multimorbidity rates are both increasing in prevalence across age ranges, and also increasing in diagnostic importance within and outside the family medicine clinic. Here we aim to describe the course of multimorbidity across the lifespan. METHODS: This was a retrospective cohort study across 211,953 patients from a large northeastern health care system. Past medical histories were collected in the form of ICD-10 diagnostic codes. Rates of multimorbidity were calculated from comorbid diagnoses defined from the ICD10 codes identified in the past medical histories. RESULTS: We identify 4 main age groups of diagnosis and multimorbidity. Ages 0 to 10 contain diagnoses which are infectious or respiratory, whereas ages 10 to 40 are related to mental health. From ages 40 to 70 there is an emergence of alcohol use disorders and cardiometabolic disorders. And ages 70 to 90 are predominantly long-term sequelae of the most common cardiometabolic disorders. The mortality of the whole population over the study period was 5.7%, whereas the multimorbidity with the highest mortality across the study period was Circulatory Disorders-Circulatory Disorders at 23.1%. CONCLUSION: The results from this study provide a comparison for the presence of multimorbidity within age cohorts longitudinally across the population. These patterns of comorbidity can assist in the allocation to practice resources that will best support the common conditions that patients need assistance with, especially as the patients transition between pediatric, adult, and geriatric care. Future work examining and comparing multimorbidity indices is warranted.


Subject(s)
Family Practice , Multimorbidity , Humans , Retrospective Studies , Aged , Adult , Middle Aged , Adolescent , Aged, 80 and over , Family Practice/statistics & numerical data , Male , Female , Young Adult , Child , Child, Preschool , Infant , Infant, Newborn , Age Factors , Prevalence , New England/epidemiology
18.
Mult Scler Relat Disord ; 87: 105630, 2024 Jul.
Article in English | MEDLINE | ID: mdl-38678969

ABSTRACT

BACKGROUND: Expanded Disability Status Scale (EDSS) is limited when utilized in highly disabled people with multiple sclerosis (pwMS). OBJETIVE: To explore the relationship between disability measures and MRI outcomes in severely-affected pwMS. METHODS: PwMS recruited from The Boston Home (TBH), a specialized residential facility for severly-affected pwMS and University at Buffalo (UB) MS Center were assessed using EDSS, MS Severity Scale, age-related MSS, Scripps Neurological Rating Scale (SNRS) and Combinatorial Weight-Adjusted Disability Score (CombiWISE). In all scores except SNRS, higher score indicates greater disability. MRI measures of T1, T2-lesion volume (LV), whole brain, gray matter, medulla oblongata and thalamic volumes (WBV, GMV, MOV, TV) and thalamic dysconnectivity were obtained. RESULTS: Greatest disability differences between the TBH and UB pwMS were in SNRS (24.4 vs 71.9, p < 0.001, Cohen's d = 4.05) and CombiWISE (82.3 vs. 38.9, p < 0.001, Cohen's d = 4.02). In combined analysis of all pwMS, worse SNRS scores were correlated with worse MRI pathology in 8 out of 9 outcomes. EDSS only with 3 measures (GMV, MOV and TV). In severely-affected pwMS, SNRS was associated with T1-LV, T2-LV and WBV (not surviving false discovery rate (FDR) correction for multiple comparisons) whereas EDSS did not. CONCLUSION: Granular and dynamic disability measures may bridge the clinico-radiologcal gap present in severely affected pwMS.


Subject(s)
Disability Evaluation , Magnetic Resonance Imaging , Multiple Sclerosis , Severity of Illness Index , Humans , Female , Male , Adult , Multiple Sclerosis/diagnostic imaging , Multiple Sclerosis/physiopathology , Multiple Sclerosis/pathology , Middle Aged , Brain/diagnostic imaging , Brain/pathology , Brain/physiopathology
20.
Pain Med ; 2024 Apr 30.
Article in English | MEDLINE | ID: mdl-38688587

ABSTRACT

BACKGROUND: Given the high prevalence of chronic shoulder pain and encouraging early results of terminal sensory articular branch (TSAB) radiofrequency ablation to treat shoulder pain, research is warranted to refine the procedural technique based on updated neuroanatomical knowledge with the goal of further improving patient outcomes. OBJECTIVE: We describe an updated radiofrequency ablation protocol that accounts for varied locations of the TSABs of suprascapular, axillary, subscapular and lateral pectoral nerves within individual patients. DESIGN: Technical note. METHODS: Cadaveric studies delineating the sensory innervation of the shoulder joint were reviewed, and a more comprehensive radiofrequency ablation (RFA) protocol is proposed relative to historical descriptions. CONCLUSIONS: Based on neuroanatomical dissections of the shoulder joint, the proposed RFA protocol will provide a safe means of more complete sensory denervation and potentially improve clinical outcomes compared to historical descriptions, which must be confirmed in prospective studies.

SELECTION OF CITATIONS
SEARCH DETAIL
...