Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 119
Filter
Add more filters

Country/Region as subject
Publication year range
1.
Hepatology ; 79(5): 1033-1047, 2024 May 01.
Article in English | MEDLINE | ID: mdl-38090880

ABSTRACT

BACKGROUND AND AIMS: In liver transplantation, cold preservation induces ischemia, resulting in significant reperfusion injury. Hypothermic oxygenated machine perfusion (HMP-O 2 ) has shown benefits compared to static cold storage (SCS) by limiting ischemia-reperfusion injury. This study reports outcomes using a novel portable HMP-O 2 device in the first US randomized control trial. APPROACH AND RESULTS: The PILOT trial (NCT03484455) was a multicenter, randomized, open-label, noninferiority trial, with participants randomized to HMP-O 2 or SCS. HMP-O 2 livers were preserved using the Lifeport Liver Transporter and Vasosol perfusion solution. The primary outcome was early allograft dysfunction. Noninferiority margin was 7.5%. From April 3, 2019, to July 12, 2022, 179 patients were randomized to HMP-O 2 (n=90) or SCS (n=89). The per-protocol cohort included 63 HMP-O 2 and 73 SCS. Early allograft dysfunction occurred in 11.1% HMP-O 2 (N=7) and 16.4% SCS (N=12). The risk difference between HMP-O 2 and SCS was -5.33% (one-sided 95% upper confidence limit of 5.81%), establishing noninferiority. The risk of graft failure as predicted by Liver Graft Assessment Following Transplant score at seven days (L-GrAFT 7 ) was lower with HMP-O 2 [median (IQR) 3.4% (2.4-6.5) vs. 4.5% (2.9-9.4), p =0.024]. Primary nonfunction occurred in 2.2% of all SCS (n=3, p =0.10). Biliary strictures occurred in 16.4% SCS (n=12) and 6.3% (n=4) HMP-O 2 ( p =0.18). Nonanastomotic biliary strictures occurred only in SCS (n=4). CONCLUSIONS: HMP-O 2 demonstrates safety and noninferior efficacy for liver graft preservation in comparison to SCS. Early allograft failure by L-GrAFT 7 was lower in HMP-O 2 , suggesting improved early clinical function. Recipients of HMP-O 2 livers also demonstrated a lower incidence of primary nonfunction and biliary strictures, although this difference did not reach significance.


Subject(s)
Liver Transplantation , Reperfusion Injury , Humans , Liver Transplantation/methods , Organ Preservation/methods , Constriction, Pathologic , Liver , Perfusion/methods , Reperfusion Injury/etiology , Reperfusion Injury/prevention & control
2.
Article in English | MEDLINE | ID: mdl-38507607

ABSTRACT

RATIONALE: Individuals with COPD have airflow obstruction and maldistribution of ventilation. For those living at high altitude, any gas exchange abnormality is compounded by reduced partial pressures of inspired oxygen. OBJECTIVES: Does residence at higher-altitude exposure affect COPD outcomes, including lung function, imaging characteristics, symptoms, health status, functional exercise capacity, exacerbations, or mortality? METHODS: From the SPIROMICS cohort, we identified individuals with COPD living below 1,000 ft (305 m) elevation (n= 1,367) versus above 4,000 ft (1,219 m) elevation (n= 288). Multivariable regression models were used to evaluate associations of exposure to high altitude with COPD-related outcomes. MEASUREMENTS AND MAIN RESULTS: Living at higher altitude was associated with reduced functional exercise capacity as defined by 6MWD (-32.3 m, (-55.7 to -28.6)). There were no differences in patient-reported outcomes as defined by symptoms (CAT, mMRC), or health status (SGRQ). Higher altitude was not associated with a different rate of FEV1 decline. Higher altitude was associated with lower odds of severe exacerbations (IRR 0.65, (0.46 to 0.90)). There were no differences in small airway disease, air trapping, or emphysema. In longitudinal analyses, higher altitude was associated with increased mortality (HR 1.25, (1.0 to 1.55)); however, this association was no longer significant when accounting for air pollution. CONCLUSIONS: Chronic altitude exposure is associated with reduced functional exercise capacity in individuals with COPD, but this did not translate into differences in symptoms or health status. Additionally, chronic high-altitude exposure did not affect progression of disease as defined by longitudinal changes in spirometry.

3.
J Obstet Gynaecol Can ; 46(5): 102404, 2024 May.
Article in English | MEDLINE | ID: mdl-38336006

ABSTRACT

OBJECTIVES: Examine whether preoperative antibiotics in class I/clean abdominal gynaecologic surgery decrease the incidence of surgical site infections (SSI). METHODS: Retrospective cohort study at academic safety net hospital of patients undergoing class I laparoscopic or open gynaecologic surgery between November 2013 and September 2017. Performance improvement initiative to administer preoperative antibiotics to all surgical patients starting July 2016. RESULTS: In total, 510 patients were included: 283 in the antibiotic group and 227 in the no-antibiotic group. PRIMARY OUTCOME: incidence of SSI. Baseline characteristics were similar between groups once balanced by propensity score method. In unweighted analysis, incidence of SSI decreased from 9.3% (21/227) in the no-antibiotics group to 4.9% (14/283) in antibiotics group, but this was not statistically significant (odds ratio (OR) 0.51 CI 0.25-1.03, P = 0.0598). Following of inverse probability of treatment weighting adjustments in weighted analysis, incidence of SSI was found to be significantly lower in patients who received antibiotics compared to patients who did not receive antibiotics across entry types (4.6% vs. 9.8%, OR 0.45; CI 0.22-0.90, P = 0.023). Weighted analysis demonstrated in the exploratory laparotomy group patients who received antibiotics had a lower incidence of SSI compared to patients who did not receive antibiotics (5.1% vs. 18.7%, OR 0.23; CI 0.08-0.68, P = 0.008). In the laparoscopy group, there was no difference between groups (4.4% vs. 5.4%, OR 0.81; CI 0.3-2.16, P = 0.675). CONCLUSIONS: There is limited literature on SSI prevention/preoperative antibiotic use in class I gynaecologic surgeries. This study demonstrates antibiotics in class I procedures decrease SSI rates, specifically in open procedures. There was a lack of demonstrated benefit in laparoscopy.


Subject(s)
Anti-Bacterial Agents , Antibiotic Prophylaxis , Gynecologic Surgical Procedures , Surgical Wound Infection , Humans , Surgical Wound Infection/prevention & control , Surgical Wound Infection/epidemiology , Female , Retrospective Studies , Gynecologic Surgical Procedures/adverse effects , Middle Aged , Anti-Bacterial Agents/therapeutic use , Anti-Bacterial Agents/administration & dosage , Adult , Incidence , Safety-net Providers , Laparoscopy , Preoperative Care/methods
4.
Stroke ; 54(5): 1320-1329, 2023 05.
Article in English | MEDLINE | ID: mdl-37021564

ABSTRACT

BACKGROUND: Patients with stroke in the United States can be transferred for higher level of care. Little is known about possible inequities in interhospital transfers (IHTs) for acute ischemic stroke. We hypothesized that historically marginalized populations would have lower odds of IHT. METHODS: A cross-sectional analysis was done for adults with a primary diagnosis of acute ischemic stroke in 2010 to 2017; n=747 982 were identified in the National Inpatient Sample. Yearly rates for IHT were assessed and adjusted odds ratios (aORs) of IHT in 2014 to 2017 were compared with that of 2010 to 2013. Multinomial logistic regression was used to determine the aOR of IHT, adjusting for sociodemographic variables (model 1), sociodemographic and medical variables such as comorbidity and mortality risk (model 2), and sociodemographic, medical, and hospital variables (model 3). RESULTS: After adjusting for sociodemographic, medical, and hospital characteristics, there were no significant temporal differences in IHT from 2010 to 2017. Overall, women were less likely than men to be transferred in all models (model 3: aOR, 0.89 [0.86-0.92]). Compared with those who were White, individuals who were Black (aOR, 0.93 [0.88-0.99]), Hispanic (aOR, 0.90 [0.83-0.97]), other (aOR, 0.90 [0.82-0.99]), or of unknown race, ethnicity (aOR, 0.89 [0.80-1.00]) were less likely to be transferred (model 2), but these differences dissipated when further adjusting for hospital-level characteristics (model 3). Compared with those with private insurance, those with Medicaid (aOR, 0.86 [0.80-0.91]), self-pay (aOR, 0.64 [0.59-0.70]), and no charge (aOR, 0.64 [0.46-0.88]) were less likely to be transferred (model 3). Individuals with lower income were less likely to be transferred compared with those with higher income (model 3: aOR, 0.85 [0.80-0.90], third versus fourth quartile). CONCLUSIONS: Adjusted odds of IHT for acute ischemic stroke remained stable from 2010 to 2017. There are numerous inequities in the rates of IHT by race, ethnicity, sex, insurance, and income. Further studies are needed to understand these inequities and develop policies and interventions to mitigate them.


Subject(s)
Ischemic Stroke , Stroke , Male , Adult , Humans , Female , United States , Cross-Sectional Studies , Stroke/diagnosis , Ethnicity , Income , Retrospective Studies
5.
Ann Surg ; 278(3): 441-451, 2023 09 01.
Article in English | MEDLINE | ID: mdl-37389564

ABSTRACT

OBJECTIVE: To examine liver retransplantation (ReLT) over 35 years at a single center. BACKGROUND: Despite the durability of liver transplantation (LT), graft failure affects up to 40% of LT recipients. METHODS: All adult ReLTs from 1984 to 2021 were analyzed. Comparisons were made between ReLTs in the pre versus post-model for end-stage liver disease (MELD) eras and between ReLTs and primary-LTs in the modern era. Multivariate analysis was used for prognostic modeling. RESULTS: Six hundred fifty-four ReLTs were performed in 590 recipients. There were 372 pre-MELD ReLTs and 282 post-MELD ReLTs. Of the ReLT recipients, 89% had one previous LT, whereas 11% had ≥2. Primary nonfunction was the most common indication in the pre-MELD era (33%) versus recurrent disease (24%) in the post-MELD era. Post-MELD ReLT recipients were older (53 vs 48, P = 0.001), had higher MELD scores (35 vs 31, P = 0.01), and had more comorbidities. However, post-MELD ReLT patients had superior 1, 5, and 10-year survival compared with pre-MELD ReLT (75%, 60%, and 43% vs 53%, 43%, and 35%, respectively, P < 0.001) and lower in-hospital mortality and rejection rates. Notably, in the post-MELD era, the MELD score did not affect survival. We identified the following risk factors for early mortality (≤12 months after ReLT): coronary artery disease, obesity, ventilatory support, older recipient age, and longer pre-ReLT hospital stay. CONCLUSIONS: This represents the largest single-center ReLT report to date. Despite the increased acuity and complexity of ReLT patients, post-MELD era outcomes have improved. With careful patient selection, these results support the efficacy and survival benefit of ReLT in an acuity-based allocation environment.


Subject(s)
End Stage Liver Disease , Liver Transplantation , Adult , Humans , Retrospective Studies , Severity of Illness Index , Graft Survival
6.
Liver Transpl ; 29(1): 34-47, 2023 01 01.
Article in English | MEDLINE | ID: mdl-36630156

ABSTRACT

NAFLD will soon be the most common indication for liver transplantation (LT). In NAFLD, HCC may occur at earlier stages of fibrosis and present with more advanced tumor stage, raising concern for aggressive disease. Thus, adult LT recipients with HCC from 20 US centers transplanted between 2002 and 2013 were analyzed to determine whether NAFLD impacts recurrence-free post-LT survival. Five hundred and thirty-eight (10.8%) of 4981 total patients had NAFLD. Patients with NAFLD were significantly older (63 vs. 58, p<0.001), had higher body mass index (30.5 vs. 27.4, p<0.001), and were more likely to have diabetes (57.3% vs. 28.8%, p<0.001). Patients with NAFLD were less likely to receive pre-LT locoregional therapy (63.6% vs. 72.9%, p<0.001), had higher median lab MELD (15 vs. 13, p<0.001) and neutrophil-lymphocyte ratio (3.8 vs. 2.9, p<0.001), and were more likely to have their maximum pre-LT alpha fetoprotein at time of LT (44.1% vs. 36.1%, p<0.001). NAFLD patients were more likely to have an incidental HCC on explant (19.4% vs. 10.4%, p<0.001); however, explant characteristics including tumor differentiation and vascular invasion were not different between groups. Comparing NAFLD and non-NAFLD patients, the 1, 3, and 5-year cumulative incidence of recurrence (3.1%, 9.1%, 11.5% vs. 4.9%, 10.1%, 12.6%, p=0.36) and recurrence-free survival rates (87%, 76%, and 67% vs. 87%, 75%, and 67%, p=0.97) were not different. In competing risks analysis, NAFLD did not significantly impact recurrence in univariable (HR: 0.88, p=0.36) nor in adjusted analysis (HR: 0.91, p=0.49). With NAFLD among the most common causes of HCC and poised to become the leading indication for LT, a better understanding of disease-specific models to predict recurrence is needed. In this NAFLD cohort, incidental HCCs were common, raising concerns about early detection. However, despite less locoregional therapy and high neutrophil-lymphocyte ratio, explant tumor characteristics and post-transplant recurrence-free survival were not different compared to non-NAFLD patients.


Subject(s)
Carcinoma, Hepatocellular , Liver Neoplasms , Liver Transplantation , Non-alcoholic Fatty Liver Disease , Adult , Humans , Carcinoma, Hepatocellular/epidemiology , Carcinoma, Hepatocellular/surgery , Carcinoma, Hepatocellular/pathology , Liver Neoplasms/epidemiology , Liver Neoplasms/surgery , Liver Neoplasms/pathology , Non-alcoholic Fatty Liver Disease/complications , Non-alcoholic Fatty Liver Disease/epidemiology , Non-alcoholic Fatty Liver Disease/surgery , Liver Transplantation/adverse effects , Retrospective Studies , Neoplasm Recurrence, Local/pathology , Risk Factors
7.
Liver Transpl ; 29(7): 683-697, 2023 07 01.
Article in English | MEDLINE | ID: mdl-37029083

ABSTRACT

HCC recurrence following liver transplantation (LT) is highly morbid and occurs despite strict patient selection criteria. Individualized prediction of post-LT HCC recurrence risk remains an important need. Clinico-radiologic and pathologic data of 4981 patients with HCC undergoing LT from the US Multicenter HCC Transplant Consortium (UMHTC) were analyzed to develop a REcurrent Liver cAncer Prediction ScorE (RELAPSE). Multivariable Fine and Gray competing risk analysis and machine learning algorithms (Random Survival Forest and Classification and Regression Tree models) identified variables to model HCC recurrence. RELAPSE was externally validated in 1160 HCC LT recipients from the European Hepatocellular Cancer Liver Transplant study group. Of 4981 UMHTC patients with HCC undergoing LT, 71.9% were within Milan criteria, 16.1% were initially beyond Milan criteria with 9.4% downstaged before LT, and 12.0% had incidental HCC on explant pathology. Overall and recurrence-free survival at 1, 3, and 5 years was 89.7%, 78.6%, and 69.8% and 86.8%, 74.9%, and 66.7%, respectively, with a 5-year incidence of HCC recurrence of 12.5% (median 16 months) and non-HCC mortality of 20.8%. A multivariable model identified maximum alpha-fetoprotein (HR = 1.35 per-log SD, 95% CI,1.22-1.50, p < 0.001), neutrophil-lymphocyte ratio (HR = 1.16 per-log SD, 95% CI,1.04-1.28, p < 0.006), pathologic maximum tumor diameter (HR = 1.53 per-log SD, 95% CI, 1.35-1.73, p < 0.001), microvascular (HR = 2.37, 95%-CI, 1.87-2.99, p < 0.001) and macrovascular (HR = 3.38, 95% CI, 2.41-4.75, p < 0.001) invasion, and tumor differentiation (moderate HR = 1.75, 95% CI, 1.29-2.37, p < 0.001; poor HR = 2.62, 95% CI, 1.54-3.32, p < 0.001) as independent variables predicting post-LT HCC recurrence (C-statistic = 0.78). Machine learning algorithms incorporating additional covariates improved prediction of recurrence (Random Survival Forest C-statistic = 0.81). Despite significant differences in European Hepatocellular Cancer Liver Transplant recipient radiologic, treatment, and pathologic characteristics, external validation of RELAPSE demonstrated consistent 2- and 5-year recurrence risk discrimination (AUCs 0.77 and 0.75, respectively). We developed and externally validated a RELAPSE score that accurately discriminates post-LT HCC recurrence risk and may allow for individualized post-LT surveillance, immunosuppression modification, and selection of high-risk patients for adjuvant therapies.


Subject(s)
Carcinoma, Hepatocellular , Liver Neoplasms , Liver Transplantation , Humans , Liver Transplantation/adverse effects , Risk Factors , Neoplasm Recurrence, Local/pathology , Retrospective Studies , Recurrence
8.
Reprod Biomed Online ; 46(1): 123-128, 2023 01.
Article in English | MEDLINE | ID: mdl-36396533

ABSTRACT

RESEARCH QUESTION: Does luteal phase support with vaginal progesterone improve clinical pregnancy rates in patients undergoing ovarian stimulation with letrozole? DESIGN: This was a retrospective cohort study of patients undergoing ovarian stimulation with letrozole paired with intrauterine insemination (IUI) or timed intercourse (TIC) from January 2018 to October 2021. The primary outcome of clinical pregnancy rate (CPR) was calculated for cycles with and without luteal phase progesterone support. Univariate logistic regressions were done to evaluate predictor variables for CPR. Clinically important covariates including age, body mass index, anti-Müllerian hormone concentration, diagnosis of ovulatory dysfunction and multifollicular development were included in a multivariate analysis evaluating the relationship between luteal progesterone use and odds of clinical pregnancy. Secondary outcomes including spontaneous abortion, biochemical pregnancy and ectopic pregnancy were calculated. Live birth rates were calculated for cycles in a secondary analysis. RESULTS: A total of 492 letrozole ovarian stimulation cycles in 273 patients were included. Of these cycles, 387 (78.7%) used vaginal progesterone for luteal support and 105 (21.3%) did not. The unadjusted CPR per cycle was 11.6% (45/387) with progesterone and 13.3% (14/105) without progesterone (P = 0.645). After adjusting for significant covariates including age, BMI, diagnosis of ovulatory dysfunction and multifollicular development, the odds for clinical pregnancy were not significantly improved in cycles with exogenous progesterone (odds ratio [OR] 1.15, 95% confidence interval [CI] 0.48-2.75, P = 0.762). A follow-up analysis demonstrated that live birth rate was 10.7% (41/384) with and 12.5% (13/104) without luteal progesterone, respectively (P = 0.599). CONCLUSIONS: Luteal support with vaginal progesterone does not significantly improve CPR in ovarian stimulation cycles using letrozole.


Subject(s)
Luteal Phase , Progesterone , Pregnancy , Female , Humans , Pregnancy Rate , Letrozole/therapeutic use , Luteal Phase/physiology , Retrospective Studies , Ovulation Induction
9.
Clin Transplant ; 37(4): e14919, 2023 04.
Article in English | MEDLINE | ID: mdl-36716121

ABSTRACT

PURPOSE: To determine hepatocellular carcinoma (HCC) magnetic resonance imaging (MRI) biomarkers that enable the prediction of delisting from tumor progression versus successful transplantation in patients listed for orthotopic liver transplantation (OLT). METHODS: With IRB approval and HIPPA compliance, patients with HCC awaiting OLT who were delisted due to HCC progression from 2006 to 2015 were identified. Patients with adequate MR images for review were subsequently matched with a cohort of patients successfully bridged to OLT in the same time period. Matching considered the tumor stage and the dominant treatment strategy adopted to bridge the patient to OLT. Potential MRI features were evaluated by univariable and multivariable analysis using a conditional logistic model. RESULTS: There were 53 patients included in each cohort. On uni-variable analysis, significant unfavorable MR imaging features included T2 hyperintensity (odds ratio [OR], 19.0), infiltrative border (OR, 7.50), lobulated shape (OR, 4.5), T1 hypointensity (OR, 3.0), heterogeneous arterial enhancement (OR, 7.0), and corona venous enhancement (OR, 4.0). A significant favorable MR imaging feature was the presence of intralesional fat (OR = .36). The best multivariable logistic prediction model derived from the above notable features included only T1 and T2 signal intensity, border definition, and absence of intra-lesional fat as significant variables, with an area under the receiver operating characteristic curve (AUC) of .86 in the prediction of delisting. CONCLUSION: Select MR imaging features of HCC at presentation before any treatment are significantly associated with the risk of tumor progression regardless of tumor stage and treatment strategy in patients awaiting liver transplantation.


Subject(s)
Carcinoma, Hepatocellular , Liver Neoplasms , Liver Transplantation , Humans , Carcinoma, Hepatocellular/diagnostic imaging , Carcinoma, Hepatocellular/surgery , Carcinoma, Hepatocellular/pathology , Liver Neoplasms/diagnostic imaging , Liver Neoplasms/surgery , Liver Neoplasms/pathology , Gadolinium DTPA , Magnetic Resonance Imaging/methods , Biomarkers , Retrospective Studies , Contrast Media
10.
Crit Care ; 27(1): 486, 2023 12 08.
Article in English | MEDLINE | ID: mdl-38066613

ABSTRACT

BACKGROUND: Sepsis is a highly heterogeneous syndrome, which has hindered the development of effective therapies. This has prompted investigators to develop a precision medicine approach aimed at identifying biologically homogenous subgroups of patients with septic shock and critical illnesses. Transcriptomic analysis can identify subclasses derived from differences in underlying pathophysiological processes that may provide the basis for new targeted therapies. The goal of this study was to elucidate pathophysiological pathways and identify pediatric septic shock subclasses based on whole blood RNA expression profiles. METHODS: The subjects were critically ill children with cardiopulmonary failure who were a part of a prospective randomized insulin titration trial to treat hyperglycemia. Genome-wide expression profiling was conducted using RNA sequencing from whole blood samples obtained from 46 children with septic shock and 52 mechanically ventilated noninfected controls without shock. Patients with septic shock were allocated to subclasses based on hierarchical clustering of gene expression profiles, and we then compared clinical characteristics, plasma inflammatory markers, cell compositions using GEDIT, and immune repertoires using Imrep between the two subclasses. RESULTS: Patients with septic shock depicted alterations in innate and adaptive immune pathways. Among patients with septic shock, we identified two subtypes based on gene expression patterns. Compared with Subclass 2, Subclass 1 was characterized by upregulation of innate immunity pathways and downregulation of adaptive immunity pathways. Subclass 1 had significantly worse clinical outcomes despite the two classes having similar illness severity on initial clinical presentation. Subclass 1 had elevated levels of plasma inflammatory cytokines and endothelial injury biomarkers and demonstrated decreased percentages of CD4 T cells and B cells and less diverse T cell receptor repertoires. CONCLUSIONS: Two subclasses of pediatric septic shock patients were discovered through genome-wide expression profiling based on whole blood RNA sequencing with major biological and clinical differences. Trial Registration This is a secondary analysis of data generated as part of the observational CAF-PINT ancillary of the HALF-PINT study (NCT01565941). Registered March 29, 2012.


Subject(s)
Sepsis , Shock, Septic , Child , Humans , Gene Expression Profiling , Prospective Studies , Sepsis/genetics , Shock, Septic/therapy , Transcriptome , Randomized Controlled Trials as Topic , Observational Studies as Topic
11.
J Stroke Cerebrovasc Dis ; 32(11): 107340, 2023 Nov.
Article in English | MEDLINE | ID: mdl-37683528

ABSTRACT

OBJECTIVES: Left ventricular assist devices are known to extend survival in patients with advanced heart failure; however, their association with intracranial hemorrhage is also well-known. We aimed to explore the risk trend and predictors of intracranial hemorrhage in patients with left ventricular assist devices. MATERIAL AND METHODS: We included patients aged 18 years or older with left ventricular assist devices hospitalized in the US from 2005 to 2014 using the National Inpatient Sample. We computed the survey-weighted percentages with intracranial hemorrhage across the 10-year study period and assessed whether the proportions changed over time. Predictors of intracranial hemorrhage were evaluated using multivariable logistic regression model. RESULTS: Of 33,246 hospitalizations, 568 (1.7%) had intracranial hemorrhage. The number of left ventricular assist devices placements increased from 873 in 2005 to 5175 in 2014. However, the risk of intracranial hemorrhage remained largely unchanged (1.7% to 2.3%; linear trend, P = 0.604). The adjusted odds of intracranial hemorrhage were increased with the presence of one of the following variables: female sex (odds ratio [OR], 1.58; 95% CI, 1.03-2.43), history of ischemic stroke (OR, 3.13; 95% CI, 1.86-5.28), or Charlson Comorbidity Index score of 3 or more (OR, 77.40; 95% CI, 10.03-597.60). CONCLUSIONS: Over the last decade, the risk of intracranial hemorrhage has remained relatively unchanged despite an increase in the use of left ventricular assist devices in patients with advanced heart failure. Women, higher Charlson Comorbidity Index scores, and history of ischemic stroke were associated with higher odds of intracranial hemorrhage in patients with left ventricular assist devices.

12.
Stroke ; 53(11): 3369-3374, 2022 11.
Article in English | MEDLINE | ID: mdl-35862233

ABSTRACT

BACKGROUND: Food insecurity (FI)-lack of consistent access to food due to poor financial resources-limits the ability to eat a healthy diet, which is essential for secondary stroke prevention. Yet, little is known about FI in stroke survivors. METHODS: Using data from the US National Health and Nutrition Examination Survey from 1999 to 2015, we analyzed the prevalence, predictors, and temporal trends in FI among adults with and without self-reported prior stroke in this cross-sectional study. Age-standardized prevalence estimates were computed by self-reported history of stroke over survey waves. Multivariable logistic regression models were performed for the National Health and Nutrition Examination Survey participants who had a prior stroke to identify independent predictors of FI by self-reported history of stroke. RESULTS: Among 48 242 adults ≥20 years of age, 1877 self-reported history of stroke. FI was more prevalent among people with prior stroke (17%) versus those without prior stroke (12%; P<0.001). Prevalence of FI increased over time from 7.8% in 1999 to 42.1% in 2015 among stroke survivors and from 8% to 17% among individuals without prior stroke (P<0.001). The age-standardized prevalence of FI over the entire time was 24% among stroke survivors versus 11% among individuals without prior stroke (P<0.001). In the adjusted model, younger age (adjusted odds ratio [aOR], 0.96 [0.95-0.97]; P<0.01), Hispanic ethnicity (aOR, 2.12 [1.36-3.31]; P<0.01), lower education (aOR, 1.67 [1.17-2.38]; P<0.01), nonmarried status (aOR, 1.49 [1.01-2.19]; P=0.04), and poverty income ratio <130% (aOR, 3.78 [2.55-5.59]; P<0.01) were associated with FI in those with prior stroke. CONCLUSIONS: One in 3 stroke survivors reported FI in 2015, nearly double the prevalence in those without stroke. Addressing the fundamental drivers of FI and targeting vulnerable demographic groups may have a profound influence on stroke prevalence.


Subject(s)
Food Supply , Stroke , Adult , United States/epidemiology , Humans , Nutrition Surveys , Prevalence , Cross-Sectional Studies , Survivors , Stroke/epidemiology , Food Insecurity
13.
Clin Transplant ; 36(1): e14503, 2022 01.
Article in English | MEDLINE | ID: mdl-34634157

ABSTRACT

BACKGROUND: Sarcopenia has gained momentum as a potential risk-stratification tool in liver transplantation (LT). While LT recipients recently have more advanced end-stage liver disease, the impact of sarcopenia in high acuity recipients with a high model for end-stage liver disease (MELD) score remains unclear. METHODS: We retrospectively assessed sarcopenia by calculating skeletal muscle index (SMI) from cross-sectional area at third lumbar vertebra (cm2 ) and height (m2 ) in 296 patients with a CT ≤ 30 days prior to LT. Sex-specific SMI cut-offs were developed, and its impact was assessed in patients with MELD ≥ 35. RESULTS: In patients with MELD ≥ 35 (n = 217), men with a SMI < 30 cm2 /m2 had significantly higher rates of bacteremia (P = .021) and a longer hospital stay (P < .001). Women with a SMI < 34 cm2 /m2 had a longer hospital stay (P = .032). There were no relationships between SMI and survival in men and women with MELD ≥ 35. CONCLUSIONS: This series examined sarcopenia with a focus on high MELD patients. Although decreased SMI contributed to higher post-LT hospital stay, it did not impact patient survival, suggesting that while SMI alone may not aid in patient selection for LT, it certainly may guide perioperative care-planning in this challenging patient population.


Subject(s)
End Stage Liver Disease , Liver Transplantation , Sarcopenia , End Stage Liver Disease/surgery , Female , Humans , Male , Prognosis , Retrospective Studies , Sarcopenia/etiology , Severity of Illness Index
14.
Am J Transplant ; 21(2): 614-625, 2021 02.
Article in English | MEDLINE | ID: mdl-32713098

ABSTRACT

Ischemia-reperfusion injury (IRI) is believed to contribute to graft dysfunction after liver transplantation (LT). However, studies on IRI and the impact of early allograft dysfunction (EAD) in IRI grafts are limited. Histological IRI was graded in 506 grafts from patients who had undergone LT and classified based on IRI severity (no, minimal, mild, moderate, and severe). Of the 506 grafts, 87.4% had IRI (no: 12.6%, minimal: 38.1%, mild: 35.4%, moderate: 13.0%, and severe: 0.8%). IRI severity correlated with the incidence of EAD and graft survival at 6 months. Longer cold/warm ischemia time, recipient/donor hypertension, and having a male donor were identified as independent risk factors for moderate to severe IRI. Among 70 grafts with moderate to severe IRI, 42.9% of grafts developed EAD, and grafts with EAD had significantly inferior survival compared to grafts without EAD. Longer cold ischemia time and large droplet macrovesicular steatosis (≥20%) were identified as independent risk factors for EAD. Our study demonstrated that increased IRI severity was correlated with inferior short-term graft outcomes. Careful consideration of IRI risk factors during donor-recipient matching may assist in optimizing graft utilization and LT outcomes. Furthermore, identification of risk factors of IRI-associated EAD may guide patient management and possible timely graft replacement.


Subject(s)
Liver Transplantation , Reperfusion Injury , Allografts , Cold Ischemia/adverse effects , Graft Survival , Humans , Liver Transplantation/adverse effects , Male , Reperfusion Injury/etiology , Risk Factors
15.
J Hepatol ; 74(4): 881-892, 2021 04.
Article in English | MEDLINE | ID: mdl-32976864

ABSTRACT

BACKGROUND & AIMS: Early allograft dysfunction (EAD) following liver transplantation (LT) negatively impacts graft and patient outcomes. Previously we reported that the liver graft assessment following transplantation (L-GrAFT7) risk score was superior to binary EAD or the model for early allograft function (MEAF) score for estimating 3-month graft failure-free survival in a single-center derivation cohort. Herein, we sought to externally validate L-GrAFT7, and compare its prognostic performance to EAD and MEAF. METHODS: Accuracies of L-GrAFT7, EAD, and MEAF were compared in a 3-center US validation cohort (n = 3,201), and a Consortium for Organ Preservation in Europe (COPE) normothermic machine perfusion (NMP) trial cohort (n = 222); characteristics were compared to assess generalizability. RESULTS: Compared to the derivation cohort, patients in the validation and NMP trial cohort had lower recipient median MELD scores; were less likely to require pretransplant hospitalization, renal replacement therapy or mechanical ventilation; and had superior 1-year overall (90% and 95% vs. 84%) and graft failure-free (88% and 93% vs. 81%) survival, with a lower incidence of 3-month graft failure (7.4% and 4.0% vs. 11.1%; p <0.001 for all comparisons). Despite significant differences in cohort characteristics, L-GrAFT7 maintained an excellent validation AUROC of 0.78, significantly superior to binary EAD (AUROC 0.68, p = 0.001) and MEAF scores (AUROC 0.72, p <0.001). In post hoc analysis of the COPE NMP trial, the highest tertile of L-GrAFT7 was significantly associated with time to liver allograft (hazard ratio [HR] 2.17, p = 0.016), Clavien ≥IIIB (HR 2.60, p = 0.034) and ≥IVa (HR 4.99, p = 0.011) complications; post-LT length of hospitalization (p = 0.002); and renal replacement therapy (odds ratio 3.62, p = 0.016). CONCLUSIONS: We have validated the L-GrAFT7 risk score as a generalizable, highly accurate, individualized risk assessment of 3-month liver allograft failure that is superior to existing scores. L-GrAFT7 may standardize grading of early hepatic allograft function and serve as a clinical endpoint in translational studies (www.lgraft.com). LAY SUMMARY: Early allograft dysfunction negatively affects outcomes following liver transplantation. In independent multicenter US and European cohorts totaling 3,423 patients undergoing liver transplantation, the liver graft assessment following transplantation (L-GrAFT) risk score is validated as a superior measure of early allograft function that accurately discriminates 3-month graft failure-free survival and post-liver transplantation complications.


Subject(s)
Liver Transplantation , Primary Graft Dysfunction , Risk Assessment , Europe/epidemiology , Female , Graft Survival , Humans , Liver Transplantation/adverse effects , Liver Transplantation/methods , Liver Transplantation/statistics & numerical data , Male , Middle Aged , Outcome and Process Assessment, Health Care/statistics & numerical data , Primary Graft Dysfunction/diagnosis , Primary Graft Dysfunction/epidemiology , Primary Graft Dysfunction/therapy , Prognosis , Reperfusion Injury/diagnosis , Reperfusion Injury/epidemiology , Reperfusion Injury/therapy , Reproducibility of Results , Risk Assessment/methods , Risk Assessment/standards , Risk Factors , Survival Analysis , United States/epidemiology
16.
J Urol ; 205(2): 444-451, 2021 02.
Article in English | MEDLINE | ID: mdl-33026934

ABSTRACT

PURPOSE: Oncologic efficacy of focal therapies in prostate cancer depends heavily on accurate tumor size estimation. We aim to evaluate the agreement between radiologic tumor size and pathological tumor size, and identify predictors of pathological tumor size. MATERIALS AND METHODS: This single arm study cohort included all consecutive patients with biopsy proven prostate cancer and a corresponding PI-RADS®v2 3 or greater index tumor on multiparametric magnetic resonance imaging who subsequently underwent radical prostatectomy. Radiologic tumor size was defined as maximum tumor diameter on multiparametric magnetic resonance imaging and compared to whole mount histopathology tumor correlates. The difference between radiologic tumor size and pathological tumor size was assessed, and clinical, pathological and radiographic predictors of pathological tumor size were examined. RESULTS: A total of 461 consecutive lesions in 441 men were included for statistical analysis. Mean radiologic tumor size and pathological tumor size was 1.57 and 2.37 cm, respectively (p <0.001). Radiologic tumor size consistently underestimated pathological tumor size regardless of the preoperative covariates, and the degree of underestimation increased with smaller radiologic tumor size and lower PI-RADSv2 scores. Pathological tumor size was significantly larger for biopsy Gleason Grade Group (GG) 5 compared to GG1 (mean change 0.37 cm, p=0.014), PI-RADSv2 5 lesions compared to PI-RADSv2 4 (mean change 0.26, p=0.006) and higher prostate specific antigen density. The correlations between radiologic tumor size vs pathological tumor size according to biopsy GG and radiologic covariates were generally low with correlation coefficients ranging between 0.1 and 0.65. CONCLUSIONS: Multiparametric magnetic resonance imaging frequently underestimates pathological tumor size and the degree of underestimation increases with smaller radiologic tumor size and lower PI-RADSv2 scores. Therefore, a larger ablation margin may be required for smaller tumors and lesions with lower PI-RADSv2 scores. These variables must be considered when estimating treatment margins in focal therapy.


Subject(s)
Multiparametric Magnetic Resonance Imaging , Prostatic Neoplasms/diagnostic imaging , Prostatic Neoplasms/pathology , Adult , Aged , Aged, 80 and over , Humans , Male , Middle Aged , Predictive Value of Tests , Preoperative Period , Prostatectomy , Prostatic Neoplasms/surgery , Retrospective Studies , Tumor Burden
17.
Hepatology ; 72(6): 2014-2028, 2020 12.
Article in English | MEDLINE | ID: mdl-32124453

ABSTRACT

BACKGROUND AND AIMS: The Organ Procurement and Transplantation Network recently approved liver transplant (LT) prioritization for patients with hepatocellular carcinoma (HCC) beyond Milan Criteria (MC) who are down-staged (DS) with locoregional therapy (LRT). We evaluated post-LT outcomes, predictors of down-staging, and the impact of LRT in patients with beyond-MC HCC from the U.S. Multicenter HCC Transplant Consortium (20 centers, 2002-2013). APPROACH AND RESULTS: Clinicopathologic characteristics, overall survival (OS), recurrence-free survival (RFS), and HCC recurrence (HCC-R) were compared between patients within MC (n = 3,570) and beyond MC (n = 789) who were down-staged (DS, n = 465), treated with LRT and not down-staged (LRT-NoDS, n = 242), or untreated (NoLRT-NoDS, n = 82). Five-year post-LT OS and RFS was higher in MC (71.3% and 68.2%) compared with DS (64.3% and 59.5%) and was lowest in NoDS (n = 324; 60.2% and 53.8%; overall P < 0.001). DS patients had superior RFS (60% vs. 54%, P = 0.043) and lower 5-year HCC-R (18% vs. 32%, P < 0.001) compared with NoDS, with further stratification by maximum radiologic tumor diameter (5-year HCC-R of 15.5% in DS/<5 cm and 39.1% in NoDS/>5 cm, P < 0.001). Multivariate predictors of down-staging included alpha-fetoprotein response to LRT, pathologic tumor number and size, and wait time >12 months. LRT-NoDS had greater HCC-R compared with NoLRT-NoDS (34.1% vs. 26.1%, P < 0.001), even after controlling for clinicopathologic variables (hazard ratio [HR] = 2.33, P < 0.001) and inverse probability of treatment-weighted propensity matching (HR = 1.82, P < 0.001). CONCLUSIONS: In LT recipients with HCC presenting beyond MC, successful down-staging is predicted by wait time, alpha-fetoprotein response to LRT, and tumor burden and results in excellent post-LT outcomes, justifying expansion of LT criteria. In LRT-NoDS patients, higher HCC-R compared with NoLRT-NoDS cannot be explained by clinicopathologic differences, suggesting a potentially aggravating role of LRT in patients with poor tumor biology that warrants further investigation.


Subject(s)
Ablation Techniques/methods , Carcinoma, Hepatocellular/therapy , End Stage Liver Disease/therapy , Liver Neoplasms/therapy , Liver Transplantation/statistics & numerical data , Neoplasm Recurrence, Local/epidemiology , Ablation Techniques/statistics & numerical data , Carcinoma, Hepatocellular/diagnosis , Carcinoma, Hepatocellular/mortality , Carcinoma, Hepatocellular/pathology , Disease-Free Survival , End Stage Liver Disease/diagnosis , End Stage Liver Disease/mortality , End Stage Liver Disease/pathology , Female , Follow-Up Studies , Humans , Liver/diagnostic imaging , Liver/pathology , Liver/radiation effects , Liver/surgery , Liver Neoplasms/diagnosis , Liver Neoplasms/mortality , Liver Neoplasms/pathology , Liver Transplantation/standards , Male , Middle Aged , Neoadjuvant Therapy/methods , Neoplasm Recurrence, Local/prevention & control , Neoplasm Staging , Radiotherapy, Adjuvant/methods , Radiotherapy, Adjuvant/statistics & numerical data , Retrospective Studies , Severity of Illness Index , Tissue and Organ Procurement/standards , Tumor Burden/radiation effects , United States/epidemiology , Waiting Lists/mortality
18.
Acta Neuropathol ; 142(3): 495-511, 2021 09.
Article in English | MEDLINE | ID: mdl-33991233

ABSTRACT

The diagnosis of Parkinson's disease (PD) and atypical parkinsonian syndromes is difficult due to the lack of reliable, easily accessible biomarkers. Multiple system atrophy (MSA) is a synucleinopathy whose symptoms often overlap with PD. Exosomes isolated from blood by immunoprecipitation using CNS markers provide a window into the brain's biochemistry and may assist in distinguishing between PD and MSA. Thus, we asked whether α-synuclein (α-syn) in such exosomes could distinguish among healthy individuals, patients with PD, and patients with MSA. We isolated exosomes from the serum or plasma of these three groups by immunoprecipitation using neuronal and oligodendroglial markers in two independent cohorts and measured α-syn in these exosomes using an electrochemiluminescence ELISA. In both cohorts, α-syn concentrations were significantly lower in the control group and significantly higher in the MSA group compared to the PD group. The ratio between α-syn concentrations in putative oligodendroglial exosomes compared to putative neuronal exosomes was a particularly sensitive biomarker for distinguishing between PD and MSA. Combining this ratio with the α-syn concentration itself and the total exosome concentration, a multinomial logistic model trained on the discovery cohort separated PD from MSA with an AUC = 0.902, corresponding to 89.8% sensitivity and 86.0% specificity when applied to the independent validation cohort. The data demonstrate that a minimally invasive blood test measuring α-syn in blood exosomes immunoprecipitated using CNS markers can distinguish between patients with PD and patients with MSA with high sensitivity and specificity. Future optimization and validation of the data by other groups would allow this strategy to become a viable diagnostic test for synucleinopathies.


Subject(s)
Exosomes/immunology , Multiple System Atrophy/diagnosis , Neurons/metabolism , Oligodendroglia/metabolism , Parkinson Disease/diagnosis , alpha-Synuclein/immunology , Adult , Aged , Aged, 80 and over , Area Under Curve , Biomarkers , Cohort Studies , Diagnosis, Differential , Enzyme-Linked Immunosorbent Assay , Female , Healthy Volunteers , Humans , Immunoprecipitation , Male , Middle Aged , Multiple System Atrophy/blood , Parkinson Disease/blood , Reproducibility of Results , Sensitivity and Specificity
19.
Clin Transplant ; 35(4): e14215, 2021 04.
Article in English | MEDLINE | ID: mdl-33406299

ABSTRACT

INTRODUCTION: Increased societal prevalence of marijuana continues to challenge liver transplant (LT) programs. This study aimed to examine the potential effects of marijuana use on outcomes. METHODS: This retrospective study included recipients who underwent LT between 1/2012 and 6/2018. According to pre-LT marijuana use, patients were classified into recent (≤6 months of LT), former (chronic use but not ≤6 months), or non-users. Additionally, the impact of post-LT marijuana use on survival was assessed. RESULTS: Of 926 eligible patients, 184 were pre-LT marijuana users (42 recent; 142 former) (median follow-up: 30.3 months). Pre-users were more likely to be male, White, and have histories of tobacco, alcohol, and illicit drug use. Additionally, recent users were of higher acuity, with higher MELD and requiring ICU admission. Patient survival at 1-year was 89% in non-users, 94% (HR: 0.494, 95% CI: 0.239-1.022 vs. non-users) in former users, and 83% (HR: 1.516, 95% CI: 0.701-3.282) in recent users. Post-operative complications in pre-LT users and the survival analysis for post-LT marijuana users vs. non-users did not show significance. CONCLUSIONS: Our results demonstrated that marijuana use did not have an adverse impact on post-LT outcomes; however, further studies utilizing larger cohorts are warranted.


Subject(s)
Liver Transplantation , Marijuana Use , Substance-Related Disorders , Female , Humans , Liver Transplantation/adverse effects , Male , Marijuana Use/epidemiology , Postoperative Complications/epidemiology , Retrospective Studies , Transplant Recipients
20.
Acta Neurol Scand ; 144(5): 478-485, 2021 Nov.
Article in English | MEDLINE | ID: mdl-34268731

ABSTRACT

OBJECTIVES: Since 2000, medical treatment for epilepsy and cardiovascular risk-reduction strategies have advanced significantly in the United States (US). However, seizure-free rates remain unchanged, and people with epilepsy are at higher risk than the general population for heart disease and stroke. The purpose of this study is to determine how cardiovascular, epilepsy-related, and other causes of death are changing in epilepsy in comparison with the US population. MATERIALS & METHODS: Changes in the 15 underlying causes of death in epilepsy (ICD-10 G40-G40.9) and the US population were analyzed and compared from 2000 to 2018. The CDC multiple cause-of-death database was utilized as the primary data source. Changes in the relative proportions for each cause-of-death over were evaluated using logistic regression. RESULTS: The proportions of deaths in epilepsy due to heart disease declined 34.4% (p < .001), a rate similar to the general population (39.9%). Epilepsy-related deaths declined 25% as a percentage of all epilepsy deaths (p < .001). The proportions of deaths due to stroke and neoplasms increased significantly in epilepsy versus the US population (p < .001 linear trend). CONCLUSIONS: The reduction in ischemic heart disease in epilepsy is a novel and highly significant finding, which reflects widespread implementation of cardiovascular risk-factor reduction and treatment in the United States. Reductions in epilepsy-related deaths are an exciting development which requires further investigation into causality. The increase in deaths due to neoplasms and stroke relative to the US population is concerning, warranting vigilance and increased efforts at recognition, prevention, and treatment.


Subject(s)
Epilepsy , Heart Diseases , Stroke , Cause of Death , Epilepsy/epidemiology , Humans , United States/epidemiology
SELECTION OF CITATIONS
SEARCH DETAIL