Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 56
Filter
1.
Ann Gastroenterol Surg ; 8(3): 383-393, 2024 May.
Article in English | MEDLINE | ID: mdl-38707230

ABSTRACT

Background: We aimed to identify the characteristics of new-onset diabetes after liver transplantation (LT) (NODAT) and investigate its impacts on post-transplant outcomes. Methods: Adult LT patients between 2014 and 2020 who used tacrolimus as initial immunosuppression and survived 3 months at least were evaluated. Patients who developed NODAT within 3 months after LT were classified as NODAT group. Also, patients were further classified as history of diabetes before LT (PHDBT) and non-diabetes (ND) groups. Patient characteristics, post-LT outcomes, and cardiovascular and/or pulmonary complications were compared. Results: A total of 83, 225, and 263 patients were classified into NODAT, PHDBT, and ND groups. The proportion of cholestatic liver disease and rejection within 90 days were higher in NODAT group. Mean serum tacrolimus concentration trough level in the first week after LT was 7.12, 6.12, and 6.12 ng/mL (p < 0.001). Duration of corticosteroids was significantly longer in NODAT compared to PHDBD or ND (416, 289, and 228 days, p < 0.001). Three-year graft and patient survival were significantly worse in NODAT than ND (80.5% vs. 95.0%, p < 0.001: 82.0% vs. 95.4%, p < 0.001) but similar to PHDBT. Adjusted risks of 3-year graft loss and patient death using Cox regression analysis were significantly higher in NODAT compared to ND (adjusted hazard ratio [aHR] 3.41, p = 0.004; aHR 3.61, p = 0.004). Incidence rates of cardiovascular or pulmonary complications after LT in NODAT were significantly higher than ND but similar to PHDBT. Conclusion: Higher initial tacrolimus concentration and early rejection might be risk factors for NODAT. NODAT was associated with worse post-transplant outcomes.

2.
Transplantation ; 2024 Feb 27.
Article in English | MEDLINE | ID: mdl-38409687

ABSTRACT

BACKGROUND: Liver transplant (LT) using organs donated after circulatory death (DCD) has been increasing in the United States. We investigated whether transplant centers' receptiveness to use of DCD organs impacted patient outcomes. METHODS: Transplant centers were classified as very receptive (group 1), receptive (2), or less receptive (3) based on the DCD acceptance rate and DCD transplant percentage. Using organ procurement and transplantation network/UNOS registry data for 20 435 patients listed for LT from January 2020 to June 2022, we compared rates of 1-y transplant probability and waitlist mortality between groups, broken down by model for end-stage liver disease-sodium (MELD-Na) categories. RESULTS: In adjusted analyses, patients in group 1 centers with MELD-Na scores 6 to 29 were significantly more likely to undergo transplant than those in group 3 (aHR range 1.51-2.11, P < 0.001). Results were similar in comparisons between groups 1 and 2 (aHR range 1.41-1.81, P < 0.001) and between groups 2 and 3 with MELD-Na 15-24 (aHR 1.19-1.20, P < 0.007). Likewise, patients with MELD-Na score 20 to 29 in group 1 centers had lower waitlist mortality than those in group 3 (scores, 20-24: aHR, 0.71, P = 0.03; score, 25-29: aHR, 0.51, P < 0.001); those in group 1 also had lower waitlist mortality compared with group 2 (scores 20-24: aHR0.69, P = 0.02; scores 25-29: aHR 0.63, P = 0.03). One-year posttransplant survival of DCD LT patients did not vary significantly compared with donation after brain dead. CONCLUSIONS: We conclude that transplant centers' use of DCD livers can improve waitlist outcomes, particularly among mid-MELD-Na patients.

3.
Clin Transplant ; 38(1): e15190, 2024 01.
Article in English | MEDLINE | ID: mdl-37964683

ABSTRACT

BACKGROUND: After implementation of the Acuity Circles (AC) allocation policy, use of DCD liver grafts has increased in the United States. METHODS: We evaluated the impact of AC on rates of DCD-liver transplants (LT), their outcomes, and medical costs in a single practice. Adult LT patients were classified into three eras: Era 1 (pre-AC, 1/01/2015-12/31/2017); Era 2 (late pre-AC era, 1/01/2018-02/03/2020); and Era 3 (AC era, 05/10/2020-09/30/2021). RESULTS: A total of 520 eligible LTs were performed; 87 were DCD, and 433 were DBD. With each successive era, the proportion of DCD increased (Era 1: 11%; Era 2: 20%; Era 3: 24%; p < .001). DCD recipients had longer ICU stays, higher re-admission/re-operation rates, and higher incidence of ischemic cholangiopathy compared to those with DBD. Direct, surgical, and ICU costs during first admission were higher with DCD than DBD (+8.0%, p < .001; +4.2%, p < .001; and +33.3%, p = .001). DCD-related costs increased after Era 1 (Direct: +4.9% [Era 2 vs. 1] and +12.4% [Era 3 vs. 1], p = .04; Surgical: +17.7% and +21.7%, p < .001). In the AC era, there was a significantly higher proportion of donors ≥50 years, and more national organ sharing. Compared to DCD from donors <50 years, DCD from donors ≥50 years was associated with significantly higher total direct, surgical, and ICU costs (+12.6%, p = .01; +9.5%, p = .01; +84.6%, p = .03). CONCLUSIONS: The proportion of DCD-LT, especially from older donors, has increased after the implementation of AC policies. These changes are likely to be associated with higher costs in the AC era.


Subject(s)
Cardiovascular System , Liver Transplantation , Tissue and Organ Procurement , Adult , Humans , Financial Stress , Graft Survival , Living Donors , Tissue Donors , Retrospective Studies , Death , Brain Death
4.
Clin Transplant ; 37(6): e14977, 2023 06.
Article in English | MEDLINE | ID: mdl-36951511

ABSTRACT

BACKGROUND: Acuity circle (AC) policy implementation improved the waitlist outcomes for certain liver transplant (LT)-candidates. The impact of the policy implementation for liver retransplant (reLT) candidates is unknown. METHODS: Using Organ Procurement and Transplantation Network/United Network for Organ Sharing (OPTN/UNOS) data from January, 2018 to September, 2021, we investigated the effect of the AC policy on waitlist and post-LT outcomes among patients who had previously received a LT. Patients were categorized by relisting date: Pre-AC (Era 1: January 1, 2018-February 3, 2020; n = 750); and Post-AC (Era 2: February 4, 2020-June 30, 2021; n = 556). Patient and donor characteristics, as well as on-waitlist and post-reLT outcomes were compared across eras. RESULTS: In Era 2, the probability of transplant within 90 days overall and among patients relisted > 14 days from initial transplant (late relisting) were significantly higher compared to Era 1 (subdistribution hazard ratio [sHR] 1.40, 95% CI 1.18-1.64, p < .001; sHR 1.52, 95% CI 1.23-1.88, p = .001, respectively). However, there was no difference by era among patients relisted ≤14 days from initial transplant (early relisting; sHR 1.21, 95% CI .93-1.57, p = .15). Likewise, among early relisting patients, risks for 180-day graft loss and mortality were significantly higher in Era 2 versus Era 1 (adjusted hazard ratio [aHR] 5.77, 95% CI 1.71-19.51, p = .004; and aHR 8.22, 95% CI 1.85-36.59, p = .005, respectively); for late relisting patients, risks for these outcomes were similar across eras. CONCLUSION: Our results show that the implementation of AC policy has improved transplant rates and reduced waiting time for reLT candidates listed > 14 days from initial transplant. However, the impact upon early relisting patients may be mixed.


Subject(s)
End Stage Liver Disease , Liver Transplantation , Humans , Waiting Lists , End Stage Liver Disease/surgery , Policy
5.
Transplant Direct ; 8(10): e1356, 2022 Oct.
Article in English | MEDLINE | ID: mdl-36176726

ABSTRACT

Liver allocation in the United States was updated on February 4, 2020, by introducing the acuity circle (AC)-based model. This study evaluated the early effects of the AC-based allocation on waitlist outcomes. Methods: Adult liver transplant (LT) candidates listed between January 1, 2019, and September 30, 2021, were assessed. Two periods were defined according to listing date (pre- and post-AC), and 90-d waitlist outcomes were compared. Median transplant Model for End-stage Liver Disease (MELD) score of each transplant center was calculated, with centers categorized as low- (<25 percentile), mid- (25-75 percentile), and high-MELD (>75 percentile) centers. Results: A total of 12 421 and 17 078 LT candidates in the pre- and post-AC eras were identified. Overall, the post-AC era was associated with higher cause-specific 90-d hazards of transplant (csHR, 1.32; 95% confidence interval [CI], 1.27-1.38; P < 0.001) and waitlist mortality (cause-specific hazard ratio [csHR], 1.20; 95% CI, 1.09-1.32; P < 0.001). The latter effect was primarily driven by high-MELD centers. Low-MELD centers had a higher proportion of donations after circulatory death (DCDs) used. Compared with low-MELD centers, mid-MELD and high-MELD centers had significantly lower cause-specific hazards of DCD-LT in both eras (mid-MELD: csHR, 0.47; 95% CI, 0.38-0.59 in pre-AC and csHR, 0.56; 95% CI, 0.46-0.67 in post-AC and high-MELD: csHR, 0.11; 95% CI, 0.07-0.17 in pre-AC and csHR, 0.14; 95% CI, 0.10-0.20 in post-AC; all P < 0.001). Using a structural Bayesian time-series model, the AC policy was associated with an increase in the actual monthly DCD-LTs in low-, mid-, and high-MELD centers (actual/predicted: low-MELD: 19/16; mid-MELD: 21/14; high-MELD: 4/3), whereas the increase in monthly donation after brain death-LTs were only present in mid- and high-MELD centers. Conclusions: Although AC-based allocation may improve waitlist outcomes, regional variation exists in the drivers of such outcomes between centers.

6.
Transpl Int ; 35: 10489, 2022.
Article in English | MEDLINE | ID: mdl-36090776

ABSTRACT

Advanced age of liver donor is a risk factor for graft loss after transplant. We sought to identify recipient characteristics associated with negative post-liver transplant (LT) outcomes in the context of elderly donors. Using 2014-2019 OPTN/UNOS data, LT recipients were classified by donor age: ≥70, 40-69, and <40 years. Recipient risk factors for one-year graft loss were identified and created a risk stratification system and validated it using 2020 OPTN/UNOS data set. At transplant, significant recipient risk factors for one-year graft loss were: previous liver transplant (adjusted hazard ratio [aHR] 4.37, 95%CI 1.98-9.65); mechanical ventilation (aHR 4.28, 95%CI 1.95-9.43); portal thrombus (aHR 1.87, 95%CI 1.26-2.77); serum sodium <125 mEq/L (aHR 2.88, 95%CI 1.34-6.20); and Karnofsky score 10-30% (aHR 2.03, 95%CI 1.13-3.65), 40-60% (aHR 1.65, 95%CI 1.08-2.51). Using those risk factors and multiplying HRs, recipients were divided into low-risk (n = 931) and high-risk (n = 294). Adjusted risk of one-year graft loss in the low-risk recipient group was similar to that of patients with younger donors; results were consistent using validation dataset. Our results show that a system of careful recipient selection can reduce the risks of graft loss associated with older donor age.


Subject(s)
Kidney Transplantation , Liver Transplantation , Transplants , Adult , Aged , Graft Survival , Humans , Kidney Transplantation/adverse effects , Liver Transplantation/adverse effects , Tissue Donors
7.
Ann Transplant ; 27: e934850, 2022 Feb 18.
Article in English | MEDLINE | ID: mdl-35177580

ABSTRACT

BACKGROUND The new simultaneous liver-kidney transplantation (SLK) listing criteria in the United States was implemented in 2017. We aimed to investigate the impact on waitlist and post-transplantation outcomes from changes in the medical eligibility of candidates for SLK after policy implementation in the United States. MATERIAL AND METHODS We analyzed adult primary SLK candidates between January 2015 and March 2019 using the Organ Procurement and Transplant Network/United Network for Organ Sharing (OPTN/UNOS) registry. We compared waitlist practice, post-transplantation outcomes, and final transplant graft type in SLK candidates before and after the policy. RESULTS A total of 4641 patients were eligible, with 2975 and 1666 registered before and after the 2017 policy, respectively. The daily number of SLK candidates was lower after the 2017 policy (3.25 vs 2.89, P=0.01); 1956 received SLK and 95 received liver transplant alone (LTA). The proportion of patients who eventually received LTA was higher after the 2017 policy (7.9% vs 3.0%; P<0.001). The 1-year graft survival rate was worse in patients with LTA than in those with SLK (80.5% vs 90.4%; P=0.003). The adjusted risk of 1-year graft failure in patients with LTA was 2.01 (95% confidence interval 1.13-3.58, P=0.01) compared with patients with SLK among the SLK candidates. CONCLUSIONS Although the number of registrations for SLK increased, the number of SLK transplants decreased, and the number of liver transplants increased. LTA in this patient cohort was associated with worse post-transplantation outcomes.


Subject(s)
Kidney Transplantation , Liver Transplantation , Tissue and Organ Procurement , Adult , Humans , Kidney Transplantation/adverse effects , Liver , Liver Transplantation/adverse effects , Policy , Risk Factors , United States
8.
Transpl Infect Dis ; 24(2): e13808, 2022 Apr.
Article in English | MEDLINE | ID: mdl-35157334

ABSTRACT

BACKGROUND: In recipients with HCV/HIV coinfection, the impact that the wider use of direct-acting antivirals (DAAs) has had on post-liver transplant (LT) outcomes has not been evaluated. We investigated the impact of DAAs introduction on post-LT outcome in patients with HCV/HIV coinfection. METHODS: Using Organ Procurement and Transplant Network/United Network for Organ Sharing data, we compared post-LT outcomes in patients with HCV and/or HIV pre- and post-DAAs introduction. We categorized these patients into two eras: pre-DAA (2008-2012 [pre-DAA era]) and post-DAA (2014-2019 [post-DAA era]). To study the impact of DAAs introduction, inverse probability of treatment weighting was used to adjust patient characteristics. RESULTS: A total of 17 215 LT recipients were eligible for this study (HCV/HIV [n = 160]; HIV mono-infection [n = 188]; HCV mono-infection [n = 16 867]). HCV/HIV coinfection and HCV mono-infection had a significantly lower hazard of 1- and 3-year graft loss post-DAA, compared pre-DAA (1-year: adjusted hazard ratio [aHR] 0.29, 95% confidence interval (CI) 0.16-0.53 in HIV/HCV, aHR 0.58, 95% CI 0.54-0.63, respectively; 3-year: aHR 0.30, 95% CI 0.14-0.61, aHR 0.64, 95% CI 0.58-0.70, respectively). The hazards of 1- and 3-year graft loss post-DAA in HIV mono-infection were comparable to those in pre-DAA. HCV/HIV coinfection had significantly lower patient mortality post-DAA, compared to pre-DAA (1-year: aHR 0.30, 95% CI 0.17-0.55; 3-year: aHR 0.31, 95% CI 0.15-0.63). CONCLUSIONS: Post-LT outcomes in patients with coinfection significantly improved and became comparable to those with HCV mono-infection after introducing DAA therapy. The introduction of DAAs supports the use of LT in the setting of HCV/HIV coinfection.


Subject(s)
Coinfection , HIV Infections , Hepatitis C, Chronic , Hepatitis C , Liver Transplantation , Antiviral Agents/therapeutic use , Coinfection/drug therapy , HIV Infections/complications , HIV Infections/drug therapy , Hepacivirus , Hepatitis C/complications , Hepatitis C/drug therapy , Hepatitis C, Chronic/complications , Hepatitis C, Chronic/drug therapy , Humans , Retrospective Studies
9.
Transplant Direct ; 8(2): e1283, 2022 Feb.
Article in English | MEDLINE | ID: mdl-35187210

ABSTRACT

Use of higher-risk grafts in liver transplantation for patients with acute-on-chronic liver failure (ACLF) has been associated with poor outcomes. This study analyzes trends in liver transplantation outcomes for ACLF over time based on the donor risk index (DRI). METHODS: Using the Organ Procurement and Transplantation Network and the United Network for Organ Sharing registry, 17 300 ACLF patients who underwent liver transplantation between 2002 and 2019 were evaluated. Based on DRI, adjusted hazard ratios for 1-y patient death were analyzed in 3 eras: Era 1 (2002-2007, n = 4032), Era 2 (2008-2013, n = 6130), and Era 3 (2014-2019, n = 7138). DRI groups were defined by DRI <1.2, 1.2-1.6, 1.6-2.0, and >2.0. RESULTS: ACLF patients had significantly lower risks of patient death within 1 y in Era 2 (adjusted hazard ratio, 0.69; 95% confidence interval, 0.61-0.78; P < 0.001) and Era 3 (adjusted hazard ratio, 0.48; 95% confidence interval, 0.42-0.55; P < 0.001) than in Era 1. All DRI groups showed lower hazards in Era 3 than in Era 1. Improvement of posttransplant outcomes were found both in ACLF-1/2 and ACLF-3 patients. In ACLF-1/2, DRI 1.2 to 1.6 and >2.0 had lower adjusted risk in Era 3 than in Era 1. In ACLF-3, DRI 1.2 to 2.0 had lower risk in Era 3. In the overall ACLF cohort, the 2 categories with DRI >1.6 had significantly higher adjusted risks of 1-y patient death than DRI <1.2. When analyzing hazards in each era, DRI > 2.0 carried significantly higher adjusted risks in Eras 1 and 3' whereas DRI 1.2 to 2.0 had similar adjusted risks throughout eras. Similar tendency was found in ACLF-1/2. In the non-ACLF cohort, steady improvement of posttransplant outcomes was obtained in all DRI categories. Similar results were obtained when only hepatitis C virus-uninfected ACLF patients were evaluated. CONCLUSIONS: In ACLF patients, posttransplant outcomes have significantly improved, and outcomes with higher-risk organs have improved in all ACLF grades. These results might encourage the use of higher-risk donors in ACLF patients and provide improved access to transplant.

10.
Liver Transpl ; 28(7): 1133-1143, 2022 07.
Article in English | MEDLINE | ID: mdl-35224855

ABSTRACT

Current liver transplantation (LT) organ allocation relies on Model for End-Stage Liver Disease-sodium scores to predict mortality in patients awaiting LT. This study aims to develop neural network (NN) models that more accurately predict LT waitlist mortality. The study evaluates patients listed for LT between February 27, 2002, and June 30, 2021, using the Organ Procurement and Transplantation Network/United Network for Organ Sharing registry. We excluded patients listed with Model for End-Stage Liver Disease (MELD) exception scores and those listed for multiorgan transplant, except for liver-kidney transplant. A subset of data from the waiting list was used to create a mortality prediction model at 90 days after listing with 105,140 patients. A total of 28 variables were selected for model creation. The data were split using random sampling into training, validation, and test data sets in a 60:20:20 ratio. The performance of the model was assessed using area under the receiver operating curve (AUC-ROC) and area under the precision-recall curve (AUC-PR). AUC-ROC for 90-day mortality was 0.936 (95% confidence interval [CI], 0.934-0.937), and AUC-PR was 0.758 (95% CI, 0.754-0.762). The NN 90-day mortality model outperformed MELD-based models for both AUC-ROC and AUC-PR. The 90-day mortality model specifically identified more waitlist deaths with a higher recall (sensitivity) of 0.807 (95% CI, 0.803-0.811) versus 0.413 (95% CI, 0.409-0.418; p < 0.001). The performance metrics were compared by breaking the test data set into multiple patient subsets by ethnicity, gender, region, age, diagnosis group, and year of listing. The NN 90-day mortality model outperformed MELD-based models across all subsets in predicting mortality. In conclusion, organ allocation based on NN modeling has the potential to decrease waitlist mortality and lead to more equitable allocation systems in LT.


Subject(s)
End Stage Liver Disease , Liver Transplantation , End Stage Liver Disease/diagnosis , End Stage Liver Disease/surgery , Humans , Liver Transplantation/adverse effects , Neural Networks, Computer , Severity of Illness Index , Waiting Lists
11.
Am J Transplant ; 22(3): 909-926, 2022 03.
Article in English | MEDLINE | ID: mdl-34780106

ABSTRACT

To extend previous molecular analyses of rejection in liver transplant biopsies in the INTERLIVER study (ClinicalTrials.gov #NCT03193151), the present study aimed to define the gene expression selective for parenchymal injury, fibrosis, and steatohepatitis. We analyzed genome-wide microarray measurements from 337 liver transplant biopsies from 13 centers. We examined expression of genes previously annotated as increased in injury and fibrosis using principal component analysis (PCA). PC1 reflected parenchymal injury and related inflammation in the early posttransplant period, slowly regressing over many months. PC2 separated early injury from late fibrosis. Positive PC3 identified a distinct mildly inflamed state correlating with histologic steatohepatitis. Injury PCs correlated with liver function and histologic abnormalities. A classifier trained on histologic steatohepatitis predicted histologic steatohepatitis with cross-validated AUC = 0.83, and was associated with pathways reflecting metabolic abnormalities distinct from fibrosis. PC2 predicted histologic fibrosis (AUC = 0.80), as did a molecular fibrosis classifier (AUC = 0.74). The fibrosis classifier correlated with matrix remodeling pathways with minimal overlap with those selective for steatohepatitis, although some biopsies had both. Genome-wide assessment of liver transplant biopsies can not only detect molecular changes induced by rejection but also those correlating with parenchymal injury, steatohepatitis, and fibrosis, offering potential insights into disease mechanisms for primary diseases.


Subject(s)
Liver Transplantation , Liver , Biopsy , Fatty Liver , Fibrosis , Graft Rejection , Humans , Liver/pathology , Liver Transplantation/adverse effects , Phenotype
12.
Transpl Int ; 34(12): 2856-2868, 2021 12.
Article in English | MEDLINE | ID: mdl-34580929

ABSTRACT

The impact of hyponatremia on waitlist and post-transplant outcomes following the implementation of MELD-Na-based liver allocation remains unclear. We investigated waitlist and postliver transplant (LT) outcomes in patients with hyponatremia before and after implementing MELD-Na-based allocation. Adult patients registered for a primary LT between 2009 and 2021 were identified in the OPTN/UNOS database. Two eras were defined; pre-MELD-Na and post-MELD-Na. Extreme hyponatremia was defined as a serum sodium concentration ≤120 mEq/l. Ninety-day waitlist outcomes and post-LT survival were compared using Fine-Gray proportional hazard and mixed-effects Cox proportional hazard models. A total of 118 487 patients were eligible (n = 64 940: pre-MELD-Na; n = 53 547: post-MELD-Na). In the pre-MELD-Na era, extreme hyponatremia at listing was associated with an increased risk of 90-day waitlist mortality ([ref: 135-145] HR: 3.80; 95% CI: 2.97-4.87; P < 0.001) and higher transplant probability (HR: 1.67; 95% CI: 1.38-2.01; P < 0.001). In the post-MELD-Na era, patients with extreme hyponatremia had a proportionally lower relative risk of waitlist mortality (HR: 2.27; 95% CI 1.60-3.23; P < 0.001) and proportionally higher transplant probability (HR: 2.12; 95% CI 1.76-2.55; P < 0.001) as patients with normal serum sodium levels (135-145). Extreme hyponatremia was associated with a higher risk of 90, 180, and 365-day post-LT survival compared to patients with normal serum sodium levels. With the introduction of MELD-Na-based allocation, waitlist outcomes have improved in patients with extreme hyponatremia but they continue to have worse short-term post-LT survival.


Subject(s)
Hyponatremia , Liver Transplantation , Adult , Humans , Hyponatremia/etiology , Risk Factors , Sodium , Waiting Lists
14.
Oncotarget ; 12(3): 185-198, 2021 Feb 02.
Article in English | MEDLINE | ID: mdl-33613846

ABSTRACT

Hepatocellular carcinoma (HCC) is the most common primary liver tumor worldwide. Current medical therapy for HCC has limited efficacy. The present study tests the hypothesis that human cerebral endothelial cell-derived exosomes carrying elevated miR-214 (hCEC-Exo-214) can amplify the efficacy of anti-cancer drugs on HCC cells. Treatment of HepG2 and Hep3B cells with hCEC-Exo-214 in combination with anti-cancer agents, oxaliplatin or sorafenib, significantly reduced cancer cell viability and invasion compared with monotherapy with either drug. Additionally, the therapeutic effect of the combination therapy was detected in primary tumor cells derived from patients with HCC. The ability of hCEC-Exo-214 in sensitizing HCC cells to anti-cancer drugs was specific, in that combination therapy did not affect the viability and invasion of human liver epithelial cells and non-cancer primary cells. Furthermore, compared to monotherapy with oxaliplatin and sorafenib, hCEC-Exo-214 in combination with either drug substantially reduced protein levels of P-glycoprotein (P-gp) and splicing factor 3B subunit 3 (SF3B3) in HCC cells. P-gp and SF3B3 are among miR-214 target genes and are known to mediate drug resistance and cancer cell proliferation, respectively. In conclusion, the present in vitro study provides evidence that hCEC-Exo-214 significantly enhances the anti-tumor efficacy of oxaliplatin and sorafenib on HCC cells.

15.
Transplantation ; 105(12): 2571-2578, 2021 12 01.
Article in English | MEDLINE | ID: mdl-33449608

ABSTRACT

BACKGROUND: Graft-versus-host disease (GVHD) after liver transplantation (LT) is a rare but serious complication. The aim of this study is to identify risk factors, including immunosuppressive regimens, for mortality due to GVHD (fatal GVHD). METHODS: Using data from the Organ Procurement and Transplantation Network and United Network for Organ Sharing registry, 77 416 adult patients who underwent LT between 2003 and 2018 were assessed. Risk factors for fatal GVHD were analyzed by focusing on induction and maintenance immunosuppression regimens. RESULTS: The incidence of fatal GVHD was 0.2% (121 of 77 416), of whom 105 (87%) died within 180 d and 13 (11%) died between 181 d and 1 y. Median survival after LT was 68.0 (49.5-125.5) d. Recipient age minus donor age >20 y (hazard ratio [HR], 2.57; P < 0.001) and basiliximab induction (HR, 1.69; P = 0.018) were independent risk factors for fatal GVHD. Maintenance therapy with mycophenolate mofetil (MMF) was associated with a decrease in fatal GVHD (HR, 0.51; P = 0.001). In an increased risk cohort of patients with recipient-donor age discrepancy >20 y, MMF use was associated with a 50% decline in fatal GVHD (HR, 0.50; P < 0.001). CONCLUSIONS: Recipient age minus donor age >20 y remains a significant risk factor for fatal GVHD. The risk of fatal GVHD significantly increases in association with basiliximab induction and decreases with MMF maintenance. These associations were pronounced in patients with recipient minus donor age >20 y. These results emphasize the importance of donor age and individualized immunosuppression regimens on the risk of fatal GVHD.


Subject(s)
Graft vs Host Disease , Hematopoietic Stem Cell Transplantation , Liver Transplantation , Adult , Graft vs Host Disease/diagnosis , Graft vs Host Disease/epidemiology , Graft vs Host Disease/etiology , Hematopoietic Stem Cell Transplantation/adverse effects , Humans , Incidence , Liver Transplantation/adverse effects , Mycophenolic Acid , Risk Factors
16.
Liver Transpl ; 27(7): 971-983, 2021 07.
Article in English | MEDLINE | ID: mdl-33492764

ABSTRACT

Although recent studies have reported favorable outcomes in living donor liver transplantation (LDLT), it remains unclear which populations benefit most from LDLT. This study aims to evaluate LDLT outcomes compared with deceased donor LT (DDLT) according to Model for End-Stage Liver Disease (MELD) score categories. Using data from the United Network for Organ Sharing registry, outcomes were compared between 1486 LDLTs; 13,568 donation after brain death (DBD)-DDLTs; and 1171 donation after circulatory death (DCD)-DDLTs between 2009 and 2018. Because LDLT for patients with MELD scores >30 was rare, all patients with scores >30 were excluded to equalize LDLT and DDLT cohorts. Risk factors for 1-year graft loss (GL) were determined separately for LDLT and DDLT. Compared with LDLT, DBD-DDLT had a lower risk of 30-day (adjusted hazard ratio [aHR], 0.60; P < 0.001) and 1-year GL (aHR, 0.57; P < 0.001). The lower risk of GL was more prominent in the mid-MELD score category (score 15-29). Compared with LDLT, DCD-DDLT had a lower risk of 30-day GL but a comparable risk of 1-year GL, regardless of MELD score category. In LDLT, significant ascites was an independent risk for GL in patients with mid-MELD scores (aHR, 1.68; P = 0.02), but not in the lower-MELD score group. The risk of 1-year GL in LDLT patients with ascites who received a left liver was higher than either those who received a right liver or those without ascites who received a left liver. In LDLT, combinations of MELD scores of 15 to 29, moderate/severe ascites, and the use of a left liver are associated with worse outcomes. These findings help calibrate appropriate patient and graft selection in LDLT.


Subject(s)
End Stage Liver Disease , Liver Transplantation , End Stage Liver Disease/diagnosis , End Stage Liver Disease/surgery , Graft Survival , Humans , Liver Transplantation/adverse effects , Living Donors , Retrospective Studies , Severity of Illness Index , Treatment Outcome
17.
Transpl Int ; 34(3): 499-513, 2021 03.
Article in English | MEDLINE | ID: mdl-33423330

ABSTRACT

This study aimed to evaluate possible discrepancies in waitlist outcomes between liver diseases, including alcohol-related liver disease (ALD), nonalcoholic steatohepatitis (NASH), hepatitis C virus infection (HCV), primary biliary cirrhosis (PBC), and primary sclerosing cholangitis (PSC). Patients registered for liver transplantation from January 11, 2016, to June 30, 2018, were evaluated using OPTN/UNOS registry. Waitlist outcomes were compared between the five-disease groups. Patients were categorized by initial MELD-Na-score (6-20, 21-29, and ≥30) to identify outcome variations. Prognostic impact of transplantation was assessed according to final MELD-Na scores using Cox regression analysis modeling transplantation as a time-dependent covariate. 6053 with ALD, 3814 with NASH, 1558 with HCV, 602 with PBC, and 819 with PSC were eligible. Compared to ALD with comparable MELD-Na-scores, NASH with lower [adjusted hazard ratio (aHR) = 1.30, P = 0.042] and mid-scores (aHR = 1.35, P = 0.008) showed significantly higher risk of 1-year waitlist mortality, and PBC with higher scores showed significantly higher risk of 90-day (aHR = 1.69, P = 0.03) and 1-year waitlist mortality (aHR = 1.69, P = 0.02). Positive prognostic impact of transplantation was not seen until score of 24-27 in ALD, 18-20 in HCV, 15-17 in NASH, and 24-27 in PBC and PSC. There are significant differences in waitlist outcomes among etiologies, which may differ the optimal transplant timing.


Subject(s)
Cholangitis, Sclerosing , Liver Cirrhosis, Biliary , Liver Transplantation , Humans , Liver Cirrhosis, Biliary/surgery , Retrospective Studies , Waiting Lists
18.
J Fam Pract ; 70(10): 474-481, 2021 12.
Article in English | MEDLINE | ID: mdl-35119986

ABSTRACT

Direct biomarkers detect alcohol even in small amounts shortly after ingestion. But which one is nearly 100% specific for alcohol use?


Subject(s)
Alcohol Drinking , Alcohol Drinking/adverse effects , Biomarkers , Humans
19.
Am J Transplant ; 21(3): 1100-1112, 2021 03.
Article in English | MEDLINE | ID: mdl-32794649

ABSTRACT

The success of direct-acting antiviral (DAA) therapy has led to near-universal cure for patients chronically infected with hepatitis C virus (HCV) and improved post-liver transplant (LT) outcomes. We investigated the trends and outcomes of retransplantation in HCV and non-HCV patients before and after the introduction of DAA. Adult patients who underwent re-LT were identified in the Organ Procurement and Transplantation Network/United Network for Organ Sharing database. Multiorgan transplants and patients with >2 total LTs were excluded. Two eras were defined: pre-DAA (2009-2012) and post-DAA (2014-2017). A total of 2112 re-LT patients were eligible (HCV: n = 499 pre-DAA and n = 322 post-DAA; non-HCV: n = 547 pre-DAA and n = 744 post-DAA). HCV patients had both improved graft and patient survival after re-LT in the post-DAA era. One-year graft survival was 69.8% pre-DAA and 83.8% post-DAA (P < .001). One-year patient survival was 73.1% pre-DAA and 86.2% post-DAA (P < .001). Graft and patient survival was similar between eras for non-HCV patients. When adjusted, the post-DAA era represented an independent positive predictive factor for graft and patient survival (hazard ratio [HR]: 0.67; P = .005, and HR: 0.65; P = .004) only in HCV patients. The positive post-DAA era effect was observed only in HCV patients with first graft loss due to disease recurrence (HR: 0.31; P = .002, HR 0.32; P = .003, respectively). Among HCV patients, receiving a re-LT in the post-DAA era was associated with improved patient and graft survival.


Subject(s)
Hepatitis C, Chronic , Hepatitis C , Adult , Antiviral Agents/therapeutic use , Hepacivirus , Hepatitis C/drug therapy , Hepatitis C/surgery , Hepatitis C, Chronic/drug therapy , Hepatitis C, Chronic/surgery , Humans , Reoperation , Retrospective Studies , United States/epidemiology
SELECTION OF CITATIONS
SEARCH DETAIL