Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 67
Filter
Add more filters

Publication year range
1.
Ann Surg ; 278(3): 441-451, 2023 09 01.
Article in English | MEDLINE | ID: mdl-37389564

ABSTRACT

OBJECTIVE: To examine liver retransplantation (ReLT) over 35 years at a single center. BACKGROUND: Despite the durability of liver transplantation (LT), graft failure affects up to 40% of LT recipients. METHODS: All adult ReLTs from 1984 to 2021 were analyzed. Comparisons were made between ReLTs in the pre versus post-model for end-stage liver disease (MELD) eras and between ReLTs and primary-LTs in the modern era. Multivariate analysis was used for prognostic modeling. RESULTS: Six hundred fifty-four ReLTs were performed in 590 recipients. There were 372 pre-MELD ReLTs and 282 post-MELD ReLTs. Of the ReLT recipients, 89% had one previous LT, whereas 11% had ≥2. Primary nonfunction was the most common indication in the pre-MELD era (33%) versus recurrent disease (24%) in the post-MELD era. Post-MELD ReLT recipients were older (53 vs 48, P = 0.001), had higher MELD scores (35 vs 31, P = 0.01), and had more comorbidities. However, post-MELD ReLT patients had superior 1, 5, and 10-year survival compared with pre-MELD ReLT (75%, 60%, and 43% vs 53%, 43%, and 35%, respectively, P < 0.001) and lower in-hospital mortality and rejection rates. Notably, in the post-MELD era, the MELD score did not affect survival. We identified the following risk factors for early mortality (≤12 months after ReLT): coronary artery disease, obesity, ventilatory support, older recipient age, and longer pre-ReLT hospital stay. CONCLUSIONS: This represents the largest single-center ReLT report to date. Despite the increased acuity and complexity of ReLT patients, post-MELD era outcomes have improved. With careful patient selection, these results support the efficacy and survival benefit of ReLT in an acuity-based allocation environment.


Subject(s)
End Stage Liver Disease , Liver Transplantation , Adult , Humans , Retrospective Studies , Severity of Illness Index , Graft Survival
2.
Ann Surg ; 278(5): e912-e921, 2023 Nov 01.
Article in English | MEDLINE | ID: mdl-37389552

ABSTRACT

OBJECTIVE: To compare conventional low-temperature storage of transplant donor livers [static cold storage (SCS)] with storage of the organs at physiological body temperature [normothermic machine perfusion (NMP)]. BACKGROUND: The high success rate of liver transplantation is constrained by the shortage of transplantable organs (eg, waiting list mortality >20% in many centers). NMP maintains the liver in a functioning state to improve preservation quality and enable testing of the organ before transplantation. This is of greatest potential value with organs from brain-dead donor organs (DBD) with risk factors (age and comorbidities), and those from donors declared dead by cardiovascular criteria (donation after circulatory death). METHODS: Three hundred eighty-three donor organs were randomized by 15 US liver transplant centers to undergo NMP (n = 192) or SCS (n = 191). Two hundred sixty-six donor livers proceeded to transplantation (NMP: n = 136; SCS: n = 130). The primary endpoint of the study was "early allograft dysfunction" (EAD), a marker of early posttransplant liver injury and function. RESULTS: The difference in the incidence of EAD did not achieve significance, with 20.6% (NMP) versus 23.7% (SCS). Using exploratory, "as-treated" rather than "intent-to-treat," subgroup analyses, there was a greater effect size in donation after circulatory death donor livers (22.8% NMP vs 44.6% SCS) and in organs in the highest risk quartile by donor risk (19.2% NMP vs 33.3% SCS). The incidence of acute cardiovascular decompensation at organ reperfusion, "postreperfusion syndrome," as a secondary outcome was reduced in the NMP arm (5.9% vs 14.6%). CONCLUSIONS: NMP did not lower EAD, perhaps related to the inclusion of lower-risk liver donors, as higher-risk donor livers seemed to benefit more. The technology is safe in standard organ recovery and seems to have the greatest benefit for marginal donors.

3.
Liver Transpl ; 28(3): 386-396, 2022 03.
Article in English | MEDLINE | ID: mdl-34482610

ABSTRACT

Liver transplantation (LT) for cholangiocarcinoma (CCA) remains limited to a small number of centers. Although the role of neoadjuvant therapy (NAT) has been explored over time, an in-depth analysis of NAT strategies remains limited. Furthermore, controversy exists regarding acceptable tumor size during patient selection for LT. This study explores the impact of era, tumor size, and NAT strategy on LT outcomes for CCA. We conducted a retrospective review of 53 patients with CCA treated with LT from 1985 to 2019; 19 hilar CCA (hCCA) and 30 intrahepatic CCA (iCCA) were included. The relative contributions of varying NAT (neoadjuvant chemotherapy [NAC], neoadjuvant local therapy [NALT], and combined NAC and NALT [NACLT]) as well as the implication of tumor size and era were analyzed. The primary endpoint was overall survival (OS). Compared with the old era (1985-2007), 5-year OS in patients who underwent LT in the recent era (2008-2019) showed a superior trend. The 5-year OS from initial treatment in patients receiving NACLT for hCCA and iCCA were 88% and 100% versus 9% and 41% in patients without it, respectively (P = 0.01 for hCCA; P = 0.02 for iCCA), whereas NAC or NALT alone did not show significant differences in OS versus no NAT (P > 0.05). Although 33 patients had large-size tumors (hCCA ≥ 30 mm, n = 12, or iCCA ≥ 50 mm, n = 21), tumor size had no impact on survival outcomes. Outcomes of LT for CCA seem to have improved over time. Multimodal NAT is associated with improved survival in LT for both iCCA and hCCA regardless of tumor size.


Subject(s)
Bile Duct Neoplasms , Cholangiocarcinoma , Liver Transplantation , Bile Duct Neoplasms/surgery , Bile Ducts, Intrahepatic/pathology , Cholangiocarcinoma/surgery , Humans , Liver Transplantation/adverse effects , Neoadjuvant Therapy , Treatment Outcome
4.
Am J Transplant ; 21(2): 614-625, 2021 02.
Article in English | MEDLINE | ID: mdl-32713098

ABSTRACT

Ischemia-reperfusion injury (IRI) is believed to contribute to graft dysfunction after liver transplantation (LT). However, studies on IRI and the impact of early allograft dysfunction (EAD) in IRI grafts are limited. Histological IRI was graded in 506 grafts from patients who had undergone LT and classified based on IRI severity (no, minimal, mild, moderate, and severe). Of the 506 grafts, 87.4% had IRI (no: 12.6%, minimal: 38.1%, mild: 35.4%, moderate: 13.0%, and severe: 0.8%). IRI severity correlated with the incidence of EAD and graft survival at 6 months. Longer cold/warm ischemia time, recipient/donor hypertension, and having a male donor were identified as independent risk factors for moderate to severe IRI. Among 70 grafts with moderate to severe IRI, 42.9% of grafts developed EAD, and grafts with EAD had significantly inferior survival compared to grafts without EAD. Longer cold ischemia time and large droplet macrovesicular steatosis (≥20%) were identified as independent risk factors for EAD. Our study demonstrated that increased IRI severity was correlated with inferior short-term graft outcomes. Careful consideration of IRI risk factors during donor-recipient matching may assist in optimizing graft utilization and LT outcomes. Furthermore, identification of risk factors of IRI-associated EAD may guide patient management and possible timely graft replacement.


Subject(s)
Liver Transplantation , Reperfusion Injury , Allografts , Cold Ischemia/adverse effects , Graft Survival , Humans , Liver Transplantation/adverse effects , Male , Reperfusion Injury/etiology , Risk Factors
5.
J Hepatol ; 74(4): 881-892, 2021 04.
Article in English | MEDLINE | ID: mdl-32976864

ABSTRACT

BACKGROUND & AIMS: Early allograft dysfunction (EAD) following liver transplantation (LT) negatively impacts graft and patient outcomes. Previously we reported that the liver graft assessment following transplantation (L-GrAFT7) risk score was superior to binary EAD or the model for early allograft function (MEAF) score for estimating 3-month graft failure-free survival in a single-center derivation cohort. Herein, we sought to externally validate L-GrAFT7, and compare its prognostic performance to EAD and MEAF. METHODS: Accuracies of L-GrAFT7, EAD, and MEAF were compared in a 3-center US validation cohort (n = 3,201), and a Consortium for Organ Preservation in Europe (COPE) normothermic machine perfusion (NMP) trial cohort (n = 222); characteristics were compared to assess generalizability. RESULTS: Compared to the derivation cohort, patients in the validation and NMP trial cohort had lower recipient median MELD scores; were less likely to require pretransplant hospitalization, renal replacement therapy or mechanical ventilation; and had superior 1-year overall (90% and 95% vs. 84%) and graft failure-free (88% and 93% vs. 81%) survival, with a lower incidence of 3-month graft failure (7.4% and 4.0% vs. 11.1%; p <0.001 for all comparisons). Despite significant differences in cohort characteristics, L-GrAFT7 maintained an excellent validation AUROC of 0.78, significantly superior to binary EAD (AUROC 0.68, p = 0.001) and MEAF scores (AUROC 0.72, p <0.001). In post hoc analysis of the COPE NMP trial, the highest tertile of L-GrAFT7 was significantly associated with time to liver allograft (hazard ratio [HR] 2.17, p = 0.016), Clavien ≥IIIB (HR 2.60, p = 0.034) and ≥IVa (HR 4.99, p = 0.011) complications; post-LT length of hospitalization (p = 0.002); and renal replacement therapy (odds ratio 3.62, p = 0.016). CONCLUSIONS: We have validated the L-GrAFT7 risk score as a generalizable, highly accurate, individualized risk assessment of 3-month liver allograft failure that is superior to existing scores. L-GrAFT7 may standardize grading of early hepatic allograft function and serve as a clinical endpoint in translational studies (www.lgraft.com). LAY SUMMARY: Early allograft dysfunction negatively affects outcomes following liver transplantation. In independent multicenter US and European cohorts totaling 3,423 patients undergoing liver transplantation, the liver graft assessment following transplantation (L-GrAFT) risk score is validated as a superior measure of early allograft function that accurately discriminates 3-month graft failure-free survival and post-liver transplantation complications.


Subject(s)
Liver Transplantation , Primary Graft Dysfunction , Risk Assessment , Europe/epidemiology , Female , Graft Survival , Humans , Liver Transplantation/adverse effects , Liver Transplantation/methods , Liver Transplantation/statistics & numerical data , Male , Middle Aged , Outcome and Process Assessment, Health Care/statistics & numerical data , Primary Graft Dysfunction/diagnosis , Primary Graft Dysfunction/epidemiology , Primary Graft Dysfunction/therapy , Prognosis , Reperfusion Injury/diagnosis , Reperfusion Injury/epidemiology , Reperfusion Injury/therapy , Reproducibility of Results , Risk Assessment/methods , Risk Assessment/standards , Risk Factors , Survival Analysis , United States/epidemiology
6.
Liver Transpl ; 25(12): 1778-1789, 2019 12.
Article in English | MEDLINE | ID: mdl-31509643

ABSTRACT

Intestinal microbiota is thought to play an important role in hepatic ischemia/reperfusion injury (IRI) after liver transplantation (LT). Rifaximin, a nonabsorbable antibiotic used to treat encephalopathy, exhibits antibacterial activity within the gut. We report the first study examining the impact of pre-LT rifaximin use on reducing hepatic IRI and inflammatory cell infiltration after LT. This retrospective single-center study included adult LT recipients from January 2013 through June 2016. Patients were divided into 2 groups based on duration of rifaximin use before LT: rifaximin group (≥28 days) and control group (none or <28 days). Patients receiving other antibiotics within 28 days of LT and re-LTs were excluded. Outcomes and messenger RNA (mRNA) expression in the graft were compared by 1:1 propensity score-matching and multivariate analyses. On 1:1 matching (n = 39/group), rifaximin patients had lower postoperative serum transaminase levels and lower early allograft dysfunction (EAD; 10.3% versus 33.3%; P = 0.014). Of the matched patients, 8 patients (n = 4/group) had postreperfusion liver biopsies (approximately 2 hours after reperfusion) available for mRNA analysis. Hepatic expression of CD86 (macrophage marker) and cathepsin G (neutrophil marker) was significantly lower in rifaximin patients than controls (P < 0.05). The multivariate analysis included 458 patients. Rifaximin treatment <28 days was identified as an independent risk factor EAD in all patients and those with high Model for End-Stage Liver Disease (MELD) score (MELD ≥35; n = 230). In conclusion, the propensity score-matched and multivariate analyses suggest a therapeutic role of rifaximin in reducing EAD. Pre-LT rifaximin administration exerted a protective function against early liver injury, potentially by suppressing inflammatory cell activation in the graft.


Subject(s)
Antibiotic Prophylaxis/methods , Gastrointestinal Microbiome/drug effects , Graft Rejection/epidemiology , Liver Transplantation/adverse effects , Postoperative Complications/epidemiology , Reperfusion Injury/epidemiology , Rifaximin/administration & dosage , Adult , Aged , Allografts/blood supply , Allografts/pathology , Antibiotic Prophylaxis/statistics & numerical data , Biomarkers/analysis , Biopsy , Drug Administration Schedule , Female , Graft Rejection/diagnosis , Graft Rejection/etiology , Graft Rejection/prevention & control , Graft Survival , Humans , Liver/blood supply , Liver/pathology , Liver Function Tests , Male , Middle Aged , Postoperative Complications/diagnosis , Postoperative Complications/etiology , Postoperative Complications/prevention & control , Preoperative Care/methods , Preoperative Care/statistics & numerical data , Propensity Score , Reperfusion/adverse effects , Reperfusion Injury/diagnosis , Reperfusion Injury/etiology , Reperfusion Injury/prevention & control , Retrospective Studies , Risk Factors , Survival Rate , Time Factors , Young Adult
7.
Clin Transplant ; 33(6): e13569, 2019 06.
Article in English | MEDLINE | ID: mdl-31006141

ABSTRACT

BACKGROUND: Kidney delayed graft function (kDGF) remains a challenging problem following simultaneous liver and kidney transplantation (SLKT) with a reported incidence up to 40%. Given the scarcity of renal allografts, it is crucial to minimize the development of kDGF among SLKT recipients to improve patient and graft outcomes. We sought to assess the role of preoperative recipient and donor/graft factors on developing kDGF among recipients of SLKT. METHODS: A retrospective review of 194 patients who received SLKT in the period from January 2004 to March 2017 in a single center was performed to assess the effect of preoperative factors on the development of kDGF. RESULTS: Kidney delayed graft function was observed in 95 patients (49%). Multivariate analysis revealed that donor history of hypertension, cold static preservation of kidney grafts [versus using hypothermic pulsatile machine perfusion (HPMP)], donor final creatinine, physiologic MELD, and duration of delay of kidney transplantation after liver transplantation were significant independent predictors for kDGF. kDGF is associated with worse graft function and patient and graft survival. CONCLUSIONS: Kidney delayed graft function has detrimental effects on graft function and graft survival. Understanding the risks and combining careful perioperative patient management, proper recipient selection and donor matching, and graft preservation using HPMP would decrease kDGF among SLKT recipients.


Subject(s)
Cold Temperature , Delayed Graft Function/epidemiology , Graft Survival , Kidney Transplantation/methods , Liver Transplantation/methods , Organ Preservation/methods , Risk Assessment/methods , Adult , Delayed Graft Function/physiopathology , Female , Follow-Up Studies , Humans , Incidence , Male , Middle Aged , Perfusion , Predictive Value of Tests , Prospective Studies , Retrospective Studies , Young Adult
8.
Ann Surg ; 262(3): 536-45; discussion 543-5, 2015 Sep.
Article in English | MEDLINE | ID: mdl-26258323

ABSTRACT

OBJECTIVES: To evaluate the rate, effect, and predictive factors of a complete pathologic response (cPR) in patients with hepatocellular carcinoma (HCC) undergoing locoregional therapy (LRT) before liver transplantation (LT). BACKGROUND: Eligible patients with HCC receive equal model for end-stage liver disease prioritization, despite variable risks of tumor progression, waitlist dropout, and posttransplant recurrence. Pretransplant LRT mitigates these risks by inducing tumor necrosis. METHODS: Comparisons were made among HCC recipients with cPR (n = 126) and without cPR (n = 375) receiving pre-LT LRT (1994-2013). Multivariable predictors of cPR were identified. RESULTS: Of 501 patients, 272, 148, and 81 received 1, 2, and 3 or more LRT treatments. The overall, recurrence-free, and disease-specific survival at 1-, 3-, and 5 years was 86%, 71%, 63%; 84%, 67%, 60%; and 97%, 90%, 87%. Compared with recipients without cPR, cPR patients had significantly lower laboratory model for end-stage liver disease scores, pretransplant alpha fetoprotein, and cumulative tumor diameters; were more likely to have 1 lesion, tumors within Milan/University of California, San Francisco (UCSF) criteria, LRT that included ablation, and a favorable tumor response to LRT; and had superior 1-, 3-, and 5-year recurrence-free survival (92%, 79%, and 73% vs 81%, 63%, and 56%; P = 0.006) and disease-specific survival (100%, 100%, and 99% vs 96%, 89%, and 86%; P < 0.001) with only 1 cancer-specific death and fewer recurrences (2.4% vs 15.2%; P < 0.001). Multivariate predictors of cPR included a favorable post-LRT radiologic/alpha fetoprotein tumor response, longer time interval from LRT to LT, and lower model for end-stage liver disease score and maximum tumor diameter (C-statistic 0.75). CONCLUSIONS: Achieving cPR in patients with HCC receiving LRT strongly predicts tumor-free survival. Factors predicting cPR are identified, allowing for differential prioritization of HCC recipients based on their variable risks of post-LT recurrence. Improving LRT strategies to maximize cPR would enhance posttransplant cancer outcomes.


Subject(s)
Carcinoma, Hepatocellular/pathology , Carcinoma, Hepatocellular/surgery , Liver Neoplasms/pathology , Liver Neoplasms/surgery , Liver Transplantation/methods , Neoadjuvant Therapy/methods , Adult , Aged , Biopsy, Needle , Carcinoma, Hepatocellular/mortality , Carcinoma, Hepatocellular/therapy , Databases, Factual , Disease-Free Survival , Female , Follow-Up Studies , Graft Rejection , Graft Survival , Humans , Immunohistochemistry , Liver Neoplasms/mortality , Liver Neoplasms/therapy , Liver Transplantation/mortality , Logistic Models , Male , Middle Aged , Multivariate Analysis , Predictive Value of Tests , Preoperative Care/methods , Retrospective Studies , Statistics, Nonparametric , Survival Analysis , Time Factors , Treatment Outcome
9.
J Surg Res ; 197(1): 183-90, 2015 Jul.
Article in English | MEDLINE | ID: mdl-25940156

ABSTRACT

BACKGROUND: Clinical and laboratory criteria are not reliable predictors of deceased donor liver graft quality. Intraoperative assessment of experienced surgeons is the gold standard. Standardizing and quantifying this assessment is especially needed now that regional sharing is the rule. We prospectively evaluated a novel, simple, rapid, noninvasive, quantitative measure of liver function performed before graft procurement. MATERIALS AND METHODS: Using a portable, finger-probe-based device, indocyanine green plasma disappearance rates (ICG-PDR) were measured in adult brain-dead donors in the local donor service area before organ procurement. Results were compared with graft function and outcomes. Both donor and recipient teams were blinded to ICG-PDR measurements. RESULTS: Measurements were performed on 53 consecutive donors. Eleven liver grafts were declined by all centers because of quality; the other 42 grafts were transplanted. Logistic regression analysis showed ICG-PDR to be the only donor variable to be significantly associated with 7-d graft survival. Donor risk index, donor age, and transaminase levels at peak or procurement were not significantly associated with 7-d graft survival. CONCLUSIONS: We report the successful use of a portable quantitative means of measuring liver function and its association with graft survival. These data warrant further exploration in a variety of settings to evaluate acceptable values for donated liver grafts.


Subject(s)
Donor Selection/methods , Fluorescent Dyes , Graft Survival , Indocyanine Green , Liver Transplantation , Liver/physiology , Adult , Aged , Aged, 80 and over , Double-Blind Method , Female , Humans , Liver Function Tests , Logistic Models , Male , Middle Aged , Outcome Assessment, Health Care , Prospective Studies , Reproducibility of Results
10.
Curr Opin Organ Transplant ; 20(2): 121-6, 2015 Apr.
Article in English | MEDLINE | ID: mdl-25856175

ABSTRACT

PURPOSE OF REVIEW: The purpose of this article is to identify the unique aspects of combined multiorgan and vascularized composite allograft (VCA) procurement from deceased donors and outline the steps essential for success. RECENT FINDINGS: Transplantation of nonsolid organ composite tissues is becoming a viable option for reconstruction of massive tissue defects. With the United Network for Organ Sharing designation of VCAs as organs, placing them under the domain of the Organ Procurement and Transplantation Network, a systematic method for combined solid organ and VCA procurement is required. Several centers have reported experience with successful procurement strategies including sequential and simultaneous retrievals. The published literature describing donor screening, sequence of procurement with relation to solid organs and allocation is reviewed. SUMMARY: With the 2013 classification of VCAs as organs, the Organ Procurement and Transplantation Network and United Network for Organ Sharing are better suited to aligning procurement and allocation policies. As VCA transplantation becomes more commonplace, protocol guidelines will ensure smooth integration with existing procurement infrastructure.


Subject(s)
Tissue and Organ Procurement , Abdominal Wall , Animals , Humans , Kidney Failure, Chronic/surgery , Organ Transplantation , Tissue Donors , Tissue Transplantation
11.
Gastroenterol Clin North Am ; 53(3): 453-459, 2024 Sep.
Article in English | MEDLINE | ID: mdl-39068006

ABSTRACT

The history of intestinal transplantation can be traced back to the turn of the twentieth century. Although advancements have been made, the intestine still presents a greater challenge to transplantation than does that of other solid organs, experiencing higher rates of graft rejection and lower long-term survival. Increasingly, intestinal re-transplantation (re-ITx) is seen as a viable option and is now the fourth most common indication for ITx. Changes to immunosuppression protocols, technical modifications, and infectious disease monitoring have contributed to improved outcomes. The authors review the literature on re-ITx in regard to the history, management considerations, and future directions.


Subject(s)
Graft Rejection , Intestines , Reoperation , Humans , Intestines/transplantation , Graft Rejection/prevention & control , Immunosuppressive Agents/therapeutic use , Organ Transplantation/methods , Graft Survival
12.
Ann Surg ; 258(3): 409-21, 2013 Sep.
Article in English | MEDLINE | ID: mdl-24022434

ABSTRACT

OBJECTIVE: To analyze a 28-year single-center experience with orthotopic liver transplantation (OLT) for patients with irreversible liver failure. BACKGROUND: The implementation of the model for end-stage liver disease (MELD) in 2002 represented a fundamental shift in liver donor allocation to recipients with the highest acuity, raising concerns about posttransplant outcome and morbidity. METHODS: Outcomes and factors affecting survival were analyzed in 5347 consecutive OLTs performed in 3752 adults and 822 children between 1984 and 2012, including comparisons of recipient and donor characteristics, graft and patient outcomes, and postoperative morbidity before (n = 3218) and after (n = 2129) implementation of the MELD allocation system. Independent predictors of survival were identified. RESULTS: Overall, 1-, 5-, 10-, and 20-year patient and graft survival estimates were 82%, 70%, 63%, 52%, and 73%, 61%, 54%, 43%, respectively. Recipient survival was best in children with biliary atresia and worst in adults with malignancy. Post-MELD era recipients were older (54 vs 49, P < 0.001), more likely to be hospitalized (50% vs 47%, P = 0.026) and receiving pretransplant renal replacement therapy (34% vs 12%, P < 0.001), and had significantly greater laboratory MELD scores (28 vs 19, P < 0.001), longer wait-list times (270 days vs 186 days, P < 0.001), and pretransplant hospital stays (10 days vs 8 days, P < 0.001). Despite increased acuity, post-MELD era recipients achieved superior 1-, 5-, and 10-year patient survival (82%, 70%, and 65% vs 77%, 66%, and 58%, P < 0.001) and graft survival (78%, 66%, and 61% vs 69%, 58%, and 51%, P < 0.001) compared with pre-MELD recipients. Of 17 recipient and donor variables, era of transplantation, etiology of liver disease, recipient and donor age, prior transplantation, MELD score, hospitalization at time of OLT, and cold and warm ischemia time were independent predictors of survival. CONCLUSIONS: We present the world's largest reported single-institution experience with OLT. Despite increasing acuity in post-MELD era recipients, patient and graft survival continues to improve, justifying the "sickest first" allocation approach.


Subject(s)
End Stage Liver Disease/surgery , Liver Transplantation , Severity of Illness Index , Adolescent , Adult , Aged , Aged, 80 and over , Anti-Infective Agents/therapeutic use , Antibiotic Prophylaxis/methods , Antibiotic Prophylaxis/trends , Child , Child, Preschool , Drug Therapy, Combination , End Stage Liver Disease/mortality , Female , Follow-Up Studies , Graft Survival , Humans , Immunosuppression Therapy/methods , Immunosuppression Therapy/trends , Infant , Infant, Newborn , Liver Transplantation/mortality , Liver Transplantation/trends , Male , Middle Aged , Perioperative Care/methods , Perioperative Care/trends , Postoperative Complications/epidemiology , Postoperative Complications/prevention & control , Reoperation/statistics & numerical data , Reoperation/trends , Retrospective Studies , Survival Analysis , Treatment Outcome , Young Adult
13.
Liver Transpl ; 19(4): 437-49, 2013 Apr.
Article in English | MEDLINE | ID: mdl-23408461

ABSTRACT

An accurate clinical assessment of hepatic steatosis before transplantation is critical for successful outcomes after liver transplantation, especially if a pathologist is not available at the time of procurement. This prospective study investigated the surgeon's accuracy in predicting hepatic steatosis and organ quality in 201 adult donor livers. A steatosis assessment by a blinded expert pathologist served as the reference gold standard. The surgeon's steatosis estimate correlated more strongly with large-droplet macrovesicular steatosis [ld-MaS; nonparametric Spearman correlation coefficient (rS ) = 0.504] versus small-droplet macrovesicular steatosis (sd-MaS; rS = 0.398). True microvesicular steatosis was present in only 2 donors (1%). Liver texture criteria (yellowness, absence of scratch marks, and round edges) were mainly associated with ld-MaS (variance = 0.619) and were less associated with sd-MaS (variance = 0.264). The prediction of ≥30% ld-MaS versus <30% ld-MaS was excellent when liver texture criteria were used (accuracy = 86.2%), but it was less accurate when the surgeon's direct estimation of the steatosis percentage was used (accuracy = 75.5%). The surgeon's quality grading correlated with the degree of ld-MaS and the surgeon's steatosis estimate as well as the incidence of poor initial function and primary nonfunction. In conclusion, the precise estimation of steatosis remains challenging even in experienced hands. Liver texture characteristics are more helpful in identifying macrosteatotic organs than the surgeon's actual perception of steatosis. These findings are especially important when histological assessment is not available at the donor's hospital.


Subject(s)
Donor Selection , Fatty Liver/pathology , Liver Transplantation , Pathology, Surgical/methods , Tissue Donors , Adolescent , Adult , Aged , Biopsy , Decision Support Techniques , Double-Blind Method , Female , Humans , Male , Middle Aged , Multivariate Analysis , Observer Variation , Predictive Value of Tests , Prospective Studies , Reproducibility of Results , Severity of Illness Index , Young Adult
14.
Ann Surg ; 256(4): 624-33, 2012 Oct.
Article in English | MEDLINE | ID: mdl-22964732

ABSTRACT

OBJECTIVE: To analyze incidence, outcomes, and utilization of health care resources in liver transplantation (LT) for nonalcoholic steatohepatitis (NASH). SUMMARY OF BACKGROUND DATA: With the epidemic of obesity and metabolic syndrome in nearly 33% of the US population, NASH is projected to become the leading indication for LT in the next several years. Data on predictors of outcome and utilization of health care resources after LT in NASH is limited. METHODS: We conducted an analysis from our prospective database of 144 adult NASH patients who underwent LT between December 1993 and August 2011. Outcomes and resource utilization were compared with other common indications for LT. Independent predictors of graft and patient survival were identified. RESULTS: The average Model for End-Stage Liver Disease score was 33. The frequency of NASH as the primary indication for LT increased from 3% in 2002 to 19% in 2011 to become the second most common indication for LT at our center behind hepatitis C. NASH patients had significantly longer operative times (402 vs 322 minutes; P < 0.001), operative blood loss (18 vs 14 packed red blood cell units; P = 0.001), and posttransplant length of stay (35 vs 29 days; P = 0.032), but 1-, 3-, and 5-year graft (81%, 71%, 63%) and patient (84%, 75%, 70%) survival were comparable with other diagnoses. Age greater than 55 years, pretransplant intubation, dialysis, hospitalization, presence of hepatocellular carcinoma on explant, donor age greater than 55 years, and cold ischemia time greater than 550 minutes were significant independent predictors of survival for all patients, whereas body mass index greater than 35 was a predictor in NASH patients only. CONCLUSIONS: We report the largest single institution experience of LT for NASH. Over a 10-year period, the frequency of LT for NASH has increased 5-fold. Although outcomes are comparable with LT for other indications, health care resources are stressed significantly by this new and increasing group of transplant candidates.


Subject(s)
Fatty Liver/surgery , Liver Transplantation , Adult , Erythrocyte Transfusion/statistics & numerical data , Fatty Liver/epidemiology , Fatty Liver/mortality , Female , Follow-Up Studies , Graft Survival , Humans , Incidence , Length of Stay/statistics & numerical data , Linear Models , Liver Transplantation/mortality , Liver Transplantation/statistics & numerical data , Los Angeles/epidemiology , Male , Middle Aged , Multivariate Analysis , Non-alcoholic Fatty Liver Disease , Operative Time , Retrospective Studies , Survival Analysis , Treatment Outcome
15.
Curr Opin Organ Transplant ; 16(3): 269-73, 2011 Jun.
Article in English | MEDLINE | ID: mdl-21467935

ABSTRACT

PURPOSE OF REVIEW: Results of surgical innovations using partial liver grafts from deceased donors have improved the availability of transplantable organs. However, current data on outcomes after split liver transplantation (SLT) are conflicting. This article reviews the current state of SLT, focusing on long-term outcomes and predictors for patient and graft survival after SLT. RECENT FINDINGS: The conventional SLT has been proven to be a durable life-saving procedure. Early results for full left-right SLT for two adults are promising but this technique had not showed efficacy for wide application. Predictors of diminished patient survival after SLT included the use of split grafts in critically ill recipients (model for end-stage liver disease score >30), retransplant patients, cold ischemia time more than 10 h, and the performance of SLT in low-volume liver transplant centers. SUMMARY: Conventional SLT performed in specialized centers resulted in long-term survival outcomes comparable with whole-organ liver transplantation. Full left-right SLT for two adults remains experimental. Splitting of the liver is an effective approach to expand the donor pool and remains an untapped resource for patients in need of liver transplantation. Split graft-to-recipient pairing is crucial for optimal organ allocation and survival outcomes after liver transplantation.


Subject(s)
Liver Transplantation/methods , Tissue Donors/supply & distribution , Graft Survival , Humans , Liver Transplantation/adverse effects , Liver Transplantation/mortality , Risk Assessment , Risk Factors , Survival Rate , Time Factors , Treatment Outcome
16.
Ann Surg ; 250(3): 484-93, 2009 Sep.
Article in English | MEDLINE | ID: mdl-19730179

ABSTRACT

OBJECTIVE(S): Death occurs in half of all children with fulminant hepatic failure (FHF). Although liver transplantation (LT) is potentially life-saving, there are only a few published series with limited experience. The aim was to examine predictors of survival after LT for FHF. METHODS: Between 1984 and 2008, all LT for FHF performed in recipients less than or equal to 18 years of age were analyzed from a prospectively maintained database using 35 demographic, laboratory, and operative variables. Unique calculated variables included creatinine clearance (cCrCl) and Pediatric End-Stage Liver Disease score (PELD). Study end-points were patient and death censored graft survival. Median follow-up was 98 months. Statistical analysis involved the log-rank test and Cox proportional hazards model. RESULTS: A total of 122 children underwent 159 LTx. Cryptogenic was the primary etiology (70%) and the median age was 53 months. The significant (P < 0.05) univariate predictors of worse graft survival were: recipient age <24 months, cCrCl <60 mL/min/1.73m, PELD >25 points, and warm ischemia time >60 minutes. The significant (P < 0.05) univariate predictors of worse patient survival were: recipient African-American and Asian race, recipient age <24 months, cCrCl <60 mL/min/1.73m, and time from onset jaundice to encephalopathy <7 days. On multivariate analysis, survival was significantly impacted by 4 variables: cCrCl <60 mL/min/1.73m (GRAFT and PATIENT), PELD >25 points (GRAFT), recipient age <24 months (GRAFT), and time from onset jaundice to encephalopathy <7 days (PATIENT). While overall 5- and 10-year survival was 73% and 72% (GRAFT) and 77% and 73% (PATIENT), these were significantly worse when a combination of multivariate risk-factors were present. CONCLUSIONS: This data from a large, single-center experience demonstrates that LT is the treatment of choice for FHF and results in durable survival. Analysis revealed 4 novel outcome predictors. Young children with rapid onset acute liver failure are a high-risk subpopulation. Unique to this study, cCrCl and PELD accurately predicted the end-points. This analysis identifies patient subpopulations requiring early aggressive intervention with LT.


Subject(s)
Liver Failure, Acute/surgery , Liver Transplantation , Adolescent , Child , Female , Graft Survival , Humans , Liver Failure, Acute/mortality , Liver Function Tests , Liver Transplantation/mortality , Male , Proportional Hazards Models , Prospective Studies , Risk Factors , Survival Rate , Treatment Outcome
17.
Liver Transpl ; 15(11): 1525-34, 2009 Nov.
Article in English | MEDLINE | ID: mdl-19877207

ABSTRACT

Hepatitis B virus (HBV) reinfection and recurrence of hepatocellular carcinoma (HCC) after orthotopic liver transplantation (OLT) are associated with increased graft failure and reduced patient survival. We evaluated the effects of both HCC recurrence and HBV reinfection on the long-term survival of these patients after OLT. One hundred seventy-five patients underwent OLT for HBV-related liver diseases and were the subjects of this retrospective study. We assessed risk factors for HBV reinfection, HCC recurrence, and survival post-OLT using univariate and multivariate analyses. During a mean follow-up of 43.0 +/- 42.0 months, 88 of 175 (50.3%) patients transplanted for HBV-related liver disease had HCC prior to OLT. Thirteen (14.8%) of these patients had HCC recurrence after OLT. The mean time for recurrence of HCC was 26.1 +/- 31.9 months. Twelve of 175 (6.9%) patients developed HBV reinfection after liver transplantation. The mean time for HBV reinfection was 28.7 +/- 26.4 months. Ten of these 12 (83.3%) patients had HCC prior to OLT, and 5 (50%) developed recurrence of HCC. On multivariate analyses, pre-OLT HCC and recurrence of HCC post-OLT were significantly associated with HBV reinfection after transplantation (P = 0.031 and P < 0.001, respectively). HCC recurrence after OLT was associated with lymphovascular invasion (P < 0.001) and post-OLT chemotherapy (P < or = 0.001). The 3- and 5-year survival rates were significantly decreased in patients with HBV reinfection (P = 0.007) and in patients with HCC recurrence after OLT (P = 0.03). In conclusion, pre-OLT HCC and HCC recurrence after transplantation were associated with HBV reinfection and with decreased patient survival. Hepatitis B immunoglobulin and antiviral therapy was only partially effective in preventing HBV reinfection in patients with HCC recurrence.


Subject(s)
Carcinoma, Hepatocellular/mortality , Hepatitis B, Chronic/mortality , Liver Neoplasms/mortality , Liver Transplantation/mortality , Neoplasm Recurrence, Local/mortality , Postoperative Complications/mortality , Adult , Antibodies, Viral/blood , Antiviral Agents/therapeutic use , Carcinoma, Hepatocellular/surgery , Databases, Factual , Disease-Free Survival , Female , Follow-Up Studies , Hepatitis B Surface Antigens/immunology , Hepatitis B, Chronic/diagnosis , Hepatitis B, Chronic/drug therapy , Humans , Liver Neoplasms/surgery , Liver Transplantation/statistics & numerical data , Male , Middle Aged , Multivariate Analysis , Recurrence , Retrospective Studies , Survival Analysis
18.
JAMA Surg ; 154(5): 431-439, 2019 05 01.
Article in English | MEDLINE | ID: mdl-30758485

ABSTRACT

Importance: Anastomotic biliary complications (ABCs) constitute the most common technical complications in liver transplant (LT). Given the ever-increasing acuity of LT, identification of factors contributing to ABCs is essential to minimize morbidity and optimize outcomes. A detailed analysis in a patient population undergoing high-acuity LT is lacking. Objective: To evaluate the rate of, risk factors for, and outcomes of ABCs and acuity level in LT recipients. Design, Setting, and Participants: This retrospective cohort study included adult LT recipients from January 1, 2013, through June 30, 2016, at a single large urban transplant center. Patients were followed up for at least 12 months after LT until June 30, 2017. Of 520 consecutive adult patients undergoing LT, 509 LTs in 503 patients were included. Data were analyzed from May 1 through September 13, 2017. Exposure: Liver transplant. Main Outcomes and Measures: Any complications occurring at the level of the biliary reconstruction. Results: Among the 503 transplant recipients undergoing 509 LTs included in the analysis (62.3% male; median age, 58 years [interquartile range {IQR}, 50-63 years), median follow-up was 24 months (IQR, 16-34 months). Overall patient and graft survival at 1 year were 91.1% and 90.3%, respectively. The median Model for End-stage Liver Disease (MELD) score was 35 (IQR, 15-40) for the entire cohort. T tubes were used in 199 LTs (39.1%) during initial bile duct reconstruction. Overall incidence of ABCs included 103 LTs (20.2%). Anastomotic leak occurred in 25 LTs (4.9%) and stricture, 77 (15.1%). Exit-site leak in T tubes occurred in 36 (7.1%) and T tube obstruction in 16 (3.1%). Seventeen patients with ABCs required surgical revision of bile duct reconstruction. Multivariate analysis revealed the following 7 independent risk factors for ABCs: recipient hepatic artery thrombosis (odds ratio [OR], 12.41; 95% CI, 2.37-64.87; P = .003), second LT (OR, 4.05; 95% CI, 1.13-14.50; P = .03), recipient hepatic artery stenosis (OR, 3.81; 95% CI, 1.30-11.17; P = .02), donor hypertension (OR, 2.79; 95% CI, 1.27-6.11; P = .01), recipients with hepatocellular carcinoma (OR, 2.66; 95% CI, 1.23-5.74; P = .01), donor death due to anoxia (OR, 2.61; 95% CI, 1.13-6.03; P = .03), and use of nonabsorbable suture material for biliary reconstruction (OR, 2.45; 95% CI, 1.09-5.54; P = .03). Conclusions and Relevance: This large, single-center series identified physiologic and anatomical independent risk factors contributing to ABCs after high-acuity LT. Careful consideration of these factors could guide perioperative management and mitigate potentially preventable ABCs.


Subject(s)
Bile Ducts/surgery , Biliary Tract Surgical Procedures/adverse effects , Liver Transplantation/adverse effects , Postoperative Complications/epidemiology , Anastomosis, Surgical/adverse effects , Egypt/epidemiology , Female , Follow-Up Studies , Graft Survival , Humans , Incidence , Liver Failure/surgery , Male , Middle Aged , Retrospective Studies , Risk Factors , Survival Rate/trends
19.
Liver Int ; 28(8): 1087-94, 2008 Sep.
Article in English | MEDLINE | ID: mdl-18662279

ABSTRACT

OBJECTIVE: With an increasing number of liver transplant recipients living, understanding quality-of-life issues is essential. Our goal is to identify pretransplant variables associated with post-transplant quality of life in liver transplant recipients. METHODS: Three hundred and eight liver transplant recipients were administered the Short Form 36 and a basic demographical questionnaire. Variables associated with post-transplant quality of life were studied in a multivariate regression analysis. Interaction terms were used to examine effect modification. RESULTS: Male gender, longer pretransplant work hours and interaction term between work hours and male gender were independently associated with Physical Functioning. Work hours positively correlated with Role-Physical, while viral hepatitis and ascites were negatively associated with Role-Physical. Ascites and viral hepatitis were independently negatively associated with Bodily Pain. Encephalopathy, hepatocellular carcinoma and viral hepatitis were independently associated with General Health. Ascites was also negatively associated with Social Functioning, Role-Emotional, Bodily Pain, General Health and Vitality. Viral hepatitis was negatively correlated with Vitality and Mental Functioning. CONCLUSIONS: Pretransplant variables such as ascites, encephalopathy, hepatocellular carcinoma, viral hepatitis, work hours, time unable to work and gender were significantly associated with post-transplant quality of life in liver transplant recipients. Interventions addressing these issues may be initiated to improve the post-transplant quality of life.


Subject(s)
Liver Transplantation , Quality of Life , Adult , Cross-Sectional Studies , Female , Humans , Male , Middle Aged , Multivariate Analysis , Regression Analysis , Surveys and Questionnaires
20.
JAMA Surg ; 153(5): 436-444, 2018 05 01.
Article in English | MEDLINE | ID: mdl-29261831

ABSTRACT

Importance: Early allograft dysfunction (EAD) following a liver transplant (LT) unequivocally portends adverse graft and patient outcomes, but a widely accepted classification or grading system is lacking. Objective: To develop a model for individualized risk estimation of graft failure after LT and then compare the model's prognostic performance with the existing binary EAD definition (bilirubin level of ≥10 mg/dL on postoperative day 7, international normalized ratio of ≥1.6 on postoperative day 7, or aspartate aminotransferase or alanine aminotransferase level of >2000 U/L within the first 7 days) and the Model for Early Allograft Function (MEAF) score. Design, Setting, and Participants: This retrospective single-center analysis used a transplant database to identify all adult patients who underwent a primary LT and had data on 10 days of post-LT laboratory variables at the Dumont-UCLA Transplant Center of the David Geffen School of Medicine at UCLA between February 1, 2002, and June 30, 2015. Data collection took place from January 4, 2016, to June 30, 2016. Data analysis was conducted from July 1, 2016, to August 30, 2017. Main Outcomes and Measures: Three-month graft failure-free survival. Results: Of 2021 patients who underwent primary LT over the study period, 2008 (99.4%) had available perioperative data and were included in the analysis. The median (interquartile range [IQR]) age of recipients was 56 (49-62) years, and 1294 recipients (64.4%) were men. Overall survival and graft-failure-free survival rates were 83% and 81% at year 1, 74% and 71% at year 3, and 69% and 65% at year 5, with an 11.1% (222 recipients) incidence of 3-month graft failure or death. Multivariate factors associated with 3-month graft failure-free survival included post-LT aspartate aminotransferase level, international normalized ratio, bilirubin level, and platelet count, measures of which were used to calculate the Liver Graft Assessment Following Transplantation (L-GrAFT) risk score. The L-GrAFT model had an excellent C statistic of 0.85, with a significantly superior discrimination of 3-month graft failure-free survival compared with the existing EAD definition (C statistic, 0.68; P < .001) and the MEAF score (C statistic, 0.70; P < .001). Compared with patients with lower L-GrAFT risk, LT recipients in the highest 10th percentile of L-GrAFT scores had higher Model for End-Stage Liver Disease scores (median [IQR], 34 [26-40] vs 31 [25-38]; P = .005); greater need for pretransplant hospitalization (56.8% vs 44.8%; P = .003), renal replacement therapy (42.9% vs 30.5%; P < .001), mechanical ventilation (35.8% vs 18.1%; P < .001), and vasopressors (22.9% vs 11.0%; P < .001); longer cold ischemia times (median [IQR], 436 [311-539] vs 401 [302-506] minutes; P = .04); greater intraoperative blood transfusions (median [IQR], 17 [10-26] vs 10 [6-17] units of packed red blood cells; P < .001); and older donors (median [IQR] age, 47 [28-56] vs 41 [25-52] years; P < .001). Conclusions and Relevance: The L-GrAFT risk score allows a highly accurate, individualized risk estimation of 3-month graft failure following LT that is more accurate than existing EAD and MEAF scores. Multicenter validation may allow for the adoption of the L-GrAFT as a tool for evaluating the need for a retransplant, for establishing standardized grading of early allograft function across transplant centers, and as a highly accurate clinical end point in translational studies aiming to mitigate ischemia or reperfusion injury by modulating donor quality and recipient factors.


Subject(s)
Aspartate Aminotransferases/blood , Bilirubin/blood , Liver Transplantation/adverse effects , Primary Graft Dysfunction/diagnosis , Allografts , Biomarkers/blood , Female , Follow-Up Studies , Graft Survival , Humans , Incidence , Liver Function Tests , Male , Middle Aged , Primary Graft Dysfunction/blood , Primary Graft Dysfunction/epidemiology , Prognosis , Retrospective Studies , Risk Factors , Survival Rate/trends , Time Factors , United States/epidemiology
SELECTION OF CITATIONS
SEARCH DETAIL