Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 17 de 17
Filter
Add more filters










Publication year range
1.
Hepatobiliary Surg Nutr ; 13(4): 662-668, 2024 Aug 01.
Article in English | MEDLINE | ID: mdl-39175744

ABSTRACT

The Meso-Rex bypass (MRB) is recognized as an effective treatment for portal hypertension secondary to extrahepatic portal vein occlusion (EHPVO) both in the pediatric and adult population, within or outside the context of liver transplantation. It is the preferred surgical treatment in most centers because not only does it addresses the portal hypertension, but also restores physiologic portal hepatopetal flow. However, the Rex recess, the landmark for this technique, may not be safely accessible in some patients. We present a 22-year-old male who underwent living donor liver transplant (LDLT) for neonatal hepatitis. He presented with variceal bleeding due to EHPVO at 13 years after transplant. Various endoscopic, radiologic, and surgical interventions were employed to address the recurrent gastrointestinal bleeding, but results have been unsatisfactory. We performed a meso-intrahepatic portal vein bypass (MIPVB), an innovative alternative to the MRB, for this patient with extensive post-operative adhesions, perihilar collaterals, and cavernous transformation. MIPVB creation in patients where the Rex recess is inaccessible is technically challenging. But with a multidisciplinary team approach, meticulous preoperative planning, and close follow-up, the authors have demonstrated that it is a safe and feasible option for patients with late-onset EHPVO after liver transplantation.

3.
Pediatr Transplant ; 28(6): e14838, 2024 Sep.
Article in English | MEDLINE | ID: mdl-39158111

ABSTRACT

BACKGROUND: Although the outcomes of living donor liver transplantation (LDLT) for pediatric acute liver failure (PALF) have improved, patient survival remains lower than in patients with chronic liver disease. We investigated whether the poor outcomes of LDLT for PALF persisted in the contemporary transplant era. METHODS: We analyzed 193 patients who underwent LDLT between December 2000 and December 2020. The outcomes of patients managed in 2000-2010 (era 1) and 2011-2020 (era 2) were compared. RESULTS: The median age at the time of LDLT was 1.2 years both eras. An unknown etiology was the major cause in both groups. Patients in era 1 were more likely to have surgical complications, including hepatic artery and biliary complications (p = 0.001 and p = 0.013, respectively). The era had no impact on the infection rate after LDLT (cytomegalovirus, Epstein-Barr virus, and sepsis). The mortality rates of patients and grafts in era one were significantly higher (p = 0.03 and p = 0.047, respectively). The 1- and 5-year survival rates were 76.4% and 70.9%, respectively, in era 1, while they were 88.3% and 81.9% in era 2 (p = 0.042). Rejection was the most common cause of graft loss in both groups. In the multivariate analysis, sepsis during the 30 days after LDLT was independently associated with graft loss (p = 0.002). CONCLUSIONS: The survival of patients with PALF has improved in the contemporary transplant era. The early detection and proper management of rejection in patients, while being cautious of sepsis, should be recommended to improve outcomes further.


Subject(s)
Liver Failure, Acute , Liver Transplantation , Living Donors , Postoperative Complications , Humans , Male , Female , Retrospective Studies , Infant , Child, Preschool , Liver Failure, Acute/surgery , Child , Postoperative Complications/epidemiology , Treatment Outcome , Graft Survival , Survival Rate , Adolescent
4.
Article in English | MEDLINE | ID: mdl-38944568

ABSTRACT

BACKGROUND: Liver transplantation (LT) is a pivotal treatment for end-stage liver disease. However, bloodstream infections (BSI) in the post-operative period present a significant threat to patient survival. This study aims to identify risk factors for post-LT BSI and crucial prognostic indicators for mortality among affected patients. METHODS: We conducted a retrospective study of adults diagnosed with end-stage liver disease who underwent their initial LT between 2010 and 2021. Those who developed BSI post-LT during the same hospital admission were classified into the BSI group. RESULTS: In this cohort of 1049 patients, 89 (8.4%) developed BSI post-LT, while 960 (91.5%) did not contract any infection. Among the BSI cases, 17 (19.1%) patients died. The average time to BSI onset was 48 days, with 46% occurring within the first month post-LT. Of the 123 isolated microorganisms, 97 (78.8%) were gram-negative bacteria. BSI patients had significantly longer stays in the intensive care unit and hospital compared to non-infected patients. The 90-day and in-hospital mortality rates for recipients with BSI were significantly higher than those without infections. Multivariate analysis indicated heightened BSI risk in patients with blood loss >3000 mL during LT (odds ratio [OR] 2.128), re-operation within 30 days (OR 2.341), post-LT bile leakage (OR 3.536), and graft rejection (OR 2.194). Additionally, chronic kidney disease (OR 6.288), each 1000 mL increase in intraoperative blood loss (OR 1.147) significantly raised mortality risk in BSI patients, whereas each 0.1 mg/dL increase in albumin levels correlated with a lower risk of death from BSI (OR 0.810). CONCLUSIONS: This study underscores the need for careful monitoring and management in the post-LT period, especially for patients at higher risk of BSI. It also suggests that serum albumin levels could serve as a valuable prognostic indicator for outcomes in LT recipients experiencing BSI.

5.
Am J Transplant ; 2024 Jun 22.
Article in English | MEDLINE | ID: mdl-38914281

ABSTRACT

Decreasing the graft size in living donor liver transplantation (LDLT) increases the risk of early allograft dysfunction. Graft-to-recipient weight ratio (GRWR) of 0.8 is considered the threshold. There is evidence that smaller volume grafts may also provide equally good outcomes, the cut-off of which remains unknown. In this retrospective multicenter study, 92 adult LDLTs with a final GRWR ≤0.6 performed at 12 international liver transplant centers over a 3-year period were included. Perioperative data including preoperative status, portal flow hemodynamics (PFH) and portal flow modulation, development of small for size syndrome (SFSS), morbidity, and mortality was collated and analyzed. Thirty-two (36.7%) patients developed SFSS and this was associated with increased 30-day, 90-day, and 1-year mortality. The preoperative model for end-stage liver disease and inpatient status were independent predictors for SFSS (P < .05). Pre-liver transplant renal dysfunction was an independent predictor of survival (hazard ratio 3.1; 95% confidence intervals 1.1, 8.9, P = .035). PFH or portal flow modulation were not predictive of SFSS or survival. We report the largest ever multicenter study of LDLT outcomes using ultralow GRWR grafts and for the first time validate the International Liver Transplantation Society-International Living donor liver transplantation study group-Liver Transplantation Society of India consensus definition and grading of SFSS. Preoperative recipient condition rather than GRWR and PFH were independent predictors of SFSS. Algorithms to predict SFSS and LT outcomes should incorporate recipient factors along with GRWR.

6.
Int J Surg ; 2024 Jun 13.
Article in English | MEDLINE | ID: mdl-38870007

ABSTRACT

BACKGROUND: Active vaccination has been utilized to prevent de novo hepatitis B virus infection (DNHB) in anti-HBc (+) grafts after liver transplantation (LT). However, the long-term efficacy of active vaccination and graft/patient outcomes of anti-HBc (+) grafts have yet to be comprehensively investigated. MATERIALS AND METHODS: Among 204 pediatric patients enrolled in the study, 82 recipients received anti-HBc (+) grafts. For DNHB prevention, active vaccination was repeatedly administered prior to transplant. Anti-viral therapy was given to patients with pre-transplant anti-HBs<1000 IU/ ml (non-robust response) for 2 years and discontinued when post-transplant patients achieved anti-HBs>1000 IU/mL, while anti-viral therapy was not given in patients with an anti-HBs titer over 1000 IU/mL. The primary outcome was to investigate the long-term efficacy of active vaccination, while the secondary outcomes included the graft and patient survival rates. RESULTS: Among the 82 anti-HBc (+) transplant patients, 68% of recipients achieved a robust immune response, thus not requiring antiviral therapy. Two patients (2.4%) developed DNHB infection, one of which was due to an escape mutant. With a median follow-up of 150 months, the overall 10-year patient and graft survival rates were significantly worse in recipients of anti-HBc (+) grafts than those of anti-HBc (-) grafts (85.2% vs 93.4%, P=0.026; 85.1% vs 93.4%, P=0.034, respectively). Additionally, the 10-year patient and graft outcomes of the anti-HBc (+) graft recipients were significantly worse than those of the anti-HBc (-) graft recipients after excluding early mortality and non-graft mortality values (90.8% vs 96.6%, P=0.036; 93.0% vs 98.3%, P=0.011, respectively). CONCLUSION: Our long-term follow-up study demonstrates that active vaccination is a simple, cost-effective strategy against DNHB infection in anti-HBc (+) graft patients, whereby the need for life-long antiviral therapy is removed. Notably, both the anti-HBc (+) grafts and patients exhibited inferior long-term survival rates, although the exact mechanisms remain unclear.

7.
Eur J Radiol ; 177: 111551, 2024 Aug.
Article in English | MEDLINE | ID: mdl-38875747

ABSTRACT

BACKGROUND: Liver transplantation is an effective treatment for preventing hepatocellular carcinoma (HCC) recurrence. This retrospective study aimed to quantitatively evaluate the attenuation in Hounsfield units (HU) on contrast-enhanced computed tomography (CECT) as a prognostic factor for hepatocellular carcinoma (HCC) following liver transplantation as a treatment. Our goal is to optimize its predictive ability for early tumor recurrence and compare it with the other imaging modality-positron emission tomography (PET). METHODS: In 618 cases of LDLT for HCC, only 131 patients with measurable viable HCC on preoperative CECT and preoperative positron emission tomography (PET) evaluations were included, with a minimum follow-up period of 6 years. Cox regression models were developed to identify predictors of postoperative recurrence. Performance metrics for both CT and PET were assessed. The correlation between these two imaging modalities was also evaluated. Survival analyses were conducted using time-dependent receiver operating characteristic (ROC) curve analysis and area under the curve (AUC) to assess accuracy and determine optimized cut-off points. RESULTS: Univariate and multivariate analyses revealed that both arterial-phase preoperative tumor attenuation (HU) and PET were independent prognostic factors for recurrence-free survival. Both lower arterial tumor enhancement (Cut-off value = 59.2, AUC 0.88) on CT and PET positive (AUC 0.89) increased risk of early tumor recurrence 0.5-year time-dependent ROC. Composites with HU < 59.2 and a positive PET result exhibited significantly higher diagnostic accuracy in detecting early tumor recurrence (AUC = 0.96). CONCLUSION: Relatively low arterial tumor enhancement values on CECT effectively predict early HCC recurrence after LDLT. The integration of CT and PET imaging may serve as imaging markers of early tumor recurrence in HCC patients after LDLT.


Subject(s)
Carcinoma, Hepatocellular , Liver Neoplasms , Liver Transplantation , Living Donors , Neoplasm Recurrence, Local , Tomography, X-Ray Computed , Humans , Carcinoma, Hepatocellular/diagnostic imaging , Carcinoma, Hepatocellular/surgery , Liver Neoplasms/diagnostic imaging , Liver Neoplasms/surgery , Male , Female , Neoplasm Recurrence, Local/diagnostic imaging , Middle Aged , Tomography, X-Ray Computed/methods , Retrospective Studies , Positron-Emission Tomography/methods , Contrast Media , Adult , Aged , Survival Analysis , Predictive Value of Tests , Prognosis
8.
Hepatobiliary Surg Nutr ; 13(3): 425-443, 2024 Jun 01.
Article in English | MEDLINE | ID: mdl-38911194

ABSTRACT

Background: Liver retransplant is the only option to save a patient with liver graft failure. However, it is controversial due to its poor survival outcome compared to primary transplantation. Insufficient deceased organ donation in Taiwan leads to high waitlist mortality. Hence, living-donor grafts offer a valuable alternative for retransplantation. This study aims to analyze the single center's outcome in living donor liver retransplantation (re-LDLT) and deceased donor liver retransplantation (re-DDLT) as well as the survival related confounding risk factors. Methods: This is a single center retrospective study including 32 adults who underwent liver retransplantation (re-LT) from June 2002 to April 2020. The cohort was divided into a re-LDLT and a re-DDLT group and survival outcomes were analyzed. Patient outcomes over different periods, the effect of timing on survival, and multivariate analysis for risk factors were also demonstrated. Results: Of the 32 retransplantations, the re-LDLT group (n=11) received grafts from younger donors (31.3 vs. 43.75 years, P=0.016), with lower graft weights (688 vs. 1,457.2 g, P<0.001) and shorter cold ischemia time (CIT) (45 vs. 313 min, P<0.001). The 5-year survival was significantly better in the re-LDLT group than in the re-DDLT group (100% vs. 70.8%, P=0.02). This difference was adjusted when only retransplantation after 2010 was analyzed. Further analysis showed that the timing of retransplantation (early vs. late) did not affect patient survival. Multivariate analysis revealed that prolonged warm ischemia time (WIT) and intraoperative blood transfusion were related to poor long-term survival. Conclusions: Retransplantation with living donor graft demonstrated good long-term outcomes with acceptable complications to both recipient and donor. It may serve as a choice in areas lacking deceased donors. The timing of retransplantation did not affect the long-term survival. Further effort should be made to reduce WIT and massive blood transfusion as they contributed to poor survival after retransplantation.

9.
Cell Calcium ; 121: 102895, 2024 Jul.
Article in English | MEDLINE | ID: mdl-38703416

ABSTRACT

Liver fibrosis is characterized by excessive deposition of extracellular matrix (ECM) as a wound healing process. Activated hepatic stellate cells (HpSCs) are the major producer of the ECM and play a central role in liver fibrogenesis. It has been widely accepted that elimination of activated HpSCs or reversion to a quiescent state can be a feasible strategy for resolving the disease, further highlighting the urgent need for novel therapeutic targets. Calreticulin (CRT) is a molecular chaperone that normally resides in the endoplasmic reticulum (ER), important in protein folding and trafficking through the secretory pathway. CRT also plays a critical role in calcium (Ca2+) homeostasis, with its Ca2+ storage capacity. In the current study, we aimed to demonstrate its function in directing HpSC activation. In a mouse liver injury model, CRT was up-regulated in HpSCs. In cellular experiments, we further showed that this activation was through modulating the canonical TGF-ß signaling. As down-regulation of CRT in HpSCs elevated intracellular Ca2+ levels through a form of Ca2+ influx, named store-operated Ca2+ entry (SOCE), we examined whether moderating SOCE affected TGF-ß signaling. Interestingly, blocking SOCE had little effect on TGF-ß-induced gene expression. In contrast, inhibition of ER Ca2+ release using the inositol trisphosphate receptor inhibitor 2-APB increased TGF-ß signaling. Treatment with 2-APB did not alter SOCE but decreased intracellular Ca2+ at the basal level. Indeed, adjusting Ca2+ concentrations by EGTA or BAPTA-AM chelation further enhanced TGF-ß-induced signaling. Our results suggest a crucial role of CRT in the liver fibrogenic process through modulating Ca2+ concentrations and TGF-ß signaling in HpSCs, which may provide new information and help advance the current discoveries for liver fibrosis.


Subject(s)
Calreticulin , Hepatic Stellate Cells , Signal Transduction , Smad Proteins , Transforming Growth Factor beta , Hepatic Stellate Cells/metabolism , Hepatic Stellate Cells/drug effects , Calreticulin/metabolism , Animals , Transforming Growth Factor beta/metabolism , Signal Transduction/drug effects , Smad Proteins/metabolism , Mice , Humans , Calcium/metabolism , Liver Cirrhosis/metabolism , Liver Cirrhosis/pathology , Male , Calcium Signaling/drug effects , Mice, Inbred C57BL
10.
Am J Transplant ; 2024 Apr 29.
Article in English | MEDLINE | ID: mdl-38692411

ABSTRACT

Liver transplantation is often the only lifesaving option for acute liver failure (ALF); however, the predictors of short-term mortality (death within one year) after living donor liver transplantation (LDLT) for ALF have yet to be defined. We retrospectively collected patients ≥18 years old who underwent LDLT for ALF between 2010 and 2020 at 35 centers in Asia. Univariate and multivariate logistic regression analyses were conducted to identify the clinical variables related to short-term mortality and establish a novel scoring system. The Kaplan-Meier method was performed to explore the association between the score and overall survival. Of the 339 recipients, 46 (13.6%) died within 1 year after LDLT. Multivariate analyses revealed 4 independent risk factors for death: use of vasopressors or mechanical ventilation, the higher model for end-stage liver disease score, and a lower graft-to-recipient weight ratio. The internally validated c-statistic of the short-term mortality after transplant (SMT) score derived from these 4 variables was 0.80 (95% confidence interval: 0.74-0.87). The SMT score successfully stratified recipients into low-, intermediate-, and high-risk groups with 1-year overall survival rates of 96%, 80%, and 50%, respectively. In conclusion, our novel SMT score based on 4 predictors will guide ALF recipient and living donor selection.

11.
Diagnostics (Basel) ; 14(8)2024 Apr 11.
Article in English | MEDLINE | ID: mdl-38667453

ABSTRACT

Acute cellular rejection (ACR) is a significant immune issue among recipients following liver transplantation. Although diffusion-weighted magnetic resonance imaging (DWI) is widely used for diagnosing liver disease, it has not yet been utilized for monitoring ACR in patients after liver transplantation. Therefore, the aim of this study was to evaluate the efficacy of DWI in monitoring treatment response among recipients with ACR. This study enrolled 25 recipients with highly suspected ACR rejection, and all subjects underwent both biochemistry and DWI scans before and after treatment. A pathological biopsy was performed 4 to 24 h after the first MRI examination to confirm ACR and degree of rejection. All patients were followed up and underwent a repeated MRI scan when their liver function returned to the normal range. After data acquisition, the DWI data were post-processed to obtain the apparent diffusion coefficient (ADC) map on a voxel-by-voxel basis. Five regions of interest were identified on the liver parenchyma to measure the mean ADC values from each patient. Finally, the mean ADC values and biochemical markers were statistically compared between ACR and non-ACR groups. A receiver operating characteristic (ROC) curve was constructed to evaluate the performance of the ADC and biochemical data in detecting ACR, and correlation analysis was used to understand the relationship between the ADC values, biochemical markers, and the degree of rejection. The histopathologic results revealed that 20 recipients had ACR, including 10 mild, 9 moderate, and 1 severe rejection. The results demonstrated that the ACR patients had significantly lower hepatic ADC values than those in patients without ACR. After treatment, the hepatic ADC values in ACR patients significantly increased to levels similar to those in non-ACR patients with treatment. The ROC analysis showed that the sensitivity and specificity for detecting ACR were 80% and 95%, respectively. Furthermore, the correlation analysis revealed that the mean ADC value and alanine aminotransferase level had strong and moderate negative correlation with the degree of rejection, respectively (r = -0.72 and -0.47). The ADC values were useful for detecting hepatic ACR and monitoring treatment response after immunosuppressive therapy.

12.
Transplant Proc ; 56(3): 625-633, 2024 Apr.
Article in English | MEDLINE | ID: mdl-38519269

ABSTRACT

BACKGROUND: Advancements in surgical techniques, immunosuppression regimens, and peri-operative and postoperative care have resulted in marked improvement in outcomes after pediatric living donor liver transplantation (PLDLT). Despite these developments, infectious complications remain a major cause of morbidity and mortality. METHODS: This is a retrospective cohort analysis of pediatric recipients from January 2004 to December 2018. Patients were classified into infected and non-infected groups based on the occurrence of bacterial infection during the first 3 months after transplant. Perioperative risk factors for early post-transplant bacterial infections and postoperative outcomes were investigated. RESULTS: Seventy-two out of 221 children developed early bacterial infection (32.6%). The first episodes of bacterial infection most frequently occurred in the second week after LDLT (37.5%). In multivariate analysis, active infection before transplant and complications with Clavien-Dindo grading >3 were the only independent risk factors. Early bacterial infections were independently associated with longer intensive care unit stays, longer hospital stays, and a higher incidence of readmission for bacterial infection during the first year after transplant. Additionally, the overall patient survival rate was significantly higher in the non-infected group (P = .001). Risk factors for infection, such as age, weight, disease severity, ABO-incompatible, and other operative factors, were not identified as independent risk factors. CONCLUSION: We have demonstrated that there are similarities and disparities in the epidemiology and risk factors for early bacterial infection after transplant between centers. Identification and better characterization of these predisposing factors are essential in the modification of current preventive strategies and treatment protocols to improve outcomes for this highly vulnerable group.


Subject(s)
Bacterial Infections , Liver Transplantation , Living Donors , Humans , Liver Transplantation/adverse effects , Risk Factors , Retrospective Studies , Male , Female , Bacterial Infections/epidemiology , Child , Child, Preschool , Infant , Postoperative Complications/epidemiology , Adolescent , Length of Stay
13.
Am J Transplant ; 24(7): 1233-1246, 2024 Jul.
Article in English | MEDLINE | ID: mdl-38428639

ABSTRACT

In living-donor liver transplantation, biliary complications including bile leaks and biliary anastomotic strictures remain significant challenges, with incidences varying across different centers. This multicentric retrospective study (2016-2020) included 3633 adult patients from 18 centers and aimed to identify risk factors for these biliary complications and their impact on patient survival. Incidences of bile leaks and biliary strictures were 11.4% and 20.6%, respectively. Key risk factors for bile leaks included multiple bile duct anastomoses (odds ratio, [OR] 1.8), Roux-en-Y hepaticojejunostomy (OR, 1.4), and a history of major abdominal surgery (OR, 1.4). For biliary anastomotic strictures, risk factors were ABO incompatibility (OR, 1.4), blood loss >1 L (OR, 1.4), and previous abdominal surgery (OR, 1.7). Patients experiencing biliary complications had extended hospital stays, increased incidence of major complications, and higher comprehensive complication index scores. The impact on graft survival became evident after accounting for immortal time bias using time-dependent covariate survival analysis. Bile leaks and biliary anastomotic strictures were associated with adjusted hazard ratios of 1.7 and 1.8 for graft survival, respectively. The study underscores the importance of minimizing these risks through careful donor selection and preoperative planning, as biliary complications significantly affect graft survival, despite the availability of effective treatments.


Subject(s)
Graft Survival , Liver Transplantation , Living Donors , Postoperative Complications , Humans , Liver Transplantation/adverse effects , Male , Female , Retrospective Studies , Middle Aged , Adult , Risk Factors , Postoperative Complications/etiology , Follow-Up Studies , Prognosis , Anastomotic Leak/etiology , Biliary Tract Diseases/etiology , Incidence , Survival Rate
14.
Transplant Proc ; 56(3): 596-601, 2024 Apr.
Article in English | MEDLINE | ID: mdl-38472083

ABSTRACT

AIM: To compare the effectiveness of drug-eluting bead transarterial chemoembolization (DEB-TACE) with different particle sizes in bridging and downstaging in pretransplant hepatocellular carcinoma patients. Assess the recurrent and survival rates after living donor liver transplantation (LDLT). METHODS: Retrospective review of 580 patients who underwent TACE using DEB from August 2012 to June 2020 at Taiwan Kaohsiung Chang Gung Memorial Hospital. Pre- and post-TACE computed tomography scan images of the liver were reviewed, and treatment responses were assessed using modified Response Evaluation Criteria in Solid Tumors criteria. Patients were divided by who met the criteria (n = 342) or beyond (n = 238) the University of California San Francisco criteria for successful bridging and downstaging rate evaluation. Each group was divided into subgroups according to DEB particle sizes (group A: <100µm, group B: 100-300 µm, group C: 300-500 µm, and group D: 500-700 µm) to compare objective response rate and post-LDLT survival rate. RESULTS: Overall successful bridging and downstaging rate is 97.1% and 58.4%, respectively, in the group of patients who meet the criteria (n = 332) and are beyond (n = 139) the University of California San Francisco criteria. Group B (100-300 µm) had a higher successful bridging rate (99.5%, P = .003) and downstaging rate (63.8%, P = .443). This subgroup also demonstrated a higher objective response rate in single (93.2%, P = .038) tumors, multiple (83.3%, P = .001) tumors, and tumors with size less than 5 cm (93.9%, P = .005). There are no significant differences in post-LDLT overall survival rate between different particle sizes. CONCLUSION: TACE with 100 to 300 µm DEB particles is associated with a better chance of bridging and downstaging hepatocellular carcinoma patients to LDLT.


Subject(s)
Carcinoma, Hepatocellular , Chemoembolization, Therapeutic , Liver Neoplasms , Liver Transplantation , Living Donors , Particle Size , Humans , Carcinoma, Hepatocellular/therapy , Carcinoma, Hepatocellular/pathology , Carcinoma, Hepatocellular/mortality , Liver Neoplasms/therapy , Liver Neoplasms/pathology , Liver Neoplasms/mortality , Retrospective Studies , Male , Female , Middle Aged , Treatment Outcome , Adult , Neoplasm Staging , Microspheres , Aged
15.
Transplant Proc ; 56(3): 573-580, 2024 Apr.
Article in English | MEDLINE | ID: mdl-38326205

ABSTRACT

PURPOSE: Despite technological and immunologic innovations, some living-donor liver transplant (LDLT) recipients still face poor liver regeneration. Sarcopenia is often recognized as a biomarker for poor outcomes in surgical patients. This study aimed to evaluate associations between sarcopenia and liver regeneration in LDLT recipients. MATERIALS AND METHODS: This retrospective review included consecutive patients who had received LDLT at Chang Gung Memorial Hospital between 2005 and 2017. Sarcopenia was assessed using the psoas muscle index (PMI) in cross-sectional images. Receiver operating characteristic curve analysis was used to determine the ability of PMI to predict relatively poor survival rates. Correlations between liver regeneration and sarcopenia were evaluated using regression analysis. RESULTS: A total of 109 LDLT recipients were included. The 1-, 3-, 5, 10-, and 15-year survival rates were 93.7%, 84.8%, 79.7%, 74.7%, and 73.3% in males and 93.3%, 83.3%, 83.3%, 71.4%, and 71.4% in females. PMIs were significantly different based on 10- and 15-year overall survival rates (P = .001 and P = .000) in male patients. Receiver operating characteristic curve analysis revealed the PMI cutoff point at 6.7 cm2/m2 (sensitivity = 48.3%, specificity = 81%, AUC (area under the ROC curve) = 0.685) based on 10-year survival. Linear regression analysis revealed that PMI was significantly associated with liver regeneration in males (P = .013). CONCLUSIONS: Sarcopenia and low PMI are associated with poor liver regeneration and long-term survival after LDLT in male patients. Further studies, including sarcopenia with conventional scores, may help to more reliably predict liver regeneration and mortality among LDLT patients with hepatocellular carcinoma.


Subject(s)
Carcinoma, Hepatocellular , Liver Neoplasms , Liver Regeneration , Liver Transplantation , Living Donors , Sarcopenia , Humans , Sarcopenia/mortality , Male , Female , Retrospective Studies , Middle Aged , Liver Neoplasms/surgery , Liver Neoplasms/mortality , Carcinoma, Hepatocellular/surgery , Carcinoma, Hepatocellular/mortality , Adult , Survival Rate
17.
Hepatobiliary Surg Nutr ; 12(6): 898-908, 2023 Dec 01.
Article in English | MEDLINE | ID: mdl-38115943

ABSTRACT

Background: Extracorporeal membrane oxygenation (ECMO) is a potential rescue therapy for patients with acute cardiopulmonary dysfunction refractory to conventional treatment. In this study, we described the clinical profiles and outcomes of adult and pediatric living donor liver transplantation (LDLT) patients who received ECMO support during the peri-operative period. Methods: From June 1994 to December 2020, eleven out of the 1,812 LDLTs performed at Kaohsiung Chang Gung Memorial Hospital required ECMO support: six for respiratory failure, three for cardiogenic shock, and two for refractory septic shock. Comparison between the survivor and non-survivor groups was made. Results: The survival rate for liver transplantation (LT) patients on ECMO support is 36.4%-40% in adults and 33.3% in pediatrics, while the survival rate per indication is as follows: acute respiratory distress syndrome (ARDS) (50%), cardiogenic shock (33.3%), and sepsis (0%). Shorter durations of LT-to-ECMO and pre-ECMO mechanical ventilation were observed in the survivor group. On the other hand, we observed persistently elevated total bilirubin levels in non-survivors, while none of the survivors had aspartate aminotransferase (AST)/alanine aminotransferase (ALT) levels >1,000 U/L. A higher proportion of non-survivors were on concurrent continuous renal replacement therapy (CRRT). Conclusions: Our experience has proven ECMO's utility during the peri-operative period for both adult and pediatric LDLT patients, more specifically for indications other than septic shock. Further studies are needed to better understand the factors leading to poor outcomes in order to identify patients who will more likely benefit from ECMO.

SELECTION OF CITATIONS
SEARCH DETAIL