RESUMO
BACKGROUND & AIMS: Continuous risk-stratification of candidates and urgency-based prioritization have been utilized for liver transplantation (LT) in patients with non-hepatocellular carcinoma (HCC) in the United States. Instead, for patients with HCC, a dichotomous criterion with exception points is still used. This study evaluated the utility of the hazard associated with LT for HCC (HALT-HCC), an oncological continuous risk score, to stratify waitlist dropout and post-LT outcomes. METHODS: A competing risk model was developed and validated using the UNOS database (2012-2021) through multiple policy changes. The primary outcome was to assess the discrimination ability of waitlist dropouts and LT outcomes. The study focused on the HALT-HCC score, compared with other HCC risk scores. RESULTS: Among 23,858 candidates, 14,646 (59.9%) underwent LT and 5196 (21.8%) dropped out of the waitlist. Higher HALT-HCC scores correlated with increased dropout incidence and lower predicted 5-year overall survival after LT. HALT-HCC demonstrated the highest area under the curve (AUC) values for predicting dropout at various intervals post-listing (0.68 at 6 months, 0.66 at 1 year), with excellent calibration (R2 = 0.95 at 6 months, 0.88 at 1 year). Its accuracy remained stable across policy periods and locoregional therapy applications. CONCLUSIONS: This study highlights the predictive capability of the continuous oncological risk score to forecast waitlist dropout and post-LT outcomes in patients with HCC, independent of policy changes. The study advocates integrating continuous scoring systems like HALT-HCC in liver allocation decisions, balancing urgency, organ utility, and survival benefit.
Assuntos
Carcinoma Hepatocelular , Neoplasias Hepáticas , Transplante de Fígado , Listas de Espera , Humanos , Carcinoma Hepatocelular/cirurgia , Neoplasias Hepáticas/cirurgia , Masculino , Feminino , Pessoa de Meia-Idade , Medição de Risco/métodos , Estados Unidos/epidemiologia , Idoso , AdultoRESUMO
With increasing metabolic dysfunction-associated steatotic liver disease, the use of steatotic grafts in liver transplantation (LT) and their impact on postoperative graft survival (GS) needs further exploration. Analyzing adult LT recipient data (2002-2022) from the United Network for Organ Sharing database, outcomes of LT using steatotic (≥30% macrosteatosis) and nonsteatotic donor livers, donors after circulatory death, and standard-risk older donors (age 45-50) were compared. GS predictors were evaluated using Kaplan-Meier and Cox regression analyses. Of the 35,345 LT donors, 8.9% (3,155) were fatty livers. The initial 30-day postoperative period revealed significant challenges with fatty livers, demonstrating inferior GS. However, the GS discrepancy between fatty and nonfatty livers subsided over time ( p = 0.10 at 5 y). Long-term GS outcomes showed comparable or even superior results in fatty livers relative to nonsteatotic livers, conditional on surviving the initial 90 postoperative days ( p = 0.90 at 1 y) or 1 year ( p = 0.03 at 5 y). In the multivariable Cox regression analysis, the high body surface area (BSA) ratio (≥1.1) (HR 1.42, p = 0.02), calculated as donor BSA divided by recipient BSA, long cold ischemic time (≥6.5 h) (HR 1.72, p < 0.01), and recipient medical condition (intensive care unit hospitalization) (HR 2.53, p < 0.01) emerged as significant adverse prognostic factors. Young (<40 y) fatty donors showed a high BSA ratio, diabetes, and intensive care unit hospitalization as significant indicators of a worse prognosis ( p < 0.01). Our study emphasizes the initial postoperative 30-day survival challenge in LT using fatty livers. However, with careful donor-recipient matching, for example, avoiding the use of steatotic donors with long cold ischemic time and high BSA ratios for recipients in the intensive care unit, it is possible to enhance immediate GS, and in a longer time, outcomes comparable to those using nonfatty livers, donors after circulatory death livers, or standard-risk older donors can be anticipated. These novel insights into decision-making criteria for steatotic liver use provide invaluable guidance for clinicians.
Assuntos
Fígado Gorduroso , Transplante de Fígado , Humanos , Pessoa de Meia-Idade , Transplante de Fígado/métodos , Prognóstico , Fígado Gorduroso/etiologia , Fígado/metabolismo , Doadores de Tecidos , Sobrevivência de EnxertoRESUMO
With the Acuity Circles (AC) policy aiming to reduce disparities in liver transplantation (LT) access, the allocation of high-quality grafts has shifted, potentially affecting the use and outcomes of split LT. Data from the United Network for Organ Sharing (UNOS) database (February 4, 2016, to February 3, 2024) were analyzed, including 1,470 candidates who underwent deceased donor split LT, with 681 adult and 789 pediatric cases. The study periods were divided into pre-AC (February 4, 2016, to February 3, 2020) and post-AC (February 4, 2020, to February 3, 2024). The study assessed changes in split LT volumes and examined the impact of center practices. Both adult and pediatric split LTs decreased in the initial three years post-policy change, followed by an increase in the final year, with an overall 11.9% and 13.9% decrease between the eras. Adult female split LT cases remained consistent, ensuring access for smaller recipients. High-quality "splittable" livers were increasingly allocated to high MELD patients (MELD-Na ≥30). Despite the overall decrease in case volume, adult split LT volume increased in newly active LDLT centers, with six centers increasing LDLT volume by over 50.0%. Pediatric split LT volumes decreased despite additional priorities for pediatric candidates. The number of split LTs decreased in the initial period after the AC policy introduction, but there was a consistent need for small female candidates. In the adult population, LDLT and split LT demonstrated a synergistic effect in boosting center transplant volumes, potentially improving access for female candidates who need small grafts.
RESUMO
The use of older donors after circulatory death (DCD) for liver transplantation (LT) has increased over the past decade. This study examined whether outcomes of LT using older DCD (≥50 y) have improved with advancements in surgical/perioperative care and normothermic machine perfusion (NMP) technology. A total of 7602 DCD LT cases from the United Network for Organ Sharing database (2003-2022) were reviewed. The impact of older DCD donors on graft survival was assessed using the Kaplan-Meier and HR analyses. In all, 1447 LT cases (19.0%) involved older DCD donors. Although there was a decrease in their use from 2003 to 2014, a resurgence was noted after 2015 and reached 21.9% of all LTs in the last 4 years (2019-2022). Initially, 90-day and 1-year graft survivals for older DCDs were worse than younger DCDs, but this difference decreased over time and there was no statistical difference after 2015. Similarly, HRs for graft loss in older DCD have recently become insignificant. In older DCD LT, NMP usage has increased recently, especially in cases with extended donor-recipient distances, while the median time from asystole to aortic cross-clamp has decreased. Multivariable Cox regression analyses revealed that in the early phase, asystole to cross-clamp time had the highest HR for graft loss in older DCD LT without NMP, while in the later phases, the cold ischemic time (>5.5 h) was a significant predictor. LT outcomes using older DCD donors have become comparable to those from young DCD donors, with recent HRs for graft loss becoming insignificant. The strategic approach in the recent period could mitigate risks, including managing cold ischemic time (≤5.5 h), reducing asystole to cross-clamp time, and adopting NMP for longer distances. Optimal use of older DCD donors may alleviate the donor shortage.
RESUMO
BACKGROUND: The incidence of graft failure following liver transplantation (LTx) is consistent. While traditional risk scores for LTx have limited accuracy, the potential of machine learning (ML) in this area remains uncertain, despite its promise in other transplant domains. This study aims to determine ML's predictive limitations in LTx by replicating methods used in previous heart transplant research. METHODS: This study utilized the UNOS STAR database, selecting 64,384 adult patients who underwent LTx between 2010 and 2020. Gradient boosting models (XGBoost and LightGBM) were used to predict 14, 30, and 90-day graft failure compared to conventional logistic regression model. Models were evaluated using both shuffled and rolling cross-validation (CV) methodologies. Model performance was assessed using the AUC across validation iterations. RESULTS: In a study comparing predictive models for 14-day, 30-day and 90-day graft survival, LightGBM consistently outperformed other models, achieving the highest AUC of.740,.722, and.700 in shuffled CV methods. However, in rolling CV the accuracy of the model declined across every ML algorithm. The analysis revealed influential factors for graft survival prediction across all models, including total bilirubin, medical condition, recipient age, and donor AST, among others. Several features like donor age and recipient diabetes history were important in two out of three models. CONCLUSIONS: LightGBM enhances short-term graft survival predictions post-LTx. However, due to changing medical practices and selection criteria, continuous model evaluation is essential. Future studies should focus on temporal variations, clinical implications, and ensure model transparency for broader medical utility.
Assuntos
Transplante de Fígado , Adulto , Humanos , Transplante de Fígado/efeitos adversos , Projetos de Pesquisa , Algoritmos , Bilirrubina , Aprendizado de MáquinaRESUMO
BACKGROUND: Donors with hyperbilirubinemia are often not utilized for liver transplantation (LT) due to concerns about potential liver dysfunction and graft survival. The potential to mitigate organ shortages using such donors remains unclear. METHODS: This study analyzed adult deceased donor data from the United Network for Organ Sharing database (2002-2022). Hyperbilirubinemia was categorized as high total bilirubin (3.0-5.0 mg/dL) and very high bilirubin (≥5.0 mg/dL) in brain-dead donors. We assessed the impact of donor hyperbilirubinemia on 3-month and 3-year graft survival, comparing these outcomes to donors after circulatory death (DCD). RESULTS: Of 138 622 donors, 3452 (2.5%) had high bilirubin and 1999 (1.4%) had very high bilirubin levels. Utilization rates for normal, high, and very high bilirubin groups were 73.5%, 56.4%, and 29.2%, respectively. No significant differences were found in 3-month and 3-year graft survival between groups. Donors with high bilirubin had superior 3-year graft survival compared to DCD (hazard ratio .83, p = .02). Factors associated with inferior short-term graft survival included recipient medical condition in intensive care unit (ICU) and longer cold ischemic time; factors associated with inferior long-term graft survival included older donor age, recipient medical condition in ICU, older recipient age, and longer cold ischemic time. Donors with ≥10% macrosteatosis in the very high bilirubin group were also associated with worse 3-year graft survival (p = .04). DISCUSSION: The study suggests that despite many grafts with hyperbilirubinemia being non-utilized, acceptable post-LT outcomes can be achieved using donors with hyperbilirubinemia. Careful selection may increase utilization and expand the donor pool without negatively affecting graft outcome.
Assuntos
Fígado , Obtenção de Tecidos e Órgãos , Adulto , Humanos , Prognóstico , Doadores de Tecidos , Sobrevivência de Enxerto , Hiperbilirrubinemia/etiologia , Bilirrubina , Estudos RetrospectivosRESUMO
Older compatible living donor kidney transplant (CLDKT) recipients have higher mortality and death-censored graft failure (DCGF) compared to younger recipients. These risks may be amplified in older incompatible living donor kidney transplant (ILDKT) recipients who undergo desensitization and intense immunosuppression. In a 25-center cohort of ILDKT recipients transplanted between September 24, 1997, and December 15, 2016, we compared mortality, DCGF, delayed graft function (DGF), acute rejection (AR), and length of stay (LOS) between 234 older (age ≥60 years) and 1172 younger (age 18-59 years) recipients. To investigate whether the impact of age was different for ILDKT recipients compared to 17 542 CLDKT recipients, we used an interaction term to determine whether the relationship between posttransplant outcomes and transplant type (ILDKT vs CLDKT) was modified by age. Overall, older recipients had higher mortality (hazard ratio: 1.632.072.65, P < .001), lower DCGF (hazard ratio: 0.360.530.77, P = .001), and AR (odds ratio: 0.390.540.74, P < .001), and similar DGF (odds ratio: 0.461.032.33, P = .9) and LOS (incidence rate ratio: 0.880.981.10, P = 0.8) compared to younger recipients. The impact of age on mortality (interaction P = .052), DCGF (interaction P = .7), AR interaction P = .2), DGF (interaction P = .9), and LOS (interaction P = .5) were similar in ILDKT and CLDKT recipients. Age alone should not preclude eligibility for ILDKT.
Assuntos
Transplante de Rim , Humanos , Idoso , Pessoa de Meia-Idade , Adolescente , Adulto Jovem , Adulto , Transplante de Rim/efeitos adversos , Doadores Vivos , Sobrevivência de Enxerto , Rejeição de Enxerto/etiologia , Antígenos HLA , Fatores de RiscoRESUMO
The current liver allocation system may be disadvantaging younger adult recipients as it does not incorporate the donor-recipient age difference. Given the longer life expectancy of younger recipients, the influences of older donor grafts on their long-term prognosis should be elucidated. This study sought to reveal the long-term prognostic influence of the donor-recipient age difference in young adult recipients. Adult patients who received initial liver transplants from deceased donors between 2002 and 2021 were identified from the UNOS database. Young recipients (patients 45 years old or below) were categorized into 4 groups: donor age younger than the recipient, 0-9 years older, 10-19 years older, or 20 years older or above. Older recipients were defined as patients 65 years old or above. To examine the influence of the age difference in long-term survivors, conditional graft survival analysis was conducted on both younger and older recipients. Among 91,952 transplant recipients, 15,170 patients were 45 years old or below (16.5%); these were categorized into 6,114 (40.3%), 3,315 (21.9%), 2,970 (19.6%), and 2,771 (18.3%) for groups 1-4, respectively. Group 1 demonstrated the highest probability of survival, followed by groups 2, 3, and 4 for the actual graft survival and conditional graft survival analyses. In younger recipients who survived at least 5 years post-transplant, inferior long-term survival was observed when there was an age difference of 10 years or above (86.9% vs. 80.6%, log-rank p <0.01), whereas there was no difference in older recipients (72.6% vs. 74.2%, log-rank p =0.89). In younger patients who are not in emergent need of a transplant, preferential allocation of younger aged donor offers would optimize organ utility by increasing postoperative graft survival time.
Assuntos
Transplante de Rim , Transplante de Fígado , Humanos , Adulto Jovem , Idoso , Recém-Nascido , Lactente , Pré-Escolar , Criança , Pessoa de Meia-Idade , Transplante de Fígado/efeitos adversos , Doadores Vivos , Fatores de Tempo , Doadores de Tecidos , Análise de Sobrevida , Sobrevivência de Enxerto , Fatores Etários , Estudos RetrospectivosRESUMO
NAFLD will soon be the most common indication for liver transplantation (LT). In NAFLD, HCC may occur at earlier stages of fibrosis and present with more advanced tumor stage, raising concern for aggressive disease. Thus, adult LT recipients with HCC from 20 US centers transplanted between 2002 and 2013 were analyzed to determine whether NAFLD impacts recurrence-free post-LT survival. Five hundred and thirty-eight (10.8%) of 4981 total patients had NAFLD. Patients with NAFLD were significantly older (63 vs. 58, p<0.001), had higher body mass index (30.5 vs. 27.4, p<0.001), and were more likely to have diabetes (57.3% vs. 28.8%, p<0.001). Patients with NAFLD were less likely to receive pre-LT locoregional therapy (63.6% vs. 72.9%, p<0.001), had higher median lab MELD (15 vs. 13, p<0.001) and neutrophil-lymphocyte ratio (3.8 vs. 2.9, p<0.001), and were more likely to have their maximum pre-LT alpha fetoprotein at time of LT (44.1% vs. 36.1%, p<0.001). NAFLD patients were more likely to have an incidental HCC on explant (19.4% vs. 10.4%, p<0.001); however, explant characteristics including tumor differentiation and vascular invasion were not different between groups. Comparing NAFLD and non-NAFLD patients, the 1, 3, and 5-year cumulative incidence of recurrence (3.1%, 9.1%, 11.5% vs. 4.9%, 10.1%, 12.6%, p=0.36) and recurrence-free survival rates (87%, 76%, and 67% vs. 87%, 75%, and 67%, p=0.97) were not different. In competing risks analysis, NAFLD did not significantly impact recurrence in univariable (HR: 0.88, p=0.36) nor in adjusted analysis (HR: 0.91, p=0.49). With NAFLD among the most common causes of HCC and poised to become the leading indication for LT, a better understanding of disease-specific models to predict recurrence is needed. In this NAFLD cohort, incidental HCCs were common, raising concerns about early detection. However, despite less locoregional therapy and high neutrophil-lymphocyte ratio, explant tumor characteristics and post-transplant recurrence-free survival were not different compared to non-NAFLD patients.
Assuntos
Carcinoma Hepatocelular , Neoplasias Hepáticas , Transplante de Fígado , Hepatopatia Gordurosa não Alcoólica , Adulto , Humanos , Carcinoma Hepatocelular/epidemiologia , Carcinoma Hepatocelular/cirurgia , Carcinoma Hepatocelular/patologia , Neoplasias Hepáticas/epidemiologia , Neoplasias Hepáticas/cirurgia , Neoplasias Hepáticas/patologia , Hepatopatia Gordurosa não Alcoólica/complicações , Hepatopatia Gordurosa não Alcoólica/epidemiologia , Hepatopatia Gordurosa não Alcoólica/cirurgia , Transplante de Fígado/efeitos adversos , Estudos Retrospectivos , Recidiva Local de Neoplasia/patologia , Fatores de RiscoRESUMO
HCC recurrence following liver transplantation (LT) is highly morbid and occurs despite strict patient selection criteria. Individualized prediction of post-LT HCC recurrence risk remains an important need. Clinico-radiologic and pathologic data of 4981 patients with HCC undergoing LT from the US Multicenter HCC Transplant Consortium (UMHTC) were analyzed to develop a REcurrent Liver cAncer Prediction ScorE (RELAPSE). Multivariable Fine and Gray competing risk analysis and machine learning algorithms (Random Survival Forest and Classification and Regression Tree models) identified variables to model HCC recurrence. RELAPSE was externally validated in 1160 HCC LT recipients from the European Hepatocellular Cancer Liver Transplant study group. Of 4981 UMHTC patients with HCC undergoing LT, 71.9% were within Milan criteria, 16.1% were initially beyond Milan criteria with 9.4% downstaged before LT, and 12.0% had incidental HCC on explant pathology. Overall and recurrence-free survival at 1, 3, and 5 years was 89.7%, 78.6%, and 69.8% and 86.8%, 74.9%, and 66.7%, respectively, with a 5-year incidence of HCC recurrence of 12.5% (median 16 months) and non-HCC mortality of 20.8%. A multivariable model identified maximum alpha-fetoprotein (HR = 1.35 per-log SD, 95% CI,1.22-1.50, p < 0.001), neutrophil-lymphocyte ratio (HR = 1.16 per-log SD, 95% CI,1.04-1.28, p < 0.006), pathologic maximum tumor diameter (HR = 1.53 per-log SD, 95% CI, 1.35-1.73, p < 0.001), microvascular (HR = 2.37, 95%-CI, 1.87-2.99, p < 0.001) and macrovascular (HR = 3.38, 95% CI, 2.41-4.75, p < 0.001) invasion, and tumor differentiation (moderate HR = 1.75, 95% CI, 1.29-2.37, p < 0.001; poor HR = 2.62, 95% CI, 1.54-3.32, p < 0.001) as independent variables predicting post-LT HCC recurrence (C-statistic = 0.78). Machine learning algorithms incorporating additional covariates improved prediction of recurrence (Random Survival Forest C-statistic = 0.81). Despite significant differences in European Hepatocellular Cancer Liver Transplant recipient radiologic, treatment, and pathologic characteristics, external validation of RELAPSE demonstrated consistent 2- and 5-year recurrence risk discrimination (AUCs 0.77 and 0.75, respectively). We developed and externally validated a RELAPSE score that accurately discriminates post-LT HCC recurrence risk and may allow for individualized post-LT surveillance, immunosuppression modification, and selection of high-risk patients for adjuvant therapies.
Assuntos
Carcinoma Hepatocelular , Neoplasias Hepáticas , Transplante de Fígado , Humanos , Transplante de Fígado/efeitos adversos , Fatores de Risco , Recidiva Local de Neoplasia/patologia , Estudos Retrospectivos , RecidivaRESUMO
BACKGROUND: Current success in transplant oncology for select liver tumors, such as hepatocellular carcinoma, has ignited international interest in liver transplantation (LT) as a therapeutic option for nonresectable colorectal liver metastases (CRLM). In the United States, the CRLM LT experience is limited to reports from a handful of centers. This study was designed to summarize donor, recipient, and transplant center characteristics and posttransplant outcomes for the indication of CRLM. METHODS: Adult, primary LT patients listed between December 2017 and March 2022 were identified by using United Network Organ Sharing database. LT for CRLM was identified from variables: "DIAG_OSTXT"; "DGN_OSTXT_TCR"; "DGN2_OSTXT_TCR"; and "MALIG_TY_OSTXT." RESULTS: During this study period, 64 patients were listed, and 46 received LT for CRLM in 15 centers. Of 46 patients who underwent LT for CRLM, 26 patients (56.5%) received LTs using living donor LT (LDLT), and 20 patients received LT using deceased donor (DDLT) (43.5%). The median laboratory MELD-Na score at the time of listing was statistically similar between the LDLT and DDLT groups (8 vs. 9, P = 0.14). This persisted at the time of LT (8 vs. 12, P = 0.06). The 1-, 2-, and 3-year, disease-free, survival rates were 75.1, 53.7, and 53.7%. Overall survival rates were 89.0, 60.4, and 60.4%, respectively. CONCLUSIONS: This first comprehensive U.S. analysis of LT for CRLM suggests a burgeoning interest in high-volume U.S. transplant centers. Strategies to optimize patient selection are limited by the scarce oncologic history provided in UNOS data, warranting a separate registry to study LT in CRLM.
Assuntos
Neoplasias Colorretais , Neoplasias Hepáticas , Transplante de Fígado , Adulto , Humanos , Estados Unidos , Estudos Retrospectivos , Neoplasias Hepáticas/cirurgia , Doadores Vivos , Neoplasias Colorretais/cirurgia , Receptores de Antígenos de Linfócitos T , Resultado do TratamentoRESUMO
INTRODUCTION: Liver transplantation is a highly successful treatment for liver failure and disease. However, demand continues to outstrip our ability to provide transplantation as a treatment. Many livers initially considered for transplantation are not used because of concerns about their viability or logistical issues. Recent clinical trials have shown discarded livers may be viable if they undergo machine perfusion, which allows a more objective assessment of liver quality. METHODS: Using the Scientific Registry of Transplant Recipients dataset, we examined discarded and unretrieved organs to determine their eligibility for perfusion. We then used a Markov decision-analytic model to perform a cost-effectiveness analysis of two competing transplant strategies: Static Cold Storage (SCS) alone versus Static Cold Storage and Normothermic Machine Perfusion (NMP) of discarded organs. RESULTS: The average predicted successful transplants after perfusion was 385, representing a 5.8% increase in the annual yield of liver transplants. Our cost-effectiveness analysis found that the SCS strategy generated 4.64 quality-adjusted life years (QALYs) and cost $479,226. The combined SCS + NMP strategy generated 4.72 QALYs and cost $481,885. The combined SCS + NMP strategy had an incremental cost-effectiveness ratio of $33,575 per additional QALY over the 10-year study horizon. CONCLUSIONS: Machine perfusion of livers currently not considered viable for transplant could increase the number of transplantable grafts by approximately 5% per year and is cost-effective compared to Static Cold Storage alone.
Assuntos
Transplante de Fígado , Preservação de Órgãos , Humanos , Fígado , Doadores de Tecidos , PerfusãoRESUMO
BACKGROUND: The allocation system for livers used the acuity circles (AC) beginning in 2020. In this study, we sought to evaluate the effect of the AC policy on center transplant volumes, from geographic and center practice perspectives. METHODS: Using the US national registry data between 2018 and 2022, adult liver transplantations (LTs) were separated into two eras: before AC and after AC. RESULTS: The number of LT for Model for End-Stage Liver Disease (MELD) scores ≥29 have significantly increased by 10%, and waitlist times for those patients have been significantly shorter after AC. These benefits were not found in patients with MELD scores <29. The geographic distribution of transplant centers reveals that the majority of centers which increased their transplant volume (18 out of 25 centers) are located in high-population states while there are seven transplant centers in nonhigh-population states. The centers in the nonhigh-population states utilized more marginal donation after brain death (DBD) and donation after circulatory death (DCD) donors by 27% and 155%, respectively. MELD scores were significantly lower in the nonhigh-population states compared with those in the high-population states (p < .01). CONCLUSION: AC improved the LT access for patients with MELD scores ≥29, which benefited the high-population states. However, aggressive center practices to utilize marginal DBD and DCD donors were able to increase transplant volume and lower median allocation MELD scores.
Assuntos
Doença Hepática Terminal , Transplante de Fígado , Obtenção de Tecidos e Órgãos , Adulto , Humanos , Doença Hepática Terminal/cirurgia , Índice de Gravidade de Doença , Doadores de Tecidos , Morte Encefálica , Estudos RetrospectivosRESUMO
BACKGROUND: Despite advancements in liver transplantation (LT) over the past two decades, liver re-transplantation (re-LT) presents challenges. This study aimed to assess improvements in re-LT outcomes and contributing factors. METHODS: Data from the United Network for Organ Sharing database (2002-2021) were analyzed, with recipients categorized into four-year intervals. Trends in re-LT characteristics and postoperative outcomes were evaluated. RESULTS: Of 128,462 LT patients, 7254 received re-LT. Graft survival (GS) for re-LT improved (91.3%, 82.1%, and 70.8% at 30 days, 1 year, and 3 years post-LT from 2018 to 2021). However, hazard ratios (HRs) for GS remained elevated compared to marginal donors including donors after circulatory death (DCD), although the difference in HRs decreased in long-term GS. Changes in re-LT causes included a reduction in hepatitis C recurrence and an increase in graft failure post-primary LT involving DCD. Trends identified included recent decreased cold ischemic time (CIT) and increased distance from donor hospital in re-LT group. Meanwhile, DCD cohort exhibited less significant increase in distance and more marked decrease in CIT. The shortest CIT was recorded in urgent re-LT group. The highest Model for End-Stage Liver Disease score was observed in urgent re-LT group, while the lowest was recorded in DCD group. Analysis revealed shorter time interval between previous LT and re-listing, leading to worse outcomes, and varying primary graft failure causes influencing overall survival post-re-LT. DISCUSSION: While short-term re-LT outcomes improved, challenges persist compared to DCD. Further enhancements are required, with ongoing research focusing on optimizing risk stratification models and allocation systems for better LT outcomes.
Assuntos
Doença Hepática Terminal , Transplante de Fígado , Obtenção de Tecidos e Órgãos , Humanos , Doença Hepática Terminal/cirurgia , Índice de Gravidade de Doença , Doadores de Tecidos , Sobrevivência de Enxerto , Estudos RetrospectivosRESUMO
BACKGROUND: During the donor hepatectomy time (dHT), defined as the time from the start of cold perfusion to the end of the hepatectomy, liver grafts have a suboptimal temperature. The aim of this study was to analyze the impact of prolonged dHT on outcomes in donation after circulatory death (DCD) liver transplantation (LT). METHODS: Using the US national registry data between 2012 and 2020, DCD LT patients were separated into two groups based on their dHT: standard dHT (< 42 min) and prolonged dHT (≥42 min). RESULTS: There were 3810 DCD LTs during the study period. Median dHT was 32 min (interquartile range 25-41 min). Kaplan-Meier graft survival curves demonstrated inferior outcomes in the prolonged dHT group at 1-year after DCD LT compared to those in the standard dHT group (85.3% vs 89.9%; P < .01). Multivariate Cox proportional hazards models for 1-year graft survival identified that prolonged dHT [hazard ratio (HR) 1.46, 95% confidence interval (CI) 1.19 - 1.79], recipient age ≥ 64 years (HR 1.40, 95% CI 1.14 - 1.72), and MELD score ≥ 24 (HR 1.43, 95% CI 1.16 - 1.76) were significant predictors of 1-year graft loss. Spline analysis shows that the dHT effects on the risk for 1-year graft loss with an increase in the slope after median dHT of 32 min. CONCLUSION: Prolonged dHTs significantly reduced graft and patient survival after DCD LT. Because dHT is a modifiable factor, donor surgeons should take on cases with caution by setting the dHT target of < 32 min.
Assuntos
Transplante de Fígado , Obtenção de Tecidos e Órgãos , Hepatectomia , Humanos , Fígado , Pessoa de Meia-Idade , Sistema de Registros , Estudos RetrospectivosRESUMO
BACKGROUND: Donor livers undergo subjective pathologist review of steatosis before transplantation to mitigate the risk for early allograft dysfunction (EAD). We developed an objective, computer vision artificial intelligence (CVAI) platform to score donor liver steatosis and compared its capability for predicting EAD against pathologist steatosis scores. METHODS: Two pathologists scored digitized donor liver biopsy slides from 2014 to 2019. We trained four CVAI platforms with 1:99 training:prediction split. Mean intersection-over-union (IU) characterized CVAI model accuracy. We defined EAD using liver function tests within 1 week of transplantation. We calculated separate EAD logistic regression models with CVAI and pathologist steatosis and compared the models' discrimination and internal calibration. RESULTS: From 90 liver biopsies, 25,494 images trained CVAI models yielding peak mean IU = 0.80. CVAI steatosis scores were lower than pathologist scores (median 3% vs 20%, P < 0.001). Among 41 transplanted grafts, 46% developed EAD. The median CVAI steatosis score was higher for those with EAD (2.9% vs 1.9%, P = 0.02). CVAI steatosis was independently associated with EAD after adjusting for donor age, donor diabetes, and MELD score (aOR = 1.34, 95%CI = 1.03-1.75, P = 0.03). CONCLUSION: The CVAI steatosis EAD model demonstrated slightly better calibration than pathologist steatosis, meriting further investigation into which modality most accurately and reliably predicts post-transplantation outcomes.
Assuntos
Fígado Gorduroso , Transplante de Fígado , Aloenxertos , Inteligência Artificial , Fígado Gorduroso/diagnóstico , Fígado Gorduroso/patologia , Sobrevivência de Enxerto , Humanos , Fígado/patologia , Transplante de Fígado/efeitos adversos , Transplante de Fígado/métodos , Doadores Vivos , Fatores de RiscoRESUMO
Incompatible living donor kidney transplant recipients (ILDKTr) have pre-existing donor-specific antibody (DSA) that, despite desensitization, may persist or reappear with resulting consequences, including delayed graft function (DGF) and acute rejection (AR). To quantify the risk of DGF and AR in ILDKT and downstream effects, we compared 1406 ILDKTr to 17 542 compatible LDKT recipients (CLDKTr) using a 25-center cohort with novel SRTR linkage. We characterized DSA strength as positive Luminex, negative flow crossmatch (PLNF); positive flow, negative cytotoxic crossmatch (PFNC); or positive cytotoxic crossmatch (PCC). DGF occurred in 3.1% of CLDKT, 3.5% of PLNF, 5.7% of PFNC, and 7.6% of PCC recipients, which translated to higher DGF for PCC recipients (aOR = 1.03 1.682.72 ). However, the impact of DGF on mortality and DCGF risk was no higher for ILDKT than CLDKT (p interaction > .1). AR developed in 8.4% of CLDKT, 18.2% of PLNF, 21.3% of PFNC, and 21.7% of PCC recipients, which translated to higher AR (aOR PLNF = 1.45 2.093.02 ; PFNC = 1.67 2.403.46 ; PCC = 1.48 2.243.37 ). Although the impact of AR on mortality was no higher for ILDKT than CLDKT (p interaction = .1), its impact on DCGF risk was less consequential for ILDKT (aHR = 1.34 1.621.95 ) than CLDKT (aHR = 1.96 2.292.67 ) (p interaction = .004). Providers should consider these risks during preoperative counseling, and strategies to mitigate them should be considered.
Assuntos
Transplante de Rim , Função Retardada do Enxerto/etiologia , Rejeição de Enxerto/etiologia , Sobrevivência de Enxerto , Humanos , Transplante de Rim/efeitos adversos , Doadores Vivos , Estudos Retrospectivos , Fatores de RiscoRESUMO
The incidence of hepatocellular carcinoma (HCC) is growing in the United States, especially among the elderly. Older patients are increasingly receiving transplants as a result of HCC, but the impact of advancing age on long-term posttransplant outcomes is not clear. To study this, we used data from the US Multicenter HCC Transplant Consortium of 4980 patients. We divided the patients into 4 groups by age at transplantation: 18 to 64 years (n = 4001), 65 to 69 years (n = 683), 70 to 74 years (n = 252), and ≥75 years (n = 44). There were no differences in HCC tumor stage, type of bridging locoregional therapy, or explant residual tumor between the groups. Older age was confirmed to be an independent and significant predictor of overall survival even after adjusting for demographic, etiologic, and cancer-related factors on multivariable analysis. A dose-response effect of age on survival was observed, with every 5-year increase in age older than 50 years resulting in an absolute increase of 8.3% in the mortality rate. Competing risk analysis revealed that older patients experienced higher rates of non-HCC-related mortality (P = 0.004), and not HCC-related death (P = 0.24). To delineate the precise cause of death, we further analyzed a single-center cohort of patients who received a transplant as a result of HCC (n = 302). Patients older than 65 years had a higher incidence of de novo cancer (18.1% versus 7.6%; P = 0.006) after transplantation and higher overall cancer-related mortality (14.3% versus 6.6%; P = 0.03). Even carefully selected elderly patients with HCC have significantly worse posttransplant survival rates, which are mostly driven by non-HCC-related causes. Minimizing immunosuppression and closer surveillance for de novo cancers can potentially improve the outcomes in elderly patients who received a transplant as a result of HCC.
Assuntos
Carcinoma Hepatocelular , Neoplasias Hepáticas , Transplante de Fígado , Idoso , Carcinoma Hepatocelular/epidemiologia , Carcinoma Hepatocelular/cirurgia , Humanos , Neoplasias Hepáticas/epidemiologia , Neoplasias Hepáticas/cirurgia , Transplante de Fígado/efeitos adversos , Pessoa de Meia-Idade , Estudos Retrospectivos , Medição de Risco , Taxa de Sobrevida , Estados Unidos/epidemiologiaRESUMO
BACKGROUND AND AIMS: The Organ Procurement and Transplantation Network recently approved liver transplant (LT) prioritization for patients with hepatocellular carcinoma (HCC) beyond Milan Criteria (MC) who are down-staged (DS) with locoregional therapy (LRT). We evaluated post-LT outcomes, predictors of down-staging, and the impact of LRT in patients with beyond-MC HCC from the U.S. Multicenter HCC Transplant Consortium (20 centers, 2002-2013). APPROACH AND RESULTS: Clinicopathologic characteristics, overall survival (OS), recurrence-free survival (RFS), and HCC recurrence (HCC-R) were compared between patients within MC (n = 3,570) and beyond MC (n = 789) who were down-staged (DS, n = 465), treated with LRT and not down-staged (LRT-NoDS, n = 242), or untreated (NoLRT-NoDS, n = 82). Five-year post-LT OS and RFS was higher in MC (71.3% and 68.2%) compared with DS (64.3% and 59.5%) and was lowest in NoDS (n = 324; 60.2% and 53.8%; overall P < 0.001). DS patients had superior RFS (60% vs. 54%, P = 0.043) and lower 5-year HCC-R (18% vs. 32%, P < 0.001) compared with NoDS, with further stratification by maximum radiologic tumor diameter (5-year HCC-R of 15.5% in DS/<5 cm and 39.1% in NoDS/>5 cm, P < 0.001). Multivariate predictors of down-staging included alpha-fetoprotein response to LRT, pathologic tumor number and size, and wait time >12 months. LRT-NoDS had greater HCC-R compared with NoLRT-NoDS (34.1% vs. 26.1%, P < 0.001), even after controlling for clinicopathologic variables (hazard ratio [HR] = 2.33, P < 0.001) and inverse probability of treatment-weighted propensity matching (HR = 1.82, P < 0.001). CONCLUSIONS: In LT recipients with HCC presenting beyond MC, successful down-staging is predicted by wait time, alpha-fetoprotein response to LRT, and tumor burden and results in excellent post-LT outcomes, justifying expansion of LT criteria. In LRT-NoDS patients, higher HCC-R compared with NoLRT-NoDS cannot be explained by clinicopathologic differences, suggesting a potentially aggravating role of LRT in patients with poor tumor biology that warrants further investigation.
Assuntos
Técnicas de Ablação/métodos , Carcinoma Hepatocelular/terapia , Doença Hepática Terminal/terapia , Neoplasias Hepáticas/terapia , Transplante de Fígado/estatística & dados numéricos , Recidiva Local de Neoplasia/epidemiologia , Técnicas de Ablação/estatística & dados numéricos , Carcinoma Hepatocelular/diagnóstico , Carcinoma Hepatocelular/mortalidade , Carcinoma Hepatocelular/patologia , Intervalo Livre de Doença , Doença Hepática Terminal/diagnóstico , Doença Hepática Terminal/mortalidade , Doença Hepática Terminal/patologia , Feminino , Seguimentos , Humanos , Fígado/diagnóstico por imagem , Fígado/patologia , Fígado/efeitos da radiação , Fígado/cirurgia , Neoplasias Hepáticas/diagnóstico , Neoplasias Hepáticas/mortalidade , Neoplasias Hepáticas/patologia , Transplante de Fígado/normas , Masculino , Pessoa de Meia-Idade , Terapia Neoadjuvante/métodos , Recidiva Local de Neoplasia/prevenção & controle , Estadiamento de Neoplasias , Radioterapia Adjuvante/métodos , Radioterapia Adjuvante/estatística & dados numéricos , Estudos Retrospectivos , Índice de Gravidade de Doença , Obtenção de Tecidos e Órgãos/normas , Carga Tumoral/efeitos da radiação , Estados Unidos/epidemiologia , Listas de Espera/mortalidadeRESUMO
Segmenting cell nuclei within microscopy images is a ubiquitous task in biological research and clinical applications. Unfortunately, segmenting low-contrast overlapping objects that may be tightly packed is a major bottleneck in standard deep learning-based models. We report a Nuclear Segmentation Tool (NuSeT) based on deep learning that accurately segments nuclei across multiple types of fluorescence imaging data. Using a hybrid network consisting of U-Net and Region Proposal Networks (RPN), followed by a watershed step, we have achieved superior performance in detecting and delineating nuclear boundaries in 2D and 3D images of varying complexities. By using foreground normalization and additional training on synthetic images containing non-cellular artifacts, NuSeT improves nuclear detection and reduces false positives. NuSeT addresses common challenges in nuclear segmentation such as variability in nuclear signal and shape, limited training sample size, and sample preparation artifacts. Compared to other segmentation models, NuSeT consistently fares better in generating accurate segmentation masks and assigning boundaries for touching nuclei.