RESUMO
Medical literature highlights differences in liver transplantation (LT) waitlist experiences among ABO blood types. Type AB candidates reportedly have higher LT rates and reduced mortality. Despite liver offering guidelines, ABO disparities persist. This study examines LT access discrepancies among blood types, focusing on type AB, and seeks equitable strategies. Using the United Network for Organ Sharing database (2003-2022), 170 276 waitlist candidates were retrospectively analyzed. Dual predictive analyses (LT opportunity and survival studies) evaluated 1-year recipient pool survival, considering waitlist and post-LT survival, alongside anticipated allocation value per recipient, under 6 scenarios. Of the cohort, 97 670 patients (57.2%) underwent LT. Type AB recipients had the highest LT rate (73.7% vs 55.2% for O), shortest median waiting time (90 vs 198 days for A), and lowest waitlist mortality (12.9% vs 23.9% for O), with the lowest median model for end-stage liver disease-sodium (MELD-Na) score (20 vs 25 for A/O). The LT opportunity study revealed that reallocating type A (or A and O) donors originally for AB recipients to A recipients yielded the greatest reduction in disparities in anticipated value per recipient, from 0.19 (before modification) to 0.08. Meanwhile, the survival study showed that ABO-identical LTs reduced disparity the most (3.5% to 2.8%). Sensitivity analysis confirmed these findings were specific to the MELD-Na score < 30 population, indicating current LT allocation may favor certain blood types. Prioritizing ABO-identical LTs for MELD-Na score < 30 recipients could ensure uniform survival outcomes and mitigate disparities.
RESUMO
OBJECTIVE: To evaluate long-term oncologic outcomes of patients post-living donor liver transplantation (LDLT) within and outside standard transplantation selection criteria and the added value of the incorporation of the New York-California (NYCA) score. BACKGROUND: LDLT offers an opportunity to decrease the liver transplantation waitlist, reduce waitlist mortality, and expand selection criteria for patients with hepatocellular carcinoma (HCC). METHODS: Primary adult LDLT recipients between October 1999 and August 2019 were identified from a multicenter cohort of 12 North American centers. Posttransplantation and recurrence-free survival were evaluated using the Kaplan-Meier method. RESULTS: Three hundred sixty LDLTs were identified. Patients within Milan criteria (MC) at transplantation had a 1, 5, and 10-year posttransplantation survival of 90.9%, 78.5%, and 64.1% versus outside MC 90.4%, 68.6%, and 57.7% ( P = 0.20), respectively. For patients within the University of California San Francisco (UCSF) criteria, respective posttransplantation survival was 90.6%, 77.8%, and 65.0%, versus outside UCSF 92.1%, 63.8%, and 45.8% ( P = 0.08). Fifty-three (83%) patients classified as outside MC at transplantation would have been classified as either low or acceptable risk with the NYCA score. These patients had a 5-year overall survival of 72.2%. Similarly, 28(80%) patients classified as outside UCSF at transplantation would have been classified as a low or acceptable risk with a 5-year overall survival of 65.3%. CONCLUSIONS: Long-term survival is excellent for patients with HCC undergoing LDLT within and outside selection criteria, exceeding the minimum recommended 5-year rate of 60% proposed by consensus guidelines. The NYCA categorization offers insight into identifying a substantial proportion of patients with HCC outside the MC and the UCSF criteria who still achieve similar post-LDLT outcomes as patients within the criteria.
Assuntos
Carcinoma Hepatocelular , Neoplasias Hepáticas , Transplante de Fígado , Adulto , Humanos , Transplante de Fígado/métodos , Doadores Vivos , Recidiva Local de Neoplasia/etiologia , Seleção de Pacientes , América do Norte , Estudos Retrospectivos , Resultado do TratamentoRESUMO
BACKGROUND & AIMS: Continuous risk-stratification of candidates and urgency-based prioritization have been utilized for liver transplantation (LT) in patients with non-hepatocellular carcinoma (HCC) in the United States. Instead, for patients with HCC, a dichotomous criterion with exception points is still used. This study evaluated the utility of the hazard associated with LT for HCC (HALT-HCC), an oncological continuous risk score, to stratify waitlist dropout and post-LT outcomes. METHODS: A competing risk model was developed and validated using the UNOS database (2012-2021) through multiple policy changes. The primary outcome was to assess the discrimination ability of waitlist dropouts and LT outcomes. The study focused on the HALT-HCC score, compared with other HCC risk scores. RESULTS: Among 23,858 candidates, 14,646 (59.9%) underwent LT and 5196 (21.8%) dropped out of the waitlist. Higher HALT-HCC scores correlated with increased dropout incidence and lower predicted 5-year overall survival after LT. HALT-HCC demonstrated the highest area under the curve (AUC) values for predicting dropout at various intervals post-listing (0.68 at 6 months, 0.66 at 1 year), with excellent calibration (R2 = 0.95 at 6 months, 0.88 at 1 year). Its accuracy remained stable across policy periods and locoregional therapy applications. CONCLUSIONS: This study highlights the predictive capability of the continuous oncological risk score to forecast waitlist dropout and post-LT outcomes in patients with HCC, independent of policy changes. The study advocates integrating continuous scoring systems like HALT-HCC in liver allocation decisions, balancing urgency, organ utility, and survival benefit.
Assuntos
Carcinoma Hepatocelular , Neoplasias Hepáticas , Transplante de Fígado , Listas de Espera , Humanos , Carcinoma Hepatocelular/cirurgia , Neoplasias Hepáticas/cirurgia , Masculino , Feminino , Pessoa de Meia-Idade , Medição de Risco/métodos , Estados Unidos/epidemiologia , Idoso , AdultoRESUMO
There is no recent update on the clinical course of retransplantation (re-LT) after living donor liver transplantation (LDLT) in the US using recent national data. The UNOS database (2002-2023) was used to explore patient characteristics in initial LT, comparing deceased donor liver transplantation (DDLT) and LDLT for graft survival (GS), reasons for graft failure, and GS after re-LT. It assesses waitlist dropout and re-LT likelihood, categorizing re-LT cohort based on time to re-listing as acute or chronic (≤ or > 1 mo). Of 132,323 DDLT and 5955 LDLT initial transplants, 3848 DDLT and 302 LDLT recipients underwent re-LT. Of the 302 re-LT following LDLT, 156 were acute and 146 chronic. Primary nonfunction (PNF) was more common in DDLT, although the difference was not statistically significant (17.4% vs. 14.8% for LDLT; p = 0.52). Vascular complications were significantly higher in LDLT (12.5% vs. 8.3% for DDLT; p < 0.01). Acute re-LT showed a larger difference in primary nonfunction between DDLT and LDLT (49.7% vs. 32.0%; p < 0.01). Status 1 patients were more common in DDLT (51.3% vs. 34.0% in LDLT; p < 0.01). In the acute cohort, Kaplan-Meier curves indicated superior GS after re-LT for initial LDLT recipients in both short-term and long-term ( p = 0.02 and < 0.01, respectively), with no significant difference in the chronic cohort. No significant differences in waitlist dropout were observed, but the initial LDLT group had a higher re-LT likelihood in the acute cohort (sHR 1.40, p < 0.01). A sensitivity analysis focusing on the most recent 10-year cohort revealed trends consistent with the overall study findings. LDLT recipients had better GS in re-LT than DDLT. Despite a higher severity of illness, the DDLT cohort was less likely to undergo re-LT.
Assuntos
Bases de Dados Factuais , Sobrevivência de Enxerto , Transplante de Fígado , Doadores Vivos , Reoperação , Listas de Espera , Humanos , Transplante de Fígado/estatística & dados numéricos , Transplante de Fígado/efeitos adversos , Transplante de Fígado/métodos , Doadores Vivos/estatística & dados numéricos , Feminino , Masculino , Estados Unidos/epidemiologia , Reoperação/estatística & dados numéricos , Pessoa de Meia-Idade , Adulto , Bases de Dados Factuais/estatística & dados numéricos , Listas de Espera/mortalidade , Resultado do Tratamento , Fatores de Tempo , Idoso , Rejeição de Enxerto/epidemiologia , Rejeição de Enxerto/etiologia , Rejeição de Enxerto/prevenção & controle , Fatores de RiscoRESUMO
With increasing metabolic dysfunction-associated steatotic liver disease, the use of steatotic grafts in liver transplantation (LT) and their impact on postoperative graft survival (GS) needs further exploration. Analyzing adult LT recipient data (2002-2022) from the United Network for Organ Sharing database, outcomes of LT using steatotic (≥30% macrosteatosis) and nonsteatotic donor livers, donors after circulatory death, and standard-risk older donors (age 45-50) were compared. GS predictors were evaluated using Kaplan-Meier and Cox regression analyses. Of the 35,345 LT donors, 8.9% (3,155) were fatty livers. The initial 30-day postoperative period revealed significant challenges with fatty livers, demonstrating inferior GS. However, the GS discrepancy between fatty and nonfatty livers subsided over time ( p = 0.10 at 5 y). Long-term GS outcomes showed comparable or even superior results in fatty livers relative to nonsteatotic livers, conditional on surviving the initial 90 postoperative days ( p = 0.90 at 1 y) or 1 year ( p = 0.03 at 5 y). In the multivariable Cox regression analysis, the high body surface area (BSA) ratio (≥1.1) (HR 1.42, p = 0.02), calculated as donor BSA divided by recipient BSA, long cold ischemic time (≥6.5 h) (HR 1.72, p < 0.01), and recipient medical condition (intensive care unit hospitalization) (HR 2.53, p < 0.01) emerged as significant adverse prognostic factors. Young (<40 y) fatty donors showed a high BSA ratio, diabetes, and intensive care unit hospitalization as significant indicators of a worse prognosis ( p < 0.01). Our study emphasizes the initial postoperative 30-day survival challenge in LT using fatty livers. However, with careful donor-recipient matching, for example, avoiding the use of steatotic donors with long cold ischemic time and high BSA ratios for recipients in the intensive care unit, it is possible to enhance immediate GS, and in a longer time, outcomes comparable to those using nonfatty livers, donors after circulatory death livers, or standard-risk older donors can be anticipated. These novel insights into decision-making criteria for steatotic liver use provide invaluable guidance for clinicians.
Assuntos
Fígado Gorduroso , Transplante de Fígado , Humanos , Pessoa de Meia-Idade , Transplante de Fígado/métodos , Prognóstico , Fígado Gorduroso/etiologia , Fígado/metabolismo , Doadores de Tecidos , Sobrevivência de EnxertoRESUMO
BACKGROUND: The impact of transjugular intrahepatic portosystemic shunt (TIPS) on waitlist mortality and liver transplantation (LT) urgency in Budd-Chiari Syndrome (BCS) patients remains unclear. METHOD: We analyzed BCS patients listed for LT in the UNOS database(2002-2024) to assess TIPS's impact on waitlist mortality and LT access via competing-risk analysis. We compared trends across two phases:Phase1(2002-2011) and Phase2(2012-2024). RESULTS: Of 815 BCS patients, 263(32.3%) received TIPS at listing. TIPS group had lower MELD-Na scores(20vs22,p<0.01), milder ascites(p=0.01), and fewer Status1 patients(those at risk of imminent death while awaiting LT)(2.7%vs8.3%,p<0.01) at listing compared to those without TIPS. TIPS patients had lower LT rates(43.3%vs56.5%,p<0.01) and longer waitlist times(350vs113 d,p<0.01). TIPS use increased in Phase2(64.3%vs35.7%,p<0.01). Of 426 transplanted patients, 134(31.5%) received TIPS, showing lower MELD-Na scores(24vs27,p<0.01) and better medical conditions(Intensive care unit:14.9%vs21.9%,p<0.01) at LT. Status1 patients were fewer (3.7%vs12.3%,p<0.01), with longer waiting days(97vs26 d,p<0.01) in TIPS group. TIPS use at listing increased from Phase1(25.6%) to Phase2(37.7%). From Phase1 to Phase2, ascites severity improved, re-LT cases decreased(Phase1:9.8%vsPhase2:2.2%,p<0.01), and cold ischemic time slightly decreased(Phase1:7.0vsPhase2:6.4 hours,p=0.14). Median donor body mass index significantly increased. No significant differences were identified in patient/graft survival at 1-/5-/10-year intervals between phases or TIPS/non-TIPS patients. While 90-day waitlist mortality showed no significant difference(p=0.11), TIPS trended towards lower mortality(subHazard ratio[sHR]:0.70[0.45-1.08]). Multivariable analysis indicated that TIPS was a significant factor in decreasing mortality(sHR:0.45[0.27-0.77],p<0.01). TIPS group also showed significantly lower LT access(sHR:0.65[0.53-0.81],p<0.01). Multivariable analysis showed that TIPS was a significant factor in decreasing access to LT(sHR:0.60[0.46-0.77],p<0.01). Sub-group analysis excluding Status1 or HCC showed similar trends. CONCLUSION: TIPS in BCS patients listed for LT reduces waitlist mortality and LT access, supporting its bridging role.
RESUMO
With the Acuity Circles (AC) policy aiming to reduce disparities in liver transplantation (LT) access, the allocation of high-quality grafts has shifted, potentially affecting the use and outcomes of split LT. Data from the United Network for Organ Sharing (UNOS) database (February 4, 2016, to February 3, 2024) were analyzed, including 1,470 candidates who underwent deceased donor split LT, with 681 adult and 789 pediatric cases. The study periods were divided into pre-AC (February 4, 2016, to February 3, 2020) and post-AC (February 4, 2020, to February 3, 2024). The study assessed changes in split LT volumes and examined the impact of center practices. Both adult and pediatric split LTs decreased in the initial three years post-policy change, followed by an increase in the final year, with an overall 11.9% and 13.9% decrease between the eras. Adult female split LT cases remained consistent, ensuring access for smaller recipients. High-quality "splittable" livers were increasingly allocated to high MELD patients (MELD-Na ≥30). Despite the overall decrease in case volume, adult split LT volume increased in newly active LDLT centers, with six centers increasing LDLT volume by over 50.0%. Pediatric split LT volumes decreased despite additional priorities for pediatric candidates. The number of split LTs decreased in the initial period after the AC policy introduction, but there was a consistent need for small female candidates. In the adult population, LDLT and split LT demonstrated a synergistic effect in boosting center transplant volumes, potentially improving access for female candidates who need small grafts.
RESUMO
The use of older donors after circulatory death (DCD) for liver transplantation (LT) has increased over the past decade. This study examined whether outcomes of LT using older DCD (≥50 y) have improved with advancements in surgical/perioperative care and normothermic machine perfusion (NMP) technology. A total of 7602 DCD LT cases from the United Network for Organ Sharing database (2003-2022) were reviewed. The impact of older DCD donors on graft survival was assessed using the Kaplan-Meier and HR analyses. In all, 1447 LT cases (19.0%) involved older DCD donors. Although there was a decrease in their use from 2003 to 2014, a resurgence was noted after 2015 and reached 21.9% of all LTs in the last 4 years (2019-2022). Initially, 90-day and 1-year graft survivals for older DCDs were worse than younger DCDs, but this difference decreased over time and there was no statistical difference after 2015. Similarly, HRs for graft loss in older DCD have recently become insignificant. In older DCD LT, NMP usage has increased recently, especially in cases with extended donor-recipient distances, while the median time from asystole to aortic cross-clamp has decreased. Multivariable Cox regression analyses revealed that in the early phase, asystole to cross-clamp time had the highest HR for graft loss in older DCD LT without NMP, while in the later phases, the cold ischemic time (>5.5 h) was a significant predictor. LT outcomes using older DCD donors have become comparable to those from young DCD donors, with recent HRs for graft loss becoming insignificant. The strategic approach in the recent period could mitigate risks, including managing cold ischemic time (≤5.5 h), reducing asystole to cross-clamp time, and adopting NMP for longer distances. Optimal use of older DCD donors may alleviate the donor shortage.
RESUMO
BACKGROUND: The current liver transplantation (LT) allocation policy focuses on the Model for End-Stage Liver Disease (MELD) scores, often overlooking factors like blood type and survival benefits. Understanding blood types' impact on survival benefits is crucial for optimizing the MELD 3.0 classification. METHOD: This study used the United Network for Organ Sharing national registry database (2003-2020) to identify LT characteristics per ABO blood type and to determine the optimal MELD 3.0 scores for each blood type, based on survival benefits. RESULTS: The study included LT candidates aged 18 years or older listed for LT (total N=150,815; A:56,546, AB:5,841, B:18,500, O:69,928). Among these, 87,409 individuals (58.0%) underwent LT (A:32,156, AB:4,362, B:11,786, O:39,105). Higher transplantation rates were observed in AB and B groups, with lower median MELD 3.0 scores at transplantation (AB:21, B:24 vs. A/O:26, p<0.01) and shorter waiting times (AB:101 days, B:172 days vs. A:211 days, O:201 days, p<0.01). A preference for Donation after Cardiac Death (DCD) was seen in A and O recipients. Survival benefit analysis indicated that B blood type required higher MELD 3.0 scores for transplantation than A and O (Donation after Brain Death transplantation: ≥15 in B vs. ≥11 in A/O; DCD transplantation: ≥21 in B vs. ≥11 in A, ≥15 in O). CONCLUSION: The study suggests revising the allocation policy to consider blood type for improved post-LT survival. This calls for personalized LT policies, recommending higher MELD 3.0 thresholds, particularly for individuals with type B blood.
RESUMO
Post-liver transplant (LT) immunosuppression is necessary to prevent rejection; however, a major consequence of this is tumor recurrence. Although recurrence is a concern after LT for patients with HCC, the oncologically optimal tacrolimus (FK) regimen is still unknown. This retrospective study included 1406 patients with HCC who underwent LT (2002-2019) at 4 US institutions using variable post-LT immunosuppression regimens. Receiver operating characteristic analyses were performed to investigate the influences of post-LT time-weighted average FK (TWA-FK) level on HCC recurrence. A competing risk analysis was employed to evaluate the prognostic influence of TWA-FK while adjusting for patient and tumor characteristics. The AUC for TWA-FK was greatest at 2 weeks (0.68), followed by 1 week (0.64) after LT. Importantly, this was consistently observed across the institutions despite immunosuppression regimen variability. In addition, the TWA-FK at 2 weeks was not associated with rejection within 6 months of LT. A competing risk regression analysis showed that TWA-FK at 2 weeks after LT is significantly associated with recurrence (HR: 1.31, 95% CI: 1.21-1.41, p < 0.001). The TWA-FK effect on recurrence varied depending on the exposure level and the individual's risk of recurrence, including vascular invasion and tumor morphology. Although previous studies have explored the influence of FK levels at 1-3 months after LT on HCC recurrence, this current study suggests that earlier time points and exposure levels must be evaluated. Each patient's oncological risk must also be considered in developing an individualized immunosuppression regimen.
RESUMO
Colorectal cancer is the second most common cause of cancer-related death worldwide, and half of patients present with colorectal liver metastasis (CRLM). Liver transplant (LT) has emerged as a treatment modality for otherwise unresectable CRLM. Since the publication of the Lebeck-Lee systematic review in 2022, additional evidence has come to light supporting LT for CRLM in highly selected patients. This includes reports of >10-year follow-up with over 80% survival rates in low-risk patients. As these updated reports have significantly changed our collective knowledge, this article is intended to serve as an update to the 2022 systematic review to include the most up-to-date evidence on the subject.
Assuntos
Neoplasias Colorretais , Neoplasias Hepáticas , Transplante de Fígado , Humanos , Protocolos de Quimioterapia Combinada Antineoplásica , Neoplasias Colorretais/patologia , Hepatectomia , Neoplasias Hepáticas/secundário , Revisões Sistemáticas como AssuntoRESUMO
INTRODUCTION: Data on clinical characteristics and disease-specific prognosis among patients with early onset intrahepatic cholangiocarcinoma (ICC) are currently limited. METHODS: Patients undergoing hepatectomy for ICC between 2000 and 2020 were identified by using a multi-institutional database. The association of early (≤50 years) versus typical onset (>50 years) ICC with recurrence-free (RFS) and disease-specific survival (DSS) was assessed in the multi-institutional database and validated in an external cohort. The genomic and transcriptomic profiles of early versus late onset ICC were analyzed by using the Total Cancer Genome Atlas (TCGA) and Memorial Sloan Kettering Cancer Center databases. RESULTS: Among 971 patients undergoing resection for ICC, 22.7% (n = 220) had early-onset ICC. Patients with early-onset ICC had worse 5-year RFS (24.1% vs. 29.7%, p < 0.05) and DSS (36.5% vs. 48.9%, p = 0.03) compared with patients with typical onset ICC despite having earlier T-stage tumors and lower rates of microvascular invasion. In the validation cohort, patients with early-onset ICC had worse 5-year RFS (7.4% vs. 20.5%, p = 0.002) compared with individuals with typical onset ICC. Using the TCGA cohort, 652 and 266 genes were found to be upregulated (including ATP8A2) and downregulated (including UTY and KDM5D) in early versus typical onset ICC, respectively. Genes frequently implicated as oncogenic drivers, including CDKN2A, IDH1, BRAF, and FGFR2 were infrequently mutated in the early-onset ICC patients. CONCLUSIONS: Early-onset ICC has distinct clinical and genomic/transcriptomic features. Morphologic and clinicopathologic characteristics were unable to fully explain differences in outcomes among early versus typical onset ICC patients. The current study offers a preliminary landscape of the molecular features of early-onset ICC.
Assuntos
Neoplasias dos Ductos Biliares , Colangiocarcinoma , Humanos , Neoplasias dos Ductos Biliares/genética , Neoplasias dos Ductos Biliares/cirurgia , Colangiocarcinoma/genética , Colangiocarcinoma/cirurgia , Prognóstico , Perfilação da Expressão Gênica , Hepatectomia , Genômica , Ductos Biliares Intra-Hepáticos/patologia , Antígenos de Histocompatibilidade Menor , Histona DesmetilasesRESUMO
BACKGROUND: Recent studies have suggested that certain combinations of KRAS or BRAF biomarkers with clinical factors are associated with poor outcomes and may indicate that surgery could be "biologically" futile in otherwise technically resectable colorectal liver metastasis (CRLM). However, these combinations have yet to be validated through external studies. PATIENTS AND METHODS: We conducted a systematic search to identify these studies. The overall survival (OS) of patients with these combinations was evaluated in a cohort of patients treated at 11 tertiary centers. Additionally, the study investigated whether using high-risk KRAS point mutations in these combinations could be associated with particularly poor outcomes. RESULTS: The recommendations of four studies were validated in 1661 patients. The first three studies utilized KRAS, and their validation showed the following median and 5-year OS: (1) 30 months and 16.9%, (2) 24.3 months and 21.6%, and (3) 46.8 months and 44.4%, respectively. When analyzing only patients with high-risk KRAS mutations, median and 5-year OS decreased to: (1) 26.2 months and 0%, (2) 22.3 months and 15.1%, and (3) not reached and 44.9%, respectively. The fourth study utilized BRAF, and its validation showed a median OS of 10.4 months, with no survivors beyond 21 months. CONCLUSION: The combinations of biomarkers and clinical factors proposed to render surgery for CRLM futile, as presented in studies 1 (KRAS high-risk mutations) and 4, appear justified. In these studies, there were no long-term survivors, and survival was similar to that of historic cohorts with similar mutational profiles that received systemic therapies alone for unresectable disease.
RESUMO
BACKGROUND: To date, only two studies have compared the outcomes of patients with liver-limited BRAF V600E-mutated colorectal liver metastases (CRLMs) managed with resection versus systemic therapy alone, and these have reported contradictory findings. METHODS: In this observational, international, multicentre study, patients with liver-limited BRAF V600E-mutated CRLMs treated with resection or systemic therapy alone were identified from institutional databases. Patterns of recurrence/progression and overall survival were compared using multivariable analyses of the entire cohort and a propensity score-matched cohort. RESULTS: Of 170 patients included, 119 underwent hepatectomy and 51 received systemic treatment. Surgically treated patients had a more favourable pattern of recurrence with most recurrences limited to a single site, whereas diffuse progression was more common among patients who received systemic treatment (19 versus 44%; P = 0.002). Surgically treated patients had longer median overall survival (35 versus 20 months; P < 0.001). Hepatectomy was independently associated with better OS than systemic treatment alone (HR 0.37, 95% c.i. 0.21 to 0.65). In the propensity score-matched cohort, surgically treated patients had longer median overall survival (28 versus 20 months; P < 0.001); hepatectomy was independently associated with better overall survival (HR 0.47, 0.25 to 0.88). CONCLUSION: BRAF V600E mutation should not be considered a contraindication to surgery for patients with resectable, liver-only CRLMs.
Assuntos
Neoplasias Colorretais , Hepatectomia , Neoplasias Hepáticas , Proteínas Proto-Oncogênicas B-raf , Humanos , Neoplasias Hepáticas/secundário , Neoplasias Hepáticas/genética , Neoplasias Hepáticas/cirurgia , Neoplasias Hepáticas/mortalidade , Hepatectomia/métodos , Neoplasias Colorretais/genética , Neoplasias Colorretais/patologia , Proteínas Proto-Oncogênicas B-raf/genética , Masculino , Feminino , Estudos Retrospectivos , Pessoa de Meia-Idade , Idoso , Mutação , Pontuação de Propensão , Recidiva Local de Neoplasia/genética , Adulto , Resultado do TratamentoRESUMO
INTRODUCTION: Given the importance of understanding COVID-19-positive donor incidence and acceptance, we characterize chronological and geographic variations in COVID-19 incidence relative to COVID-19-positive donor acceptance. METHODS: Data on deceased donors and recipients of liver and kidney transplants were obtained from the UNOS database between 2020 and 2023. Hierarchical cluster analysis was used to assess trends in COVID-19-positive donor incidence. Posttransplant graft and patient survival were assessed using Kaplan-Meier curves. RESULTS: From among 38 429 deceased donors, 1517 were COVID-19 positive. Fewer kidneys (72.4% vs. 76.5%, p < 0.001) and livers (56.4% vs. 62.0%, p < 0.001) were used from COVID-19-positive donors versus COVID-19-negative donors. Areas characterized by steadily increased COVID-19 donor incidence exhibit the highest transplantation acceptance rates (92.33%), followed by intermediate (84.62%) and rapidly increased (80.00%) COVID-19 incidence areas (p = 0.016). Posttransplant graft and patient survival was comparable among recipients, irrespective of donor COVID-19 status. CONCLUSIONS: Regions experiencing heightened rates of COVID-19-positive donors are associated with decreased acceptance of liver and kidney transplantation. Similar graft and patient survival is noted among recipients, irrespective of donor COVID-19 status. These findings emphasize the need for adaptive practices and unified medical consensus in navigating a dynamic pandemic.
Assuntos
COVID-19 , Sobrevivência de Enxerto , Transplante de Rim , Transplante de Fígado , SARS-CoV-2 , Doadores de Tecidos , Humanos , COVID-19/epidemiologia , Incidência , Masculino , Feminino , Doadores de Tecidos/provisão & distribuição , Doadores de Tecidos/estatística & dados numéricos , Pessoa de Meia-Idade , Adulto , Obtenção de Tecidos e Órgãos/estatística & dados numéricos , Idoso , Taxa de Sobrevida , Transplantados/estatística & dados numéricos , Estados Unidos/epidemiologiaRESUMO
BACKGROUND: Introducing new liver transplantation (LT) practices, like unconventional donor use, incurs higher costs, making evaluation of their prognostic justification crucial. This study reexamines the spread pattern of new LT practices and its prognosis across the United States. METHODS: The study investigated the spread pattern of new practices using the UNOS database (2014-2023). Practices included LT for hepatitis B/C (HBV/HCV) nonviremic recipients with viremic donors, LT for COVID-19-positive recipients, and LT using onsite machine perfusion (OMP). One year post-LT patient and graft survival were also evaluated. RESULTS: LTs using HBV/HCV donors were common in the East, while LTs for COVID-19 recipients and those using OMP started predominantly in California, Arizona, Texas, and the Northeast. K-means cluster analysis identified three adoption groups: facilities with rapid, slow, and minimal adoption rates. Rapid adoption occurred mainly in high-volume centers, followed by a gradual increase in middle-volume centers, with little increase in low-volume centers. The current spread patterns did not significantly affect patient survival. Specifically, for LTs with HCV donors or COVID-19 recipients, patient and graft survivals in the rapid-increasing group was comparable to others. In LTs involving OMP, the rapid- or slow-increasing groups tended to have better patient survival (p = 0.05) and significantly improved graft survival rates (p = 0.02). Facilities adopting new practices often overlap across different practices. DISCUSSION: Our analysis revealed three distinct adoption groups across all practices, correlating the adoption aggressiveness with LT volume in centers. Aggressive adoption of new practices did not compromise patient and graft survivals, supporting the current strategy. Understanding historical trends could predict the rise in future LT cases with new practices, aiding in resource distribution.
Assuntos
COVID-19 , Sobrevivência de Enxerto , Transplante de Fígado , SARS-CoV-2 , Humanos , Transplante de Fígado/estatística & dados numéricos , Estados Unidos/epidemiologia , COVID-19/epidemiologia , Feminino , Masculino , Pessoa de Meia-Idade , Obtenção de Tecidos e Órgãos/estatística & dados numéricos , Doadores de Tecidos/provisão & distribuição , Doadores de Tecidos/estatística & dados numéricos , Adulto , Taxa de Sobrevida , Prognóstico , Padrões de Prática Médica/estatística & dados numéricosRESUMO
BACKGROUND: Outcomes of intestinal transplantation with colon allograft (ICTx) remain controversial. We aimed to assess the outcomes of ICTx in comparison to intestinal transplantation without colon (ITx) using the UNOS/OPTN registry database. METHODS: We retrospectively reviewed 2612 patients who received primary intestinal transplants from 1998 to 2020. The rates of acute rejection (AR) within 6 months after transplant were compared between ICTx and ITx. Risk factors of 6-month AR were examined using logistic regression model by era. Furthermore, conditional graft survival was analyzed to determine long-term outcomes of ICTx. RESULTS: Of 2612 recipients, 506 (19.4%) received ICTx. Graft and patient survival in ICTx recipients were comparable to those in ITx recipients. White ICTx recipients had a higher incidence of AR within 6 months compared to ITx during the entire study period (p = .002), colonic inclusion did not increase the risk of 6-month AR in the past decade. ICTx recipients who experienced 6-month AR had worse graft and patient survival compared to those who did not (p <.001 and p = .004, respectively). Among patients who did not develop 6-month AR, Cox proportional hazard model analysis revealed that colonic inclusion was independently associated with improved conditional graft survival. CONCLUSIONS: In the recent transplant era, colonic inclusion is no longer associated with a heightened risk of 6-month AR and may provide better long-term survival compared to ITx when AR is absent. Risk adjustment for rejection and proper immunosuppressive therapy are crucial to maximize the benefits of colonic inclusion.
Assuntos
Transplante de Rim , Humanos , Estudos Retrospectivos , Rejeição de Enxerto/etiologia , Transplante Homólogo , Sobrevivência de Enxerto , AloenxertosRESUMO
BACKGROUND: The incidence of graft failure following liver transplantation (LTx) is consistent. While traditional risk scores for LTx have limited accuracy, the potential of machine learning (ML) in this area remains uncertain, despite its promise in other transplant domains. This study aims to determine ML's predictive limitations in LTx by replicating methods used in previous heart transplant research. METHODS: This study utilized the UNOS STAR database, selecting 64,384 adult patients who underwent LTx between 2010 and 2020. Gradient boosting models (XGBoost and LightGBM) were used to predict 14, 30, and 90-day graft failure compared to conventional logistic regression model. Models were evaluated using both shuffled and rolling cross-validation (CV) methodologies. Model performance was assessed using the AUC across validation iterations. RESULTS: In a study comparing predictive models for 14-day, 30-day and 90-day graft survival, LightGBM consistently outperformed other models, achieving the highest AUC of.740,.722, and.700 in shuffled CV methods. However, in rolling CV the accuracy of the model declined across every ML algorithm. The analysis revealed influential factors for graft survival prediction across all models, including total bilirubin, medical condition, recipient age, and donor AST, among others. Several features like donor age and recipient diabetes history were important in two out of three models. CONCLUSIONS: LightGBM enhances short-term graft survival predictions post-LTx. However, due to changing medical practices and selection criteria, continuous model evaluation is essential. Future studies should focus on temporal variations, clinical implications, and ensure model transparency for broader medical utility.
Assuntos
Transplante de Fígado , Adulto , Humanos , Transplante de Fígado/efeitos adversos , Projetos de Pesquisa , Algoritmos , Bilirrubina , Aprendizado de MáquinaRESUMO
BACKGROUND: Donors with hyperbilirubinemia are often not utilized for liver transplantation (LT) due to concerns about potential liver dysfunction and graft survival. The potential to mitigate organ shortages using such donors remains unclear. METHODS: This study analyzed adult deceased donor data from the United Network for Organ Sharing database (2002-2022). Hyperbilirubinemia was categorized as high total bilirubin (3.0-5.0 mg/dL) and very high bilirubin (≥5.0 mg/dL) in brain-dead donors. We assessed the impact of donor hyperbilirubinemia on 3-month and 3-year graft survival, comparing these outcomes to donors after circulatory death (DCD). RESULTS: Of 138 622 donors, 3452 (2.5%) had high bilirubin and 1999 (1.4%) had very high bilirubin levels. Utilization rates for normal, high, and very high bilirubin groups were 73.5%, 56.4%, and 29.2%, respectively. No significant differences were found in 3-month and 3-year graft survival between groups. Donors with high bilirubin had superior 3-year graft survival compared to DCD (hazard ratio .83, p = .02). Factors associated with inferior short-term graft survival included recipient medical condition in intensive care unit (ICU) and longer cold ischemic time; factors associated with inferior long-term graft survival included older donor age, recipient medical condition in ICU, older recipient age, and longer cold ischemic time. Donors with ≥10% macrosteatosis in the very high bilirubin group were also associated with worse 3-year graft survival (p = .04). DISCUSSION: The study suggests that despite many grafts with hyperbilirubinemia being non-utilized, acceptable post-LT outcomes can be achieved using donors with hyperbilirubinemia. Careful selection may increase utilization and expand the donor pool without negatively affecting graft outcome.
Assuntos
Fígado , Obtenção de Tecidos e Órgãos , Adulto , Humanos , Prognóstico , Doadores de Tecidos , Sobrevivência de Enxerto , Hiperbilirrubinemia/etiologia , Bilirrubina , Estudos RetrospectivosRESUMO
BACKGROUND: Over the last decade there has been a surge in overdose deaths due to the opioid crisis. We sought to characterize the temporal change in overdose donor (OD) use in liver transplantation (LT), as well as associated post-LT outcomes, relative to the COVID-19 era. METHODS: LT candidates and donors listed between January 2016 and September 2022 were identified from the Scientific Registry of Transplant Recipients database. Trends in LT donors and changes related to OD were assessed pre- versus post-COVID-19 (February 2020). RESULTS: Between 2016 and 2022, most counties in the United States experienced an increase in overdose-related deaths (n = 1284, 92.3%) with many counties (n = 458, 32.9%) having more than a doubling in drug overdose deaths. Concurrently, there was an 11.2% increase in overall donors, including a 41.7% increase in the number of donors who died from drug overdose. In pre-COVID-19 overdose was the 4th top mechanism of donor death, while in the post-COVID-19 era, overdose was the 2nd most common cause of donor death. OD was younger (OD: 35 yrs, IQR 29-43 vs. non-OD: 43 yrs, IQR 31-56), had lower body mass index (≥35 kg/cm2, OD: 31.2% vs. non-OD: 33.5%), and was more likely to be HCV+ (OD: 28.9% vs. non-OD: 5.4%) with lower total bilirubin (≥1.1 mg/dL, OD: 12.9% vs. non-OD: 20.1%) (all p < .001). Receipt of an OD was not associated with worse graft survival (HR .94, 95% CI .88-1.01, p = .09). CONCLUSIONS: Opioid deaths markedly increased following the COVID-19 pandemic, substantially altering the LT donor pool in the United States.