RESUMO
BACKGROUND: In the United States, hepatitis C virus-associated hepatocellular carcinoma incidence and mortality are highest among minorities. Socioeconomic constraints play a major role in inequitable treatment. We evaluated the association between race/ethnicity and outcomes in a population that overcame treatment barriers. METHODS: We report a retrospective cohort study of 666 patients across 20 institutions in the United States Hepatocellular Carcinoma Liver Transplantation Consortium from 2015 to 2019 with hepatitis C virus-associated hepatocellular carcinoma who completed direct-acting antiviral therapy and underwent liver transplantation. Patients were excluded if they had a prior liver transplantation, hepatocellular carcinoma recurrence, no prior liver-directed therapy, or if race/ethnicity data were unavailable. Patients were stratified by race/ethnicity. Primary outcomes were recurrence-free survival and overall survival, and secondary outcome was major postoperative complication. RESULTS: Race/ethnicity was not associated with differences in 5-year recurrence-free survival (White 90%, Black 88%, Hispanic 92%, Other 87%; p = 0.85), overall survival (White 85%, Black 84%, Hispanic 84%, Other 93%; p = 0.70), or major postoperative complication. CONCLUSIONS: Race/ethnicity was not associated with worse oncologic or postoperative outcomes among those who completed direct-acting antiviral therapy and underwent liver transplantation, suggesting that overcoming socioeconomic constraints equalizes outcomes across racial/ethnic groups. Eliminating barriers that prohibit care access among minorities must be a priority.
RESUMO
OBJECTIVE: Nutcracker syndrome is a rare condition that involves mechanical compression of the left renal vein, leading to chronic and debilitating left flank pain. The etiology of the pain is misdiagnosed frequently, and patients usually require long-term opioid use to manage their pain. Multiple therapeutic options for nutcracker syndrome have been described in the literature but the reports are limited by small numbers of patients, and the lack of convincing data demonstrating consistently improved outcomes. Here we report the largest series to date of patients undergoing renal autotransplantation for the treatment of nutcracker syndrome. METHODS: We performed a multicenter retrospective cohort review of patients 105 patients with nutcracker syndrome who underwent renal autotransplantation as a primary or salvage therapy. RESULTS: During the overall study period, 93.1% of patients treated with autotransplantation had durable, complete flank pain relief at 12 months with both open and robotic surgical approach. After autotransplantation, a statistically significant decrease in the percentage of patients using opioids from 48.6% to 17.0% was demonstrated at 12 months. In those patients using opioids before autotransplantation, a statistically significant decrease in morphine milligram equivalents was demonstrated from an alarming 68.9 ± 15.0 per day to 25.0 ± 11.02 morphine milligram equivalents per day. CONCLUSIONS: Our findings suggest that renal autotransplantation, as a primary treatment or a salvage treatment, in patients with nutcracker syndrome provides durable pain relief and a marked decrease in chronic opioid use regardless of surgical approach.
RESUMO
BACKGROUND: Liver transplantation (LT) is the treatment of choice for end-stage liver disease and certain malignancies such as hepatocellular carcinoma (HCC). Data on the surgical management of de novo or recurrent tumors that develop in the transplanted allograft are limited. This study aimed to investigate the perioperative and long-term outcomes for patients undergoing hepatic resection for de novo or recurrent tumors after liver transplantation. METHODS: The study enrolled adult and pediatric patients from 12 centers across North America who underwent hepatic resection for the treatment of a solid tumor after LT. Perioperative outcomes were assessed as well as recurrence free survival (RFS) and overall survival (OS) for those undergoing resection for HCC. RESULTS: Between 2003 and 2023, 54 patients underwent hepatic resection of solid tumors after LT. For 50 patients (92.6 %), resection of malignant lesions was performed. The most common lesion was HCC (n = 35, 64.8 %), followed by cholangiocarcinoma (n = 6, 11.1 %) and colorectal liver metastases (n = 6, 11.1 %). The majority of the 35 patients underwent resection of HCC did not receive any preoperative therapy (82.9 %) or adjuvant therapy (71.4 %), with resection their only treatment method for HCC. During a median follow-up period of 50.7 months, the median RFS was 21.5 months, and the median OS was 49.6 months. CONCLUSION: Hepatic resection following OLT is safe and associated with morbidity and mortality rates that are comparable to those reported for patients undergoing resection in native livers. Hepatic resection as the primary and often only treatment modality for HCC following LT is associated with acceptable RFS and OS and should be considered in well selected patients.
RESUMO
With improved medical treatments, the prognosis for many malignancies has improved, and more patients are presenting for transplant evaluation with a history of treated cancer. Solid organ transplant (SOT) recipients with a prior malignancy are at higher risk of posttransplant recurrence or de novo malignancy, and they may require a cancer surveillance program that is individualized to their specific needs. There is a dearth of literature on optimal surveillance strategies specific to SOT recipients. A working group of transplant physicians and cancer-specific specialists met to provide expert opinion recommendations on optimal cancer surveillance after transplantation for patients with a history of malignancy. Surveillance strategies provided are mainly based on general population recurrence risk data, immunosuppression effects, and limited transplant-specific data and should be considered expert opinion based on current knowledge. Prospective studies of cancer-specific surveillance models in SOT recipients should be supported to inform posttransplant management of this high-risk population.
RESUMO
BACKGROUND: Solid organ transplantation is a risk predictor for virally-mediated anal squamous intraepithelial lesions and cancer (anal disease). Precancerous squamous intraepithelial lesions can be detected by screening, and treatment may prevent cancer progression. Screening recommendations are not well defined. We aim to define prevalence and describe risk predictors for anal disease in a large population of solid organ transplant recipients. METHODS: Retrospective single-center cohort analysis included solid organ transplant recipients cared for between 2001 and 2022 (Nâ =â 15 362). The cohort of recipients who developed anal disease was compared with those who did not. Greedy propensity score matching was performed for organ-specific recipients, and time-to-event analysis for the development of anal disease was performed in those with genitourinary human papilloma virus (HPV) disease versus those without. RESULTS: Prevalence of anal disease was 0.6% (cancer 0.2%). The average years from transplant to the diagnosis of anal disease was 11.67. Anal disease was more common in women (68.5% versus 31.5%, P â <â 0.001), patients who had other HPV-related genitourinary diseases (40.4% versus 0.6%, P â <â 0.001), who were of younger age at transplant (39.62 versus 46.58, P â <â 0.001), and had increased years from transplant (17.06 versus 12.57, P â <â 0.001). In multivariate analysis, the odds of anal disease increased by 4% each year posttransplant. History of genitourinary HPV disease (odds ratio 69.63) and female sex (odds ratio 1.96) were the most significant risk predictors for anal disease. CONCLUSIONS: The prevalence of anal cancer among solid organ transplant recipients was equal to the general population (0.2%). Due to the low prevalence of overall disease, these data suggest that anal screenings in transplant recipients should be targeted to higher-risk subsets: female recipients farther out from transplant and patients with genitourinary HPV-related diseases.
Assuntos
Neoplasias do Ânus , Transplante de Órgãos , Infecções por Papillomavirus , Humanos , Feminino , Neoplasias do Ânus/virologia , Neoplasias do Ânus/epidemiologia , Masculino , Estudos Retrospectivos , Pessoa de Meia-Idade , Infecções por Papillomavirus/epidemiologia , Infecções por Papillomavirus/diagnóstico , Infecções por Papillomavirus/virologia , Transplante de Órgãos/efeitos adversos , Fatores de Risco , Prevalência , Adulto , Idoso , Lesões Intraepiteliais Escamosas/virologia , Lesões Intraepiteliais Escamosas/epidemiologia , Medição de Risco , Transplantados , Fatores de Tempo , Papillomaviridae/isolamento & purificaçãoRESUMO
Background: Kidney transplant outcomes have dramatically improved since the first successful transplant in 1954. In its early years, kidney transplantation was viewed more skeptically. Today it is considered the treatment of choice among patients with end-stage kidney disease. Methods: Our program performed its first kidney transplant in 1966 and recently performed our 12 000th kidney transplant. Here, we review and describe our experience with these 12 000 transplants. Transplant recipients were analyzed by decade of date of transplant: 1966-1975, 1976-1985, 1986-1995, 1996-2005, 2006-2015, and 2016-2022. Death-censored graft failure and mortality were outcomes of interest. Results: Of 12 000 kidneys, 247 were transplanted from 1966 to 1975, 1147 from 1976 to 1985, 2194 from 1986 to 1995, 3147 from 1996 to 2005, 3046 from 2006 to 2015, and 2219 from 2016 to 2022 compared with 1966-1975, there were statistically significant and progressively lower risks of death-censored graft failure at 1 y, 5 y, and at last follow-up in all subsequent eras. Although mortality at 1 y was lower in all subsequent eras after 1986-1995, there was no difference in mortality at 5 y or the last follow-up between eras. Conclusions: In this large cohort of 12 000 kidneys from a single center, we observed significant improvement in outcomes over time. Kidney transplantation remains a robust and ever-growing and improving field.
RESUMO
Existing literature offers conflicting conclusions about whether early acute cellular rejection influences long-term outcomes in liver transplantation. We retrospectively collected donor and recipient data on all adult, first-time liver transplants performed at a single center between 2008 and 2020. We divided this population into two cohorts based on the presence of early biopsy-proven acute cellular rejection (EBPR) within the first 90 days post-transplant and compared outcomes between the groups. There were 896 liver transplants that met inclusion criteria with 112 cases (12.5%) of EBPR. Recipients who developed EBPR had higher biochemical Model for End-Stage Liver Disease scores (28 vs. 24, p < .01), but other donor and recipient characteristics were similar. Recipients with EBPR had similar overall survival compared to patients without EBPR (p = .09) but had decreased graft survival (p < .05). EBPR was also associated with decreased time to first episode of late (> 90 days post-transplant) rejection (p < .0001) and increased vulnerability to bacterial and viral infection (p < .05). In subgroup analysis of recipients with autoimmune indications for liver transplantation, EBPR had a more pronounced association with patient death (hazard ratio [HR] 3.9, p < .05) and graft loss (HR 4.0, p < .01). EBPR after liver transplant is associated with inferior graft survival, increased susceptibility to late rejections, and increased vulnerability to infection.
Assuntos
Doença Hepática Terminal , Transplante de Fígado , Adulto , Humanos , Transplante de Fígado/efeitos adversos , Estudos Retrospectivos , Índice de Gravidade de Doença , Rejeição de Enxerto/diagnóstico , Rejeição de Enxerto/etiologia , Biópsia , Sobrevivência de EnxertoRESUMO
Despite the increased usage of livers from donation after circulatory death (DCD) donors in the last decade, many patients remaining on the waitlist who need a liver transplant. Recent efforts have focused on maximizing the utilization and outcomes of these allografts using advances in machine perfusion technology and other perioperative strategies such as normothermic regional perfusion (NRP). In addition to the standard donor and recipient matching that is required with DCD donation, new data regarding the impact of graft steatosis, extensive European experience with NRP, and the increasing use of normothermic and hypothermic machine perfusion have shown immense potential in increasing DCD organ overall utilization and improved outcomes. These techniques, along with viability testing of extended criteria donors, have generated early promising data to consider the use of higher-risk donor organs and more widespread adoption of these techniques in the United States. This review explores the most recent international literature regarding strategies to optimize the utilization and outcomes of DCD liver allografts, including donor-recipient matching, perioperative strategies including NRP versus rapid controlled DCD recovery, viability assessment of discarded livers, and postoperative strategies including machine perfusion versus pharmacologic interventions.
Assuntos
Transplante de Fígado , Obtenção de Tecidos e Órgãos , Humanos , Estados Unidos , Preservação de Órgãos/métodos , Doadores de Tecidos , Perfusão/métodos , Fígado , Transplante de Fígado/efeitos adversos , Transplante de Fígado/métodos , Sobrevivência de Enxerto , MorteRESUMO
INTRODUCTION: Failure to achieve planned same-day discharge (SDD) primary total joint arthroplasty (TJA) occurs in as many as 7% to 49% of patients in the United States. This study evaluated the association between 43 perioperative risk factors and SDD failure rates. METHODS: A retrospective analysis of prospectively collected data from 466 primary TJAs with planned SDD to home was performed. Surgeries were performed at an academic tertiary care center comprising a hospital facility and a stand-alone ambulatory surgery center (ASC) on the same campus. Factors associated with failed SDD were identified using a multivariable analysis. RESULTS: Only one of 316 (0.3%) patients who underwent surgery in the ASC failed planned SDD ( P < 0.001) compared with 33.3% of 150 patients who underwent surgery in the hospital. The ASC failure was because of pain that interfered with physical therapy. Sixty-two percent (n = 31) of hospital failures were attributed to medical complications, 24% (n = 12) to physical therapy clearance, 8% (n = 4) to not being seen by internal medicine or therapy on the day of surgery, and 6% (n = 3) to unknown causes. Failure was increased in patients with preoperative anemia ( P = 0.003), nonwhite patients ( P = 0.002), patients taking depression/anxiety medication ( P = 0.015), and for every 10-morphine milligram equivalent increase in opioids consumed per hour in the postacute care unit ( P = 0.030). DISCUSSION: Risk stratification methods used to allocate patients to ASC versus hospital outpatient TJA surgery predicted SDD success. Most failures were secondary to medical causes. The findings of this study may be used to improve perioperative protocols enabling the safe planning and selection of patients for SDD pathways.
Assuntos
Artroplastia de Quadril , Alta do Paciente , Humanos , Estudos Retrospectivos , Artroplastia/efeitos adversos , Fatores de Risco , Hospitais , Tempo de Internação , Complicações Pós-Operatórias/epidemiologia , Complicações Pós-Operatórias/etiologia , Artroplastia de Quadril/efeitos adversosRESUMO
STUDY DESIGN: Prospective randomized controlled trial. OBJECTIVE: Compare range of motion (ROM) and adjacent segment degeneration (ASD) following cervical disc arthroplasty (CDA) versus anterior cervical discectomy and fusion (ACDF) at 20-year follow-up. SUMMARY OF BACKGROUND DATA: Anterior cervical discectomy and fusion is the standard of treatment for single-level cervical disc degeneration causing radiculopathy. CDA is claimed to reduce shear strain, and adjacent-level ROM changes are hypothesized to hasten ASD with ACDF. MATERIALS AND METHODS: This study collected data on 47 patients randomized to ACDF or CDA. Lateral cervical spine radiographs were evaluated preoperatively, postoperatively, and at 20 years for alignment, ROM, ASD, and heterotopic ossification. RESULTS: Eighty-two percent (18/22) of CDA patients and 84% (21/25) of ACDF patients followed up at 20 years. At 20 years, total cervical (C2-C7) ROM was statistically different between the CDA and fusion groups (47.8° vs . 33.4°, P =0.005). Total cervical ROM was not significantly different between preoperative and 20-year periods following CDA (45.6° vs . 47.4°, P =0.772) or ACDF (40.6° vs . 33.0°, P =0.192). Differences in postoperative and 20-year index-level ROM following CDA were not significant (10.1° vs . 10.2°, P =0.952). Final ASD grading was statistically lower following CDA versus ACDF at both adjacent levels ( P <0.005). Twenty-year adjacent-level ossification development was increased following ACDF versus CDA ( P <0.001). Polyethylene mean thickness decreased from 9.4 mm immediately postoperatively to 9.1 mm at 20-year follow up ( P =0.013). Differences in adjacent-level ROM from preoperative to 20-year follow-up in both the ACDF and CDA groups did not meet statistical significance ( P >0.05). CONCLUSIONS: Cervical disc arthroplasty maintains index-level and total cervical ROM with very long-term follow-up. Total cervical ROM was higher at 20 years in CDA relative to ACDF. CDA results in lower rates of ASD and adjacent-level ossification development than ACDF.
Assuntos
Degeneração do Disco Intervertebral , Fusão Vertebral , Humanos , Estudos Prospectivos , Resultado do Tratamento , Vértebras Cervicais/cirurgia , Fusão Vertebral/métodos , Degeneração do Disco Intervertebral/cirurgia , Discotomia/métodos , Artroplastia/métodos , Amplitude de Movimento Articular , SeguimentosRESUMO
STUDY DESIGN: Prospective, randomized, controlled trial. OBJECTIVE: To compare clinical outcomes of anterior cervical discectomy and fusion (ACDF) and cervical disk arthroplasty (CDA) at 20 years. SUMMARY OF BACKGROUND DATA: Concern for adjacent-level disease after ACDF prompted the development of CDA. MATERIALS AND METHODS: Forty-seven patients with single-level cervical radiculopathy were randomized to either BRYAN CDA or ACDF for a Food and Drug Administration Investigational Device Exemption trial. At 20 years, patient-reported outcomes, including visual analog scales (VAS) for neck and arm pain, neck disability index (NDI), and reoperation rates, were analyzed. RESULTS: Follow-up rate was 91.3%. Both groups showed significantly better NDI, VAS arm pain, and VAS neck pain scores at 20 years versus preoperative scores. Comparing CDA versus ACDF, there was no difference at 20 years in mean scores for NDI [11.1 (SD 14.1) vs. 19.9 (SD 17.2), P =0.087], mean VAS arm pain [0.9 (SD 2.4) vs. 2.3 (SD 2.8), P =0.095], or mean VAS neck pain [1.2 (SD 2.5) vs. 2.9 (3.3), P =0.073]. There was a significant difference between CDA versus ACDF groups in the change in VAS neck pain score between 10 and 20 years [respectively, -0.4 (SD 2.5) vs. 1.5 (SD 2.5), P =0.030]. Reoperations were reported in 41.7% of ACDF patients and 10.0% of CDA patients ( P =0.039). CONCLUSIONS: Both CDA and ACDF are effective in treating cervical radiculopathy with sustained improvement in NDI, VAS neck and VAS arm pain at 20 years. CDA demonstrates lower reoperation rates than ACDF. There were no failures of the arthroplasty device requiring reoperation at the index level. The symptomatic nonunion rate of ACDF was 4.2% at 20 years. Despite a higher reoperation rate in the CDA group versus ACDF group, there was no difference in the 20-year NDI, VAS Neck, and VAS arm pain scores.
Assuntos
Degeneração do Disco Intervertebral , Radiculopatia , Fusão Vertebral , Humanos , Cervicalgia/etiologia , Cervicalgia/cirurgia , Resultado do Tratamento , Radiculopatia/cirurgia , Estudos Prospectivos , Vértebras Cervicais/cirurgia , Artroplastia , Discotomia , Degeneração do Disco Intervertebral/cirurgiaRESUMO
Here we test the hypothesis that, like CD81-associated "latent" IL35, the transforming growth factor (TGF)ß:latency-associated peptide (LAP)/glycoprotein A repetitions predominant (GARP) complex was also tethered to small extracellular vesicles (sEVs), aka exosomes, produced by lymphocytes from allo-tolerized mice. Once these sEVs are taken up by conventional T cells, we also test whether TGFß could be activated suppressing the local immune response. Methods: C57BL/6 mice were tolerized by i.p. injection of CBA/J splenocytes followed by anti-CD40L/CD154 antibody treatment on days 0, 2, and 4. On day 35, spleen and lymph nodes were extracted and isolated lymphocytes were restimulated with sonicates of CBA splenocytes overnight. sEVs were extracted from culture supernatants by ultracentrifugation (100 000g) and assayed for (a) the presence of TGFß:LAP associated with tetraspanins CD81,CD63, and CD9 by enzyme-linked immunosorbent assay; (b) GARP, critical to membrane association of TGFß:LAP and to activation from its latent form, as well as various TGFß receptors; and (c) TGFß-dependent function in 1° and 2° immunosuppression of tetanus toxoid-immunized B6 splenocytes using trans-vivo delayed-type hypersensitivity assay. Results: After tolerization, CBA-restimulated lymphocytes secreted GARP/TGFß:LAP-coated extracellular vesicles. Like IL35 subunits, but unlike IL10, which was absent from ultracentrifuge pellets, GARP/TGFß:LAP was mainly associated with CD81+ exosomes. sEV-bound GARP/TGFß:LAP became active in both 1° and 2° immunosuppression, the latter requiring sEV uptake by "bystander" T cells and reexpression on the cell surface. Conclusions: Like other immune-suppressive components of the Treg exosome, which are produced in a latent form, exosomal GARP/TGFß:LAP produced by allo-specific regulatory T cells undergoes either immediate activation (1° suppression) or internalization by naive T cells, followed by surface reexpression and subsequent activation (2°), to become suppressive. Our results imply a membrane-associated form of TGFß:LAP that, like exosomal IL35, can target "bystander" lymphocytes. This new finding implicates exosomal TGFß:LAP along with Treg-derived GARP as part of the infectious tolerance network.
RESUMO
BACKGROUND: Our research showed that patients with alcohol-associated liver disease (ALD) had more severe liver disease than those without a diagnosis of ALD yet were less likely to be selected for transplant listing due to their increased psychosocial vulnerability. This study aims to answer whether this vulnerability translates to worse short-term outcomes after transplant listing. METHODS: A total of 187 patients were approved for liver transplant listing and are included in the present retrospective study. We collected dates of transplantation, retransplantation, death, and pathologic data for evidence of rejection, and reviewed alcohol biomarkers and documentation for evidence of alcohol use. RESULTS: The ALD cohort had higher Stanford Integrated Psychosocial Assessment for Transplant (SIPAT) scores (39.4 vs. 22.5, p <0.001) and Model for End-Stage Liver Disease (MELD)-Na scores (25.0 vs. 18.5, p <0.001) compared with the non-ALD cohort. Forty-nine (59.7%) subjects with ALD and 60 (57.1%, p =0.71) subjects without ALD subsequently received a liver transplant. Overall mortality was similar between the 2 groups (20.7% ALD vs. 21.0% non-ALD, p =0.97). Neither the SIPAT score (HR: 0.98, 95% CI: 0.96-1.00, p =0.11) nor MELD-Na score (HR 0.99, 95% CI 0.95-1.02, p =0.40) were associated with mortality. Patients with ALD were more likely to have alcohol biomarkers tested both before (84.1% vs. 24.8% non-ALD, p <0.001) and after liver transplantation (74.0% vs. 16.7% non-ALD, p <0.001). SIPAT score was associated with alcohol use after listing (OR: 1.03, 95% CI: 1.0-1.07, p =0.04), although a return to alcohol use was not associated with mortality (HR: 1.60, 95% CI: 0.63-4.10, p =0.33). CONCLUSION: Patients with ALD had higher psychosocial risk compared with patients without a diagnosis of ALD who were placed on the waitlist, but had similar short-term outcomes including mortality, transplantation, and rejection. Although a high SIPAT score was predictive of alcohol use, in the short-term, alcohol use after transplant listing was not associated with mortality.
Assuntos
Doença Hepática Terminal , Hepatopatias Alcoólicas , Transplante de Fígado , Humanos , Doença Hepática Terminal/diagnóstico , Doença Hepática Terminal/cirurgia , Estudos Retrospectivos , Índice de Gravidade de Doença , BiomarcadoresRESUMO
BACKGROUND: Ischemia-reperfusion (IR) of a kidney transplant (KTx) upregulates TNF α production that amplifies allograft inflammation and may negatively affect transplant outcomes. METHODS: We tested the effects of blocking TNF peri-KTx via a randomized, double-blind, placebo-controlled, 15-center, phase 2 clinical trial. A total of 225 primary transplant recipients of deceased-donor kidneys (KTx; 38.2% Black/African American, 44% White) were randomized to receive intravenous infliximab (IFX) 3 mg/kg or saline placebo (PLBO) initiated before kidney reperfusion. All patients received rabbit anti-thymocyte globulin induction and maintenance immunosuppression (IS) with tacrolimus, mycophenolate mofetil, and prednisone. The primary end point was the difference between groups in mean 24-month eGFR. RESULTS: There was no difference in the primary end point of 24-month eGFR between IFX (52.45 ml/min per 1.73 m 2 ; 95% CI, 48.38 to 56.52) versus PLBO (57.35 ml/min per 1.73 m 2 ; 95% CI, 53.18 to 61.52; P =0.1). There were no significant differences between groups in rates of delayed graft function, biopsy-proven acute rejection (BPAR), development of de novo donor-specific antibodies, or graft loss/death. Immunosuppression did not differ, and day 7 post-KTx plasma analyses showed approximately ten-fold lower TNF ( P <0.001) in IFX versus PLBO. BK viremia requiring IS change occurred more frequently in IFX (28.9%) versus PLBO (13.4%; P =0.004), with a strong trend toward higher rates of BKV nephropathy in IFX (13.3%) versus PLBO (4.9%; P =0.06). CONCLUSIONS: IFX induction therapy does not benefit recipients of kidney transplants from deceased donors on this IS regimen. Because the intervention unexpectedly increased rates of BK virus infections, our findings underscore the complexities of targeting peritransplant inflammation as a strategy to improve KTx outcomes.Clinical Trial registry name and registration number:clinicaltrials.gov (NCT02495077).
Assuntos
Vírus BK , Transplante de Rim , Viroses , Humanos , Imunossupressores/uso terapêutico , Transplante de Rim/efeitos adversos , Transplante de Rim/métodos , Infliximab/uso terapêutico , Rejeição de Enxerto/prevenção & controle , Inflamação/tratamento farmacológico , Viroses/tratamento farmacológicoRESUMO
OBJECTIVE: To define benchmark values for liver transplantation (LT) in patients with perihilar cholangiocarcinoma (PHC) enabling unbiased comparisons. BACKGROUND: Transplantation for PHC is used with reluctance in many centers and even contraindicated in several countries. Although benchmark values for LT are available, there is a lack of specific data on LT performed for PHC. METHODS: PHC patients considered for LT after Mayo-like protocol were analyzed in 17 reference centers in 2 continents over the recent 5-year period (2014-2018). The minimum follow-up was 1 year. Benchmark patients were defined as operated at high-volume centers (≥50 overall LT/year) after neoadjuvant chemoradiotherapy, with a tumor diameter <3 cm, negative lymph nodes, and with the absence of relevant comorbidities. Benchmark cutoff values were derived from the 75th to 25th percentiles of the median values of all benchmark centers. RESULTS: One hundred thirty-four consecutive patients underwent LT after completion of the neoadjuvant treatment. Of those, 89.6% qualified as benchmark cases. Benchmark cutoffs were 90-day mortality ≤5.2%; comprehensive complication index at 1 year of ≤33.7; grade ≥3 complication rates ≤66.7%. These values were better than benchmark values for other indications of LT. Five-year disease-free survival was largely superior compared with a matched group of nodal negative patients undergoing curative liver resection (n=106) (62% vs 32%, P <0.001). CONCLUSION: This multicenter benchmark study demonstrates that LT offers excellent outcomes with superior oncological results in early stage PHC patients, even in candidates for surgery. This provocative observation should lead to a change in available therapeutic algorithms for PHC.