RESUMO
BACKGROUND: Existing research exploring predictors of success on American Board of Surgery (ABS) exams focused on either resident or residency program characteristics, but limited studies focus on both. This study examines relationships between both resident and program characteristics and ABS Qualifying (QE) and Certifying Exam (CE) outcomes. STUDY DESIGN: Multilevel logistic regression was used to analyze the relationship between resident and program characteristics and ABS QE and CE 1st attempt pass and eventual certification. Resident characteristics were gender, IMG status, and prior performance, measured by 1st attempt USMLE Step 2 CK and Step 3 scaled scores. Program characteristics were size, %female, %International Medical Graduate (IMG), and program type. The sample included surgeons with QE and CE data from 2007-2019 and matched USMLE scores. RESULTS: Controlling for other variables, prior medical performance positively related to all ABS exam outcomes. The relationships between USMLE scores and success on ABS exams varied but were generally strong. Other resident characteristics that predicted ABS exam outcomes were gender and IMG (QE 1st attempt pass). The only program characteristic that significantly predicted ABS outcomes was %IMG (QE and CE 1st attempt pass). Despite statistical significance, gender, IMG, and %IMG translated to small differences in predicted probabilities of ABS exam success. CONCLUSION: This study highlights resident and program characteristics that predict success on ABS exams. USMLE scores consistently and strongly related to ABS exam success, providing evidence that USMLE scores relate to future high-stakes consequences like board certification. After controlling for prior performance, gender, IMG, and program %IMG significantly related to ABS exam success, but effects were small.
RESUMO
Importance: A new liver allocation policy was implemented by United Network for Organ Sharing (UNOS) in February 2020 with the stated intent of improving access to liver transplant (LT). There are growing concerns nationally regarding the implications this new system may have on LT costs, as well as access to a chance for LT, which have not been captured at a multicenter level. Objective: To characterize LT volume and cost changes across the US and within specific center groups and demographics after the policy implementation. Design, Setting, and Participants: This cross-sectional study collected and reviewed LT volume from multiple centers across the US and cost data with attention to 8 specific center demographics. Two separate 12-month eras were compared, before and after the new UNOS allocation policy: March 4, 2019, to March 4, 2020, and March 5, 2020, to March 5, 2021. Data analysis was performed from May to December 2022. Main Outcomes and Measures: Center volume, changes in cost. Results: A total of 22 of 68 centers responded comparing 1948 LTs before the policy change and 1837 LTs postpolicy, resulting in a 6% volume decrease. Transplants using local donations after brain death decreased 54% (P < .001) while imported donations after brain death increased 133% (P = .003). Imported fly-outs and dry runs increased 163% (median, 19; range, 1-75, vs 50, range, 2-91; P = .009) and 33% (median, 3; range, 0-16, vs 7, range, 0-24; P = .02). Overall hospital costs increased 10.9% to a total of $46â¯360â¯176 (P = .94) for participating centers. There was a 77% fly-out cost increase postpolicy ($10â¯600â¯234; P = .03). On subanalysis, centers with decreased LT volume postpolicy observed higher overall hospital costs ($41â¯720â¯365; P = .048), and specifically, a 122% cost increase for liver imports ($6â¯508â¯480; P = .002). Transplant centers from low-income states showed a significant increase in hospital (12%) and import (94%) costs. Centers serving populations with larger proportions of racial and ethnic minority candidates and specifically Black candidates significantly increased costs by more than 90% for imported livers, fly-outs, and dry runs despite lower LT volume. Similarly, costs increased significantly (>100%) for fly-outs and dry runs in centers from worse-performing health systems. Conclusions and Relevance: Based on this large multicenter effort and contrary to current assumptions, the new liver distribution system appears to place a disproportionate burden on populations of the current LT community who already experience disparities in health care. The continuous allocation policies being promoted by UNOS could make the situation even worse.
Assuntos
Transplante de Fígado , Obtenção de Tecidos e Órgãos , Transplante de Fígado/economia , Humanos , Estudos Transversais , Estados Unidos , Obtenção de Tecidos e Órgãos/economia , Obtenção de Tecidos e Órgãos/legislação & jurisprudência , Política de Saúde , Masculino , Feminino , Listas de EsperaRESUMO
Background: We aimed to identify the characteristics of new-onset diabetes after liver transplantation (LT) (NODAT) and investigate its impacts on post-transplant outcomes. Methods: Adult LT patients between 2014 and 2020 who used tacrolimus as initial immunosuppression and survived 3 months at least were evaluated. Patients who developed NODAT within 3 months after LT were classified as NODAT group. Also, patients were further classified as history of diabetes before LT (PHDBT) and non-diabetes (ND) groups. Patient characteristics, post-LT outcomes, and cardiovascular and/or pulmonary complications were compared. Results: A total of 83, 225, and 263 patients were classified into NODAT, PHDBT, and ND groups. The proportion of cholestatic liver disease and rejection within 90 days were higher in NODAT group. Mean serum tacrolimus concentration trough level in the first week after LT was 7.12, 6.12, and 6.12 ng/mL (p < 0.001). Duration of corticosteroids was significantly longer in NODAT compared to PHDBD or ND (416, 289, and 228 days, p < 0.001). Three-year graft and patient survival were significantly worse in NODAT than ND (80.5% vs. 95.0%, p < 0.001: 82.0% vs. 95.4%, p < 0.001) but similar to PHDBT. Adjusted risks of 3-year graft loss and patient death using Cox regression analysis were significantly higher in NODAT compared to ND (adjusted hazard ratio [aHR] 3.41, p = 0.004; aHR 3.61, p = 0.004). Incidence rates of cardiovascular or pulmonary complications after LT in NODAT were significantly higher than ND but similar to PHDBT. Conclusion: Higher initial tacrolimus concentration and early rejection might be risk factors for NODAT. NODAT was associated with worse post-transplant outcomes.
RESUMO
BACKGROUND: In 2019, Organ Procurement and Transplantation Network/United Network for Organ Sharing changed the exception policy for liver allocation to the median model for end-stage liver disease at transplantation (MMaT). This study evaluated the effects of this change on-waitlist outcomes of simultaneous liver-kidney transplantation (SLKT) for patients with polycystic liver-kidney disease (PLKD). METHODS: Using the Organ Procurement and Transplantation Network/United Network for Organ Sharing registry, 317 patients with PLKD listed for SLKT between January 2016 and December 2021 were evaluated. Waitlist outcomes were compared between prepolicy (Era 1) and postpolicy (Era 2) eras. RESULTS: One-year transplant probability was significantly higher in Era 2 than in Era 1 (55.7% versus 37.9%; P â =â 0.001), and the positive effect on transplant probability of Era 2 was significant after risk adjustment (adjusted hazard ratio, 1.76; 95% confidence interval, 1.22-2.54; P â =â 0.002 [ref. Era 1]), whereas waitlist mortality was comparable. Transplant centers were separated into the high and low MMaT groups with a score of 29 (median MMaT) and transplant probability in each group between eras was compared. In the high MMaT transplant centers, the 1-y transplant probability was significantly higher in Era 2 (27.5% versus 52.4%; P â =â 0.003). The positive effect remained significant in the high MMaT center group (adjusted hazard ratio, 2.79; 95% confidence interval, 1.43-5.46; P â =â 0.003 [ref. Era 1]) but not in the low MMaT center group. Although there was a difference between center groups in Era 1 ( P â =â 0.006), it became comparable in Era 2 ( P â =â 0.54). CONCLUSIONS: The new policy increased 1-y SLKT probability in patients with PKLD and successfully reduced the disparities based on center location.
Assuntos
Transplante de Rim , Transplante de Fígado , Sistema de Registros , Listas de Espera , Humanos , Transplante de Fígado/mortalidade , Transplante de Fígado/efeitos adversos , Masculino , Feminino , Listas de Espera/mortalidade , Pessoa de Meia-Idade , Transplante de Rim/efeitos adversos , Transplante de Rim/mortalidade , Adulto , Estados Unidos/epidemiologia , Obtenção de Tecidos e Órgãos , Doenças Renais Policísticas/cirurgia , Doenças Renais Policísticas/mortalidade , Resultado do Tratamento , Estudos Retrospectivos , Doença Hepática Terminal/cirurgia , Doença Hepática Terminal/mortalidade , Doença Hepática Terminal/diagnóstico , Fatores de Tempo , Fatores de Risco , Probabilidade , Medição de Risco , Cistos , HepatopatiasRESUMO
BACKGROUND: Liver transplant (LT) using organs donated after circulatory death (DCD) has been increasing in the United States. We investigated whether transplant centers' receptiveness to use of DCD organs impacted patient outcomes. METHODS: Transplant centers were classified as very receptive (group 1), receptive (2), or less receptive (3) based on the DCD acceptance rate and DCD transplant percentage. Using organ procurement and transplantation network/UNOS registry data for 20 435 patients listed for LT from January 2020 to June 2022, we compared rates of 1-y transplant probability and waitlist mortality between groups, broken down by model for end-stage liver disease-sodium (MELD-Na) categories. RESULTS: In adjusted analyses, patients in group 1 centers with MELD-Na scores 6 to 29 were significantly more likely to undergo transplant than those in group 3 (aHR range 1.51-2.11, P < 0.001). Results were similar in comparisons between groups 1 and 2 (aHR range 1.41-1.81, P < 0.001) and between groups 2 and 3 with MELD-Na 15-24 (aHR 1.19-1.20, P < 0.007). Likewise, patients with MELD-Na score 20 to 29 in group 1 centers had lower waitlist mortality than those in group 3 (scores, 20-24: aHR, 0.71, P = 0.03; score, 25-29: aHR, 0.51, P < 0.001); those in group 1 also had lower waitlist mortality compared with group 2 (scores 20-24: aHR0.69, P = 0.02; scores 25-29: aHR 0.63, P = 0.03). One-year posttransplant survival of DCD LT patients did not vary significantly compared with donation after brain dead. CONCLUSIONS: We conclude that transplant centers' use of DCD livers can improve waitlist outcomes, particularly among mid-MELD-Na patients.
Assuntos
Doença Hepática Terminal , Transplante de Fígado , Doadores de Tecidos , Obtenção de Tecidos e Órgãos , Listas de Espera , Humanos , Transplante de Fígado/mortalidade , Listas de Espera/mortalidade , Masculino , Feminino , Pessoa de Meia-Idade , Estados Unidos , Obtenção de Tecidos e Órgãos/métodos , Doença Hepática Terminal/cirurgia , Doença Hepática Terminal/mortalidade , Doença Hepática Terminal/diagnóstico , Doadores de Tecidos/provisão & distribuição , Sistema de Registros , Adulto , Resultado do Tratamento , Índice de Gravidade de Doença , Idoso , Fatores de TempoRESUMO
OBJECTIVE: To evaluate long-term oncologic outcomes of patients post-living donor liver transplantation (LDLT) within and outside standard transplantation selection criteria and the added value of the incorporation of the New York-California (NYCA) score. BACKGROUND: LDLT offers an opportunity to decrease the liver transplantation waitlist, reduce waitlist mortality, and expand selection criteria for patients with hepatocellular carcinoma (HCC). METHODS: Primary adult LDLT recipients between October 1999 and August 2019 were identified from a multicenter cohort of 12 North American centers. Posttransplantation and recurrence-free survival were evaluated using the Kaplan-Meier method. RESULTS: Three hundred sixty LDLTs were identified. Patients within Milan criteria (MC) at transplantation had a 1, 5, and 10-year posttransplantation survival of 90.9%, 78.5%, and 64.1% versus outside MC 90.4%, 68.6%, and 57.7% ( P = 0.20), respectively. For patients within the University of California San Francisco (UCSF) criteria, respective posttransplantation survival was 90.6%, 77.8%, and 65.0%, versus outside UCSF 92.1%, 63.8%, and 45.8% ( P = 0.08). Fifty-three (83%) patients classified as outside MC at transplantation would have been classified as either low or acceptable risk with the NYCA score. These patients had a 5-year overall survival of 72.2%. Similarly, 28(80%) patients classified as outside UCSF at transplantation would have been classified as a low or acceptable risk with a 5-year overall survival of 65.3%. CONCLUSIONS: Long-term survival is excellent for patients with HCC undergoing LDLT within and outside selection criteria, exceeding the minimum recommended 5-year rate of 60% proposed by consensus guidelines. The NYCA categorization offers insight into identifying a substantial proportion of patients with HCC outside the MC and the UCSF criteria who still achieve similar post-LDLT outcomes as patients within the criteria.
Assuntos
Carcinoma Hepatocelular , Neoplasias Hepáticas , Transplante de Fígado , Adulto , Humanos , Transplante de Fígado/métodos , Doadores Vivos , Recidiva Local de Neoplasia/etiologia , Seleção de Pacientes , América do Norte , Estudos Retrospectivos , Resultado do TratamentoRESUMO
BACKGROUND: Although transfusion management has improved during the last decade, orthotopic liver transplantation (OLT) has been associated with considerable blood transfusion requirements which poses some challenges in securing blood bank inventories. Defining the predictors of massive blood transfusion before surgery will allow the blood bank to better manage patients' needs without delays. We evaluated the predictors of intraoperative massive transfusion in OLT. STUDY DESIGN AND METHODS: Data were collected on patients who underwent OLT between 2007 and 2017. Repeat OLTs were excluded. Analyzed variables included recipients' demographic and pretransplant laboratory variables, donors' data, and intraoperative variables. Massive transfusion was defined as intraoperative transfusion of ≥10 units of packed red blood cells (RBCs). Statistical analysis was performed using SPSS version 17.0. RESULTS: The study included 970 OLT patients. The median age of patients was 57 (range: 16-74) years; 609 (62.7%) were male. RBCs, thawed plasma, and platelets were transfused intraoperatively to 782 (80.6%) patients, 831 (85.7%) patients, and 422 (43.5%) patients, respectively. Massive transfusion was documented in 119 (12.3%) patients. In multivariate analysis, previous right abdominal surgery, the recipient's hemoglobin, Model for End Stage Liver Disease (MELD) score, cold ischemia time, warm ischemia time, and operation time were predictive of massive transfusion. There was a direct significant correlation between the number of RBC units transfused and plasma (Pearson correlation coefficient r = .794) and platelets (r = .65). DISCUSSION: Previous abdominal surgery, the recipient's hemoglobin, MELD score, cold ischemia time, warm ischemia time, and operation time were predictive of intraoperative massive transfusion in OLT.
Assuntos
Doença Hepática Terminal , Transplante de Fígado , Humanos , Masculino , Adolescente , Adulto Jovem , Adulto , Pessoa de Meia-Idade , Idoso , Feminino , Doença Hepática Terminal/cirurgia , Perda Sanguínea Cirúrgica , Estudos Retrospectivos , Índice de Gravidade de Doença , Transfusão de Sangue , Hemoglobinas/análiseRESUMO
BACKGROUND: After implementation of the Acuity Circles (AC) allocation policy, use of DCD liver grafts has increased in the United States. METHODS: We evaluated the impact of AC on rates of DCD-liver transplants (LT), their outcomes, and medical costs in a single practice. Adult LT patients were classified into three eras: Era 1 (pre-AC, 1/01/2015-12/31/2017); Era 2 (late pre-AC era, 1/01/2018-02/03/2020); and Era 3 (AC era, 05/10/2020-09/30/2021). RESULTS: A total of 520 eligible LTs were performed; 87 were DCD, and 433 were DBD. With each successive era, the proportion of DCD increased (Era 1: 11%; Era 2: 20%; Era 3: 24%; p < .001). DCD recipients had longer ICU stays, higher re-admission/re-operation rates, and higher incidence of ischemic cholangiopathy compared to those with DBD. Direct, surgical, and ICU costs during first admission were higher with DCD than DBD (+8.0%, p < .001; +4.2%, p < .001; and +33.3%, p = .001). DCD-related costs increased after Era 1 (Direct: +4.9% [Era 2 vs. 1] and +12.4% [Era 3 vs. 1], p = .04; Surgical: +17.7% and +21.7%, p < .001). In the AC era, there was a significantly higher proportion of donors ≥50 years, and more national organ sharing. Compared to DCD from donors <50 years, DCD from donors ≥50 years was associated with significantly higher total direct, surgical, and ICU costs (+12.6%, p = .01; +9.5%, p = .01; +84.6%, p = .03). CONCLUSIONS: The proportion of DCD-LT, especially from older donors, has increased after the implementation of AC policies. These changes are likely to be associated with higher costs in the AC era.
Assuntos
Sistema Cardiovascular , Transplante de Fígado , Obtenção de Tecidos e Órgãos , Adulto , Humanos , Estresse Financeiro , Sobrevivência de Enxerto , Doadores Vivos , Doadores de Tecidos , Estudos Retrospectivos , Morte , Morte EncefálicaRESUMO
It has been reported that patients hospitalized outside regular working hours have worse outcomes. This study aims to compare outcomes following liver transplantation (LT) performed during public holidays and nonholidays. Methods: We analyzed the United Network for Organ Sharing registry data for 55 200 adult patients who underwent an LT between 2010 and 2019. Patients were grouped according to LT receipt during public holidays ±3 d (n = 7350) and nonholiday periods (n = 47 850). The overall post-LT mortality hazard was analyzed using multivariable Cox regression models. Results: LT recipient characteristics were similar between public holidays and nonholidays. Compared with nonholidays, deceased donors during public holidays had a lower donor risk index (median [interquartile range]: holidays 1.52 [1.29-1.83] versus nonholidays 1.54 [1.31-1.85]; P = 0.001) and shorter cold ischemia time (median [interquartile range]: holidays 5.82 h [4.52-7.22] versus nonholidays 5.91 h [4.62-7.38]; P < 0.001). Propensity score matching 4-to-1 was done to adjust for donor and recipient confounders (n = 33 505); LT receipt during public holidays (n = 6701) was associated with a lower risk of overall mortality (hazard ratio 0.94 [95% confidence interval, 0.86-0.99]; P = 0.046). The number of livers that were not recovered for transplant was higher during public holidays compared with nonholidays (15.4% versus 14.5%, respectively; P = 0.03). Conclusions: Although LT performed during public holidays was associated with improved overall patient survival, liver discard rates were higher during public holidays compared with nonholidays.
RESUMO
The practice of LDLT currently delivers limited impact in western transplant centers. The American Society of Transplantation organized a virtual consensus conference in October 2021 to identify barriers and gaps to LDLT growth, and to provide evidence-based recommendations to foster safe expansion of LDLT in the United States. This article reports the findings and recommendations regarding innovations and advances in approaches to donor-recipient matching challenges, the technical aspects of the donor and recipient operations, and surgical training. Among these themes, the barriers deemed most influential/detrimental to LDLT expansion in the United States included: (1) prohibitive issues related to donor age, graft size, insufficient donor remnant, and ABO incompatibility; (2) lack of acknowledgment and awareness of the excellent outcomes and benefits of LDLT; (3) ambiguous messaging regarding LDLT to patients and hospital leadership; and (4) a limited number of proficient LDLT surgeons across the United States. Donor-recipient mismatching may be circumvented by way of liver paired exchange. The creation of a national registry to generate granular data on donor-recipient matching will guide the practice of liver paired exchange. The surgical challenges to LDLT are addressed herein and focuses on the development of robust training pathways resulting in proficiency in donor and recipient surgery. Utilizing strong mentorship/collaboration programs with novel training practices under the auspices of established training and certification bodies will add to the breadth and depth of training.
Assuntos
Transplante de Fígado , Humanos , Incompatibilidade de Grupos Sanguíneos , Transplante de Fígado/métodos , Doadores VivosRESUMO
BACKGROUND: Acuity circle (AC) policy implementation improved the waitlist outcomes for certain liver transplant (LT)-candidates. The impact of the policy implementation for liver retransplant (reLT) candidates is unknown. METHODS: Using Organ Procurement and Transplantation Network/United Network for Organ Sharing (OPTN/UNOS) data from January, 2018 to September, 2021, we investigated the effect of the AC policy on waitlist and post-LT outcomes among patients who had previously received a LT. Patients were categorized by relisting date: Pre-AC (Era 1: January 1, 2018-February 3, 2020; n = 750); and Post-AC (Era 2: February 4, 2020-June 30, 2021; n = 556). Patient and donor characteristics, as well as on-waitlist and post-reLT outcomes were compared across eras. RESULTS: In Era 2, the probability of transplant within 90 days overall and among patients relisted > 14 days from initial transplant (late relisting) were significantly higher compared to Era 1 (subdistribution hazard ratio [sHR] 1.40, 95% CI 1.18-1.64, p < .001; sHR 1.52, 95% CI 1.23-1.88, p = .001, respectively). However, there was no difference by era among patients relisted ≤14 days from initial transplant (early relisting; sHR 1.21, 95% CI .93-1.57, p = .15). Likewise, among early relisting patients, risks for 180-day graft loss and mortality were significantly higher in Era 2 versus Era 1 (adjusted hazard ratio [aHR] 5.77, 95% CI 1.71-19.51, p = .004; and aHR 8.22, 95% CI 1.85-36.59, p = .005, respectively); for late relisting patients, risks for these outcomes were similar across eras. CONCLUSION: Our results show that the implementation of AC policy has improved transplant rates and reduced waiting time for reLT candidates listed > 14 days from initial transplant. However, the impact upon early relisting patients may be mixed.
Assuntos
Doença Hepática Terminal , Transplante de Fígado , Humanos , Listas de Espera , Doença Hepática Terminal/cirurgia , PolíticasRESUMO
Absolute lymphocyte count (ALC) is considered a surrogate marker for nutritional status and immunocompetence. We investigated the association between ALC and post-liver transplant outcomes in patients who received a deceased donor liver transplant (DDLT). Patients were categorized by ALC at liver transplant: low (<500/µL), mid (500-1000/µL), and high ALC (>1000/µL). Our main analysis used retrospective data (2013-2018) for DDLT recipients from Henry Ford Hospital (United States); the results were further validated using data from the Toronto General Hospital (Canada). Among 449 DDLT recipients, the low ALC group demonstrated higher 180-day mortality than mid and high ALC groups (83.1% vs 95.8% and 97.4%, respectively; low vs mid: P = .001; low vs high: P < .001). A larger proportion of patients with low ALC died of sepsis compared with the combined mid/high groups (9.1% vs 0.8%; P < .001). In multivariable analysis, pretransplant ALC was associated with 180-day mortality (hazard ratio, 0.20; P = .004). Patients with low ALC had higher rates of bacteremia (22.7% vs 8.1%; P < .001) and cytomegaloviremia (15.2% vs 6.8%; P = .03) than patients with mid/high ALC. Low ALC pretransplant through postoperative day 30 was associated with 180-day mortality among patients who received rabbit antithymocyte globulin induction (P = .001). Pretransplant lymphopenia is associated with short-term mortality and a higher incidence of posttransplant infections in DDLT patients.
Assuntos
Transplante de Fígado , Linfopenia , Estados Unidos , Humanos , Transplante de Fígado/efeitos adversos , Estudos Retrospectivos , Doadores Vivos , Linfopenia/etiologia , Contagem de LinfócitosRESUMO
BACKGROUND: Liver transplant (LT) candidates with hepatocellular carcinoma (HCC) often receive cancer treatment before transplant. We investigated the impact of pre-transplant treatment for HCC on the risk of posttransplant recurrence. METHODS: Adult HCC patients with LT at our institution between 2013 and 2020 were included. The impact of pre-LT cancer treatments on the cumulative recurrence was evaluated, using the Gray and Fine-Gray methods adjusted for confounding factors. Outcomes were considered in two ways: 1) by pathologically complete response (pCR) status within patients received pre-LT treatment; and 2) within patients without pCR, grouped by pre-LT treatment as A) none; B) one treatment; C) multiple treatments. RESULTS: The sample included 179 patients, of whom 151 (84%) received pretreatment and 42 (28% of treated) demonstrated pCR. Overall, 22 (12%) patients experienced recurrence. The 5-year cumulative post-LT recurrence rate was significantly lower in patients with pCR than those without pCR (4.8% vs. 19.2%, P = 0.03). In bivariable analyses, pCR significantly decreased risk of recurrence. Among the 137 patients without pCR (viable HCC in the explant), 28 (20%) had no pretreatment (A), 70 (52%) had one treatment (B), and 39 (20%) had multiple treatments (C). Patients in Group C had higher 5-year recurrence rates than those in A or B (39.6% vs. 8.2%, 6.5%, P = 0.004 and P < 0.001, respectively). In bivariable analyses, multiple treatments was significantly associated with recurrence. CONCLUSIONS: pCR is a favorable prognostic factor after LT. When pCR was not achieved by pre-LT treatment, the number of treatments might be associated with post-LT oncological prognosis.
Assuntos
Carcinoma Hepatocelular , Neoplasias Hepáticas , Adulto , Humanos , Carcinoma Hepatocelular/cirurgia , Carcinoma Hepatocelular/patologia , Neoplasias Hepáticas/cirurgia , Neoplasias Hepáticas/patologia , Recidiva Local de Neoplasia/cirurgia , Estudos Retrospectivos , PrognósticoRESUMO
BACKGROUND: Cold climate is known to affect the frequency and attributable mortality of various illnesses. This study aims to evaluate the effect of climate among regions on liver transplant (LT) outcomes. METHODS: We analyzed data from the United Network for Organ Sharing registry for 98,517 adult patients (aged ≥ 18 years) who were listed for LT between 2010 and 2019. During this period, 51,571 patients underwent single-organ, deceased LT. States were categorized based on their mean winter temperature: warm states (45°F-70°F), intermediate states (30°F-45°F), and cold states (0°F-30°F). Post-LT outcomes at 1 month, 1 year, and 3 years were compared using Cox proportional hazard models. Ninety-day and 1-year waitlist outcomes were compared among climate regions using Fine-Gray hazard regression model. RESULTS: After adjusting risks for recipient and donor characteristics, LT candidates in cold states had a significantly higher waitlist (90-day: subdistribution hazard ratio (HR) 1.46; 1-year: subdistribution HR 1.41; P < .001) and posttransplant mortality (30-day: subdistribution HR 1.23; P = .009, 1-year: subdistribution HR 1.16; P = .001; 3-year: subdistribution HR 1.08; P = .007). LT recipients in cold states had a higher proportion of deaths due to infections than warm states (cold states: 2.3%; intermediate states: 2.1%; and warm states: 1.7%; P < .001). CONCLUSIONS: Potential reasons include weather-related changes in the behavioral and physiological parameters of patients.
Assuntos
Transplante de Fígado , Adulto , Humanos , Estados Unidos , Transplante de Fígado/efeitos adversos , Estudos Retrospectivos , Listas de Espera , Sistema de Registros , Tempo (Meteorologia)RESUMO
Advanced age of liver donor is a risk factor for graft loss after transplant. We sought to identify recipient characteristics associated with negative post-liver transplant (LT) outcomes in the context of elderly donors. Using 2014-2019 OPTN/UNOS data, LT recipients were classified by donor age: ≥70, 40-69, and <40 years. Recipient risk factors for one-year graft loss were identified and created a risk stratification system and validated it using 2020 OPTN/UNOS data set. At transplant, significant recipient risk factors for one-year graft loss were: previous liver transplant (adjusted hazard ratio [aHR] 4.37, 95%CI 1.98-9.65); mechanical ventilation (aHR 4.28, 95%CI 1.95-9.43); portal thrombus (aHR 1.87, 95%CI 1.26-2.77); serum sodium <125 mEq/L (aHR 2.88, 95%CI 1.34-6.20); and Karnofsky score 10-30% (aHR 2.03, 95%CI 1.13-3.65), 40-60% (aHR 1.65, 95%CI 1.08-2.51). Using those risk factors and multiplying HRs, recipients were divided into low-risk (n = 931) and high-risk (n = 294). Adjusted risk of one-year graft loss in the low-risk recipient group was similar to that of patients with younger donors; results were consistent using validation dataset. Our results show that a system of careful recipient selection can reduce the risks of graft loss associated with older donor age.
Assuntos
Transplante de Rim , Transplante de Fígado , Transplantes , Adulto , Idoso , Sobrevivência de Enxerto , Humanos , Transplante de Rim/efeitos adversos , Transplante de Fígado/efeitos adversos , Doadores de TecidosRESUMO
Liver allocation in the United States was updated on February 4, 2020, by introducing the acuity circle (AC)-based model. This study evaluated the early effects of the AC-based allocation on waitlist outcomes. Methods: Adult liver transplant (LT) candidates listed between January 1, 2019, and September 30, 2021, were assessed. Two periods were defined according to listing date (pre- and post-AC), and 90-d waitlist outcomes were compared. Median transplant Model for End-stage Liver Disease (MELD) score of each transplant center was calculated, with centers categorized as low- (<25 percentile), mid- (25-75 percentile), and high-MELD (>75 percentile) centers. Results: A total of 12 421 and 17 078 LT candidates in the pre- and post-AC eras were identified. Overall, the post-AC era was associated with higher cause-specific 90-d hazards of transplant (csHR, 1.32; 95% confidence interval [CI], 1.27-1.38; P < 0.001) and waitlist mortality (cause-specific hazard ratio [csHR], 1.20; 95% CI, 1.09-1.32; P < 0.001). The latter effect was primarily driven by high-MELD centers. Low-MELD centers had a higher proportion of donations after circulatory death (DCDs) used. Compared with low-MELD centers, mid-MELD and high-MELD centers had significantly lower cause-specific hazards of DCD-LT in both eras (mid-MELD: csHR, 0.47; 95% CI, 0.38-0.59 in pre-AC and csHR, 0.56; 95% CI, 0.46-0.67 in post-AC and high-MELD: csHR, 0.11; 95% CI, 0.07-0.17 in pre-AC and csHR, 0.14; 95% CI, 0.10-0.20 in post-AC; all P < 0.001). Using a structural Bayesian time-series model, the AC policy was associated with an increase in the actual monthly DCD-LTs in low-, mid-, and high-MELD centers (actual/predicted: low-MELD: 19/16; mid-MELD: 21/14; high-MELD: 4/3), whereas the increase in monthly donation after brain death-LTs were only present in mid- and high-MELD centers. Conclusions: Although AC-based allocation may improve waitlist outcomes, regional variation exists in the drivers of such outcomes between centers.
RESUMO
Introduction Patients who undergo solid organ transplants have a higher risk of developing malignancies and subsequent recurrences. Clinical outcomes in transplant recipients with primary mucosal head and neck squamous cell carcinoma (HNSCC) are not well described in the published literature. Therefore, we retrospectively studied the outcomes in this group of patients. Methods This Institutional Review Board (IRB)-approved analysis included patients who had previously undergone solid organ transplants and subsequently were diagnosed with primary mucosal HNSCC between 2006 and 2021. Our institutional database of solid organ transplant recipients was cross-referenced with our head and neck cancer database to identify the patients included in this cohort. In addition, Kaplan-Meier analyses were performed to calculate overall and disease-free survival. Results Of 1,221 patients, 20 met the inclusion criteria. The median time from organ transplant to HNSCC diagnosis was 5.9 years (range: 0.5-18.5 years). A total of 11 (55.0%) and 9 (45.0%) patients presented with localized and locally advanced disease, respectively. Two-year overall and disease-free survivals were 59.1% and 73.5%, respectively. After initial treatment, six (30.0%) patients experienced a recurrence. All patients who developed a recurrence died within the follow-up period. The median time of death after recurrence for all six patients was 11.5 months (range: 2-22 months). Conclusion This series highlights a high mortality rate following recurrence among patients with primary mucosal HNSCC and a solid organ transplant history. A better understanding of how solid organ transplant history adversely impacts the course of HNSCC could help properly guide treatment, follow-up, and survivorship decisions.
RESUMO
Combined liver and lung transplantation (CLLT) is indicated in patients with both end-stage liver and lung disease. Ex-situ normothermic machine perfusion (NMP) has been previously used for extended normothermic lung preservation in CLLT. We aim to describe our single-center experience using ex-situ NMP for extended normothermic liver preservation in CLLT. Four CLLTs were performed from 2019 to 2020 with the lung transplanted first for all patients. Median ex-situ pump time for the liver was 413 min (IQR 400-424). Over a median follow-up of 15 months (IQR 14-19), all patients were alive and doing well. Normothermic extended liver preservation is a safe method to allow prolonged cold ischemia using normothermic perfusion of the liver during CLLT.