Your browser doesn't support javascript.
loading
: 20 | 50 | 100
1 - 20 de 227
1.
Pediatr Transplant ; 28(4): e14771, 2024 Jun.
Article En | MEDLINE | ID: mdl-38702924

BACKGROUND: We examined the combined effects of donor age and graft type on pediatric liver transplantation outcomes with an aim to offer insights into the strategic utilization of these donor and graft options. METHODS: A retrospective analysis was conducted using a national database on 0-2-year-old (N = 2714) and 3-17-year-old (N = 2263) pediatric recipients. These recipients were categorized based on donor age (≥40 vs <40 years) and graft type. Survival outcomes were analyzed using the Kaplan-Meier and Cox proportional hazards models, followed by an intention-to-treat (ITT) analysis to examine overall patient survival. RESULTS: Living and younger donors generally resulted in better outcomes compared to deceased and older donors, respectively. This difference was more significant among younger recipients (0-2 years compared to 3-17 years). Despite this finding, ITT survival analysis showed that donor age and graft type did not impact survival with the exception of 0-2-year-old recipients who had an improved survival with a younger living donor graft. CONCLUSIONS: Timely transplantation has the largest impact on survival in pediatric recipients. Improving waitlist mortality requires uniform surgical expertise at many transplant centers to provide technical variant graft (TVG) options and shed the conservative mindset of seeking only the "best" graft for pediatric recipients.


Graft Survival , Kaplan-Meier Estimate , Liver Transplantation , Tissue Donors , Humans , Child, Preschool , Retrospective Studies , Child , Adolescent , Male , Female , Infant , Age Factors , Infant, Newborn , Proportional Hazards Models , Adult , Treatment Outcome , Living Donors
2.
Transplant Direct ; 10(6): e1630, 2024 Jun.
Article En | MEDLINE | ID: mdl-38769984

Background: Small stature and female sex correlate to decreased deceased donor liver transplant (DDLT) access and higher waitlist mortality. However, efforts are being made to improve access and equity of allocation under the new continuous distribution (CD) system. Liver anteroposterior diameter (APD) is a method used by many centers to determine size compatibility for DDLT but is not recorded systematically, so it cannot be used for allocation algorithms. We therefore seek to correlate body surface area (BSA) and height to APD in donors and recipients and compare waitlist outcomes by these factors to support their use in the CD system. Methods: APD was measured from single-center DDLT recipients and donors with cross-sectional imaging. Linear, Pearson, and PhiK correlation coefficient were used to correlate BSA and height to APD. Competing risk analysis of waitlist outcomes was performed using United Network for Organ Sharing data. Results: For 143 pairs, donor BSA correlated better with APD than height (PhiK = 0.63 versus 0.20). For recipient all comers, neither BSA nor height were good correlates of APD, except in recipients without ascites, where BSA correlated well (PhiK = 0.63) but height did not. However, among female recipients, BSA, but not height, strongly correlated to APD regardless of ascites status (PhiK = 0.80 without, PhiK = 0.70 with). Among male recipients, BSA correlated to APD only in those without ascites (PhiK = 0.74). In multivariable models, both BSA and height were predictive of waitlist outcomes, with higher values being associated with increased access, decreased delisting for death/clinical deterioration, and decreased living donor transplant (model concordance 0.748 and 0.747, respectively). Conclusions: Taken together, BSA is a good surrogate for APD and can therefore be used in allocation decision making in the upcoming CD era to offset size and gender-based disparities among certain candidate populations.

3.
Liver Transpl ; 2024 Jan 31.
Article En | MEDLINE | ID: mdl-38289266

The Area Deprivation Index is a granular measure of neighborhood socioeconomic deprivation. The relationship between neighborhood socioeconomic deprivation and recipient survival following liver transplantation (LT) is unclear. To investigate this, the authors performed a retrospective cohort study of adults who underwent LT at the University of Washington Medical Center from January 1, 2004, to December 31, 2020. The primary exposure was a degree of neighborhood socioeconomic deprivation as determined by the Area Deprivation Index score. The primary outcome was posttransplant recipient mortality. In a multivariable Cox proportional analysis, LT recipients from high-deprivation areas had a higher risk of mortality than those from low-deprivation areas (HR: 1.81; 95% CI: 1.03-3.18, p =0.04). Notably, the difference in mortality between area deprivation groups did not become statistically significant until 6 years after transplantation. In summary, LT recipients experiencing high socioeconomic deprivation tended to have worse posttransplant survival. Further research is needed to elucidate the extent to which neighborhood socioeconomic deprivation contributes to mortality risk and identify effective measures to improve survival in more socioeconomically disadvantaged LT recipients.

4.
Transplant Proc ; 56(1): 58-67, 2024.
Article En | MEDLINE | ID: mdl-38195283

BACKGROUND: The prevalence of obesity in older patients undergoing kidney transplantation is increasing. Older age and obesity are associated with higher risks of complications and mortality post-transplantation. The optimal management of this group of patients remains undefined. METHODS: We retrospectively analyzed the United Network for Organ Sharing database of adults ≥70 years of age undergoing primary kidney transplant from January 1, 2014, to December 31, 2022. We examined patient and graft survival stratified by body mass index (BMI) in 3 categories, <30 kg/m2, 30 to 35 kg/m2, and >35 kg/m2. We also analyzed other risk factors that impacted survival. RESULTS: A total of 14,786 patients ≥70 years underwent kidney transplantation. Of those, 9,731 patients had a BMI <30 kg/m2, 3,726 patients with a BMI of 30 to 35 kg/m2, and 1,036 patients with a BMI >35 kg/m2. During the study period, there was a significant increase in kidney transplants in patients ≥70 years old across all BMI groups. Overall, patient survival, death-censored graft survival, and all-cause graft survival were lower in obese patients compared with nonobese patients. Multivariable analysis showed worse patient survival and graft survival in patients with a BMI of 30 to 35 kg/m2, a BMI >35 kg/m2, a longer duration of dialysis, diabetes mellitus, and poor functional status. CONCLUSION: Adults ≥70 years should be considered for kidney transplantation. Obesity with a BMI of 30 to 35 kg/m2 or >35 kg/m2, longer duration of dialysis, diabetes, and functional status are associated with worse outcomes. Optimization of these risk factors is essential when considering these patients for transplantation.


Diabetes Mellitus , Kidney Transplantation , Humans , Aged , Kidney Transplantation/adverse effects , Retrospective Studies , Renal Dialysis , Treatment Outcome , Obesity/epidemiology , Risk Factors , Graft Survival , Diabetes Mellitus/etiology , Body Mass Index
5.
Clin Transplant ; 38(1): e15170, 2024 01.
Article En | MEDLINE | ID: mdl-37943592

BACKGROUND: An increasing number of older patients are undergoing kidney transplant. Because of a finite longevity, more patients will be faced with failing allografts. At present there is a limited understanding of the benefits and risks associated with kidney retransplantation in this challenging population. METHODS: We performed a retrospective analysis of the Organ Procurement and Transplantation Network database of all adults ≥70 undergoing kidney retransplant from January 1, 2014 to December 31, 2022. We examined patient and graft survival of retransplanted patients compared to first time transplants. We also analyzed the risk factors that impacted the survival. RESULTS: During the study period there has been a significant rise in the number of retransplants performed, with 631 patients undergoing the procedure. Although clinically insignificant, overall graft, and patient survival rates were slightly lower in the retransplant group compared to the primary transplant group. With retransplant, patient survival was 91.3%, 75.6%, and 56.9% compared to 93.4%, 81.4%, and 64.4% with primary transplant at 1, 3, and 5 years, respectively. With retransplant, graft survival was 89.5%, 73.5%, 57.4% compared to 91.5%, 79.0%, and 63.6% in a primary transplant group at 1, 3, and 5 years, respectively. Multivariable analysis showed that factors predicting poor survival included longer time on dialysis before retransplantation and decreased functional capacity. No survival difference was noted between recipients of deceased versus living donor kidneys. Patients who underwent retransplantation before initiating dialysis had better patient and graft survival. CONCLUSION: Patients aged ≥70 achieve satisfactory outcomes following kidney retransplantation, highlighting that chronologic age should not preclude this medically complex population from this life-saving procedure. Improvement in functional status and timely retransplantation are the key factors to successful outcome.


Kidney Transplantation , Adult , Humans , Aged , Aged, 80 and over , Retrospective Studies , Reoperation , Risk Factors , Graft Survival , Kidney
6.
Am J Transplant ; 24(1): 37-45, 2024 Jan.
Article En | MEDLINE | ID: mdl-37595842

IgA nephropathy (IgAN) is associated with a risk for posttransplant recurrence. Data are limited regarding graft loss attributable to recurrence of IgAN among pediatric and young adult kidney transplant (KT) recipients. This was a retrospective cohort study of patients aged 0 to 25 years from the Scientific Registry of Transplant Recipients who received a primary KT for IgAN. Patients with history of KT attributable to renal dysplasia were comparators. Outcomes included the incidence of graft loss attributable to IgAN recurrence, association with donor type, and posttransplant corticosteroid use. In total, 5475 transplant recipients were included, with 1915 patients with IgAN and 3560 patients with renal dysplasia. In a multivariable Cox proportional hazards model, IgAN was associated with higher risk of graft loss (adjusted hazard ratio [aHR], 1.35; 95% CI, 1.21-1.50; P < .001) compared with dysplasia. Graft loss was attributed to recurrent disease in 5.4% of patients with IgAN. In a multivariable competing risks analysis, patients with IgAN receiving a parental living-donor kidney were more likely to report graft loss from recurrent disease compared with patients with a nonparental living donor (aHR, 0.52; 95% CI, 0.31-0.91; P = .02). Posttransplant prednisone use was not associated with improved graft survival (P = .2). These data challenge existing paradigms in posttransplant management of patients with IgAN.


Glomerulonephritis, IGA , Kidney Transplantation , Humans , Young Adult , Child , Glomerulonephritis, IGA/complications , Glomerulonephritis, IGA/surgery , Kidney Transplantation/adverse effects , Retrospective Studies , Transplant Recipients , Kidney , Chronic Disease , Graft Survival , Recurrence
7.
Front Immunol ; 14: 1194338, 2023.
Article En | MEDLINE | ID: mdl-37457719

Objective: There is an unmet need for optimizing hepatic allograft allocation from nondirected living liver donors (ND-LLD). Materials and method: Using OPTN living donor liver transplant (LDLT) data (1/1/2000-12/31/2019), we identified 6328 LDLTs (4621 right, 644 left, 1063 left-lateral grafts). Random forest survival models were constructed to predict 10-year graft survival for each of the 3 graft types. Results: Donor-to-recipient body surface area ratio was an important predictor in all 3 models. Other predictors in all 3 models were: malignant diagnosis, medical location at LDLT (inpatient/ICU), and moderate ascites. Biliary atresia was important in left and left-lateral graft models. Re-transplant was important in right graft models. C-index for 10-year graft survival predictions for the 3 models were: 0.70 (left-lateral); 0.63 (left); 0.61 (right). Similar C-indices were found for 1-, 3-, and 5-year graft survivals. Comparison of model predictions to actual 10-year graft survivals demonstrated that the predicted upper quartile survival group in each model had significantly better actual 10-year graft survival compared to the lower quartiles (p<0.005). Conclusion: When applied in clinical context, our models assist with the identification and stratification of potential recipients for hepatic grafts from ND-LLD based on predicted graft survivals, while accounting for complex donor-recipient interactions. These analyses highlight the unmet need for granular data collection and machine learning modeling to identify potential recipients who have the best predicted transplant outcomes with ND-LLD grafts.


Liver Failure , Liver Transplantation , Humans , Liver Transplantation/adverse effects , Living Donors , Retrospective Studies
8.
Transplantation ; 107(12): 2510-2525, 2023 Dec 01.
Article En | MEDLINE | ID: mdl-37322588

BACKGROUND: The US population is aging, and so the number of patients treated for end-stage renal disease is on the rise. In the United States, 38% of people over 65 y old have chronic kidney disease. There continues to be a reluctance of clinicians to consider older candidates for transplant, including early referrals. METHODS: We conducted a retrospective analysis of the Organ Procurement and Transplantation Network database of all adults ≥70 y old undergoing kidney transplants from December 1, 2014, to June 30, 2021. We compared patient and graft survival in candidates who were transplanted while on hemodialysis versus preemptive with a living versus deceased donor kidney transplant. RESULTS: In 2021, only 43% of the candidates listed for transplant were preemptive. In an intention-to-treat analysis from the time of listing, candidate survival was significantly improved for those transplanted preemptively versus being on dialysis (hazard ratio 0.59; confidence interval, 0.56-0.63). All donor types, donor after circulatory death, donor after brain death, and living donor, had a significant decrease in death over remaining on the waiting list. Patients who were on dialysis or transplanted preemptively with a living donor kidney had significantly better survival than those receiving a deceased donor kidney. However, receiving a deceased donor kidney significantly decreased the chance of death over remaining on the waiting list. CONCLUSIONS: Patients ≥70 y old who are transplanted preemptively, whether with a deceased donor or a living donor kidney, have a significantly better survival than those who are transplanted after initiating dialysis. Emphasis on timely referral for a kidney transplant should be placed in this population.


Kidney Failure, Chronic , Kidney Transplantation , Tissue and Organ Procurement , Adult , Humans , United States , Aged , Aged, 80 and over , Kidney Transplantation/adverse effects , Retrospective Studies , Intention to Treat Analysis , Tissue Donors , Kidney Failure, Chronic/diagnosis , Kidney Failure, Chronic/surgery , Kidney , Living Donors , Graft Survival , Waiting Lists
9.
JAMA Surg ; 158(6): 610-616, 2023 06 01.
Article En | MEDLINE | ID: mdl-36988928

Importance: Small waitlist candidates are significantly less likely than larger candidates to receive a liver transplant. Objective: To investigate the magnitude of the size disparity and test potential policy solutions. Design, Setting, and Participants: A decision analytical model was generated to match liver transplant donors to waitlist candidates based on predefined body surface area (BSA) ratio limits (donor BSA divided by recipient BSA). Participants included adult deceased liver transplant donors and waitlist candidates in the Organ Procurement and Transplantation Network database from June 18, 2013, to March 20, 2020. Data were analyzed from January 2021 to September 2021. Exposures: Candidates were categorized into 6 groups according to BSA from smallest (group 1) to largest (group 6). Waitlist outcomes were examined. A match run was created for each donor under the current acuity circle liver allocation policy, and the proportion of candidates eligible for a liver based on BSA ratio was calculated. Novel allocation models were then tested. Main Outcomes and Measures: Time on the waitlist, assigned Model for End-Stage Liver Disease (MELD) score, and proportion of patients undergoing a transplant were compared by BSA group. Modeling under the current allocation policies was used to determine baseline access to transplant by group. Simulation of novel allocation policies was performed to examine change in access. Results: There were 41 341 donors (24 842 [60.1%] male and 16 499 [39.9%] female) and 84 201 waitlist candidates (53 724 [63.8%] male and 30 477 [36.2%] female) in the study. The median age of the donors was 42 years (IQR, 28-55) and waitlist candidates, 57 years (IQR, 50-63). Females were overrepresented in the 2 smallest BSA groups (7100 [84.0%] and 7922 [61.1%] in groups 1 and 2, respectively). For each increase in group number, waitlist time decreased (234 days [IQR, 48-700] for group 1 vs 179 days [IQR, 26-503] for group 6; P < .001) and the proportion of the group undergoing transplant likewise improved (3890 [46%] in group 1 vs 4932 [57%] in group 6; P < .001). The smallest 2 groups of candidates were disadvantaged under the current acuity circle allocation model, with 37% and 7.4% fewer livers allocated relative to their proportional representation on the waitlist. Allocation of the smallest 10% of donors (by BSA) to the smallest 15% of candidates overcame this disparity, as did performing split liver transplants. Conclusions and Relevance: In this study, liver waitlist candidates with the smallest BSAs had a disadvantage due to size. Prioritizing allocation of smaller liver donors to smaller candidates may help overcome this disparity.


End Stage Liver Disease , Liver Transplantation , Tissue and Organ Procurement , Adult , Humans , Male , Female , Middle Aged , End Stage Liver Disease/surgery , Body Surface Area , Severity of Illness Index , Living Donors , Tissue Donors , Waiting Lists
10.
Am J Transplant ; 22(12): 3087-3092, 2022 12.
Article En | MEDLINE | ID: mdl-36088649

The kidney donor risk index (KDRI) and percentile conversion, kidney donor profile index (KDPI), provide a continuous measure of donor quality. Kidneys with a KDPI >85% (KDPI85 ) are referred to as "high KDPI." The KDPI85 cutoff changes every year, impacting which kidneys are labeled as KDPIHIGH . We examine kidney utilization around the KDPI85 cutoff and explore the "high KDPI" labeling effect. KDRI to KDPI Mapping Tables from 2012 to 2020 were used to determine the yearly KDRI85 value. Organ Procurement and Transplantation Network data was used to calculate discard rates and model organ use. KDRI85 varied between 1.768 and 1.888. In a multivariable analysis, kidney utilization was lower for KDPI 86% compared with KDPI 85% kidneys (p = .046). Kidneys with a KDRI between 1.785-1.849 were classified as KDPIHIGH in the years 2015-2017 and KDPILOW in the years 2018-2020. The discard rate was 44.9% when labeled as KDPIHIGH and 39.1% when labeled as KDPILOW (p < .01). For kidneys with the same KDRI, the high KDPI label is associated with increased discard. We should reconsider the appropriateness of the "high KDPI" label.


Kidney Transplantation , Tissue and Organ Procurement , Humans , Donor Selection , Graft Survival , Risk Factors , Tissue Donors , Kidney , Retrospective Studies
12.
J Surg Educ ; 79(5): 1132-1139, 2022.
Article En | MEDLINE | ID: mdl-35660307

OBJECTIVE: General surgery remains a male-dominated specialty. Women constitute 54% of medical students at the University of Washington, but only 3.4% of full professors within the Department of Surgery. Many believe surgical attrition and "the leaky pipeline" starts during medical school clerkships, but the exact deterrents remain undefined. This study examined the impact of gender on grading during the third-year surgical clerkship. DESIGN: Retrospective analysis of confidential final clerkship grades, examination scores and subjective clerkship grades was conducted. These were compared by gender, time period, and type of clerkship site. Chi-square analyses were performed. SETTING: Clerkship sites across multiple academic (n = 6) and nonacademic (n = 14) locations. PARTICIPANTS: All third-year medical students undergoing a core surgical clerkship over 2 time periods-2007 to 2010 (period 1) and 2016 to 2019 (period 2)-were included. RESULTS: There were 539 medical students in period 1 and 792 in period 2. The percentage of women was stable over time (52.0% vs 54.2%, p = 0.43). Final clerkship grades of Honors increased significantly from period 1 to 2 (22.3% vs 44.3%, p < 0.0001) and was similarly distributed by gender (women: 21.4% vs 48.0%, p < 0.0001; men 23.2% vs 39.9%, p < 0.0001). Honors on examinations remained stable over time and did not differ by gender. Women earned more final clerkship honors than men at academic sites in period 2 (48.4% vs 30.9%, p < 0.001). This finding was not identified in period 1, nor at nonacademic sites. CONCLUSION: There was a significant increase in surgical clerkship honors over the past decade, independent of gender. Women attained more clinical and final clerkship honors than men and similar exam grades as time progressed, suggesting that gender bias in the subjective grading of women at this institution does not directly contribute to the loss of talented women as they progress from medical student to faculty within the department, with said gender imbalance not related to clerkship evaluations.


Clinical Clerkship , Students, Medical , Clinical Competence , Educational Measurement , Female , Humans , Male , Retrospective Studies , Sexism
13.
Transplantation ; 106(11): 2217-2223, 2022 11 01.
Article En | MEDLINE | ID: mdl-35675439

BACKGROUND: Because of the continued demand in kidney transplantation, organs from donors with risk criteria for blood-borne viruses, high Kidney Donor Profile Index (KDPI) kidneys, and hepatitis C virus (HCV)-positive kidneys are being considered. There continues to be reluctance on the part of the providers and the candidates to accept HCV-positive kidneys. METHODS: We conducted a retrospective analysis of the Organ Procurement and Transplantation Network database of all adult (≥18 y old) recipients undergoing kidney transplant from May 10, 2013, to June 30, 2021. We compared patient and graft survival in candidates who received HCV-positive kidneys versus non-hepatitis C (Hep C) high KDPI kidneys by estimated posttransplant survival (EPTS) groups. RESULTS: HCV-viremic kidneys were transplanted in 5.6% of patients in the EPTS >61% group compared with 5.1% of patients in the 21%-60% EPTS group and 1.9% of 0%-20% EPTS group ( P < 0.001). Of all transplants performed in the EPTS 61%-100% group, 11.9% were KDPI >85% compared with 5.2% in the EPTS 21%-60%, and 0.5% in the EPTS 0%-20%. Patient survival was significantly longer at 1, 3, and 5 y in the EPTS >61% group who received Hep C-viremic or -nonviremic allografts compared with non-Hep C kidneys with KDPI >85%. When it comes to listing, only 25% of candidates in the EPTS >61% group were listed for Hep C nucleic acid testing-positive kidneys in 2021. CONCLUSIONS: Our findings could be used for counseling candidates on the types of kidneys they should consider for transplantation. Also, listing practices for viremic Hep C kidneys need continued re-evaluation.


Hepatitis C , Nucleic Acids , Tissue and Organ Procurement , Adult , Humans , Aged , Hepacivirus/genetics , Retrospective Studies , Tissue Donors , Graft Survival , Hepatitis C/complications , Hepatitis C/diagnosis , Kidney , Viremia/diagnosis
14.
Am J Surg ; 224(1 Pt B): 612-616, 2022 07.
Article En | MEDLINE | ID: mdl-35361472

BACKGROUND: Due to the COVID-19 pandemic, medical schools were forced to adapt clinical curricula. The University of Washington School of Medicine created a hybrid in person and virtual general surgery clerkship. METHODS: The third year general surgery clerkship was modified to a 4-week in person and 2-week virtual clerkship to accommodate the same number of learners in less time. All students completed a survey to assess the impact of the virtual clerkship. RESULTS: The students preferred faculty lectures over national modules in the virtual clerkship. 58.6% indicated they would prefer the virtual component before the in-person experience. There was no change from previous years in final grades or clerkship exam scores after this hybrid curriculum. CONCLUSIONS: If the need for a virtual general surgery curriculum arises again in the future, learners value this experience at the beginning of the clerkship and prefer faculty lectures over national modules.


COVID-19 , Clinical Clerkship , Education, Medical, Undergraduate , General Surgery , Students, Medical , COVID-19/epidemiology , Curriculum , General Surgery/education , Humans , Pandemics
15.
Transplant Direct ; 8(2): e1282, 2022 Feb.
Article En | MEDLINE | ID: mdl-35047664

BACKGROUND: The current model for end-stage liver disease-based liver allocation system in the United States prioritizes sickest patients first at the expense of long-term graft survival. In a continuous distribution model, a measure of posttransplant survival will also be included. We aimed to use mathematical optimization to match donors and recipients based on quality to examine the potential impact of an allocation system designed to maximize long-term graft survival. METHODS: Cox proportional hazard models using organ procurement and transplantation network data from 2008 to 2012 were used to place donors and waitlist candidates into 5 groups of increasing risk for graft loss (1-lowest to 5-highest). A mixed integer programming optimization model was then used to generate allocation rules that maximized graft survival at 5 and 8 y. RESULTS: Allocation based on mathematical optimization improved 5-y survival by 7.5% (78.2% versus 70.7% in historic cohort) avoiding 2271 graft losses, and 8-y survival by 9% (71.8% versus 62.8%) avoiding 2725 graft losses. Long-term graft survival for recipients within a quality group is highly dependent on donor quality. All candidates in groups 1 and 2 and 43% of group 3 were transplanted, whereas none of the candidates in groups 4 and 5 were transplanted. CONCLUSIONS: Long-term graft survival can be improved using a model that allocates livers based on both donor and recipient quality, and the interaction between donor and recipient quality is an important predictor of graft survival. Considerations for incorporation into a continuous distribution model are discussed.

16.
Liver Transpl ; 28(3): 407-421, 2022 03.
Article En | MEDLINE | ID: mdl-34587357

Acute graft-versus-host disease (GVHD) is a rare complication after orthotopic liver transplantation (OLT) that carries high mortality. We hypothesized that machine-learning algorithms to predict rare events would identify patients at high risk for developing GVHD. To develop a predictive model, we retrospectively evaluated the clinical features of 1938 donor-recipient pairs at the time they underwent OLT at our center; 19 (1.0%) of these recipients developed GVHD. This population was divided into training (70%) and test (30%) sets. A total of 7 machine-learning classification algorithms were built based on the training data set to identify patients at high risk for GVHD. The C5.0, heterogeneous ensemble, and generalized gradient boosting machine (GGBM) algorithms predicted that 21% to 28% of the recipients in the test data set were at high risk for developing GVHD, with an area under the receiver operating characteristic curve (AUROC) of 0.83 to 0.86. The 7 algorithms were then evaluated in a validation data set of 75 more recent donor-recipient pairs who underwent OLT at our center; 2 of these recipients developed GVHD. The logistic regression, heterogeneous ensemble, and GGBM algorithms predicted that 9% to 11% of the validation recipients were at high risk for developing GVHD, with an AUROC of 0.93 to 0.96 that included the 2 recipients who developed GVHD. In conclusion, we present a practical model that can identify patients at high risk for GVHD who may warrant additional monitoring with peripheral blood chimerism testing.


Graft vs Host Disease , Liver Transplantation , Area Under Curve , Graft vs Host Disease/diagnosis , Graft vs Host Disease/etiology , Humans , Liver Transplantation/adverse effects , Machine Learning , Retrospective Studies
17.
Exp Clin Transplant ; 19(12): 1303-1312, 2021 12.
Article En | MEDLINE | ID: mdl-34951349

OBJECTIVES: Simultaneous liver-kidney transplant is a treatment option for patients with end-stage liver disease and concomitant irreversible kidney injury. We developed a decision toolto aid transplant programs to advise their candidates for simultaneous liver-kidney transplant on accepting high-risk grafts versus waiting for lower-risk grafts. MATERIALS AND METHODS: To find the critical decision factors, we used the prescriptive analytic technique of microsimulation.All probabilities used in the simulation model were calculated from Organ Procurement and Transplantation Network data collected from February 27, 2002 to June 30, 2018. RESULTS: The simulated patient population results revealed, on average, that high-risk candidates for simultaneous liver-kidney transplant who accept highrisk organs have 254.8 ± 225.4 weeks of life compared with 285.6 ± 232.4 weeks if they waited for better organs. However, critical decision factors included the specific organ offer rates within individual transplant programs and the rank of the candidate in each program's waitlist. Thus, for programs with lower organ offer rates or for candidates with a rare blood type, a high-risk simultaneous liver-kidney transplant candidate might accept a high-risk organ for longer survival. CONCLUSIONS: Our model can be utilized to determine when acceptance of high-risk organs for patients being considered for simultaneous liver-kidney transplant would lead to survival benefit, based on probabilities specific for their program.


Kidney Transplantation , Tissue and Organ Procurement , Humans , Kidney , Kidney Transplantation/methods , Liver , Tissue Donors , Treatment Outcome , Waiting Lists
18.
Transplant Direct ; 7(8): e733, 2021 Aug.
Article En | MEDLINE | ID: mdl-34291155

BACKGROUND: As the rate of early postoperative complications decline after transplant with pediatric donation after circulatory death (DCD) kidneys, attention has shifted to the long-term consequences of donor-recipient (D-R) size disparity given the pernicious systemic effects of inadequate functional nephron mass. METHODS: We conducted a retrospective cohort study using Organ Procurement and Transplantation Network data for all adult (aged ≥18 y) recipients of pediatric (aged 0-17 y) DCD kidneys in the United States from January 1, 2004 to March 10, 2020. RESULTS: DCD pediatric allografts transplanted between D-R pairs with a body surface area (BSA) ratio of 0.10-0.70 carried an increased risk of all-cause graft failure (relative risk [RR], 1.36; 95% confidence interval [CI], 1.10-1.69) and patient death (RR, 1.32; 95% CI, 1.01-1.73) when compared with pairings with a ratio of >0.91. Conversely, similar graft and patient survivals were demonstrated among the >0.70-0.91 and >0.91 cohorts. Furthermore, we found no difference in death-censored graft survival between all groups. Survival analysis revealed improved 10-y patient survival in recipients of en bloc allografts (P = 0.02) compared with recipients of single kidneys with D-R BSA ratios of 0.10-0.70. A similar survival advantage was demonstrated in recipients of solitary allografts with D-R BSA ratios >0.70 compared with the 0.10-0.70 cohort (P = 0.02). CONCLUSIONS: Inferior patient survival is likely associated with systemic sequelae of insufficient renal functional capacity in size-disparate DCD kidney recipients, which can be overcome by appropriate BSA matching or en bloc transplantation. We therefore suggest that in DCD kidney transplantation, D-R BSA ratios of 0.10-0.70 serve as criteria for en bloc allocation or alternative recipient selection to optimize the D-R BSA ratio to >0.70.

19.
Exp Clin Transplant ; 19(3): 250-258, 2021 03.
Article En | MEDLINE | ID: mdl-33605200

OBJECTIVES: Despite data showing equivalent outcomes between grafts from marginal versus standard criteria deceased liver donors, elevated donor transaminases constitute a frequent reason to decline potential livers. We assessed the effect of donor transaminase levels and other characteristics on graft survival. MATERIALS AND METHODS: We performed a retrospective cohort analysis of adult first deceased donor liver transplant recipients with available transaminase levels registered in the Organ Procurement and Transplantation Network database (2008-2018). We used Cox proportional hazards regression to determine the effects of donor characteristics on graft survival. RESULTS: Of 53 913 liver transplants, 52 158 were allografts from donors with low transaminases (≤ 500 U/L; group A) and 1755 were from donors with elevated transaminases (> 500 U/L; group B). Group A recipients were more likely to be hospitalized (P = .01) or in intensive care (P < .001) or to have mechanical assistance (P < .001), portal vein thrombosis (P = .01), diabetes mellitus (P = .003), or dialysis the week before liver transplant (P = .004). Multivariable analysis (controlling for recipient characteristics) showed donor risk factors of graft failure included diabetes mellitus (P < .001), donation after cardiac death (P < .001), total bilirubin > 3.5 mg/dL (P < .001), serum creatinine > 1.5 mg/dL (P = .01), and cold ischemia time > 6 hours (P < .001). Regional organ sharing showed lower risk of graft failure (P = .02). Donor transaminases > 500 U/L were not associated with graft failure (relative risk, 1.02; 95% CI, 0.91-1.14; P = .74). CONCLUSIONS: Donor transaminases > 500 U/L should not preclude the use of liver grafts. Instead, donor total bilirubin > 3.5 mg/dL and serum creatinine > 1.5 mg/dL appear to be associated with higher likelihood of graft failure after liver transplant.


Graft Survival , Liver Transplantation , Living Donors , Tissue and Organ Procurement , Transaminases/blood , Bilirubin/blood , Creatinine/blood , Diabetes Mellitus , Humans , Liver Transplantation/adverse effects , Retrospective Studies , Risk Factors
20.
Exp Clin Transplant ; 19(1): 8-13, 2021 01.
Article En | MEDLINE | ID: mdl-32133939

OBJECTIVES: Kidney transplant is the optimal treatment for patients with end-stage renal disease. The effects of using machine perfusion for donor kidneys with varying Kidney Donor Profile Index scores are unknown. We sought to assess the impact of machine perfusion on the incidence of delayed graft function in different score groups of kidney grafts classified with the Kidney Donor Profile Index. MATERIALS AND METHODS: We conducted a retrospective analysis from January 2008 through September 2017 of adult recipients (≥ 18 years old) undergoing kidney-only transplant from deceased donors. All transplant recipients were followed until December 2017. Recipients who received multiorgan transplants or kidneys from living donors were excluded from our analyses. Recipients were divided according to 5 donor categories of Kidney Donor Profile Index scores (0-20, 21-40, 41-60, 61-80, and 81-100). Logistic regression analysis was performed for each score group to determine the effects of machine perfusion on development of delayed graft function within each score group. RESULTS: Our study included 101222 recipients who met the inclusion criteria. Multivariate analysis revealed that machine perfusion was associated with significantly decreased development of delayed graft function only in donors with high-risk profiles: the 61 to 80 score group (odds ratio = 0.83; confidence interval, 0.78-0.89) and the 81 to 100 score group (odds ratio = 0.72; confidence interval, 0.67-0.78). CONCLUSIONS: Machine perfusion is beneficial in reducing delayed graft function only in donor kidneys with a higher risk profile.


Delayed Graft Function , Kidney Transplantation , Adult , Delayed Graft Function/etiology , Delayed Graft Function/prevention & control , Humans , Kidney Transplantation/adverse effects , Perfusion , Retrospective Studies
...