Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 57
Filter
1.
Transpl Int ; 37: 13218, 2024.
Article in English | MEDLINE | ID: mdl-39100754

ABSTRACT

Delayed graft function (DGF) after kidney transplantation heralds a worse prognosis. In patients with hyperoxaluria, the incidence of DGF is high. Oxalic acid is a waste product that accumulates when kidney function decreases. We hypothesize that residual diuresis and accumulated waste products influence the DGF incidence. Patients transplanted between 2018-2022 participated in the prospective cohort study. Pre-transplant concentrations of oxalic acid and its precursors were determined. Data on residual diuresis and other recipient, donor or transplant related variables were collected. 496 patients were included, 154 were not on dialysis. Oxalic acid, and glyoxylic acid, were above upper normal concentrations in 98.8%, and 100% of patients. Residual diuresis was ≤150 mL/min in 24% of patients. DGF occurred in 157 patients. Multivariable binary logistic regression analysis demonstrated a significant influence of dialysis type, recipient BMI, donor type, age, and serum creatinine on the DGF risk. Residual diuresis and glycolic acid concentration were inversely proportionally related to this risk, glyoxylic acid directly proportionally. Results in the dialysis population showed the same results, but glyoxylic acid lacked significance. In conclusion, low residual diuresis is associated with increased DGF incidence. Possibly accumulated waste products also play a role. Pre-emptive transplantation may decrease the incidence of DGF.


Subject(s)
Delayed Graft Function , Diuresis , Glyoxylates , Kidney Transplantation , Oxalic Acid , Humans , Kidney Transplantation/adverse effects , Female , Male , Middle Aged , Delayed Graft Function/etiology , Delayed Graft Function/epidemiology , Adult , Prospective Studies , Aged , Renal Dialysis , Glycolates , Hyperoxaluria/etiology , Risk Factors , Incidence
2.
Transpl Int ; 36: 11112, 2023.
Article in English | MEDLINE | ID: mdl-37342179

ABSTRACT

Computerized integration of alternative transplantation programs (CIAT) is a kidney-exchange program that allows AB0- and/or HLA-incompatible allocation to difficult-to-match patients, thereby increasing their chances. Altruistic donors make this available for waiting list patients as well. Strict criteria were defined for selected highly-immunized (sHI) and long waiting (LW) candidates. For LW patients AB0i allocation was allowed. sHI patients were given priority and AB0i and/or CDC cross-match negative HLAi allocations were allowed. A local pilot was established between 2017 and 2022. CIAT results were assessed against all other transplant programs available. In the period studied there were 131 incompatible couples; CIAT transplanted the highest number of couples (35%), compared to the other programs. There were 55 sHI patients; CIAT transplanted as many sHI patients as the Acceptable Mismatch program (18%); Other programs contributed less. There were 69 LW patients; 53% received deceased donor transplantations, 20% were transplanted via CIAT. In total, 72 CIAT transplants were performed: 66 compatible, 5 AB0i and 1 both AB0i and HLAi. CIAT increased opportunities for difficult-to-match patients, not by increasing pool size, but through prioritization and allowing AB0i and "low risk" HLAi allocation. CIAT is a powerful addition to the limited number of programs available for difficult-to-match patients.


Subject(s)
Kidney Transplantation , Tissue and Organ Procurement , Humans , Living Donors , Kidney
3.
Transpl Int ; 36: 10647, 2023.
Article in English | MEDLINE | ID: mdl-36756277

ABSTRACT

Aorto-iliac calcification (AIC) is a well-studied risk factor for post-transplant cardiovascular events and mortality. Its effect on graft function remains unknown. The primary aim of this prospective cohort study was to assess the association between AIC and estimated glomerular filtration rate (eGFR) in the first year post-transplant. Eligibility criteria were: ≥50 years of age or ≥30 years with at least one risk factor for vascular disease. A non-contrast-enhanced CT-scan was performed with quantification of AIC using the modified Agatston score. The association between AIC and eGFR was investigated with a linear mixed model adjusted for predefined variables. One-hundred-and-forty patients were included with a median of 31 (interquartile range 26-39) eGFR measurements per patient. No direct association between AIC and eGFR was found. We observed a significant interaction between follow-up time and ipsilateral AIC, indicating that patients with higher AIC scores had lower eGFR trajectory over time starting 100 days after transplant (p = 0.014). To conclude, severe AIC is not directly associated with lower post-transplant eGFR. The significant interaction indicates that patients with more severe AIC have a lower eGFR trajectory after 100 days in the first year post-transplant.


Subject(s)
Kidney Transplantation , Humans , Adult , Kidney Transplantation/adverse effects , Glomerular Filtration Rate , Prospective Studies , Risk Factors
4.
Surg Obes Relat Dis ; 19(5): 501-509, 2023 05.
Article in English | MEDLINE | ID: mdl-36572583

ABSTRACT

BACKGROUND: Obesity is becoming more prevalent in the end-stage renal disease population. Bariatric surgery (BS) is increasingly considered as an approach to become eligible for kidney transplant (KT) or reduce obesity-related morbidities. OBJECTIVES: To assess the short- and long-term outcomes of patients who underwent both BS and KT and to determine the optimal timing of BS. METHODS: Patients who underwent both KT and BS between January 2000 and December 2020 were included and stratified according to the sequence of the 2 operations. The primary outcomes were patient and graft survival. The secondary outcomes were postoperative complications and efficacy of weight loss. RESULTS: Twenty-two patients were included in the KT first group and 34 in the BS first group. Death-uncensored graft survival in the KT first group was significantly higher than in the BS first group (90.9% versus 71.4%, P = .009), without significant difference in patient survival and death-censored graft survival (100% versus 90.5%, P = .082; 90.9% versus 81.0%, P = .058). There was no significant difference in 1-year total weight loss (1-yr TWL: median [interquartile range {IQR}], 36.0 [28.0-42.0] kg versus 29.6 [21.5-40.6] kg, P = .424), 1-year percentage of excess weight loss (1-yr %EWL: median [IQR], 74.9 [54.1-99.0] versus 57.9 [47.5-79.4], P = .155), and the incidence of postoperative complications (36.4% versus 50.0%, P = .316) between the KT first and BS first groups. CONCLUSION: Both pre- and posttransplant BS are effective and safe. Different conditions of each transplant candidate should be considered in detail to determine the optimal timing of BS.


Subject(s)
Bariatric Surgery , Kidney Transplantation , Obesity, Morbid , Humans , Obesity, Morbid/complications , Kidney Transplantation/adverse effects , Propensity Score , Bariatric Surgery/adverse effects , Obesity/complications , Weight Loss , Postoperative Complications/epidemiology , Retrospective Studies
5.
Transpl Int ; 36: 11751, 2023.
Article in English | MEDLINE | ID: mdl-38188697

ABSTRACT

It is not known whether antibody-mediated rejection (ABMR) is age-related, whether it plateaus late after transplantation, and to what extent it contributes to graft loss in older recipients. Patients transplanted between 2010 and 2015 (n = 1,054) in a single center had regular follow-up until January 2023. Recipients were divided into age groups at transplantation: 18-39 years ("young"), 40-55 years ("middle age"), and >55 years ("elderly"). Ten years after transplantation the cumulative % of recipients with ABMR was 17% in young, 15% in middle age, and 12% in elderly recipients (p < 0.001). The cumulative incidence of ABMR increased over time and plateaued 8-10 years after transplantation. In the elderly, with a median follow-up of 7.5 years, on average 30% of the recipients with ABMR died with a functional graft and ABMR contributed only 4% to overall graft loss in this group. These results were cross-validated in a cohort of recipients with >15 years follow-up. Multivariate cox-regression analysis showed that increasing recipient age was independently associated with decreasing risk for ABMR. In conclusion, the cumulative risk for ABMR is age-dependent, plateaus late after transplantation, and contributes little to overall graft loss in older recipients.


Subject(s)
Kidney Transplantation , Aged , Middle Aged , Humans , Adolescent , Young Adult , Adult , Incidence , Kidney Transplantation/adverse effects , Antibodies , Death , Multivariate Analysis
6.
PLoS One ; 17(7): e0270827, 2022.
Article in English | MEDLINE | ID: mdl-35797358

ABSTRACT

BACKGROUND: Most transplant centers in the Netherlands use estimated glomerular filtration rate (eGFR) for evaluation of potential living kidney donors. Whereas eGFR often underestimates GFR, especially in healthy donors, measured GFR (mGFR) allows more precise kidney function assessment, and therefore holds potential to increase the living donor pool. We hypothesized that mGFR-based donor screening leads to acceptance of donors with lower pre-donation eGFR than eGFR-based screening. METHODS: In this longitudinal cohort study, we compared eGFR (CKD-EPI) before donation in one center using mGFR-based screening (mGFR-cohort, n = 250) with two centers using eGFR-based screening (eGFR-cohort1, n = 466 and eGFR-cohort2, n = 160). We also compared differences in eGFR at five years after donation. RESULTS: Donor age was similar among the cohorts (mean±standard deviation (SD) mGFR-cohort 53±10 years, eGFR-cohort1 52±13 years, P = 0.16 vs. mGFR-cohort, and eGFR-cohort2 53±9 years, P = 0.61 vs. mGFR-cohort). Estimated GFR underestimated mGFR by 10±12 mL/min/1.73m2 (mean±SD), with more underestimation in younger donors. In the overall cohorts, mean±SD pre-donation eGFR was lower in the mGFR-cohort (91±13 mL/min/1.73m2) than in eGFR-cohort1 (93±15 mL/min/1.73m2, P<0.05) and eGFR-cohort2 (94±12 mL/min/1.73m2, P<0.05). However, these differences disappeared when focusing on more recent years, which can be explained by acceptance of more older donors with lower pre-donation eGFR over time in both eGFR-cohorts. Five years post-donation, mean±SD eGFR was similar among the centers (mGFR-cohort 62±12 mL/min/1.73m2, eGFR-cohort1 61±14 mL/min/1.73m2, eGFR-cohort2 62±11 mL/min/1.73m2, P = 0.76 and 0.95 vs. mGFR-cohort respectively). In the mGFR-cohort, 38 (22%) donors were excluded from donation due to insufficient mGFR with mean±SD mGFR of 71±9 mL/min/1.73m2. CONCLUSIONS: Despite the known underestimation of mGFR by eGFR, we did not show that the routine use of mGFR in donor screening leads to inclusion of donors with a lower pre-donation eGFR. Therefore eGFR-based screening will be sufficient for the majority of the donors. Future studies should investigate whether there is a group (e.g. young donors with insufficient eGFR) that might benefit from confirmatory mGFR testing.


Subject(s)
Kidney Transplantation , Living Donors , Adult , Glomerular Filtration Rate , Humans , Kidney , Longitudinal Studies , Middle Aged
7.
Kidney Int ; 101(6): 1251-1259, 2022 06.
Article in English | MEDLINE | ID: mdl-35227691

ABSTRACT

Single-kidney glomerular filtration rate (GFR) increases after living kidney donation due to compensatory hyperfiltration and structural changes. The implications of inter-individual variability in this increase in single-kidney GFR are unknown. Here, we aimed to identify determinants of the increase in single-kidney GFR at three-month postdonation, and to investigate its relationship with long-term kidney function. In a cohort study in 1024 donors, we found considerable inter-individual variability of the early increase in remaining single-kidney estimated GFR (eGFR) (median [25th-75th percentile]) 12 [8-18] mL/min/1.73m2. Predonation eGFR, age, and cortical kidney volume measured by CT were the main determinants of the early postdonation increase in single-kidney eGFR. Individuals with a stronger early increase in single-kidney eGFR had a significantly higher five-year postdonation eGFR, independent of predonation eGFR and age. Addition of the postdonation increase in single-kidney eGFR to a model including predonation eGFR and age significantly improved prediction of a five-year postdonation eGFR under 50 mL/min/1.73m2. Results at ten-year follow-up were comparable, while accounting for left-right differences in kidney volume did not materially change the results. Internal validation using 125I-iothalamate-based measured GFR in 529 donors and external validation using eGFR data in 647 donors yielded highly similar results. Thus, individuals with a more pronounced increase in single-kidney GFR had better long-term kidney function, independent of predonation GFR and age. Hence, the early postdonation increase in single-kidney GFR, considered indicative for kidney reserve capacity, may have additional value to eGFR and age to personalize follow-up intensity after living kidney donation.


Subject(s)
Kidney Transplantation , Living Donors , Cohort Studies , Glomerular Filtration Rate , Humans , Kidney , Kidney Transplantation/adverse effects , Kidney Transplantation/methods , Nephrectomy/adverse effects
8.
Transplantation ; 106(9): 1777-1786, 2022 09 01.
Article in English | MEDLINE | ID: mdl-35283452

ABSTRACT

BACKGROUND: Donor-derived cell-free DNA (ddcfDNA) is a promising minimally invasive biomarker for acute rejection (AR) in kidney transplant recipients. To assess the diagnostic value of ddcfDNA as a marker for AR, ddcfDNA was quantified at multiple time points after kidney transplantation with a novel high-throughput droplet digital PCR indel method that allowed for the absolute quantification of ddcfDNA. METHODS: In this study, ddcfDNA in plasma samples from 223 consecutive kidney transplant recipients was analyzed pretransplantation; at 3, 7, and 180 d after transplantation; and at time of for-cause biopsies obtained within the first 180 d after transplantation. RESULTS: Median (interquartile range) ddcfDNA concentration was significantly higher on day 3 (58.3 [17.7-258.3] copies/mL) and day 7 (25.0 [10.4-70.8] copies/mL) than on day 180 after transplantation (4.2 [0.0-8.3] copies/mL; P < 0.001 and P < 0.001, respectively). At time of biopsy-proven AR (BPAR), between day 11 and day 180 after transplantation, ddcfDNA concentration was significantly higher (50.0 [25.0-108.3] copies/mL) than those when biopsies showed non-AR (0.0 [0.0-15.6] copies/mL; P < 0.05). ddcfDNA concentration within the first 10 d after transplantation showed no significant difference between recipients with BPAR and those with non-AR in their biopsy or between recipients with BPAR and ddcfDNA measured at day 3 and day 7. CONCLUSIONS: Unfortunately, ddcfDNA concentration is not a good biomarker to detect AR within the first 10 d after transplantation; however, BPAR occurring after 10 d after transplantation can be detected in kidney transplant recipients by ddcfDNA using a novel and unique high-throughput droplet digital PCR indel method.


Subject(s)
Cell-Free Nucleic Acids , Kidney Transplantation , Biomarkers , Graft Rejection/diagnosis , Graft Rejection/genetics , Kidney Transplantation/adverse effects , Polymerase Chain Reaction
9.
Clin Transplant ; 36(1): e14515, 2022 01.
Article in English | MEDLINE | ID: mdl-34674329

ABSTRACT

Prediction of the risk of cardiovascular events (CVE's) is important to optimize outcomes after kidney transplantation. Aortoiliac stenosis is frequently observed during pre-transplant screening. We hypothesized that these patients are at higher risk of post-transplant CVE's due to the joint underlying atherosclerotic disease. Therefore, we aimed to assess whether aortoiliac stenosis was associated with post-transplant CVE's. This retrospective, single-center cohort study included adult kidney transplant recipients, transplanted between 2000 and 2016, with contrast-enhanced imaging available. Aortoiliac stenosis was classified according to the Trans-Atlantic Inter-Society Consensus (TASC) II classification and was defined as significant in case of ≥50% lumen narrowing. The primary outcome was CVE-free survival. Eighty-nine of 367 patients had significant aortoiliac stenosis and were found to have worse CVE-free survival (median CVE-free survival: stenosis 4.5 years (95% confidence interval (CI) 2.8-6.2), controls 8.9 years (95% CI 6.8-11.0); log-rank test P < .001). TASC II C and D lesions were independent risk factors for a post-transplant CVE with a hazard ratio of 2.15 (95% CI 1.05-4.38) and 6.56 (95% CI 2.74-15.70), respectively. Thus, kidney transplant recipients with TASC II C and D aortoiliac stenosis require extensive cardiovascular risk management pre-, peri,- and post-transplantation.


Subject(s)
Cardiovascular Diseases , Kidney Transplantation , Adult , Cardiovascular Diseases/etiology , Cohort Studies , Constriction, Pathologic , Humans , Kidney Transplantation/adverse effects , Retrospective Studies , Risk Factors , Transplant Recipients , Treatment Outcome
10.
Transpl Int ; 34(11): 2371-2381, 2021 Nov.
Article in English | MEDLINE | ID: mdl-34416037

ABSTRACT

Screening for aorto-iliac stenosis is important in kidney transplant candidates as its presence affects pre-transplantation decisions regarding side of implantation and the need for an additional vascular procedure. Reliable imaging techniques to identify this condition require contrast fluid, which can be harmful in these patients. To guide patient selection for these imaging techniques, we aimed to develop a prediction model for the presence of aorto-iliac stenosis. Patients with contrast-enhanced imaging available in the pre-transplant screening between January 1st, 2000 and December 31st, 2018 were included. A prediction model was developed using multivariable logistic regression analysis and internally validated using bootstrap resampling. Model performance was assessed with the concordance index and calibration slope. Three hundred and seventy-three patients were included, 90 patients (24.1%) had imaging-proven aorto-iliac stenosis. Our final model included age, smoking, peripheral arterial disease, coronary artery disease, a previous transplant, intermittent claudication and the presence of a femoral artery murmur. The model yielded excellent discrimination (optimism-corrected concordance index: 0.83) and calibration (optimism-corrected calibration slope: 0.91). In conclusion, this prediction model can guide the development of standardized protocols to decide which patients should receive vascular screening to identify aorto-iliac stenosis. External validation is needed before this model can be implemented in patient care.


Subject(s)
Kidney Transplantation , Aorta , Constriction, Pathologic , Femoral Artery , Humans , Iliac Artery/diagnostic imaging , Iliac Artery/surgery
11.
Transplant Proc ; 53(7): 2206-2211, 2021 Sep.
Article in English | MEDLINE | ID: mdl-34376313

ABSTRACT

Whether the anti-CD52 monoclonal antibody alemtuzumab can be an effective treatment option for late antibody-mediated rejection (ABMR) is not known. In a single-center pilot study, 12 patients with late ABMR were given 30 mg subcutaneous alemtuzumab.Median time from transplantation to biopsy was 22 months with 10 of 12 recipients fulfilling criteria for the histologic diagnosis chronic-active ABMR. The estimated glomerular filtration rate (eGFR) loss before diagnosis was 1.2 mL/min/mo with graft loss (eGFR <15 mL/min) expected to occur within 2 years in 11 of 12 cases. All recipients showed no or an inadequate response to initial treatment with steroids and intravenous immunoglobulin. eGFR at time of alemtuzumab administration was 35 mL/min/1.73 m2 (IQR, 30-42) and stabilized or improved in 10 of 12 recipients within 12 months. Proteinuria was stable in the year after alemtuzumab. At 3-year follow-up, the death-censored graft survival was 68% (uncensored graft survival was 58%). Five cases of 10 cases that could be evaluated at 3-year follow-up had stable eGFR (on average 44 mL/min at 12 months and 42 mL/min at 36 months). Alemtuzumab was generally well tolerated and only 2 cases of opportunistic infections were noted. One case of symptomatic parvovirus B infection and 1 case of BK viral infection occurred, which both cleared at follow-up. In conclusion, alemtuzumab may be of value as a second-line treatment for late ABMR with rapid loss of eGFR.


Subject(s)
Graft Rejection , Kidney Transplantation , Alemtuzumab , Graft Rejection/drug therapy , Graft Survival , Humans , Immunosuppressive Agents , Kidney , Kidney Transplantation/adverse effects , Pilot Projects
12.
Front Immunol ; 12: 645718, 2021.
Article in English | MEDLINE | ID: mdl-33815403

ABSTRACT

Background: Studies on herpes zoster (HZ) incidence in solid organ transplant (SOT) recipients report widely varying numbers. We investigated HZ incidence, severity, and risk factors in recipients of four different SOTs, with a follow-up time of 6-14 years. Methods: Records of 1,033 transplant recipients after first heart (HTx: n = 211), lung (LuTx: n = 121), liver (LiTx: n = 258) and kidney (KTx: n = 443) transplantation between 2000 and 2014 were analyzed for VZV-PCR, clinical signs of HZ, and complications. Results: HZ was diagnosed in 108 of 1,033 patients (10.5%): 36 HTx, 17 LuTx, 15 LiTx, and 40 KTx recipients. Overall HZ incidence rate after HTx (30.7 cases/1,000 person-years (PY)), LuTx (38.8 cases/1,000 PY), LiTx (22.7 cases/1,000 PY) and KTx (14.5 cases/1,000 PY) was significantly higher than in the general 50-70 year population. Multivariable analysis demonstrated age ≥50 years at transplantation (p = 0.038, RR 1.536), type of organ transplant (overall p = 0.002; LuTx p = 0.393; RR 1.314; LiTx p = 0.011, RR 0.444; KTx p = 0.034, RR 0.575), CMV prophylaxis (p = 0.043, RR 0.631) and type of anti-rejection therapy (overall p = 0.020; methylprednisolone p = 0.008, RR 0.475; r-ATG p = 0.64, RR1.194) as significant risk factors. Complications occurred in 33 of 108 (31%) patients (39% of HTx, 47% of LuTx, 20% of LiTx, 20% of KTx): post-herpetic neuralgia, disseminated disease, and cranial nerve involvement. Conclusion: HZ incidence and severity in SOT recipients are most pronounced after heart and lung transplantation, in older patients, and when CMV prophylaxis is lacking.


Subject(s)
Herpes Zoster/epidemiology , Organ Transplantation/adverse effects , Adolescent , Adult , Aged , Cytomegalovirus Infections/prevention & control , Female , Herpes Zoster/etiology , Humans , Incidence , Male , Middle Aged , Risk Factors , Seroepidemiologic Studies , Severity of Illness Index , Young Adult
13.
Pharmacol Res ; 167: 105565, 2021 05.
Article in English | MEDLINE | ID: mdl-33744428

ABSTRACT

Breakthrough cytomegalovirus (CMV) disease during valganciclovir prophylaxis is rare but may cause significant morbidity and even mortality. In order to identify patients at increased risk the incidence of CMV disease was studied in a large population of renal transplant recipients who underwent a kidney transplantation in the Radboud University Medical Center between 2004 and 2015 (n = 1300). CMV disease occurred in 31/1300 patients. Multivariate binary linear regression analysis showed that delayed graft function (DGF) (p = 0.018) and rejection (p = 0.001) significantly and independently increased the risk of CMV disease, whereas CMV status did not. Valganciclovir prophylaxis was prescribed to 281/1300 (21.6%) high-risk patients (defined as CMV IgG-seronegative recipients receiving a kidney from a CMV IgG-seropositive donor (D+/R-)). Of these 281 patients, 51 suffered from DGF (18%). The incidence of breakthrough CMV disease in D + /R- patients with DGF was much higher than in those with immediate function (6/51 (11.8%) vs 2/230, (0.9%), p = 0.0006 Fisher's exact test), despite valganciclovir prophylaxis. This higher incidence of CMV disease could not be explained by a higher incidence of rejection (and associated anti-rejection treatment) in patients with DGF. D + /R- patients with DGF are at increased risk of developing CMV disease despite valganciclovir prophylaxis. These findings suggest that underexposure to ganciclovir occurs in patients with DGF. Prospective studies evaluating the added value of therapeutic drug monitoring to achieve target ganciclovir concentrations in patients with DGF are needed.


Subject(s)
Cytomegalovirus Infections/etiology , Cytomegalovirus/isolation & purification , Delayed Graft Function/complications , Graft Rejection/complications , Kidney Transplantation/adverse effects , Adult , Humans , Middle Aged , Risk Factors
14.
Transplantation ; 105(1): 240-248, 2021 01 01.
Article in English | MEDLINE | ID: mdl-32101984

ABSTRACT

BACKGROUND: Most transplantation centers recognize a small patient population that unsuccessfully participates in all available, both living and deceased donor, transplantation programs for many years: the difficult-to-match patients. This population consists of highly immunized and/or ABO blood group O or B patients. METHODS: To improve their chances, Computerized Integration of Alternative Transplantation programs (CIAT) were developed to integrate kidney paired donation, altruistic/unspecified donation, and ABO and HLA desensitization. To compare CIAT with reality, a simulation was performed, including all patients, donors, and pairs who participated in our programs in 2015-2016. Criteria for inclusion as difficult-to-match, selected-highly immunized (sHI) patient were as follows: virtual panel reactive antibody >85% and participating for 2 years in Eurotransplant Acceptable Mismatch program. sHI patients were given priority, and ABO blood group incompatible (ABOi) and/or HLA incompatible (HLAi) matching with donor-specific antigen-mean fluorescence intensity (MFI) <8000 were allowed. For long-waiting blood group O or B patients, ABOi matches were allowed. RESULTS: In reality, 90 alternative program transplantations were carried out: 73 compatible, 16 ABOi, and 1 both ABOi and HLAi combination. Simulation with CIAT resulted in 95 hypothetical transplantations: 83 compatible (including 1 sHI) and 5 ABOi combinations. Eight sHI patients were matched: 1 compatible, 6 HLAi with donor-specific antigen-MFI <8000 (1 also ABOi), and 1 ABOi match. Six/eight combinations for sHI patients were complement-dependent cytotoxicity cross-match negative. CONCLUSIONS: CIAT led to 8 times more matches for difficult-to-match sHI patients. This offers them better chances because of a more favorable MFI profile against the new donor. Besides, more ABO compatible matches were found for ABOi couples, while total number of transplantations was not hampered. Prioritizing difficult-to-match patients improves their chances without affecting the chances of regular patients.


Subject(s)
ABO Blood-Group System/immunology , Blood Group Incompatibility/immunology , Decision Support Techniques , Donor Selection , HLA Antigens/immunology , Histocompatibility , Kidney Transplantation , Tissue and Organ Procurement , Adult , Blood Group Incompatibility/complications , Blood Group Incompatibility/diagnosis , Blood Grouping and Crossmatching , Clinical Decision-Making , Female , Humans , Kidney Transplantation/adverse effects , Male , Middle Aged , Predictive Value of Tests , Risk Assessment , Risk Factors , Treatment Outcome
15.
Clin Transplant ; 35(3): e14208, 2021 03.
Article in English | MEDLINE | ID: mdl-33368652

ABSTRACT

Patients with class II and III obesity and end-stage renal disease are often ineligible for kidney transplantation (KTx) due to increased postoperative complications and technically challenging surgery. Bariatric surgery (BS) can be an effective solution for KTx candidates who are considered inoperable. The aim of this study is to evaluate outcomes of KTx after BS and to compare the outcomes to obese recipients (BMI ≥ 35 kg/m2 ) without BS. This retrospective, single-center study included patients who received KTx after BS between January 1994 and December 2018. The primary outcome was postoperative complications. The secondary outcomes were graft and patient survival. In total, 156 patients were included, of whom 23 underwent BS prior to KTx. There were no significant differences in postoperative complications. After a median follow-up of 5.1 years, death-censored graft survival, uncensored graft survival, and patient survival were similar to controls (log rank test p = .845, .659, and .704, respectively). Dialysis pre-transplantation (Hazard Ratio (HR) 2.55; 95%CI 1.03-6.34, p = .043) and diabetes (HR 2.41; 95%CI 1.11-5.22, p = .027) were independent risk factors for all-cause mortality. A kidney from a deceased donor was an independent risk factor for death-censored graft loss (HR 1.98; 95%CI 1.04-3.79, p = .038). Patients who received a KTx after BS have similar outcomes as obese transplant recipients.


Subject(s)
Bariatric Surgery , Kidney Transplantation , Bariatric Surgery/adverse effects , Graft Survival , Humans , Kidney Transplantation/adverse effects , Retrospective Studies , Risk Factors , Treatment Outcome
16.
Front Immunol ; 11: 1332, 2020.
Article in English | MEDLINE | ID: mdl-32719676

ABSTRACT

Rabbit anti-thymocyte globulin (rATG) is currently the treatment of choice for glucocorticoid-resistant, recurrent, or severe acute allograft rejection (AR). However, rATG is associated with severe infusion-related side effects. Alemtuzumab is incidentally given to kidney transplant recipients as treatment for AR. In the current study, the outcomes of patients treated with alemtuzumab for AR were compared with that of patients treated with rATG for AR. The patient-, allograft-, and infection-free survival and adverse events of 116 alemtuzumab-treated patients were compared with those of 108 patients treated with rATG for AR. Propensity scores were used to control for differences between the two groups. Patient- and allograft survival of patients treated with either alemtuzumab or rATG were not different [hazard ratio (HR) 1.14, 95%-confidence interval (CI) 0.48-2.69, p = 0.77, and HR 0.82, 95%-CI 0.45-1.5, p = 0.52, respectively). Infection-free survival after alemtuzumab treatment was superior compared with that of rATG-treated patients (HR 0.41, 95%-CI 0.25-0.68, p < 0.002). Infusion-related adverse events occurred less frequently after alemtuzumab treatment. Alemtuzumab therapy may therefore be an alternative therapy for glucocorticoid-resistant, recurrent, or severe acute kidney transplant rejection.


Subject(s)
Alemtuzumab/therapeutic use , Antilymphocyte Serum/therapeutic use , Graft Rejection/drug therapy , Immunosuppressive Agents/therapeutic use , Kidney Transplantation/adverse effects , Adult , Allografts , Female , Graft Rejection/mortality , Humans , Kidney Transplantation/mortality , Male , Middle Aged , Retrospective Studies
17.
Transpl Int ; 33(5): 483-496, 2020 05.
Article in English | MEDLINE | ID: mdl-32034811

ABSTRACT

The prognosis of kidney transplant recipients (KTR) with vascular calcification (VC) in the aorto-iliac arteries is unclear. We performed a systematic review and meta-analysis to investigate their survival outcomes. Studies from January 1st, 2000 until March 5th, 2019 were included. Outcomes for meta-analysis were patient survival, (death-censored) graft survival and delayed graft function (DGF). Twenty-one studies were identified, eight provided data for meta-analysis. KTR with VC had a significantly increased mortality risk [1-year: risk ratio (RR) 2.19 (1.39-3.44), 5-year: RR 2.28 (1.86-2.79)]. The risk of 1-year graft loss was three times higher in recipients with VC [RR 3.15 (1.30-7.64)]. The risk of graft loss censored for death [1-year: RR 2.26 (0.58-2.73), 3-year: RR 2.19 (0.49-9.82)] and the risk of DGF (RR 1.24, 95% CI 0.98-1.58) were not statistically different. The quality of the evidence was rated as very low. To conclude, the presence of VC was associated with an increased mortality risk and risk of graft loss. In this small sample size, no statistical significant association between VC and DGF or risk of death-censored graft loss could be demonstrated. For interpretation of the outcomes, the quality and sample size of the evidence should be taken into consideration.


Subject(s)
Kidney Transplantation , Delayed Graft Function/etiology , Graft Rejection , Graft Survival , Humans , Kidney Transplantation/adverse effects , Prognosis , Risk Factors , Transplant Recipients
18.
Transplant Direct ; 5(10): e496, 2019 Oct.
Article in English | MEDLINE | ID: mdl-31723590

ABSTRACT

Age criteria for kidney transplantation have been liberalized over the years resulting in more waitlisted elderly patients. What are the prospects of elderly patients on the waiting list? METHODS: Between 2000 and 2013, 2622 patients had been waitlisted. Waiting time was defined as the period between dialysis onset and being delisted. Patients were categorized according to age upon listing: <25; 25-44; 45-54; 55-64; and >64 years. Furthermore, the influence of ABO blood type and panel reactive antibodies on outflow patterns was studied. RESULTS: At the end of observation (November 2017), 1957 (75%) patients had been transplanted, 333 (13%) had been delisted without a transplantation, 271 (10%) had died, and 61 (2%) were still waiting. When comparing the age categories, outflow patterns were completely different. The percentage of patients transplanted decreased with increasing age, while the percentage of patients that had been delisted or had died increased with increasing age, especially in the population without living donor. Within 6 years, 93% of the population <25 years had received a (primarily living) donor kidney. In the populations >55 years, 39% received a living donor kidney, while >50% of patients without a living donor had been delisted/died. Multivariable analysis showed that the influence of age, ABO blood type, and panel reactive antibodies on outflow patterns was significant, but the magnitude of the influence of the latter 2 was only modest compared with that of age. CONCLUSIONS: "Elderly" (not only >64 y but even 55-64 y) received a living donor kidney transplantation less often. Moreover, they cannot bear the waiting time for a deceased donor kidney, resulting in delisting without a transplant in more than half the population of patients without a living donor. Promoting living donor kidney transplantation is the only modification that improves transplantation and decreases delisting/death on the waiting list in this population.

19.
Eur Surg Res ; 60(3-4): 97-105, 2019.
Article in English | MEDLINE | ID: mdl-31480061

ABSTRACT

BACKGROUND: Short-term kidney graft dysfunction is correlated with complications and it is associated with a decreased long-term survival; therefore, a scoring system to predict short-term renal transplant outcomes is warranted. AIM: The aim of this study is to quantify the impression of the organ procurement surgeon in correlation with the following kidney transplant outcomes: immediate graft function (IGF), delayed graft function (DGF), and primary nonfunction (PNF). Results are compared to factors associated with the 1-year outcome. METHODS: A regional prospective pilot study was performed using deceased-donor organ assessment forms to be filled out by procurement surgeons after procurement. Data were gathered on kidney temperature, perfusion, anatomy, atherosclerosis, and overall quality. RESULTS: Included were 90 donors who donated 178 kidneys, 166 of which were transplanted. Variables that were significantly more prevalent in the DGF-or-PNF group (n = 65) are: large kidney size (length, p = 0.008; width, p = 0.036), poor perfusion quality (p = 0.037), lower diuresis (p = 0.039), fewer hypotensive episodes (p = 0.003), and donation-after-circulatory-death donors (p = 0.017). Multivariable analysis showed that perfusion quality and kidney width significantly predicted the short-term outcome. However multivariable analysis of long-term outcomes showed that the first measured donor creatinine, kidney donor risk index, IGF vs. DGF+PNG, and kidney length predicted outcomes. CONCLUSIONS: Results show that short-term graft function and 1-year graft function indeed are influenced by different variables. DGF and PNF occur more frequently in kidneys with poor perfusion and in larger kidneys. A plausible explanation for this is that these kidneys might be insufficiently washed out, or even congested, which may predispose to DGF. These kidneys would probably benefit most from reconditioning strategies, such as machine perfusion. A scoring system including these variables might aid in decision-making towards allocation and potential reconditioning strategies.


Subject(s)
Delayed Graft Function , Kidney Transplantation , Kidney , Transplants , Adolescent , Adult , Aged , Checklist , Child , Female , Humans , Male , Middle Aged , Pilot Projects , Prospective Studies , Tissue and Organ Procurement , Young Adult
20.
PLoS One ; 14(4): e0214940, 2019.
Article in English | MEDLINE | ID: mdl-30990835

ABSTRACT

BACKGROUND: Calcium oxalate (CaOx) deposition in the kidney may lead to loss of native renal function but little is known about the prevalence and role of CaOx deposition in transplanted kidneys. METHODS: In patients transplanted in 2014 and 2015, all for-cause renal allograft biopsies obtained within 3 months post-transplantation were retrospectively investigated for CaOx deposition. Additionally, all preimplantation renal biopsies obtained in 2000 and 2001 were studied. RESULTS: In 2014 and 2015, 388 patients were transplanted, of whom 149 had at least one for-cause renal biopsy. Twenty-six (17%) patients had CaOx deposition. In the population with CaOx deposition: Patients had significantly more often been treated with dialysis before transplantation (89 vs. 64%; p = 0.011); delayed graft function occurred more frequently (42 vs. 23%; p = 0.038); and the eGFR at the time of first biopsy was significantly worse (21 vs. 29 ml/min/1.73m2; p = 0.037). In a multivariate logistic regression analysis, eGFR at the time of first biopsy (OR 0.958, 95%-Cl: 0.924-0.993, p = 0.019), dialysis before transplantation (OR 4.868, 95%-Cl: 1.128-21.003, p = 0.034) and the time of first biopsy after transplantation (OR 1.037, 95%-Cl: 1.013-1.062, p = 0.002) were independently associated with CaOx deposition. Graft survival censored for death was significantly worse in patients with CaOx deposition (p = 0.018). In only 1 of 106 preimplantation biopsies CaOx deposition was found (0.94%). CONCLUSION: CaOx deposition appears to be primarily recipient-derived and is frequently observed in for-cause renal allograft biopsies obtained within 3 months post-transplantation. It is associated with inferior renal function at the time of biopsy and worse graft survival.


Subject(s)
Calcium Oxalate/metabolism , Delayed Graft Function , Glomerular Filtration Rate , Graft Survival , Kidney Transplantation , Kidney , Renal Dialysis , Adult , Aged , Biopsy , Delayed Graft Function/metabolism , Delayed Graft Function/pathology , Delayed Graft Function/physiopathology , Delayed Graft Function/therapy , Female , Humans , Kidney/metabolism , Kidney/pathology , Male , Middle Aged , Retrospective Studies , Time Factors , Transplantation, Homologous
SELECTION OF CITATIONS
SEARCH DETAIL