ABSTRACT
BACKGROUND: Mucosal antibodies play a critical role in preventing SARS-CoV-2 infections or reinfections by blocking the interaction of the receptor-binding domain (RBD) with the angiotensin-converting enzyme 2 (ACE2) receptor on the cell surface. In this study, we investigated the difference between the mucosal antibody response after primary infection and vaccination. METHODS: We assessed longitudinal changes in the quantity and capacity of nasal antibodies to neutralize the interaction of RBD with the ACE2 receptor using the spike protein and RBD from ancestral SARS-CoV-2 (Wuhan-Hu-1), as well as the RBD from the Delta and Omicron variants. RESULTS: Significantly higher mucosal IgA concentrations were detected postinfection vs postvaccination, while vaccination induced higher IgG concentrations. However, ACE2-inhibiting activity did not differ between the cohorts. Regarding whether IgA or IgG drove ACE2 inhibition, infection-induced binding inhibition was driven by both isotypes, while postvaccination binding inhibition was mainly driven by IgG. CONCLUSIONS: Our study provides new insights into the relationship between antibody isotypes and neutralization by using a sensitive and high-throughput ACE2 binding inhibition assay. Key differences are highlighted between vaccination and infection at the mucosal level, showing that despite differences in the response quantity, postinfection and postvaccination ACE2 binding inhibition capacity did not differ.
Subject(s)
COVID-19 , SARS-CoV-2 , Humans , Angiotensin-Converting Enzyme 2 , COVID-19/prevention & control , Vaccination , Immunoglobulin A , Immunoglobulin G , Spike Glycoprotein, Coronavirus , Protein BindingABSTRACT
Kidneys donated after circulatory death (DCD) perform similarly to kidneys donated after brain death (DBD). However, the respective incidences of delayed graft function (DGF) differ. This questions the donor type-specific impact of early graft function on long-term outcomes. Using competing risk and Cox-regression analysis, we compared death-censored graft loss between types of early graft function: DGF (temporary dialysis dependency started within 7 days after transplantation), slow graft function (3-day plasma creatinine decline less than 10% per day), and immediate graft function. In 1061 DBD and 1605 DCD graft recipients (January 2014 until January 2023), graft survival was similar. DGF was associated with death-censored graft loss in DBD and DCD (adjusted hazard ratios: DGF in DBD: 1.79 [1.04-2.91], P = .027, DGF in DCD: 1.84 [1.18-2.87], P = .008; Reference: no DGF). Slow graft function was associated with death-censored graft loss in DBD, but not significantly in DCD (adjusted hazard ratios DBD: 2.82 (1.34-5.93), P = .007, and DCD: 1.54 (0.72-3.35), P = .262; Reference: immediate graft function). Early graft dysfunction has a differential impact on graft outcome in DBD and DCD. The differences between DBD and DCD should be accounted for in research and the clinic.
ABSTRACT
The innate immune system plays an essential role in regulating the immune responses to kidney transplantation, but the mechanisms through which innate immune cells influence long-term graft survival are unclear. The current study highlights the vital role of trained immunity in kidney allograft survival. Trained immunity describes the epigenetic and metabolic changes that innate immune cells undergo following an initial stimulus, allowing them have a stronger inflammatory response to subsequent stimuli. We stimulated healthy peripheral blood mononuclear cells with pretransplant and posttransplant serum of kidney transplant patients and immunosuppressive drugs in an in vitro trained immunity assay and measured tumor necrosis factor and interleukin 6 cytokine levels in the supernatant as a readout for trained immunity. We show that the serum of kidney transplant recipients collected 1 week after transplantation can suppress trained immunity. Importantly, we found that kidney transplant recipients whose serum most strongly suppressed trained immunity rarely experienced graft loss. This suppressive effect of posttransplant serum is likely mediated by previously unreported effects of immunosuppressive drugs. Our findings provide mechanistic insights into the role of innate immunity in kidney allograft survival, uncovering trained immunity as a potential therapeutic target for improving graft survival.
ABSTRACT
Previously we established a prediction model for graft intolerance syndrome requiring graft nephrectomy in patients with late kidney graft failure. The aim of this study is to determine generalizability of this model in an independent cohort. The validation cohort included patients with late kidney graft failure between 2008 and 2018. Primary outcome is the prognostic performance of our model, expressed as the area under the receiver operating characteristic curve (ROC-AUC), in the validation cohort. In 63 of 580 patients (10.9%) a graft nephrectomy was performed because of graft intolerance. The original model, which included donor age, graft survival and number of acute rejections, performed poorly in the validation cohort (ROC-AUC 0.61). After retraining of the model using recipient age at graft failure instead of donor age, the model had an average ROC-AUC of 0.70 in the original cohort and of 0.69 in the validation cohort. Our original model did not accurately predict the graft intolerance syndrome in a validation cohort. However, a retrained model including recipient age at graft failure instead of donor age performed moderately well in both the development and validation cohort enabling identification of patients with the highest and lowest risk of graft intolerance syndrome.
Subject(s)
Kidney Transplantation , Humans , Kidney Transplantation/adverse effects , Risk Assessment , Retrospective Studies , Tissue Donors , Prognosis , ROC Curve , SyndromeABSTRACT
Breakthrough cytomegalovirus (CMV) disease during valganciclovir prophylaxis is rare but may cause significant morbidity and even mortality. In order to identify patients at increased risk the incidence of CMV disease was studied in a large population of renal transplant recipients who underwent a kidney transplantation in the Radboud University Medical Center between 2004 and 2015 (n = 1300). CMV disease occurred in 31/1300 patients. Multivariate binary linear regression analysis showed that delayed graft function (DGF) (p = 0.018) and rejection (p = 0.001) significantly and independently increased the risk of CMV disease, whereas CMV status did not. Valganciclovir prophylaxis was prescribed to 281/1300 (21.6%) high-risk patients (defined as CMV IgG-seronegative recipients receiving a kidney from a CMV IgG-seropositive donor (D+/R-)). Of these 281 patients, 51 suffered from DGF (18%). The incidence of breakthrough CMV disease in D + /R- patients with DGF was much higher than in those with immediate function (6/51 (11.8%) vs 2/230, (0.9%), p = 0.0006 Fisher's exact test), despite valganciclovir prophylaxis. This higher incidence of CMV disease could not be explained by a higher incidence of rejection (and associated anti-rejection treatment) in patients with DGF. D + /R- patients with DGF are at increased risk of developing CMV disease despite valganciclovir prophylaxis. These findings suggest that underexposure to ganciclovir occurs in patients with DGF. Prospective studies evaluating the added value of therapeutic drug monitoring to achieve target ganciclovir concentrations in patients with DGF are needed.
Subject(s)
Cytomegalovirus Infections/etiology , Cytomegalovirus/isolation & purification , Delayed Graft Function/complications , Graft Rejection/complications , Kidney Transplantation/adverse effects , Adult , Humans , Middle Aged , Risk FactorsABSTRACT
Kidney transplant candidates are blood group incompatible with roughly one out of three potential living donors. We compared outcomes after ABO-incompatible (ABOi) kidney transplantation with matched ABO-compatible (ABOc) living and deceased donor transplantation and analyzed different induction regimens. We performed a retrospective study with propensity matching and compared patient and death-censored graft survival after ABOi versus ABOc living donor and deceased donor kidney transplantation in a nationwide registry from 2006 till 2019. 296 ABOi were compared with 1184 center and propensity-matched ABOc living donor and 1184 deceased donor recipients (matching: recipient age, sex, blood group, and PRA). Patient survival was better compared with deceased donor [hazard ratio (HR) for death of HR 0.69 (0.49-0.96)] and non-significantly different from ABOc living donor recipients [HR 1.28 (0.90-1.81)]. Rate of graft failure was higher compared with ABOc living donor transplantation [HR 2.63 (1.72-4.01)]. Rejection occurred in 47% of 140 rituximab versus 22% of 50 rituximab/basiliximab, and 4% of 92 alemtuzumab-treated recipients (P < 0.001). ABOi kidney transplantation is superior to deceased donor transplantation. Rejection rate and graft failure are higher compared with matched ABOc living donor transplantation, underscoring the need for further studies into risk stratification and induction therapy [NTR7587, www.trialregister.nl].
Subject(s)
Kidney Transplantation , ABO Blood-Group System , Blood Group Incompatibility , Graft Rejection , Graft Survival , Humans , Living Donors , Retrospective StudiesABSTRACT
Early graft loss (EGL) is a feared outcome of kidney transplantation. Consequently, kidneys with an anticipated risk of EGL are declined for transplantation. In the most favorable scenario, with optimal use of available donor kidneys, the donor pool size is balanced by the risk of EGL, with a tradeoff dictated by the consequences of EGL. To gauge the consequence of EGL we systematically evaluated its impact in an observational study that included all 10,307 deceased-donor kidney transplantations performed in The Netherlands between 1990 and 2018. Incidence of EGL, defined as graft loss within 90 days, in primary transplantation was 8.2% (699/8,511). The main causes were graft rejection (30%), primary nonfunction (25%), and thrombosis or infarction (20%). EGL profoundly impacted short- and long-term patient survival (adjusted hazard ratio; 95% confidence interval: 8.2; 5.1-13.2 and 1.7; 1.3-2.1, respectively). Of the EGL recipients who survived 90 days after transplantation (617/699) only 440 of the 617 were relisted for re-transplantation. Of those relisted, only 298 were ultimately re-transplanted leading to an actual re-transplantation rate of 43%. Noticeably, re-transplantation was associated with a doubled incidence of EGL, but similar long-term graft survival (adjusted hazard ratio 1.1; 0.6-1.8). Thus, EGL after kidney transplantation is a medical catastrophe with high mortality rates, low relisting rates, and increased risk of recurrent EGL following re-transplantation. This implies that detrimental outcomes also involve convergence of risk factors in recipients with EGL. The 8.2% incidence of EGL minimally impacted population mortality, indicating this incidence is acceptable.
Subject(s)
Kidney Transplantation , Graft Rejection/epidemiology , Graft Survival , Humans , Kidney , Kidney Transplantation/adverse effects , Netherlands/epidemiology , Retrospective Studies , Tissue Donors , Treatment OutcomeABSTRACT
In several deceased donor kidney allocation systems, organs from elderly donors are allocated primarily to elderly recipients. The Eurotransplant Senior Program (ESP) was implemented in 1999, and since then, especially in Europe, the use of organs from elderly donors has steadily increased. The proportion of ≥60-year-old donors reported to the Collaborative Transplant Study (CTS) by European centers has doubled, from 21% in 2000-2001 to 42% in 2016-2017. Therefore, in the era of organ shortage it is a matter of debate whether kidney organs from elderly donors should only be allocated to elderly recipients or whether <65-year-old recipients can also benefit from these generally as "marginal" categorized organs. To discuss this issue, a European Consensus Meeting was organized by the CTS on April 12, 2018, in Heidelberg, in which 36 experts participated. Based on available evidence, it was unanimously concluded that kidney organs from 65- to 74-year-old donors can also be allocated to 55- to 64-year-old recipients, especially if these organs are from donors with no history of hypertension, no increased creatinine, no cerebrovascular death, and no other reasons for defining a marginal donor, such as diabetes or cancer.
Subject(s)
Kidney Transplantation , Tissue and Organ Procurement , Age Factors , Aged , Allografts , Europe , Graft Survival , Humans , Kidney , Middle Aged , Tissue DonorsABSTRACT
The clinical significance of non-HLA antibodies on renal allograft survival is a matter of debate, due to differences in reported results and lack of large-scale studies incorporating analysis of multiple non-HLA antibodies simultaneously. We developed a multiplex non-HLA antibody assay against 14 proteins highly expressed in the kidney. In this study, the presence of pretransplant non-HLA antibodies was correlated to renal allograft survival in a nationwide cohort of 4770 recipients transplanted between 1995 and 2006. Autoantibodies against Rho GDP-dissociation inhibitor 2 (ARHGDIB) were significantly associated with graft loss in recipients transplanted with a deceased-donor kidney (N = 3276) but not in recipients of a living-donor kidney (N = 1496). At 10 years after deceased-donor transplantation, recipients with anti-ARHGDIB antibodies (94/3276 = 2.9%) had a 13% lower death-censored covariate-adjusted graft survival compared to the anti-ARHGDIB-negative (3182/3276 = 97.1%) population (hazard ratio 1.82; 95% confidence interval, 1.32-2.53; P = .0003). These antibodies occur independently from donor-specific anti-HLA antibodies (DSA) or other non-HLA antibodies investigated. No significant relations with graft loss were found for the other 13 non-HLA antibodies. We suggest that pretransplant risk assessment can be improved by measuring anti-ARHGDIB antibodies in all patients awaiting deceased-donor transplantation.
Subject(s)
Autoantibodies/immunology , Graft Rejection/mortality , Graft Survival/immunology , HLA Antigens/immunology , Kidney Transplantation/adverse effects , Postoperative Complications/mortality , rho Guanine Nucleotide Dissociation Inhibitor beta/immunology , Adult , Female , Follow-Up Studies , Graft Rejection/diagnosis , Graft Rejection/etiology , Humans , Isoantibodies/immunology , Kidney Failure, Chronic/immunology , Kidney Failure, Chronic/mortality , Kidney Failure, Chronic/surgery , Living Donors/statistics & numerical data , Male , Middle Aged , Postoperative Complications/diagnosis , Postoperative Complications/etiology , Prognosis , Retrospective Studies , Risk FactorsABSTRACT
Whereas regular allocation avoids unacceptable mismatches on the donor organ, allocation to highly sensitized patients within the Eurotransplant Acceptable Mismatch (AM) program is based on the patient's HLA phenotype plus acceptable antigens. These are HLA antigens to which the patient never made antibodies, as determined by extensive laboratory testing. AM patients have superior long-term graft survival compared with highly sensitized patients in regular allocation. Here, we questioned whether the AM program also results in lower rejection rates. From the PROCARE cohort, consisting of all Dutch kidney transplants in 1995-2005, we selected deceased donor single transplants with a minimum of 1 HLA mismatch and determined the cumulative 6-month rejection incidence for patients in AM or regular allocation. Additionally, we determined the effect of minimal matching criteria of 1 HLA-B plus 1 HLA-DR, or 2 HLA-DR antigens on rejection incidence. AM patients showed significantly lower rejection rates than highly immunized patients in regular allocation, comparable to nonsensitized patients, independent of other risk factors for rejection. In contrast to highly sensitized patients in regular allocation, minimal matching criteria did not affect rejection rates in AM patients. Allocation based on acceptable antigens leads to relatively low-risk transplants for highly sensitized patients with rejection rates similar to those of nonimmunized individuals.
Subject(s)
Graft Rejection/diagnosis , HLA Antigens/immunology , Histocompatibility/immunology , Immunization/methods , Kidney Failure, Chronic/immunology , Kidney Transplantation/adverse effects , Patient Selection , Tissue Donors/supply & distribution , Female , Follow-Up Studies , Graft Rejection/etiology , Graft Rejection/pathology , Graft Survival/immunology , HLA Antigens/chemistry , Histocompatibility Testing , Humans , Isoantibodies/adverse effects , Kidney Failure, Chronic/surgery , Kidney Transplantation/statistics & numerical data , Male , Middle Aged , Prognosis , Risk Factors , Tissue and Organ Procurement/methods , Transplantation ImmunologyABSTRACT
BACKGROUND: Pre-transplant donor-specific anti-human leucocyte antigen (HLA) antibodies (DSAs) are associated with impaired kidney graft survival while the clinical relevance of non-donor-specific anti-HLA antibodies (nDSAs) is more controversial. The aim of the present paired kidney graft study was to compare the clinical relevance of DSAs and nDSAs. METHODS: To eliminate donor and era-dependent factors, a post hoc paired kidney graft analysis was performed as part of a Dutch multicentre study evaluating all transplantations between 1995 and 2005 with available pre-transplant serum samples. Anti-HLA antibodies were detected with a Luminex single-antigen bead assay. RESULTS: Among 3237 deceased donor transplantations, we identified 115 recipient pairs receiving a kidney from the same donor with one recipient being DSA positive and the other without anti-HLA antibodies. Patients with pre-transplant DSAs had a significantly lower 10-year death-censored graft survival (55% versus 82%, P=0.0001). We identified 192 pairs with one recipient as nDSA positive (against Class I and/or II) and the other without anti-HLA antibodies. For the patients with nDSAs against either Class I or II, graft survival did not significantly differ compared with patients without anti-HLA antibodies (74% versus 77%, P = 0.79). Only in patients with both nDSAs Class I and II was there a trend towards a lower graft survival (58%, P = 0.06). Lastly, in a small group of 42 recipient pairs, 10-year graft survival in recipients with DSAs was 49% compared with 68% in recipients with nDSAs (P=0.11). CONCLUSION: This paired kidney analysis confirms that the presence of pre-transplant DSAs in deceased donor transplantations is a risk marker for graft loss, whereas nDSAs in general are not associated with a lower graft survival. Subgroup analysis indicated that only in broadly sensitized patients with nDSAs against Class I and II, nDSAs may be a risk marker for graft loss in the long term.
Subject(s)
Graft Rejection/immunology , Graft Survival/immunology , HLA Antigens/immunology , Isoantibodies/blood , Adult , Female , Histocompatibility Antigens Class I , Humans , Kidney Transplantation/mortality , Male , Middle Aged , Netherlands , Risk , Tissue Donors , Young AdultABSTRACT
BACKGROUND: Few studies have evaluated the effect of different immunosuppressive strategies on long-term kidney transplant outcomes. Moreover, as they were usually based on historical data, it was not possible to account for the presence of pretransplant donor-specific human-leukocyte antigen antibodies (DSA), a currently recognized risk marker for impaired graft survival. The aim of this study was to evaluate to what extent frequently used initial immunosuppressive therapies increase graft survival in immunological low-risk patients. METHODS: We performed an analysis on the PROCARE cohort, a Dutch multicentre study including all transplantations performed in the Netherlands between 1995 and 2005 with available pretransplant serum (n = 4724). All sera were assessed for the presence of DSA by a luminex single-antigen bead assay. Patients with a previous kidney transplantation, pretransplant DSA or receiving induction therapy were excluded from the analysis. RESULTS: Three regimes were used in over 200 patients: cyclosporine (CsA)/prednisolone (Pred) (n = 542), CsA/mycophenolate mofetil (MMF)/Pred (n = 857) and tacrolimus (TAC)/MMF/Pred (n = 811). Covariate-adjusted analysis revealed no significant differences in 10-year death-censored graft survival between patients on TAC/MMF/Pred therapy (79%) compared with patients on CsA/MMF/Pred (82%, P = 0.88) or CsA/Pred (79%, P = 0.21). However, 1-year rejection-free survival censored for death and failure unrelated to rejection was significantly higher for TAC/MMF/Pred (81%) when compared with CsA/MMF/Pred (67%, P < 0.0001) and CsA/Pred (64%, P < 0.0001). CONCLUSION: These results suggest that in immunological low-risk patients excellent long-term kidney graft survival can be achieved irrespective of the type of initial immunosuppressive therapy (CsA or TAC; with or without MMF), despite differences in 1-year rejection-free survival.
Subject(s)
Cyclosporine/therapeutic use , Graft Rejection , Immunosuppression Therapy/methods , Kidney Transplantation , Mycophenolic Acid/therapeutic use , Tacrolimus/therapeutic use , Adult , Cohort Studies , Disease-Free Survival , Female , Graft Survival/immunology , HLA Antigens/immunology , Humans , Immunosuppression Therapy/adverse effects , Immunosuppressive Agents/therapeutic use , Kidney/immunology , Male , Middle Aged , Netherlands/epidemiology , PrednisoloneABSTRACT
Background Complement-fixing antibodies against donor HLA are considered a contraindication for kidney transplant. A modification of the IgG single-antigen bead (SAB) assay allows detection of anti-HLA antibodies that bind C3d. Because early humoral graft rejection is considered to be complement mediated, this SAB-based technique may provide a valuable tool in the pretransplant risk stratification of kidney transplant recipients.Methods Previously, we established that pretransplant donor-specific anti-HLA antibodies (DSAs) are associated with increased risk for long-term graft failure in complement-dependent cytotoxicity crossmatch-negative transplants. In this study, we further characterized the DSA-positive serum samples using the C3d SAB assay.Results Among 567 pretransplant DSA-positive serum samples, 97 (17%) contained at least one C3d-fixing DSA, whereas 470 (83%) had non-C3d-fixing DSA. At 10 years after transplant, patients with C3d-fixing antibodies had a death-censored, covariate-adjusted graft survival of 60%, whereas patients with non-C3d-fixing DSA had a graft survival of 64% (hazard ratio, 1.02; 95% confidence interval, 0.70 to 1.48 for C3d-fixing DSA compared with non-C3d-fixing DSA; P=0.93). Patients without DSA had a 10-year graft survival of 78%.Conclusions The C3d-fixing ability of pretransplant DSA is not associated with increased risk for graft failure.
Subject(s)
Antibodies, Anti-Idiotypic/immunology , Complement C3d/immunology , Graft Rejection/immunology , HLA Antigens/immunology , Kidney Transplantation/adverse effects , Registries , Adult , Age Distribution , Antilymphocyte Serum/immunology , Cohort Studies , Female , Follow-Up Studies , Graft Rejection/epidemiology , Humans , Incidence , Kidney Transplantation/methods , Male , Middle Aged , Preoperative Care/methods , Retrospective Studies , Risk Assessment , Sex Distribution , Tissue Donors , Transplant Recipients/statistics & numerical data , Transplantation ImmunologyABSTRACT
BACKGROUND: Kidney recipients maintaining a prolonged allograft survival in the absence of immunosuppressive drugs and without evidence of rejection are supposed to be exceptional. The ERA-EDTA-DESCARTES working group together with Nantes University launched a European-wide survey to identify new patients, describe them and estimate their frequency for the first time. METHODS: Seventeen coordinators distributed a questionnaire in 256 transplant centres and 28 countries in order to report as many 'operationally tolerant' patients (TOL; defined as having a serum creatinine <1.7 mg/dL and proteinuria <1 g/day or g/g creatinine despite at least 1 year without any immunosuppressive drug) and 'almost tolerant' patients (minimally immunosuppressed patients (MIS) receiving low-dose steroids) as possible. We reported their number and the total number of kidney transplants performed at each centre to calculate their frequency. RESULTS: One hundred and forty-seven questionnaires were returned and we identified 66 TOL (61 with complete data) and 34 MIS patients. Of the 61 TOL patients, 26 were previously described by the Nantes group and 35 new patients are presented here. Most of them were noncompliant patients. At data collection, 31/35 patients were alive and 22/31 still operationally tolerant. For the remaining 9/31, 2 were restarted on immunosuppressive drugs and 7 had rising creatinine of whom 3 resumed dialysis. Considering all patients, 10-year death-censored graft survival post-immunosuppression weaning reached 85% in TOL patients and 100% in MIS patients. With 218 913 kidney recipients surveyed, cumulative incidences of operational tolerance and almost tolerance were estimated at 3 and 1.5 per 10 000 kidney recipients, respectively. CONCLUSIONS: In kidney transplantation, operational tolerance and almost tolerance are infrequent findings associated with excellent long-term death-censored graft survival.
Subject(s)
Graft Rejection/epidemiology , Graft Survival/immunology , Immune Tolerance/immunology , Immunosuppression Therapy/methods , Kidney Transplantation , Transplant Recipients , Adult , Europe/epidemiology , Female , Graft Rejection/immunology , Graft Rejection/prevention & control , Humans , Incidence , Male , Surveys and Questionnaires , Survival Rate/trends , Transplantation, HomologousABSTRACT
Cell therapy and the use of mAbs that interfere with T cell effector functions constitute promising approaches for the control of allograft rejection. In the current study, we investigated a novel approach combining administration of autologous tolerogenic dendritic cells with short-term treatment with CD3-specific Abs. Permanent acceptance of pancreatic islet allografts was achieved in mice treated with the combination therapy the day before transplantation but not in recipients treated with either therapy alone. The combination treatment induced a marked decrease in T cells infiltrating the allografts and a sustained reduction of antidonor responses. Importantly, CD4(+)Foxp3(+) regulatory T cells appeared to play a crucial role in the long-term graft acceptance. Their frequency increased significantly in the spleen, draining lymph nodes, and transplanted islets and remained elevated over the long term; they exhibited increased donor-specific suppressive functions; and their removal at the time of transplantation abrogated the therapeutic effect of the combined therapy. These results support the therapeutic potential of protocols combining autologous dendritic cells and low-dose CD3 Abs, both currently in clinical development, and that act in synergy to control allogeneic immune responses and favor graft survival in a full-mismatch situation.
Subject(s)
Antibodies, Monoclonal/pharmacology , CD3 Complex/metabolism , Dendritic Cells/immunology , Dendritic Cells/transplantation , Islets of Langerhans Transplantation , T-Lymphocytes, Regulatory/drug effects , T-Lymphocytes, Regulatory/immunology , Allografts , Animals , Epitopes/immunology , Graft Survival/drug effects , Graft Survival/immunology , Immunomodulation/drug effects , Immunomodulation/immunology , Islets of Langerhans Transplantation/methods , Mice , Models, Animal , T-Lymphocyte Subsets/immunology , T-Lymphocyte Subsets/metabolism , Transplantation Tolerance/drug effects , Transplantation Tolerance/immunology , Transplantation, AutologousABSTRACT
The use of inhibitors of the mammalian target of rapamycin (mTORi) in renal transplantation is associated with many side effects, the potentially most severe being interstitial pneumonitis. Several papers have reported on sirolimus-induced pneumonitis, but less is published on everolimus-induced pneumonitis (EIP). Data on risk factors for contracting EIP are even more scarce. In the present case-cohort study in renal transplant recipients (RTR), we aimed to assess the incidence and risk factors of EIP after renal transplantation. This study is a retrospective substudy of a multicenter randomized controlled trial. All patients included in the original trial and treated with prednisolone/everolimus were included in this substudy. RTR who developed EIP were identified as cases. RTR without pulmonary symptoms served as controls. Thirteen of 102 patients (12.7%) developed EIP. We did not find any predisposing factors, especially no correlation with everolimus concentration. On pulmonary CT scan, EIP presented with an organizing pneumonia-like pattern, a nonspecific interstitial pneumonitis-like pattern, or both. Median time (range) to the development of EIP after start of everolimus was 162 (38-407) days. In conclusion, EIP is common in RTR, presenting with an organizing pneumonia, a nonspecific interstitial pneumonitis-like pattern, or both. No predisposing factors could be identified (Trial registration number: NTR567 (www.trialregister.nl), ISRCTN69188731).
Subject(s)
Immunosuppressive Agents/adverse effects , Kidney Transplantation , Lung Diseases, Interstitial/chemically induced , Sirolimus/analogs & derivatives , Adult , Aged , Cohort Studies , Everolimus , Female , Follow-Up Studies , Humans , Lung/diagnostic imaging , Lung Diseases, Interstitial/diagnostic imaging , Male , Middle Aged , Risk , Sirolimus/adverse effects , Tomography, X-Ray ComputedABSTRACT
In kidney transplantation, donor HLA antibodies are a risk factor for graft loss. Accessibility of donor eplets for HLA antibodies is predicted by the ElliPro score. The clinical usefulness of those scores in relation to transplant outcome is unknown. In a large Dutch kidney transplant cohort, Ellipro scores of pretransplant donor antibodies that can be assigned to known eplets (donor epitope specific HLA antibodies [DESAs]) were compared between early graft failure and long surviving deceased donor transplants. We did not observe a significant Ellipro score difference between the two cohorts, nor significant differences in graft survival between transplants with DESAs having high versus low total Ellipro scores. We conclude that Ellipro scores cannot be used to identify DESAs associated with early versus late kidney graft loss in deceased donor transplants.
Subject(s)
Kidney Diseases , Kidney Transplantation , Humans , Graft Survival , Alleles , Antibodies , Kidney , Epitopes , Graft Rejection , HLA Antigens , Tissue DonorsABSTRACT
In kidney transplantation, survival rates are still partly impaired due to the deleterious effects of donor specific HLA antibodies (DSA). However, not all luminex-defined DSA appear to be clinically relevant. Further analysis of DSA recognizing polymorphic amino acid configurations, called eplets or functional epitopes, might improve the discrimination between clinically relevant vs. irrelevant HLA antibodies. To evaluate which donor epitope-specific HLA antibodies (DESAs) are clinically important in kidney graft survival, relevant and irrelevant DESAs were discerned in a Dutch cohort of 4690 patients using Kaplan-Meier analysis and tested in a cox proportional hazard (CPH) model including nonimmunological variables. Pre-transplant DESAs were detected in 439 patients (9.4%). The presence of certain clinically relevant DESAs was significantly associated with increased risk on graft loss in deceased donor transplantations (p < 0.0001). The antibodies recognized six epitopes of HLA Class I, 3 of HLA-DR, and 1 of HLA-DQ, and most antibodies were directed to HLA-B (47%). Fifty-three patients (69.7%) had DESA against one donor epitope (range 1-5). Long-term graft survival rate in patients with clinically relevant DESA was 32%, rendering DESA a superior parameter to classical DSA (60%). In the CPH model, the hazard ratio (95% CI) of clinically relevant DESAs was 2.45 (1.84-3.25) in deceased donation, and 2.22 (1.25-3.95) in living donation. In conclusion, the developed model shows the deleterious effect of clinically relevant DESAs on graft outcome which outperformed traditional DSA-based risk analysis on antigen level.
Subject(s)
Kidney Transplantation , Humans , Kidney Transplantation/adverse effects , Epitopes , HLA Antigens/genetics , Clinical Relevance , Isoantibodies , Alleles , Tissue Donors , Graft RejectionABSTRACT
Inhibitors of the mammalian target of rapamycin (mTOR) have been associated with proteinuria. We studied the development of proteinuria in renal transplant recipients (RTR) treated with the mTOR inhibitor everolimus in comparison with a calcineurin inhibitor. We related the presence of proteinuria to histopathological glomerular findings in two-yr protocol biopsies. In a single-center study, nested in a multicenter randomized controlled trial, we determined eGFR, proteinuria, and renal biopsy data (light- and electron microscopy) of RTR receiving prednisolone/everolimus (P/EVL) (n = 16) in comparison with patients treated with prednisolone/cyclosporine A (P/CsA) (n = 7). All patients had been on the above-described maintenance immunosuppression for 18 months. Renal function at two yr after transplantation did not differ between patients receiving P/EVL or P/CsA (eGFR 45.5 vs. 45.7 mL/min/1.73 m(2)). Proteinuria was slightly increased in P/EVL vs. P/CsA group (0.29 vs. 0.14 g/24 h, p = 0.06). There were no differences in light- or electron microscopic findings. We could not demonstrate increased podocyte effacement or changes in glomerular basement membrane (GBM) thickness in P/EVL-treated patients. In conclusion, long-term treatment with everolimus leaves the GBM and podocytes unaffected.
Subject(s)
Cyclosporine/therapeutic use , Graft Rejection/drug therapy , Immunosuppressive Agents/therapeutic use , Kidney Failure, Chronic/surgery , Kidney Glomerulus/drug effects , Kidney Transplantation/adverse effects , Sirolimus/analogs & derivatives , Adult , Aged , Everolimus , Female , Follow-Up Studies , Glomerular Filtration Rate , Graft Rejection/etiology , Humans , Kidney Failure, Chronic/complications , Kidney Function Tests , Male , Middle Aged , Prognosis , Prospective Studies , Sirolimus/therapeutic use , Young AdultABSTRACT
A high intrapatient variability (IPV) in tacrolimus exposure is a risk factor for poor long-term outcomes after kidney transplantation. The main objective of this trial was to investigate whether tacrolimus IPV decreases after switching patients from immediate-release (IR)-tacrolimus to either extended-release (ER)-tacrolimus or LifeCyclePharma (LCP)-tacrolimus. In this randomized, prospective, open-label, cross-over trial, adult kidney transplant recipients on a stable immunosuppressive regimen, including IR-tacrolimus, were randomized for conversion to ER-tacrolimus or LCP-tacrolimus, and for the order in which IR-tacrolimus and the once-daily formulations were taken. Patients were followed 6 months for each formulation, with monthly tacrolimus predose concentration assessments to calculate the IPV. The IPV was defined as the coefficient of variation (%) of dose corrected predose concentrations. Ninety-two patients were included for analysis of the primary outcome. No significant differences between the IPV of IR-tacrolimus (16.6%) and the combined once-daily formulations (18.3%) were observed (% difference +1.7%, 95% confidence interval [CI] -1.1% to -4.5%, p = 0.24). The IPV of LCP-tacrolimus (20.1%) was not significantly different from the IPV of ER-tacrolimus (16.5%, % difference +3.6%, 95% CI -0.1% to 7.3%, p = 0.06). In conclusion, the IPV did not decrease after switching from IR-tacrolimus to either ER-tacrolimus or LCP-tacrolimus. These results provide no arguments to switch kidney transplant recipients from twice-daily (IR) tacrolimus formulations to once-daily (modified-release) tacrolimus formulations when the aim is to lower the IPV.