RESUMO
BACKGROUND: Prognostic models are becoming increasingly relevant in clinical trials as potential surrogate endpoints, and for patient management as clinical decision support tools. However, the impact of competing risks on model performance remains poorly investigated. We aimed to carefully assess the performance of competing risk and noncompeting risk models in the context of kidney transplantation, where allograft failure and death with a functioning graft are two competing outcomes. METHODS: We included 11,046 kidney transplant recipients enrolled in 10 countries. We developed prediction models for long-term kidney graft failure prediction, without accounting (i.e., censoring) and accounting for the competing risk of death with a functioning graft, using Cox, Fine-Gray, and cause-specific Cox regression models. To this aim, we followed a detailed and transparent analytical framework for competing and noncompeting risk modelling, and carefully assessed the models' development, stability, discrimination, calibration, overall fit, clinical utility, and generalizability in external validation cohorts and subpopulations. More than 15 metrics were used to provide an exhaustive assessment of model performance. RESULTS: Among 11,046 recipients in the derivation and validation cohorts, 1,497 (14%) lost their graft and 1,003 (9%) died with a functioning graft after a median follow-up post-risk evaluation of 4.7 years (IQR 2.7-7.0). The cumulative incidence of graft loss was similarly estimated by Kaplan-Meier and Aalen-Johansen methods (17% versus 16% in the derivation cohort). Cox and competing risk models showed similar and stable risk estimates for predicting long-term graft failure (average mean absolute prediction error of 0.0140, 0.0138 and 0.0135 for Cox, Fine-Gray, and cause-specific Cox models, respectively). Discrimination and overall fit were comparable in the validation cohorts, with concordance index ranging from 0.76 to 0.87. Across various subpopulations and clinical scenarios, the models performed well and similarly, although in some high-risk groups (such as donors over 65 years old), the findings suggest a trend towards moderately improved calibration when using a competing risk approach. CONCLUSIONS: Competing and noncompeting risk models performed similarly in predicting long-term kidney graft failure.
RESUMO
Living kidney donation and living liver donation significantly increases organ supply to make lifesaving transplants possible, offering survival benefits to the recipients and cost savings to society. Of all living donors, 40% are women of childbearing age. However, limited data exist regarding the effect of donation on future pregnancies and of pregnancy-related complications on postdonation outcomes. In February 2023, the American Society of Transplantation Women's Health Community of Practice held a virtual Controversies Conference on reproductive health, contraception, and pregnancy after transplantation and living donation. Experts in the field presented the available data. Smaller breakout sessions were created to discuss findings, identify knowledge gaps, and develop recommendations. We present the conference findings related to living donation. The evidence reviewed shows that gestational hypertension and gestational diabetes mellitus before kidney donation have been associated with an increased risk of developing postdonation hypertension and diabetes mellitus, respectively, without increasing the risk of developing an eGFR <45 ml/min after donation. The risk of preeclampsia in living kidney donors increases to 4%-10%, and low-dose aspirin may help reduce that risk. Little is known about the financial burden for living donors who become pregnant, their risk of postpartum depression, or the optimal time between donation and conception. The data on living liver donors are even scarcer. The creation of a registry of donor candidates may help answer many of these questions and, in turn, educate prospective donors so that they can make an informed choice.
RESUMO
BACKGROUND: Chronic immunosuppression following pancreas transplantation carries significant risk, including posttransplant lymphoproliferative disease (PTLD). We sought to define the incidence, risk factors, and long-term outcomes of PTLD following pancreas transplantation at a single center. METHODS: All adult pancreas transplants between February 1, 1983 and December 31, 2023 at the University of Minnesota were reviewed, including pancreas transplant alone (PTA), simultaneous pancreas-kidney transplants (SPK), and pancreas after kidney transplants (PAK). RESULTS: Among 2353 transplants, 110 cases of PTLD were identified, with an overall incidence of 4.8%. 17.3% were diagnosed within 1 year of transplant, 32.7% were diagnosed within 5 years, and 74 (67.3%) were diagnosed after 5 years. The overall 30-year incidence of PTLD did not differ by transplant type-7.4% for PTA, 14.2% for SPK, and 19.4% for PAK (p = 0.3). In multivariable analyses, older age and Epstein-Barr virus seronegativity were risk factors for PTLD, and PTLD was a risk factor for patient death. PTLD-specific mortality was 32.7%, although recipients with PTLD had similar median posttransplant survival compared to those without PTLD (14.9 year vs. 15.6 year, p = 0.9). CONCLUSIONS: PTLD following pancreas transplantation is associated with significant mortality. Although the incidence of PTLD has decreased over time, a high index of suspicion for PTLD following PTx should remain in EBV-negative recipients.
Assuntos
Sobrevivência de Enxerto , Transtornos Linfoproliferativos , Transplante de Pâncreas , Complicações Pós-Operatórias , Humanos , Transplante de Pâncreas/efeitos adversos , Masculino , Transtornos Linfoproliferativos/etiologia , Transtornos Linfoproliferativos/epidemiologia , Feminino , Adulto , Complicações Pós-Operatórias/epidemiologia , Complicações Pós-Operatórias/etiologia , Seguimentos , Fatores de Risco , Prognóstico , Pessoa de Meia-Idade , Incidência , Taxa de Sobrevida , Estudos Retrospectivos , Rejeição de Enxerto/etiologia , Rejeição de Enxerto/mortalidade , Transplante de Rim/efeitos adversos , Adulto JovemRESUMO
African American (AA) kidney transplant recipients (KTRs) have poor outcomes, which may in-part be due to tacrolimus (TAC) sub-optimal immunosuppression. We previously determined the common genetic regulators of TAC pharmacokinetics in AAs which were CYP3A5 *3, *6, and *7. To identify low-frequency variants that impact TAC pharmacokinetics, we used extreme phenotype sampling and compared individuals with extreme high (n = 58) and low (n = 60) TAC troughs (N = 515 AA KTRs). Targeted next generation sequencing was conducted in these two groups. Median TAC troughs in the high group were 7.7 ng/ml compared with 6.3 ng/ml in the low group, despite lower daily doses of 5 versus 12 mg, respectively. Of 34,542 identified variants across 99 genes, 1406 variants were suggestively associated with TAC troughs in univariate models (p-value < 0.05), however none were significant after multiple testing correction. We suggest future studies investigate additional sources of TAC pharmacokinetic variability such as drug-drug-gene interactions and pharmacomicrobiome.
Assuntos
Negro ou Afro-Americano , Imunossupressores , Transplante de Rim , Tacrolimo , Adulto , Feminino , Humanos , Masculino , Pessoa de Meia-Idade , Negro ou Afro-Americano/genética , Citocromo P-450 CYP3A/genética , Variação Genética , Sequenciamento de Nucleotídeos em Larga Escala , Imunossupressores/farmacocinética , Variantes Farmacogenômicos , Fenótipo , Tacrolimo/farmacocinética , Tacrolimo/uso terapêutico , TransplantadosRESUMO
AIMS: Tacrolimus, metabolized by CYP3A4 and CYP3A5 enzymes, is susceptible to drug-drug interactions (DDI). Steroids induce CYP3A genes to increase tacrolimus clearance, but the effect is variable. We hypothesized that the extent of the steroid-tacrolimus DDI differs by CYP3A4/5 genotypes. METHODS: Kidney transplant recipients (n = 2462) were classified by the number of loss of function alleles (LOF) (CYP3A5*3, *6 and *7 and CYP3A4*22) and steroid use at each tacrolimus trough in the first 6 months post-transplant. A population pharmacokinetic analysis was performed by nonlinear mixed-effect modelling (NONMEM) and stepwise covariate modelling to define significant covariates affecting tacrolimus clearance. A stochastic simulation was performed and translated into a Shiny application with the mrgsolve and Shiny packages in R. RESULTS: Steroids were associated with modestly higher (3%-11.8%) tacrolimus clearance. Patients with 0-LOF alleles receiving steroids showed the greatest increase (11.8%) in clearance compared to no steroids, whereas those with 2-LOFs had a negligible increase (2.6%) in the presence of steroids. Steroid use increased tacrolimus clearance by 5% and 10.3% in patients with 1-LOF and 3/4-LOFs, respectively. CONCLUSIONS: Steroids increase the clearance of tacrolimus but vary slightly by CYP3A genotype. This is important in individuals of African ancestry who are more likely to carry no LOF alleles, may more commonly receive steroid treatment, and will need higher tacrolimus doses.
RESUMO
BACKGROUND: Therapeutic drug monitoring for mycophenolic acid (MPA) is challenging due to difficulties in measuring the area under the curve (AUC). Limited sampling strategies (LSSs) have been developed for MPA therapeutic drug monitoring but come with risk of unacceptable performance. The authors hypothesized that the poor predictive performance of LSSs were due to the variability in MPA enterohepatic recirculation (EHR). This study is the first to evaluate LSSs models performance in the context of EHR. METHODS: Adult kidney transplant recipients (n = 84) receiving oral mycophenolate mofetil underwent intensive MPA pharmacokinetic sampling. MPA AUC0-12hr and EHR were determined. Published MPA LSSs in kidney transplant recipients receiving tacrolimus were evaluated for their predictive performance in estimating AUC0-12hr in our full cohort and separately in individuals with high and low EHR. RESULTS: None of the evaluated LSS models (n = 12) showed good precision or accuracy in predicting MPA AUC0-12hr in the full cohort. In the high EHR group, models with late timepoints had better accuracy but low precision, except for 1 model with late timepoints at 6 and 10 hours postdose, which had marginally acceptable precision. For all models, the good guess of predicted AUC0-12hr (±15% of observed AUC0-12hr) was highly variable (range, full cohort = 19%-61.9%; high EHR = 4.5%-65.9%; low EHR = 27.5%-62.5%). CONCLUSIONS: The predictive performance of the LSS models varied according to EHR status. Timepoints ≥5 hours postdose in LSS models are essential to capture EHR. Models and strategies that incorporate EHR during development are required to accurately ascertain MPA exposure.
RESUMO
In a previous study, we observed decreased 1,25-dihydroxyvitamin D levels, secondary hyperparathyroidism, and increased bone turnover markers in living kidney donors (LKDs) at 3 months and 36 months after kidney donation. In our recent survey-based study, we found no increased risk of fractures of all types but observed significantly more vertebral fractures in LKDs compared with matched controls. To elucidate the long-term effects of kidney donation on bone health, we recruited 139 LKDs and 139 age and sex matched controls from the survey-based participants for further mechanistic analyses. Specifically, we assessed whether LKDs had persistent abnormalities in calcium- and phosphorus-regulating hormones and related factors, in bone formation and resorption markers, and in density and microstructure of bone compared with controls. We measured serum markers, bone mineral density (BMD), bone microstructure and strength (via high-resolution peripheral quantitative computed tomography and micro-finite element analysis [HRpQCT]), and advanced glycation end-products in donors and controls. LKDs had decreased 1,25-dihydroxyvitamin D concentrations (donors mean 33.89 pg/mL vs. controls 38.79 pg/mL, percent difference = -12.6%; P < .001), increases in both parathyroid hormone (when corrected for ionized calcium; donors mean 52.98 pg/mL vs. controls 46.89 pg/mL,% difference 13%; P = .03) and ionized calcium levels (donors mean 5.13 mg/dL vs. controls 5.04 mg/dL; P < .001), and increases in several bone resorption and formation markers versus controls. LKDs and controls had similar measures of BMD; however, HRpQCT suggested that LKDs have a statistically insignificant tendency toward thinner cortical bone and lower failure loads as measured by micro-finite element analysis. Our findings suggest that changes in the hormonal mileu after kidney donation and the long-term cumulative effects of these changes on bone health persist for decades after kidney donation and may explain later-life increased rates of vertebral fractures.
RESUMO
In the general population, decreases in glomerular filtration rate (GFR) are associated with subsequent development of chronic kidney disease (CKD), cardiovascular disease (CVD), and death. It is unknown if low estimated GFR (eGFR) before or early after kidney donation was also associated with these risks. One thousand six hundred ninety-nine living donors who had both predonation and early (4-10 weeks) postdonation eGFR were included. We studied the relationships between eGFR, age at donation, and the time to sustained eGFR<45 (CKD stage 3b) and <30 mL/min/1.73m2 (CKD stage 4), hypertension, diabetes mellitus (DM), CVD, and death. Median follow-up was 12 (interquartile range, 6-21) years. Twenty-year event rates were 5.8% eGFR<45 mL/min/1.73m2; 1.2% eGFR<30 mL/min/1.73m2; 29.0% hypertension; 7.8% DM; 8.0% CVD; and 5.2% death. The median time to eGFR<45 mL/min/1.73m2 (N = 79) was 17 years, and eGFR<30 mL/min/1.73m2 (N = 22) was 25 years. Both low predonation and early postdonation eGFR were associated with eGFR<45 mL/min/1.73m2 (P < .0001) and eGFR<30 mL/min/1.73m2 (P < .006); however, the primary driver of risk for all ages was low postdonation (rather than predonation) eGFR. Predonation and postdonation eGFR were not associated with hypertension, DM, CVD, or death. Low predonation and early postdonation eGFR are risk factors for developing eGFR<45 mL/min/1.73m2 (CKD stage 3b) and <30 mL/min/1.73m2 (CKD stage 4), but not CVD, hypertension, DM, or death.
Assuntos
Taxa de Filtração Glomerular , Transplante de Rim , Doadores Vivos , Humanos , Masculino , Feminino , Pessoa de Meia-Idade , Transplante de Rim/efeitos adversos , Adulto , Seguimentos , Fatores de Risco , Prognóstico , Nefrectomia/efeitos adversos , Testes de Função Renal , Insuficiência Renal Crônica/fisiopatologia , Insuficiência Renal Crônica/etiologia , Coleta de Tecidos e Órgãos/efeitos adversos , Doenças Cardiovasculares/etiologiaRESUMO
Living kidney donors make a significant contribution to alleviating the organ shortage. The aim of this article is to provide an overview of mid- and long-term (≥12 mo) living donor psychosocial outcomes and highlight areas that have been understudied and should be immediately addressed in both research and clinical practice. We conducted a narrative review by searching 3 databases. A total of 206 articles were included. Living donors can be divided into those who donate to an emotionally or genetically related person, the so-called directed donors, or to an emotionally or genetically unrelated recipient, the so-called nondirected donors. The most commonly investigated (bio)psychosocial outcome after living donation was health-related quality of life. Other generic (bio)psychological outcomes include specific aspects of mental health such as depression, and fatigue and pain. Social outcomes include financial and employment burdens and problems with insurance. Donation-specific psychosocial outcomes include regret, satisfaction, feelings of abandonment and unmet needs, and benefits of living kidney donation. The experience of living donation is complex and multifaceted, reflected in the co-occurrence of both benefits and burden after donation. Noticeably, no interventions have been developed to improve mid- or long-term psychosocial outcomes among living donors. We highlight areas for methodological improvement and identified 3 areas requiring immediate attention from the transplant community in both research and clinical care: (1) recognizing and providing care for the minority of donors who have poorer long-term psychosocial outcomes after donation, (2) minimizing donation-related financial burden, and (3) studying interventions to minimize long-term psychosocial problems.
RESUMO
OBJECTIVE: To describe the evolution of pancreas transplantation, including improved outcomes and factors associated with improved outcomes over the past 5 decades. BACKGROUND: The world's first successful pancreas transplant was performed in December 1966 at the University of Minnesota. As new modalities for diabetes treatment mature, we must carefully assess the current state of pancreas transplantation to determine its ongoing role in patient care. METHODS: A single-center retrospective review of 2500 pancreas transplants was performed over >50 years in bivariate and multivariable models. Transplants were divided into 6 eras; outcomes are presented for the entire cohort and by era. RESULTS: All measures of patient and graft survival improved progressively through the 6 transplant eras. The overall death-censored pancreas graft half-lives were >35 years for simultaneous pancreas and kidney (SPK), 7.1 years for pancreas after kidney (PAK), and 3.3 years for pancreas transplants alone (PTA). The 10-year death-censored pancreas graft survival rate in the most recent era was 86.9% for SPK recipients, 58.2% for PAK recipients, and 47.6% for PTA. Overall, graft loss was most influenced by patient survival in SPK transplants, whereas graft loss in PAK and PTA recipients was more often due to graft failures. Predictors of improved pancreas graft survival were primary transplants, bladder drainage of exocrine secretions, younger donor age, and shorter preservation time. CONCLUSIONS: Pancreas outcomes have significantly improved over time through sequential, but overlapping, advances in surgical technique, immunosuppressive protocols, reduced preservation time, and the more recent reduction of immune-mediated graft loss.
Assuntos
Sobrevivência de Enxerto , Transplante de Pâncreas , Transplante de Pâncreas/métodos , Humanos , Estudos Retrospectivos , Adulto , Masculino , Feminino , Pessoa de Meia-Idade , Transplante de Rim , Resultado do Tratamento , Adolescente , Criança , Adulto Jovem , Taxa de SobrevidaRESUMO
BACKGROUND: Acute rejection (AR) after kidney transplantation is an important allograft complication. To reduce the risk of post-transplant AR, determination of kidney transplant donor-recipient mismatching focuses on blood type and human leukocyte antigens (HLA), while it remains unclear whether non-HLA genetic mismatching is related to post-transplant complications. METHODS: We carried out a genome-wide scan (HLA and non-HLA regions) on AR with a large kidney transplant cohort of 784 living donor-recipient pairs of European ancestry. An AR polygenic risk score (PRS) was constructed with the non-HLA single nucleotide polymorphisms (SNPs) filtered by independence (r2 < 0.2) and P-value (< 1×10-3) criteria. The PRS was validated in an independent cohort of 352 living donor-recipient pairs. RESULTS: By the genome-wide scan, we identified one significant SNP rs6749137 with HR = 2.49 and P-value = 2.15×10-8. 1,307 non-HLA PRS SNPs passed the clumping plus thresholding and the PRS exhibited significant association with the AR in the validation cohort (HR = 1.54, 95% CI = (1.07, 2.22), p = 0.019). Further pathway analysis attributed the PRS genes into 13 categories, and the over-representation test identified 42 significant biological processes, the most significant of which is the cell morphogenesis (GO:0000902), with 4.08 fold of the percentage from homo species reference and FDR-adjusted P-value = 8.6×10-4. CONCLUSIONS: Our results show the importance of donor-recipient mismatching in non-HLA regions. Additional work will be needed to understand the role of SNPs included in the PRS and to further improve donor-recipient genetic matching algorithms. Trial registry: Deterioration of Kidney Allograft Function Genomics (NCT00270712) and Genomics of Kidney Transplantation (NCT01714440) are registered on ClinicalTrials.gov.
Assuntos
Estudo de Associação Genômica Ampla , Genótipo , Rejeição de Enxerto , Transplante de Rim , Polimorfismo de Nucleotídeo Único , Humanos , Rejeição de Enxerto/genética , Rejeição de Enxerto/imunologia , Feminino , Masculino , Pessoa de Meia-Idade , Adulto , Antígenos HLA/genética , Herança Multifatorial , Fatores de Risco , Doadores Vivos , Estudos de Coortes , Estratificação de Risco GenéticoRESUMO
BACKGROUND: Valganciclovir (valG), a cytomegalovirus (CMV) prophylactic agent, has dose-limiting side effects. The tolerability and effectiveness of valacyclovir (valA) as CMV prophylaxis is unknown. METHODS: We conducted a randomized, open-label, single-center trial of valA versus valG for all posttransplant CMV prophylaxis in adult and pediatric kidney recipients. Participants were randomly assigned to receive valA or valG. Primary endpoints were the incidence of CMV viremia and side-effect related drug reduction with secondary assessment of incidence of EBV viremia. RESULTS: Of the 137 sequential kidney transplant recipients enrolled, 26 % were positive and negative for CMV antibody in donor and recipient respectively. The incidence of CMV viremia (4 of 71 [6 %]; 8 of 67 [12 %] P = 0.23), time to viremia (P = 0.16) and area under CMV viral load time curve (P = 0.19) were not significantly different. ValG participants were significantly more likely to require side-effect related dose reduction (15/71 [21 %] versus 1/66 [2 %] P = 0.0003). Leukopenia was the most common reason for valG dose reduction and granulocyte-colony stimulating factor was utilized for leukopenia recovery more frequently (25 % in valG vs 5 % in valA: P = 0.0007). Incidence of EBV viremia was not significantly different. CONCLUSIONS: ValA has significantly less dose-limiting side effects than valG. In our study population, a significant increase in CMV viremia was not observed, in adults and children after kidney transplant, compared to valG. TRIAL REGISTRATION NUMBER: NCT01329185.
Assuntos
Antivirais , Infecções por Citomegalovirus , Ganciclovir , Transplante de Rim , Transplantados , Valaciclovir , Valganciclovir , Humanos , Valaciclovir/uso terapêutico , Infecções por Citomegalovirus/prevenção & controle , Valganciclovir/uso terapêutico , Valganciclovir/administração & dosagem , Transplante de Rim/efeitos adversos , Antivirais/uso terapêutico , Antivirais/administração & dosagem , Antivirais/efeitos adversos , Masculino , Feminino , Adulto , Criança , Pessoa de Meia-Idade , Adolescente , Ganciclovir/análogos & derivados , Ganciclovir/uso terapêutico , Ganciclovir/administração & dosagem , Ganciclovir/efeitos adversos , Viremia/prevenção & controle , Carga Viral , Adulto Jovem , Valina/análogos & derivados , Valina/uso terapêutico , Valina/administração & dosagem , Citomegalovirus/imunologia , Citomegalovirus/efeitos dos fármacos , Pré-Escolar , Aciclovir/uso terapêutico , Aciclovir/análogos & derivados , Aciclovir/administração & dosagem , Aciclovir/efeitos adversos , Idoso , Resultado do Tratamento , IncidênciaRESUMO
African American (AA) kidney transplant recipients (KTRs) have poor outcomes, which may in-part be due to tacrolimus (TAC) sub-optimal immunosuppression. We previously determined the common genetic regulators of TAC pharmacokinetics in AAs which were CYP3A5 *3, *6, and *7. To identify low-frequency variants that impact TAC pharmacokinetics, we used extreme phenotype sampling and compared individuals with extreme high (n=58) and low (n=60) TAC troughs (N=515 AA KTRs). Targeted next generation sequencing was conducted in these two groups. Median TAC troughs in the high group were 7.7 ng/ml compared with 6.3 ng/ml in the low group, despite lower daily doses of 5 versus 12mg, respectively. Of 34,542 identified variants across 99 genes, 1,406 variants were suggestively associated with TAC troughs in univariate models (p-value <0.05), however none were significant after multiple testing correction. We suggest future studies investigate additional sources of TAC pharmacokinetic variability such as drug-drug-gene interactions and pharmacomicrobiome.
RESUMO
The human microbiome is associated with human health and disease. Exogenous compounds, including pharmaceutical products, are also known to be affected by the microbiome, and this discovery has led to the field of pharmacomicobiomics. The microbiome can also alter drug pharmacokinetics and pharmacodynamics, possibly resulting in side effects, toxicities, and unanticipated disease response. Microbiome-mediated effects are referred to as drug-microbiome interactions (DMI). Rapid advances in the field of pharmacomicrobiomics have been driven by the availability of efficient bacterial genome sequencing methods and new computational and bioinformatics tools. The success of fecal microbiota transplantation for recurrent Clostridioides difficile has fueled enthusiasm and research in the field. This review focuses on the pharmacomicrobiome in transplantation. Alterations in the microbiome in transplant recipients are well documented, largely because of prophylactic antibiotic use, and the potential for DMI is high. There is evidence that the gut microbiome may alter the pharmacokinetic disposition of tacrolimus and result in microbiome-specific tacrolimus metabolites. The gut microbiome also impacts the enterohepatic recirculation of mycophenolate, resulting in substantial changes in pharmacokinetic disposition and systemic exposure. The mechanisms of these DMI and the specific bacteria or communities of bacteria are under investigation. There are little or no human DMI data for cyclosporine A, corticosteroids, and sirolimus. The available evidence in transplantation is limited and driven by small studies of heterogeneous designs. Larger clinical studies are needed, but the potential for future clinical application of the pharmacomicrobiome in avoiding poor outcomes is high.
Assuntos
Microbioma Gastrointestinal , Imunossupressores , Transplante de Órgãos , Humanos , Imunossupressores/farmacocinética , Imunossupressores/efeitos adversos , Microbioma Gastrointestinal/efeitos dos fármacos , Transplante de Órgãos/efeitos adversos , Rejeição de Enxerto/prevenção & controle , Rejeição de Enxerto/imunologia , Rejeição de Enxerto/microbiologia , AnimaisRESUMO
Importance: Living kidney donors may have an increased risk of fractures due to reductions in kidney mass, lower concentrations of serum 1,25-dihydroxyvitamin D, and secondary increases in serum parathyroid hormone. Objective: To compare the overall and site-specific risk of fractures among living kidney donors with strictly matched controls from the general population who would have been eligible to donate a kidney but did not do so. Design, Setting, and Participants: This survey study was conducted between December 1, 2021, and July 31, 2023. A total of 5065 living kidney donors from 3 large transplant centers in Minnesota were invited to complete a survey about their bone health and history of fractures, and 16â¯156 population-based nondonor controls without a history of comorbidities that would have precluded kidney donation were identified from the Rochester Epidemiology Project and completed the same survey. A total of 2132 living kidney donors and 2014 nondonor controls responded to the survey. Statistical analyses were performed from May to August 2023. Exposure: Living kidney donation. Main Outcomes and Measures: The rates of overall and site-specific fractures were compared between living kidney donors and controls using standardized incidence ratios (SIRs). Results: At the time of survey, the 2132 living kidney donors had a mean (SD) age of 67.1 (8.9) years and included 1245 women (58.4%), and the 2014 controls had a mean (SD) age of 68.6 (7.9) years and included 1140 women (56.6%). The mean (SD) time between donation or index date and survey date was 24.2 (10.4) years for donors and 27.6 (10.7) years for controls. The overall rate of fractures among living kidney donors was significantly lower than among controls (SIR, 0.89; 95% CI, 0.81-0.97). However, there were significantly more vertebral fractures among living kidney donors than among controls (SIR, 1.42; 95% CI, 1.05-1.83). Conclusions and Relevance: This survey study found a reduced rate of overall fractures but an excess of vertebral fractures among living kidney donors compared with controls after a mean follow-up of 25 years. Treatment of excess vertebral fractures with dietary supplements such as vitamin D3 may reduce the numbers of vertebral fractures and patient morbidity.
Assuntos
Fraturas Ósseas , Transplante de Rim , Fraturas da Coluna Vertebral , Humanos , Feminino , Idoso , Doadores Vivos , ColecalciferolRESUMO
BACKGROUND: Early in the history of kidney transplantation, short-term graft survival was low. Yet some have had excellent long-term survival. Herein, we describe characteristics of pediatric recipients with > 40 years of graft survival currently alive with a functioning first graft. METHODS: We reviewed all pediatric (age < 18 years) kidney transplants performed at the University of Minnesota between January 1, 1970, and December 31, 1979 (n = 148), to identify all recipients currently alive with a functioning first graft. Data are presented as medians with interquartile ranges (IQR) and proportions. RESULTS: We identified 10 recipients with > 40-year graft survival (median follow-up: 45.0 years (IQR: 43.1, 48.1)). The median age at transplant was 13.8 years (IQR: 5.1, 16.3). All recipients were white; half were male. Of the 10, 4 had glomerulonephritis, 2 had congenital anomalies of the kidney and the urinary tract, 2 had congenital nephrotic syndrome, 1 had Alport syndrome, and 1 had cystic kidney disease as kidney failure cause. Nine patients received a living-related donor transplant, and 1 patient received a deceased-donor transplant. The median estimated glomerular filtration rate at 20 years post-transplant was 79.9 (IQR: 72.3, 98.4); at 30 years, 67.7 (IQR: 63.2, 91.8); and at 40 years, 80.3 ml/min/1.73 m2 (IQR: 73.7, 86.0). None developed rejection, 5 developed hypertension, 2 developed dyslipidemia, 1 developed diabetes, and 7 patients developed malignancy (4 skin cancer, 2 breast cancer, and 1 post-transplant lymphoproliferative disease). CONCLUSION: Pediatric kidney transplant recipients may achieve > 4 decades of graft survival. Cancer is a common complication warranting vigilant screening.
Assuntos
Transplante de Rim , Adolescente , Criança , Feminino , Humanos , Masculino , Rejeição de Enxerto/epidemiologia , Rejeição de Enxerto/prevenção & controle , Sobrevivência de Enxerto , Rim , Transplante de Rim/efeitos adversos , Doadores Vivos , Estudos Retrospectivos , Transplantados , Resultado do Tratamento , Pré-EscolarRESUMO
Virtually all clinicians agree that living donor renal transplantation is the optimal treatment for permanent loss of kidney function. Yet, living donor kidney transplantation has not grown in the United States for more than 2 decades. A virtual symposium gathered experts to examine this shortcoming and to stimulate and clarify issues salient to improving living donation. The ethical principles of rewarding kidney donors and the limits of altruism as the exclusive compelling stimulus for donation were emphasized. Concepts that donor incentives could save up to 40 000 lives annually and considerable taxpayer dollars were examined, and survey data confirmed voter support for donor compensation. Objections to rewarding donors were also presented. Living donor kidney exchanges and limited numbers of deceased donor kidneys were reviewed. Discussants found consensus that attempts to increase living donation should include removing artificial barriers in donor evaluation, expansion of living donor chains, affirming the safety of live kidney donation, and assurance that donors incur no expense. If the current legal and practice standards persist, living kidney donation will fail to achieve its true potential to save lives.
Assuntos
Transplante de Rim , Obtenção de Tecidos e Órgãos , Humanos , Estados Unidos , Doadores Vivos , Rim , Inquéritos e QuestionáriosRESUMO
A difficult decision for patients in need of kidney-pancreas transplant is whether to seek a living kidney donor or wait to receive both organs from one deceased donor. The framework of dynamic treatment regimes (DTRs) can inform this choice, but a patient-relevant strategy such as "wait for deceased-donor transplant" is ill-defined because there are multiple versions of treatment (i.e., wait times, organ qualities). Existing DTR methods average over the distribution of treatment versions in the data, estimating survival under a "representative intervention." This is undesirable if transporting inferences to a target population such as patients today, who experience shorter wait times thanks to evolutions in allocation policy. We, therefore, propose the concept of a generalized representative intervention (GRI): a random DTR that assigns treatment version by drawing from the distribution among strategy compliers in the target population (e.g., patients today). We describe an inverse-probability-weighted product-limit estimator of survival under a GRI that performs well in simulations and can be implemented in standard statistical software. For continuous treatments (e.g., organ quality), weights are reformulated to depend on probabilities only, not densities. We apply our method to a national database of kidney-pancreas transplant candidates from 2001-2020 to illustrate that variability in transplant rate across years and centers results in qualitative differences in the optimal strategy for patient survival.
Assuntos
Transplante de Rim , Transplante de Pâncreas , Humanos , Transplante de Pâncreas/métodos , Causalidade , RimRESUMO
BACKGROUND: There is uncertainty about the long-term risks of living kidney donation. Well-designed studies with controls well-matched on risk factors for kidney disease are needed to understand the attributable risks of kidney donation. METHODS: The goal of the Minnesota Attributable Risk of Kidney Donation (MARKD) study is to compare the long-term (> 50 years) outcomes of living donors (LDs) to contemporary and geographically similar controls that are well-matched on health status. University of Minnesota (n = 4022; 1st transplant: 1963) and Mayo Clinic LDs (n = 3035; 1st transplant: 1963) will be matched to Rochester Epidemiology Project (REP) controls (approximately 4 controls to 1 donor) on the basis of age, sex, and race/ethnicity. The REP controls are a well-defined population, with detailed medical record data linked between all providers in Olmsted and surrounding counties, that come from the same geographic region and era (early 1960s to present) as the donors. Controls will be carefully selected to have health status acceptable for donation on the index date (date their matched donor donated). Further refinement of the control group will include confirmed kidney health (e.g., normal serum creatinine and/or no proteinuria) and matching (on index date) of body mass index, smoking history, family history of chronic kidney disease, and blood pressure. Outcomes will be ascertained from national registries (National Death Index and United States Renal Data System) and a new survey administered to both donors and controls; the data will be supplemented by prior surveys and medical record review of donors and REP controls. The outcomes to be compared are all-cause mortality, end-stage kidney disease, cardiovascular disease and mortality, estimated glomerular filtration rate (eGFR) trajectory and chronic kidney disease, pregnancy risks, and development of diseases that frequently lead to chronic kidney disease (e.g. hypertension, diabetes, and obesity). We will additionally evaluate whether the risk of donation differs based on baseline characteristics. DISCUSSION: Our study will provide a comprehensive assessment of long-term living donor risk to inform candidate living donors, and to inform the follow-up and care of current living donors.