Your browser doesn't support javascript.
loading
: 20 | 50 | 100
1 - 20 de 187
1.
Transplant Proc ; 56(3): 505-510, 2024 Apr.
Article En | MEDLINE | ID: mdl-38448249

BACKGROUND: Postoperative delirium after organ transplantation can lead to increased length of hospital stay and mortality. Because pain is an important risk factor for delirium, perioperative analgesia with intrathecal morphine (ITM) may mitigate postoperative delirium development. We evaluated if ITM reduces postoperative delirium incidence in living donor kidney transplant (LDKT) recipients. METHODS: Two hundred ninety-six patients who received LDKT between 2014 and 2018 at our hospital were retrospectively analyzed. Recipients who received preoperative ITM (ITM group) were compared with those who did not (control group). The primary outcome was postoperative delirium based on the Confusion Assessment Method for Intensive Care Unit results during the first 4 postoperative days. RESULTS: Delirium occurred in 2.6% (4/154) and 7.0% (10/142) of the ITM and control groups, respectively. Multivariable analysis showed age (odds ratio [OR]: 1.07, 95% CI: 1.01-1.14; P = .031), recent smoking (OR: 7.87, 95% CI: 1.43-43.31; P = .018), preoperative psychotropics (OR: 23.01, 95% CI: 3.22-164.66; P = .002) were risk factors, whereas ITM was a protective factor (OR: 0.23, 95% CI: 0.06-0.89; P = .033). CONCLUSIONS: Preoperative ITM showed an independent association with reduced post-LDKT delirium. Further studies and the development of regional analgesia for delirium prevention may enhance the postoperative recovery of transplant recipients.


Analgesics, Opioid , Delirium , Injections, Spinal , Kidney Transplantation , Living Donors , Morphine , Pain, Postoperative , Humans , Kidney Transplantation/adverse effects , Morphine/administration & dosage , Male , Female , Pain, Postoperative/prevention & control , Pain, Postoperative/etiology , Middle Aged , Retrospective Studies , Delirium/prevention & control , Delirium/etiology , Delirium/epidemiology , Analgesics, Opioid/administration & dosage , Adult , Risk Factors , Psychomotor Agitation/prevention & control , Psychomotor Agitation/etiology , Postoperative Complications/prevention & control , Preoperative Care
2.
Biochem Biophys Rep ; 38: 101658, 2024 Jul.
Article En | MEDLINE | ID: mdl-38362049

Islet transplantation is the most effective treatment strategy for type 1 diabetes. Long-term storage at ultralow temperatures can be used to prepare sufficient islets of good quality for transplantation. For freezing islets, dimethyl sulfoxide (DMSO) is a commonly used penetrating cryoprotective agent (CPA). However, the toxicity of DMSO is a major obstacle to cell cryopreservation. Hydroxyethyl starch (HES) has been proposed as an alternative CPA. To investigate the effects of two types of nonpermeating CPA, we compared 4 % HES 130 and HES 200 to 10 % DMSO in terms of mouse islet yield, viability, and glucose-stimulated insulin secretion (GSIS). After one day of culture, islets were cryopreserved in each solution. After three days of cryopreservation, islet recovery was significantly higher in the HES 130 and HES 200 groups than in the DMSO group. Islet viability in the HES 200 group was also significantly higher than that in the DMSO group on Day 1 and Day 3. Stimulation indices determined by GSIS were higher in the HES 130 and 200 groups than in the DMSO group on Day 3. After three days of cryopreservation, HES 130 and HES 200 both reduced the expression of apoptosis- and necrosis-associated proteins and promoted the survival of islets. In conclusion, the use of HES as a CPA improved the survival and insulin secretion of cryopreserved islets compared with the use of a conventional CPA.

3.
Transplant Proc ; 56(3): 686-691, 2024 Apr.
Article En | MEDLINE | ID: mdl-38378341

BACKGROUND: Xenotransplantation, particularly when involving pig donors, presents challenges related to the transmission of porcine cytomegalovirus (pCMV) and its potential impact on recipient outcomes. This study aimed to investigate the relationship between pCMV positivity in both donors and recipients and the survival time of cynomolgus monkey recipients after xenogeneic kidney transplantation. METHODS: We conducted 20 cynomolgus xenotransplants using 18 transgenic pigs. On the surgery day, donor pig blood was sampled, and DNA was extracted from serum and peripheral blood mononuclear cells. Recipient DNA extraction followed the same protocol from pre-transplantation to post-transplantation. Porcine cytomegalovirus detection used real-time polymerase chain reaction (real-time PCR) with the ViroReal kit, achieving a sensitivity of 50 copies/reaction. A Ct value of 37.0 was the pCMV positivity threshold. RESULTS: Of 20 cynomolgus recipients, when donors tested negative for pCMV, recipients also showed negative results in 9 cases. In 4 cases where donors were negative, recipients tested positive. All 5 cases with pCMV-positive donors resulted in positive assessments for recipients. Detection of donor pCMV correlated with shorter recipient survival. Continuous recipient positivity during observation correlated with shorter survival, whereas transient detection showed no significant change in survival rates. However, donor pig phenotypes and transplantation protocols did not significantly impact survival. CONCLUSION: The detection of pCMV in both donors and recipients plays a crucial role in xenotransplantation outcomes. These findings suggest the importance of monitoring and managing pCMV in xenotransplantation to enhance long-term outcomes.


Cytomegalovirus Infections , Cytomegalovirus , Kidney Transplantation , Macaca fascicularis , Transplantation, Heterologous , Animals , Transplantation, Heterologous/adverse effects , Swine , Cytomegalovirus/genetics , Cytomegalovirus Infections/mortality , Cytomegalovirus Infections/virology , Kidney Transplantation/adverse effects , Graft Survival , Tissue Donors , Animals, Genetically Modified
4.
Transplant Proc ; 56(3): 705-711, 2024 Apr.
Article En | MEDLINE | ID: mdl-38395660

BACKGROUND: Although non-human primates are the closest animals to humans to simulate physiological and metabolic responses, there is a paucity of primate hemorrhagic shock models that are standardized and reproducible. Herein, we describe a model that is a clinical replica of extreme class IV hemorrhagic shock with a step-by-step description of the procedure in cynomolgus macaque monkeys. METHODS: The physiological changes that occurred during the process were evaluated using hemodynamic parameters, echocardiogram, and laboratory values. Five female monkeys were subjected to trauma laparotomy, followed by cannulation of the abdominal aorta to achieve graded hemorrhage. A central line was placed in the right internal jugular vein, which was subsequently used for laboratory sampling and volume resuscitation. The withdrawal of blood was ceased when a predefined cardiac endpoint with cardiac arrhythmia or bradycardia was reached. The animals were then immediately resuscitated with transfusion. The primary cardiac endpoint was consistently reached in all 5 animals during the fourth hemorrhage when more than 70% of the estimated total blood volume was lost. RESULTS: No mortality occurred during the process. The blood pressure, cardiac output measured from an echocardiogram, and hemoglobin correlated well with increasing loss of circulating volume, whereas the pulse pressure variation did not. The echocardiogram was also a useful predictor for urgent volume replacement. CONCLUSION: This model offers a safe and reproducible surgical hemorrhagic model in non-human primates and simulates clinical practice. This could provide a useful platform on which further studies can be carried out to address unanswered questions in trauma management.


Disease Models, Animal , Hemodynamics , Macaca fascicularis , Shock, Hemorrhagic , Animals , Shock, Hemorrhagic/physiopathology , Shock, Hemorrhagic/therapy , Female , Reproducibility of Results , Blood Pressure , Resuscitation/methods , Echocardiography
5.
Sci Rep ; 14(1): 2002, 2024 01 23.
Article En | MEDLINE | ID: mdl-38263253

Cardiovascular disease remains a leading cause of morbidity and mortality after kidney transplantation (KT). Although statins reduce cardiovascular risk and have renal benefits in the general population, their effects on KT recipients are not well-established. We studied the effects of early statin use (within 1-year post-transplantation) on long-term outcomes in 714 KT recipients from the Korean cohort study for outcome in patients with KT. Compared with the control group, statin group recipients were significantly older, had a higher body mass index, and had a higher prevalence of diabetes mellitus. During a median follow-up of 85 months, 74 graft losses occurred (54 death-censored graft losses and 20 deaths). Early statin use was independently associated with lower mortality (hazard ratio, 0.280; 95% confidence interval 0.111-0.703) and lower death-censored graft loss (hazard ratio, 0.350; 95% confidence interval 0.198-0.616). Statin therapy significantly reduced low-density lipoprotein cholesterol levels but did not decrease the risk of major adverse cardiovascular events. Biopsy-proven rejection and graft renal function were not significantly different between statin and control groups. Our findings suggest that early statin use is an effective strategy for reducing low-density lipoprotein cholesterol and improving patient and graft survival after KT.


Hydroxymethylglutaryl-CoA Reductase Inhibitors , Kidney Transplantation , Humans , Cohort Studies , Kidney , Cholesterol, LDL
6.
Xenotransplantation ; 31(1): e12838, 2024.
Article En | MEDLINE | ID: mdl-38112053

BACKGROUND: αGal-deficient xenografts are protected from hyperacute rejection during xenotransplantation but are still rejected more rapidly than allografts. Despite studies showing the roles of non-Gal antibodies and αß T cells in xenograft rejection, the involvement of γδ T cells in xenograft rejection has been limitedly investigated. METHODS: Six male cynomolgus monkeys were transplanted with porcine vessel xenografts from wild-type (n = 3) or GGTA1 knockout (n = 3) pigs. We measured the proportions and T cell receptor (TCR) repertoires of blood γδ T cells before and after xenotransplant. Grafted porcine vessel-infiltrating immune cells were visualized at the end of experiments. RESULTS: Blood γδ T cells expanded and infiltrated into the graft vessel adventitia following xenotransplantation of α-Gal-deficient pig blood vessels. Pre- and post-transplant analysis of γδ TCR repertoire revealed a transition in δ chain usage post-transplantation, with the expansion of several clonotypes of δ1, δ3, or δ7 chains. Furthermore, the distinctions between pre- and post-transplant δ chain usages were more prominent than those observed for γ chain usages. CONCLUSION: γδ TCR repertoire was significantly altered by xenotransplantation, suggesting the role of γδ T cells in sustained xenoreactive immune responses.


Primates , T-Lymphocyte Subsets , Animals , Male , Heterografts , Receptors, Antigen, T-Cell , Swine , Transplantation, Heterologous , Macaca fascicularis
7.
Sci Rep ; 13(1): 22387, 2023 12 16.
Article En | MEDLINE | ID: mdl-38104210

Protocol biopsy is a reliable method for assessing allografts status after kidney transplantation (KT). However, due to the risk of complications, it is necessary to establish indications and selectively perform protocol biopsies by classifying the high-risk group for early subclinical rejection (SCR). Therefore, the purpose of this study is to analyze the incidence and risk factors of early SCR (within 2 weeks) and develop a prediction model using machine learning. Patients who underwent KT at Samsung Medical Center from January 2005 to December 2020 were investigated. The incidence of SCR was investigated and risk factors were analyzed. For the development of prediction model, machine learning methods (random forest, elastic net, extreme gradient boosting [XGB]) and logistic regression were used and the performance between the models was evaluated. The cohorts of 987 patients were reviewed and analyzed. The incidence of SCR was 14.6%. Borderline cellular rejection (BCR) was the most common type of rejection, accounting for 61.8% of cases. In the analysis of risk factors, recipient age (OR 0.98, p = 0.03), donor BMI (OR 1.07, p = 0.02), ABO incompatibility (OR 0.15, p < 0.001), HLA II mismatch (two [OR 6.44, p < 0.001]), and ATG induction (OR 0.41, p < 0.001) were associated with SCR in the multivariate analysis. The logistic regression prediction model (average AUC = 0.717) and the elastic net model (average AUC = 0.712) demonstrated good performance. HLA II mismatch and induction type were consistently identified as important variables in all models. The odds ratio analysis of the logistic prediction model revealed that HLA II mismatch (OR 6.77) was a risk factor for SCR, while ATG induction (OR 0.37) was a favorable factor. Early SCR was associated with HLA II mismatches and induction agent and prediction model using machine learning demonstrates the potential to predict SCR.


Kidney Transplantation , Humans , Kidney Transplantation/adverse effects , Graft Rejection/etiology , Risk Factors , Blood Group Incompatibility , Machine Learning , Retrospective Studies
8.
Sci Rep ; 13(1): 19640, 2023 11 10.
Article En | MEDLINE | ID: mdl-37949967

Vitamin D3 (25[OH]D3) insufficiency and fibroblast growth factor 23 (FGF23) elevation are usually attenuated after kidney transplantation (KT). However, elevated FGF23 may be associated with poor graft outcomes and vitamin D insufficiency after KT. This study investigated the effect of pretransplant FGF23 levels on post-KT 25(OH)D3 status and graft outcomes. Serum FGF23 levels from 400 participants of the KoreaN Cohort Study for Outcome in Patients With Kidney Transplantation were measured. Annual serum 25(OH)D3 levels, all-cause mortality, cardiovascular event, and graft survival were assessed according to baseline FGF23 levels. Serum 25(OH)D3 levels were initially increased 1 year after KT (12.6 ± 7.4 vs. 22.6 ± 6.4 ng/mL). However, the prevalence of post-KT vitamin D deficiency increased again after post-KT 3 years (79.1% at baseline, 30.8% and 37.8% at 3 and 6 years, respectively). Serum FGF23 level was decreased 3 years post-KT. When participants were categorized into tertiles according to baseline FGF23 level (low, middle, high), 25(OH)D3 level in the low FGF23 group was persistently low at a median follow-up of 8.3 years. Furthermore, high baseline FGF23 level was a risk factor for poor graft survival (HR 5.882, 95% C.I.; 1.443-23.976, P = 0.013). Elevated FGF23 levels are associated with persistently low post-transplant vitamin D levels and poor graft survival.


Kidney Transplantation , Vitamin D Deficiency , Humans , Cohort Studies , Fibroblast Growth Factors , Graft Survival , Vitamin D , Vitamins
9.
Article En | MEDLINE | ID: mdl-37919893

Background: Immunosenescence gradually deteriorates the function of the immune system, making elderly patients susceptible to infection, while reducing rejection of organ transplants. Therefore, age-adaptive immunosuppression is necessary in the elderly. We evaluated clinical outcomes such as rejection and infection rate when using basiliximab and rabbit anti-thymocyte globulin (r-ATG) as induction agents in elderly and young organ transplant recipients. Methods: We retrospectively reviewed patients who underwent kidney transplantation (KT) between June 2011 and April 2019. We enrolled 704 adult KT patients and classified the patients into groups according to patient age. We compared the outcomes of infection and biopsy-proven acute rejection (BPAR) according to the type of induction agent (basiliximab and r-ATG [4.5 mg/kg]). Results: The patient group included 520 recipients (74.6%) in the younger recipient group and 179 recipients (25.4%) in the older recipient group. When r-ATG was used as an induction agent, BPAR within 6 months occurred less (p = 0.03); however, infections within 6 months were higher in older recipients. Deaths due to infection were more common in older recipients (p = 0.003). Conclusion: It may be necessary to use less intensive induction therapy for older recipients, of which dose reduction of r-ATG is one option.

10.
J Nephrol ; 36(7): 2091-2109, 2023 09.
Article En | MEDLINE | ID: mdl-37751127

BACKGROUND: The impact of circulating sclerostin levels on vascular calcification has shown conflicting results depending on the target population and vascular anatomy. This study investigated the associations of sclerostin levels with vascular outcomes in kidney transplant patients. METHODS: In a prospective observational study of the Korean Cohort Study for Outcome in Patients with Kidney Transplantation, 591 patients with serum sclerostin level data prior to transplantation were analyzed. The main predictor was the pre-transplant sclerostin level. Vascular outcomes were the abdominal aortic calcification score and brachial-ankle pulse wave velocity measured at pre-transplant screening and three and five years after kidney transplantation. RESULTS: In linear regression analysis, sclerostin level positively correlated with changes in abdominal aortic calcification score between baseline and five years after kidney transplantation (coefficient of 0.73 [95% CI, 0.11-1.35] and 0.74 [95% CI, 0.06-1.42] for second and third tertiles, respectively, vs the first tertile). In a longitudinal analysis over five years, using generalized estimating equations, the coefficient of the interaction (sclerostin × time) was significant with a positive value, indicating that higher sclerostin levels were associated with faster increase in post-transplant abdominal aortic calcification score. Linear regression analysis revealed a positive association between pre-transplant sclerostin levels and changes in brachial-ankle pulse wave velocity (coefficient of 126.7 [95% CI, 35.6-217.8], third vs first tertile). Moreover, a significant interaction was identified between sclerostin levels and brachial-ankle pulse wave velocity at five years. CONCLUSIONS: Elevated pre-transplant sclerostin levels are associated with the progression of post-transplant aortic calcifications and arterial stiffness.


Kidney Transplantation , Vascular Calcification , Vascular Stiffness , Humans , Cohort Studies , Ankle Brachial Index , Kidney Transplantation/adverse effects , Genetic Markers , Pulse Wave Analysis/methods
11.
BMC Anesthesiol ; 23(1): 263, 2023 08 05.
Article En | MEDLINE | ID: mdl-37543574

BACKGROUND: International guidelines have recommended preemptive kidney transplantation (KT) as the preferred approach, advocating for transplantation before the initiation of dialysis. This approach is advantageous for graft and patient survival by avoiding dialysis-related complications. However, recipients of preemptive KT may undergo anesthesia without the opportunity to optimize volume status or correct metabolic disturbances associated with end-stage renal disease. In these regard, we aimed to investigate the anesthetic events that occur more frequently during preemptive KT compared to nonpreemptive KT. METHODS: This is a single-center retrospective study. Of the 672 patients who underwent Living donor KT (LDKT), 388 of 519 who underwent nonpreemptive KT were matched with 153 of 153 who underwent preemptive KT using propensity score based on preoperative covariates. The primary outcome was intraoperative hypotension defined as area under the threshold (AUT), with a threshold set at a mean arterial blood pressure below 70 mmHg. The secondary outcomes were intraoperative metabolic acidosis estimated by base excess and serum bicarbonate, electrolyte imbalance, the use of inotropes or vasopressors, intraoperative transfusion, immediate graft function evaluated by the nadir creatinine, and re-operation due to bleeding. RESULTS: After propensity score matching, we analyzed 388 and 153 patients in non-preemptive and preemptive groups. The multivariable analysis revealed the AUT of the preemptive group to be significantly greater than that of the nonpreemptive group (mean ± standard deviation, 29.7 ± 61.5 and 14.5 ± 37.7, respectively, P = 0.007). Metabolic acidosis was more severe in the preemptive group compared to the nonpreemptive group. The differences in the nadir creatinine value and times to nadir creatinine were statistically significant, but clinically insignificant. CONCLUSION: Intraoperative hypotension and metabolic acidosis occurred more frequently in the preemptive group during LDKT. These findings highlight the need for anesthesiologists to be prepared and vigilant in managing these events during surgery.


Anesthesia , Kidney Failure, Chronic , Kidney Transplantation , Humans , Retrospective Studies , Creatinine , Propensity Score , Graft Survival , Living Donors , Kidney Failure, Chronic/surgery , Anesthesia/adverse effects
12.
Xenotransplantation ; 30(5): e12814, 2023.
Article En | MEDLINE | ID: mdl-37493436

Xenotransplantation using pigs' liver offers a potentially alternative method to overcome worldwide donor shortage, or more importantly as a bridge to allotransplantation. However, it has been challenged by profound thrombocytopenia and fatal coagulopathy in non-human primate models. Here we suggest that a left auxiliary technique can be a useful method to achieve extended survival of the xenograft. Fifteen consecutive liver xenotransplants were carried out in a pig-to-cynomolgus model. Right auxiliary technique was implemented in two cases, orthotopic in eight cases, and left auxiliary in five cases. None of the right auxiliary recipients survived after surgery due to hemorrhage during complex dissection between the primate's right lobe and inferior vena cava. Orthotopic recipients survived less than 7 days secondary to profound thrombocytopenia and coagulopathy. Two out of five left auxiliary xenotransplants survived more than 3 weeks without uncontrolled thrombocytopenia or anemia, with one of them surviving 34 days, the longest graft survival reported to date. Left auxiliary xenotransplant is a feasible approach in non-human primate experiments, and the feared risk of thrombocytopenia and coagulopathy can be minimized. This may allow for longer evaluation of the xenograft and help better understand histopathological and immunological changes that occur following liver xenotransplantation.


Blood Coagulation Disorders , Liver Transplantation , Thrombocytopenia , Animals , Humans , Swine , Transplantation, Heterologous/methods , Liver Transplantation/methods , Graft Rejection , Animals, Genetically Modified , Primates , Liver/surgery , Thrombocytopenia/surgery , Macaca fascicularis
13.
Front Surg ; 10: 1209698, 2023.
Article En | MEDLINE | ID: mdl-37377670

Background: A high rate of locoregional recurrence is one of the major difficulties in successful treatment of retroperitoneal sarcoma (RPS). Although pre-operative radiation therapy (RT) is considered a potential way to improve local recurrence, concerns about the associated treatment toxicity and risk of peri-operative complications need to be addressed. Hence, this study investigates the safety of pre-operative RT (preRTx) for RPS. Methods: A cohort of 198 patients with RPS who had undergone both surgery and RT was analyzed for peri-operative complications. They were divided into three groups according to the RT scheme: (1) preRTx group, (2) post-operative RT without tissue expander, and (3) post-operative RT with tissue expander. Results: The preRTx was overall well tolerated and did not affect the R2 resection rate, operative time, and severe post-operative complications. However, the preRTx group was associated with higher incidence of post-operative transfusion and admission to intensive care unit (p = 0.013 and p = 0.036, respectively), where preRTx was an independent risk factor only for the post-operative transfusion (p = 0.009) in multivariate analysis. The median radiation dose was the highest in preRTx group, although no significant difference was demonstrated in overall survival and local recurrence rate. Conclusion: This study suggests that the preRTx does not add significant post-operative morbidity to the patients with RPS. In addition, radiation dose elevation is achievable with the pre-operative RT. However, a meticulous intra-operative bleeding control is recommended in those patients, and further high-quality trials are warranted to evaluate the long-term oncological outcomes.

15.
Front Immunol ; 14: 1190576, 2023.
Article En | MEDLINE | ID: mdl-37228607

Introduction: Acute rejection (AR) continues to be a significant obstacle for short- and long-term graft survival in kidney transplant recipients. Herein, we aimed to examine urinary exosomal microRNAs with the objective of identifying novel biomarkers of AR. Materials and methods: Candidate microRNAs were selected using NanoString-based urinary exosomal microRNA profiling, meta-analysis of web-based, public microRNA database, and literature review. The expression levels of these selected microRNAs were measured in the urinary exosomes of 108 recipients of the discovery cohort using quantitative real-time polymerase chain reaction (qPCR). Based on the differential microRNA expressions, AR signatures were generated, and their diagnostic powers were determined by assessing the urinary exosomes of 260 recipients in an independent validation cohort. Results: We identified 29 urinary exosomal microRNAs as candidate biomarkers of AR, of which 7 microRNAs were differentially expressed in recipients with AR, as confirmed by qPCR analysis. A three-microRNA AR signature, composed of hsa-miR-21-5p, hsa-miR-31-5p, and hsa-miR-4532, could discriminate recipients with AR from those maintaining stable graft function (area under the curve [AUC] = 0.85). This signature exhibited a fair discriminative power in the identification of AR in the validation cohort (AUC = 0.77). Conclusion: We have successfully demonstrated that urinary exosomal microRNA signatures may form potential biomarkers for the diagnosis of AR in kidney transplantation recipients.


Kidney Transplantation , MicroRNAs , Humans , Kidney Transplantation/adverse effects , MicroRNAs/genetics , Biomarkers , Real-Time Polymerase Chain Reaction
16.
Transplant Proc ; 55(4): 769-776, 2023 May.
Article En | MEDLINE | ID: mdl-37062613

Subclinical rejection (SCR) is associated with chronic allograft nephropathy. Therefore, early detection and treatment of SCR through a protocol biopsy (PB) can reduce the incidence of pathologic changes. This study evaluates the impact of early detection and treatment of SCR using a routine PB 2 weeks after kidney transplantation (KT) by examining histologic outcomes 1 year later. We reviewed 624 KT recipients at the Samsung Medical Center between August 2012 and December 2018. Protocol biopsy was planned 2 weeks and 1 year after transplantation. We compared the histologic changes between the 2 biopsies. After a propensity score matching analysis, we divided the patients into 2 groups: the proven normal group (n = 256) and the rejection group (n = 96) at the PB taken 2 weeks post-transplant. The rejection group showed no significant difference from the normal group in the flow of graft function or the Kaplan-Meier curve for graft survival. In the histologic outcomes, the pathologic differences between the groups significantly improved between the 2 time points. Treating SCR through a PB 2 weeks after KT can contribute to the maintenance of graft function and improve histologic changes 1 year after KT.


Glomerulosclerosis, Focal Segmental , Kidney Transplantation , Humans , Kidney Transplantation/adverse effects , Graft Rejection/epidemiology , Biopsy , Graft Survival , Glomerulosclerosis, Focal Segmental/pathology , Kidney/pathology
17.
J Cardiovasc Imaging ; 31(2): 98-104, 2023 Apr.
Article En | MEDLINE | ID: mdl-37096675

BACKGROUND: We aimed to investigate left ventricular (LV) global longitudinal strain (GLS) in end-stage renal disease patients and its change after kidney transplantation (KT). METHODS: We retrospectively reviewed patients who underwent KT between 2007 and 2018 at two tertiary centers. We analyzed 488 patients (median age, 53 years; 58% male) who had obtained echocardiography both before and within 3 years after KT. Conventional echocardiography and LV GLS assessed by two-dimensional speckle-tracking echocardiography were comprehensively analyzed. Patients were classified into three groups according to the absolute value of pre-KT LV GLS (|LV GLS|). We compared longitudinal changes of cardiac structure and function according to pre-KT |LV GLS|. RESULTS: Correlation between pre-KT LV EF and |LV GLS| were statistically significant, but the constant was not high (r = 0.292, p < 0.001). |LV GLS| was widely distributed at corresponding LV EF, especially when the LV EF was > 50%. Patients with severely impaired pre-KT |LV GLS| had significantly larger LV dimension, LV mass index, left atrial volume index, and E/e' and lower LV EF, compared to mildly and moderately reduced pre-KT |LV GLS|. After KT, the LV EF, LV mass index, and |LV GLS| were significantly improved in three groups. Patients with severely impaired pre-KT |LV GLS| showed the most prominent improvement of LV EF and |LV GLS| after KT, compared to other groups. CONCLUSIONS: Improvements in LV structure and function after KT were observed in patients throughout the full spectrum of pre-KT |LV GLS|.

18.
Clin Microbiol Infect ; 29(7): 911-917, 2023 Jul.
Article En | MEDLINE | ID: mdl-36868356

OBJECTIVES: Kidney transplant (KT) recipients have an increased risk of herpes zoster (HZ) and its complications. Although recombinant zoster vaccine is favoured over zoster vaccine live (ZVL), ZVL is also recommended to prevent HZ for KT candidates. We aimed to evaluate the clinical effectiveness of ZVL in KT recipients immunized before transplantation. METHODS: Adult patients who received kidney transplantation from January 2014 to December 2018 were enrolled. Patients were observed until HZ occurrence, death, loss of allograft, loss to follow-up, or 5 years after transplantation. The inverse probability of the treatment-weighted Cox proportional hazard model was used to compare the incidence of HZ after transplantation between vaccinated and unvaccinated patients. RESULTS: A total of 84 vaccinated and 340 unvaccinated patients were included. The median age was higher in the vaccinated group (57 vs. 54 years, p 0.003). Grafts from deceased donors were more frequently transplanted in the unvaccinated group (16.7% vs. 51.8%, p < 0.001). Five-year cumulative HZ incidence was 11.9%, which translated to 26.27 (95% CI, 19.33-34.95) per 1000 person-years. The incidence in the vaccinated and unvaccinated groups was 3.9% and 13.7%, respectively. After adjustment, vaccination showed significant protective effectiveness against HZ (adjusted hazard ratio, 0.18, 95% CI, 0.05-0.60). In addition, all four cases of disseminated zoster occurred in the unvaccinated group. DISCUSSION: Our study, the first on the clinical effectiveness of zoster vaccines for KT recipients, suggests that ZVL before transplantation effectively prevents HZ.


Herpes Zoster Vaccine , Herpes Zoster , Kidney Transplantation , Adult , Humans , Herpes Zoster Vaccine/adverse effects , Cohort Studies , Retrospective Studies , Kidney Transplantation/adverse effects , Herpes Zoster/epidemiology , Herpes Zoster/prevention & control , Herpesvirus 3, Human , Vaccination , Treatment Outcome
19.
Transplant Proc ; 55(4): 756-768, 2023 May.
Article En | MEDLINE | ID: mdl-36990887

Many studies have reported that protocol biopsy (PB) may help preserve kidney function in kidney transplant recipients. Early detection and treatment of subclinical rejection may reduce the incidence of chronic antibody-mediated rejection and graft failure. However, no consensus has been reached regarding PB effectiveness, timing, and policy. This study aimed to evaluate the protective role of routine PB performed 2 weeks and 1 year after kidney transplantation. We reviewed 854 kidney transplant recipients at the Samsung Medical Center between July 2007 and August 2017, with PBs planned at 2 weeks and 1 year after transplantation. We compared the trends in graft function, chronic kidney disease (CKD) progression, new-onset CKD, infection, and patient and graft survival between the 504 patients who underwent PB and 350 who did not undergo PB. The PB group was again divided into 2 groups: the single PB group (n = 207) and the double PB group (n = 297). The PB group was significantly different from the no-PB group in terms of the trends in graft function (estimated glomerular filtration rate). The Kaplan-Meier curve showed that PB did not significantly improve graft or overall patient survival. However, in the multivariate Cox analysis, the double PB group had advantages in graft survival, CKD progression, and new-onset CKD. PB can play a protective role in the maintenance of kidney grafts in kidney transplant recipients.


Kidney Transplantation , Renal Insufficiency, Chronic , Humans , Kidney Transplantation/methods , Graft Rejection/epidemiology , Kidney , Biopsy , Graft Survival , Renal Insufficiency, Chronic/diagnosis , Renal Insufficiency, Chronic/surgery , Renal Insufficiency, Chronic/pathology , Allografts , Retrospective Studies , Review Literature as Topic
20.
Ultrasonography ; 42(2): 238-248, 2023 Apr.
Article En | MEDLINE | ID: mdl-36935601

PURPOSE: This study evaluated the role of donor kidney ultrasonography (US) for predicting functional kidney volume and identifying ideal kidney grafts in deceased donor kidney transplantation. METHODS: In total, 272 patients who underwent deceased donor kidney transplantation from 2000 to 2020 at Samsung Medical Center were enrolled. Donor kidney information (i.e., right or left) was provided to the radiologist who performed US image re-analysis. To binarize each kidney's ultrasound parameters, an optimal cutoff value for estimated glomerular filtration rate (eGFR) of less than 30 mL/min/1.73 m2 within 1 year after kidney transplantation was selected using the receiver operating characteristic curve with a specificity >60%. Cox regression analysis was performed for an eGFR less than 30 mL/min/1.73 m2 within 1 year after kidney transplantation and graft failure within 2 years after kidney transplantation. RESULTS: The product of renal length and cortical thickness was a statistically significant predictor of graft function. The odds ratios of an eGFR less than 30 mL/min/1.73 m2 within a year after kidney transplantation and the hazard ratio of graft failure within 2 years after kidney transplantation were 5.91 (P=0.003) and 5.76 (P=0.022), respectively. CONCLUSION: Preoperative US of the donor kidney can be used to evaluate donor kidney function and can predict short-term graft survival. An imaging modality such as US should be included in the donor selection criteria as an additional recommendation. However, the purpose of this study was not to narrow the expanded criteria but to avoid catastrophic consequences by identifying ideal donor kidneys using preoperative US.

...