Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 1.799
Filter
1.
Medicina (Kaunas) ; 60(6)2024 Jun 20.
Article in English | MEDLINE | ID: mdl-38929633

ABSTRACT

Background and Objectives: Endobiogeny is a global systems approach to human biology based on the concept that the endocrine system manages the metabolism. Biology of function (BoF) indices are diagnostic tools in endobiogenic medicine that reflect the action of the endocrine system on the cells and the metabolic activity of an organism. Kidney transplant recipients are a very specific patient population due to their constant use of immunosuppressive agents such as steroids and anamnesis of chronic kidney disease. The aim of this study was to assess the tendencies of endobiogenic BoF indices in a kidney transplant recipient population and to determine the relationship between BoF index values and histology-proven kidney transplant rejection. Materials and Methods: A total of 117 kidney transplant recipients undergoing surveillance or indication allograft biopsy were included in this study. Endobiogenic BoF indices were calculated from complete blood count tests taken before the kidney biopsy. Histology samples were evaluated by an experienced pathologist according to the Banff classification system. Clinical and follow-up data were collected from an electronic patient medical record system. Results: Overall, <35% of the patients had BoF index values assumed to be normal, according to the general population data. Additionally, >50% of the patients had lower-than-normal adaptation, leucocyte mobilization, genital, and adjusted genital ratio indices, while the Cata-Ana, genito-thyroid ratio, adrenal gland, and cortisol indices were increased in >50% of the transplant recipients. The adaptation index was significantly higher in patients with biopsy-proven transplant rejection and demonstrated an AUC value of 0.649 (95%CI 0.540-0.759) for discriminating rejectors from patients without transplant rejection. Conclusions: Most of the kidney transplant recipients had abnormal BoF index values, reflecting increased corticotropic effects on their cells. The adaptation index distinguished patients with biopsy-proven transplant rejection from those without it.


Subject(s)
Kidney Transplantation , Humans , Kidney Transplantation/adverse effects , Male , Female , Middle Aged , Adult , Cohort Studies , Transplant Recipients/statistics & numerical data , Graft Rejection/physiopathology , Aged , Biopsy/methods
2.
Int J Artif Organs ; 47(3): 173-180, 2024 Mar.
Article in English | MEDLINE | ID: mdl-38372215

ABSTRACT

AIM: Use of microaxial mechanical circulatory support (MCS) has been reported for severe graft rejection or dysfunction after heart transplantation (HTx). We aimed to assess utilization patterns of microaxial MCS after HTx in adolescents (ages 18 and younger) and adults (ages 19 and older). METHODS: Electronic search was performed to identify all relevant studies on post-HTx use of microaxial support in adults and adolescents. A total of 18 studies were selected and patient-level data were extracted for statistical analysis. RESULTS: All patients (n=23), including adults (n=15) and adolescents (n=8), underwent Impella (Abiomed, Danvers, MA) microaxial MCS after HTx. Median age was 36 [IQR 18-56] years (Adults, 52 [37-59]; adolescents, 16 [15-17]). Primary right ventricular graft dysfunction was an indication exclusively seen in the adults 40% (6/15), while acute graft rejection was present in 46.7% (7/15) of adults. Median time after transplant was 9 [0-32] months (Adults, 4 [0-32]; adolescents, 11 [4.5, 45]). Duration of Impella support was comparable between adults and adolescents (5 [2.5-8] vs 6 [5-8] days, p = 0.38). Overall improvement was observed both in median LV ejection fraction (23.5% [11.3-28] to 42% [37.8-47.3], p < 0.01) and cardiac index (1.8 [1.2-2.6] to 3 [2.5-3.1], p < 0.01). Retransplantation was required in four adolescents (50%, 4/8). Survival to discharge was achieved by 60.0% (9/15) of adults and 87.5% (7/8) of adolescents respectively (p = 0.37). CONCLUSION: Indications for microaxial MCS appear to vary between adult and adolescent patients. Overall improvement in LVEF and cardiac index was observed, however, with suboptimal survival to discharge.


Subject(s)
Heart Transplantation , Heart-Assist Devices , Humans , Heart Transplantation/adverse effects , Adult , Adolescent , Male , Female , Middle Aged , Young Adult , Graft Rejection/physiopathology , Treatment Outcome , Heart Failure/surgery , Heart Failure/physiopathology , Heart Failure/therapy , Age Factors , Time Factors
3.
Hematology ; 27(1): 293-299, 2022 Dec.
Article in English | MEDLINE | ID: mdl-35192779

ABSTRACT

OBJECTIVES: Graft failure (GF) is an intractable complication of transplantation, which can severely affect the efficacy of the graft; however, the characteristics, incidence, and risk factors of primary GF have not been well described. This study aimed to analyze the risk factors and outcomes of primary GF to swiftly identify high-risk patients for GF. METHODS: We performed a case-control study with a case-control ratio of 1:4 with 869 patients who underwent allogeneic hematopoietic stem cell transplantation (allo-HSCT) between January 2015 and December 2019 at our center. RESULTS: Nineteen (2.19%) patients experienced primary poor graft function (PGF), while eleven (1.27%) patients developed primary graft rejection (GR). Univariate and multivariate logistic analyses identified two independent risk factors for primary PGF: splenomegaly [P = 0.030; odds ratio (OR), 3.486; 95% confidence interval (CI), 1.139 to 13.109], and donor type [non-matched sibling donor (non-MSD)] (P = 0.018; OR, 4.475; 95% CI, 1.289 to 15.537). However, only donor type (non-MSD) was statistically significant (P = 0.020; OR, 19.432; 95% CI, 1.595 to 236.691) for primary GR. The overall survival was significantly lower in the primary PGF (P = 0.001) and GR group (P = 0.000), respectively, compared to the control group. CONCLUSION: GF can significantly affect the overall survival of patients who underwent allo-HSCT, despite its considerably low incidence. A human leukocyte antigen-matched sibling donor should be the first choice for patients undergoing allo-HSCT for the prevention of GF. Moreover, splenomegaly is an independent risk factor for PGF, and caution must be exercised while treating such patients.


Subject(s)
Graft Rejection/physiopathology , Hematopoietic Stem Cell Transplantation/adverse effects , Transplantation Conditioning/adverse effects , Transplantation, Homologous/adverse effects , Adolescent , Adult , Case-Control Studies , Female , Hematopoietic Stem Cell Transplantation/methods , Humans , Male , Middle Aged , Risk Factors , Transplantation Conditioning/methods , Transplantation, Homologous/methods , Treatment Outcome , Young Adult
4.
J Thorac Cardiovasc Surg ; 163(2): 712-720.e6, 2022 Feb.
Article in English | MEDLINE | ID: mdl-32798029

ABSTRACT

OBJECTIVES: To evaluate outcomes after heart retransplantation. METHODS: From January 6, 1968, to June 2019, 123 patients (112 adult and 11 pediatric patients) underwent heart retransplantation, and 2092 received primary transplantation at our institution. Propensity-score matching was used to account for baseline differences between the retransplantation and the primary transplantation-only groups. Kaplan-Meier survival analyses were performed. The primary end point was all-cause mortality, and secondary end points were postoperative complications. RESULTS: Retransplantation recipient age was 39.6 ± 16.4 years, and donor age was 26.4 ± 11.2 years. Ninety-two recipients (74.8%) were male. Compared with recipients who only underwent primary heart transplantation, retransplantation recipients were more likely to have hypertension (44/73.3% vs 774/53.3%, P = .0022), hyperlipidemia (40/66.7% vs 447/30.7%, P < .0001), and require dialysis (7/11.7% vs 42/2.9%, P = .0025). The indications for heart retransplantation were cardiac allograft vasculopathy (32/80%), primary graft dysfunction (6/15%), and refractory acute rejection (2/5%). After matching, postoperative outcomes such as hospital length of stay, severe primary graft dysfunction requiring intra-aortic balloon pump or extracorporeal membrane oxygenation, cerebral vascular accident, respiratory failure, renal failure requiring dialysis, and infection were similar between the 2 groups. Matched median survival after retransplantation was 4.6 years compared with 6.5 years after primary heart transplantation (log-rank P = .36, stratified log-rank P = .0063). CONCLUSIONS: In this single-center cohort, the unadjusted long-term survival after heart retransplantation was inferior to that after primary heart transplantation, and short-term survival difference persisted after propensity-score matching. Heart retransplantation should be considered for select patients for optimal donor organ usage.


Subject(s)
Coronary Artery Disease/surgery , Graft Rejection/surgery , Heart Failure/surgery , Heart Transplantation , Primary Graft Dysfunction/surgery , Adolescent , Adult , California , Coronary Artery Disease/etiology , Coronary Artery Disease/mortality , Coronary Artery Disease/physiopathology , Female , Graft Rejection/etiology , Graft Rejection/mortality , Graft Rejection/physiopathology , Heart Failure/mortality , Heart Failure/physiopathology , Heart Transplantation/adverse effects , Heart Transplantation/mortality , Humans , Male , Middle Aged , Primary Graft Dysfunction/etiology , Primary Graft Dysfunction/mortality , Primary Graft Dysfunction/physiopathology , Recovery of Function , Reoperation , Retrospective Studies , Risk Assessment , Risk Factors , Time Factors , Treatment Outcome , Young Adult
5.
J Am Coll Cardiol ; 78(24): 2425-2435, 2021 12 14.
Article in English | MEDLINE | ID: mdl-34886963

ABSTRACT

BACKGROUND: Single-center data suggest that the index of microcirculatory resistance (IMR) measured early after heart transplantation predicts subsequent acute rejection. OBJECTIVES: The goal of this study was to validate whether IMR measured early after transplantation can predict subsequent acute rejection and long-term outcome in a large multicenter cohort. METHODS: From 5 international cohorts, 237 patients who underwent IMR measurement early after transplantation were enrolled. The primary outcome was acute allograft rejection (AAR) within 1 year after transplantation. A key secondary outcome was major adverse cardiac events (MACE) (the composite of death, re-transplantation, myocardial infarction, stroke, graft dysfunction, and readmission) at 10 years. RESULTS: IMR was measured at a median of 7 weeks (interquartile range: 3-10 weeks) post-transplantation. At 1 year, the incidence of AAR was 14.4%. IMR was associated proportionally with the risk of AAR (per increase of 1-U IMR; adjusted hazard ratio [aHR]: 1.04; 95% confidence interval [CI]: 1.02-1.06; p < 0.001). The incidence of AAR in patients with an IMR ≥18 was 23.8%, whereas the incidence of AAR in those with an IMR <18 was 6.3% (aHR: 3.93; 95% CI: 1.77-8.73; P = 0.001). At 10 years, MACE occurred in 86 (36.3%) patients. IMR was significantly associated with the risk of MACE (per increase of 1-U IMR; aHR: 1.02; 95% CI: 1.01-1.04; P = 0.005). CONCLUSIONS: IMR measured early after heart transplantation is associated with subsequent AAR at 1 year and clinical events at 10 years. Early IMR measurement after transplantation identifies patients at higher risk and may guide personalized posttransplantation management.


Subject(s)
Coronary Circulation/physiology , Graft Rejection/physiopathology , Heart Transplantation/adverse effects , Microcirculation/physiology , Vascular Resistance/physiology , Allografts , Coronary Angiography , Female , Follow-Up Studies , Graft Rejection/diagnosis , Humans , Male , Middle Aged , Predictive Value of Tests , Prospective Studies , Time Factors
6.
Nutrients ; 13(9)2021 Aug 27.
Article in English | MEDLINE | ID: mdl-34578871

ABSTRACT

BACKGROUND: Hyponatremia is one of the most common electrolyte disorders observed in hospitalized and ambulatory patients. Hyponatremia is associated with increased falls, fractures, prolonged hospitalisation and mortality. The clinical importance of hyponatremia in the renal transplant field is not well established, so the aim of this study was to determine the relationships between hyponatremia and mortality as main outcome and renal function decline and graft loss as secondary outcome among a prospective cohort of renal transplant recipients. METHODS: This prospective cohort study included 1315 patients between 1 May 2008 and 31 December 2014. Hyponatremia was defined as sodium concentration below 136 mmol/L at 6 months after transplantation. The main endpoint was mortality. A secondary composite endpoint was also defined as: rapid decline in renal function (≥5 mL/min/1.73 m2 drop of the eGFR/year), graft loss or mortality. RESULTS: Mean sodium was 140 ± 3.08 mmol/L. 97 patients displayed hyponatremia with a mean of 132.9 ± 3.05 mmol/L. Hyponatremia at 6 months after transplantation was associated neither with mortality (HR: 1.02; p = 0.97, 95% CI: 0.47-2.19), nor with the composite outcome defined as rapid decline in renal function, graft loss or mortality (logrank test p = 0.9). CONCLUSIONS: Hyponatremia 6 months after transplantation is not associated with mortality in kidney allograft patients.


Subject(s)
Graft Rejection/complications , Hyponatremia/complications , Kidney Transplantation , Transplant Recipients/statistics & numerical data , Adult , Cohort Studies , Female , Graft Rejection/physiopathology , Humans , Hyponatremia/physiopathology , Kidney/physiopathology , Male , Middle Aged , Prospective Studies , Survival Analysis , Switzerland
7.
Aging Cell ; 20(10): e13461, 2021 10.
Article in English | MEDLINE | ID: mdl-34499402

ABSTRACT

Bone marrow-derived mesenchymal stem cell (BMSC)-derived small extracellular vesicles (sEVs) are potent candidates for the suppression of acute rejection post-renal allograft and have been reported to halt dendritic cells (DCs) maturation. However, whether BMSC-derived sEVs mitigate acute rejection post-renal allograft by targeting DCs is still unclear. In this study, donor BMSC-derived sEVs (sEVs) relieved the inflammatory response and suppressed mature DCs (mDCs) location in kidney grafts, and increased regulatory T (Treg) cell population in the spleens of the rats that underwent kidney allograft. In lipopolysaccharide (LPS)-stimulated immature DCs (imDCs), sEVs suppressed the maturation and migration of DCs and inactivated toll-like receptor 4 (TLR4) signaling. Compared with LPS-treated imDCs, imDCs treated with LPS+sEVs promoted CD4+ T cells differentiated toward Treg cells. Subsequently, we found that Loc108349490, a long non-coding RNA (lncRNA) abundant in sEVs, mediated the inhibitory effect of sEVs on DC maturation and migration by promoting TLR4 ubiquitination. In rats that underwent an allograft, Loc108349490 deficiency weakened the therapeutic effect of sEVs on acute rejection. The present study firstly found that sEVs alleviated acute rejection post-renal allograft by transferring lncRNA to DCs and screened out the functional lncRNA loaded in sEVs was Loc108349490.


Subject(s)
Allografts/metabolism , Dendritic Cells/metabolism , Extracellular Vesicles/metabolism , Graft Rejection/physiopathology , Mesenchymal Stem Cells/metabolism , Acute Disease , Animals , Cell Differentiation , Humans , Male , Mice , Rats , Rats, Sprague-Dawley , Rats, Wistar
8.
Toxins (Basel) ; 13(8)2021 08 16.
Article in English | MEDLINE | ID: mdl-34437442

ABSTRACT

Acute kidney injury (AKI) is a significant risk factor for developing chronic kidney disease and progression to end-stage renal disease in elderly patients. AKI is also a relatively common complication after kidney transplantation (KTx) associated with graft failure. Since the lifespan of a transplanted kidney is limited, the risk of the loss/deterioration of graft function (DoGF) should be estimated to apply the preventive treatment. The collection of saliva and urine is more convenient than collecting blood and can be performed at home. The study aimed to verify whether non-invasive biomarkers, determined in saliva and urine, may be useful in the prediction of DoGF in kidney transplant recipients (KTRs) (n = 92). Salivary and serum toxins (p-cresol sulfate, pCS; indoxyl sulfate, IS) concentrations were determined using liquid chromatography-tandem mass spectrometry (LC-MS/MS). Urinary proteins, hemoglobin, and glucose were measured using a semi-quantitative strip test. Salivary IS (odds ratio (OR) = 1.19), and proteinuria (OR = 3.69) were demonstrated as independent factors for the prediction of DoGF. Satisfactory discriminatory power (area under the receiver operating characteristic curve (AUC) = 0.71 ± 0.07) and calibration of the model were obtained. The model showed that categories of the increasing probability of the risk of DoGF are associated with the decreased risk of graft survival. The non-invasive diagnostic biomarkers are a useful screening tool to identify high-risk patients for DoGF.


Subject(s)
Cresols/analysis , Graft Rejection/diagnosis , Indican/analysis , Kidney Failure, Chronic/diagnosis , Kidney Failure, Chronic/physiopathology , Kidney Transplantation/adverse effects , Saliva/chemistry , Adult , Biomarkers/analysis , Biomarkers/urine , Chromatography, Liquid/methods , Female , Graft Rejection/physiopathology , Humans , Male , Middle Aged , Poland , Predictive Value of Tests , Proteinuria/physiopathology , Toxins, Biological/analysis , Toxins, Biological/urine
9.
Sci Rep ; 11(1): 16095, 2021 08 09.
Article in English | MEDLINE | ID: mdl-34373479

ABSTRACT

The aim of this study is to investigate whether or not delayed graft function (DGF) and pre-transplant sensitization have synergistic adverse effects on allograft outcome after deceased donor kidney transplantation (DDKT) using the Korean Organ Transplantation Registry (KOTRY) database, the nationwide prospective cohort. The study included 1359 cases between May 2014 and June 2019. The cases were divided into 4 subgroups according to pre-sensitization and the development of DGF post-transplant [non-pre-sensitized-DGF(-) (n = 1097), non-pre-sensitized-DGF(+) (n = 127), pre-sensitized-DGF(-) (n = 116), and pre-sensitized-DGF(+) (n = 19)]. We compared the incidence of biopsy-proven allograft rejection (BPAR), time-related change in allograft function, allograft or patient survival, and post-transplant complications across 4 subgroups. The incidence of acute antibody-mediated rejection (ABMR) was significantly higher in the pre-sensitized-DGF(+) subgroup than in other 3 subgroups. In addition, multivariable cox regression analysis demonstrated that pre-sensitization combined with DGF is an independent risk factor for the development of acute ABMR (hazard ratio 4.855, 95% confidence interval 1.499-15.727). Moreover, DGF and pre-sensitization showed significant interaction (p-value for interaction = 0.008). Pre-sensitization combined with DGF did not show significant impact on allograft function, and allograft or patient survival. In conclusion, the combination of pre-sensitization and DGF showed significant synergistic interaction on the development of allograft rejection after DDKT.


Subject(s)
Allografts/physiopathology , Delayed Graft Function/physiopathology , Graft Rejection/physiopathology , Graft Survival/physiology , Kidney Transplantation/adverse effects , Biopsy/methods , Female , Humans , Incidence , Kidney/physiopathology , Kidney/surgery , Male , Middle Aged , Proportional Hazards Models , Prospective Studies , Risk Factors , Time Factors , Tissue Donors , Transplantation, Homologous/adverse effects
10.
Int J Mol Sci ; 22(14)2021 Jul 20.
Article in English | MEDLINE | ID: mdl-34299344

ABSTRACT

Bone damage leading to bone loss can arise from a wide range of causes, including those intrinsic to individuals such as infections or diseases with metabolic (diabetes), genetic (osteogenesis imperfecta), and/or age-related (osteoporosis) etiology, or extrinsic ones coming from external insults such as trauma or surgery. Although bone tissue has an intrinsic capacity of self-repair, large bone defects often require anabolic treatments targeting bone formation process and/or bone grafts, aiming to restore bone loss. The current bone surrogates used for clinical purposes are autologous, allogeneic, or xenogeneic bone grafts, which although effective imply a number of limitations: the need to remove bone from another location in the case of autologous transplants and the possibility of an immune rejection when using allogeneic or xenogeneic grafts. To overcome these limitations, cutting edge therapies for skeletal regeneration of bone defects are currently under extensive research with promising results; such as those boosting endogenous bone regeneration, by the stimulation of host cells, or the ones driven exogenously with scaffolds, biomolecules, and mesenchymal stem cells as key players of bone healing process.


Subject(s)
Bone Regeneration/physiology , Bone and Bones/physiology , Animals , Graft Rejection/physiopathology , Humans , Mesenchymal Stem Cells/physiology , Osteogenesis/physiology , Tissue Scaffolds/chemistry , Wound Healing/physiology
11.
J Pharm Pharm Sci ; 24: 292-307, 2021.
Article in English | MEDLINE | ID: mdl-34107240

ABSTRACT

PURPOSE: To evaluate the effect of hyperuricemia on clinical outcomes of renal transplant recipients (RTRs). METHODS: A literature search of PubMed, Cochrane, Embase was conducted up to March 20, 2020. The primary outcome was the estimated glomerular filtration rate (eGFR). The second outcomes were the risk of graft loss, death, cardiovascular event and the level of triglyceride. The following search terms were utilized: ((Hyperuricemic group) OR (Hyperuricaemia) OR (Hyperuric) OR (Urea acid) OR (Uric acid) OR (Acid urate) OR (Urate) OR (Gout)) and ((Transplantation) OR (Transplantations) OR (Transplant) OR (Transplants) OR (Graft)). RESULTS: 28 studies with 18224 patients were eligible for inclusion. There was no significant difference in eGFR (<12 months, p=0.07), the risk of graft loss (<60 months, p=0.07) and death (<60months, p=0.19) between the hyperuricemic and normouricemic group in the early post-transplantation period. But increased uric acid levels contributed to the long-term decline of eGFR, the risk of graft loss and death increased after transplantation. Hyperuricemia increased the risk of cardiovascular event with no significant difference in the level of triglyceride between the two groups. CONCLUSIONS: Increased uric acid levels contributed to the long-term decline of eGFR, increased risk of graft loss and death after transplantation. Although there was no significant effect on triglyceride, hyperuricemia increased the risk of cardiovascular event.


Subject(s)
Graft Rejection/epidemiology , Hyperuricemia/epidemiology , Kidney Transplantation , Glomerular Filtration Rate , Graft Rejection/physiopathology , Humans , Hyperuricemia/physiopathology , Risk Factors , Transplant Recipients , Treatment Outcome
12.
Cornea ; 40(4): 506-508, 2021 Apr.
Article in English | MEDLINE | ID: mdl-33881812

ABSTRACT

PURPOSE: To present a case of primary graft failure after penetrating keratoplasty found to have epithelial ingrowth into the host stroma on histopathologic analysis. METHODS: This is a single observational case report. RESULTS: We herein describe the clinical course of a case of primary graft failure after penetrating keratoplasty. The corneal button was sent for histopathologic analysis. Analysis of the patient's failed corneal button revealed circumferential epithelial full-thickness wound invasion and stromal epithelial invasion into corneal stroma. CONCLUSIONS: Based on histopathologic analysis and this patient's presentation, the stromal ingrowth followed recipient epithelial invasion of the wound and stromal invasion through clefts in the donor corneal edges. Cases of primary graft failure should be assessed for histopathologic evidence of epithelial stromal ingrowth, despite its rarity. To our knowledge, epithelial ingrowth into the corneal donor stroma after penetrating keratoplasty has not been previously reported.


Subject(s)
Anterior Chamber/pathology , Corneal Diseases/etiology , Corneal Stroma/pathology , Epithelium, Corneal/pathology , Graft Rejection/etiology , Keratoplasty, Penetrating/adverse effects , Biomarkers/metabolism , Corneal Diseases/metabolism , Corneal Diseases/physiopathology , Female , Graft Rejection/metabolism , Graft Rejection/physiopathology , Humans , Middle Aged , Visual Acuity/physiology
13.
Am J Ophthalmol ; 226: 13-21, 2021 06.
Article in English | MEDLINE | ID: mdl-33529592

ABSTRACT

PURPOSE: To compare the outcomes of penetrating keratoplasty (PK) and deep anterior lamellar keratoplasty (DALK) for pediatric keratoconus. DESIGN: Retrospective comparative interventional case series. METHODS: This study included consecutive pediatric keratoconus cases (≤18 years of age) who received PK (n=45) or DALK (n=54) in 2 different time periods. Postoperative best spectacle-corrected visual acuity (BSCVA), refraction, and complications were compared between the study groups. RESULTS: The mean follow-up was 83.3±46.1 and 63.3±45.6 months in the PK and DALK groups, respectively (P = .10). Postoperatively, BSCVA was 0.20±0.19 logMAR in the PK group and 0.26±0.19 logMAR in the DALK group (P = .11), with a BSCVA of ≥20/40 in 91.1% and 83.3% of eyes, respectively (P = .25). Two groups were comparable regarding postoperative refractive outcomes. Graft epitheliopathy and suture-associated complications were more commonly encountered after DALK, which was attributable to the effect of low-quality grafts on the clinical outcomes of DALK. Ten PK eyes (22.2%) and 9 DALK eyes (16.7%) experienced at least 1 episode of graft rejection within 5 years of corneal transplantation (P = .49). Rejection was reversible in 93.1% and 100% of episodes in the PK and DALK groups, respectively (P = .63). At the postoperative year 5, 95.6% of grafts in the PK group and 98.2% in the DALK group remained clear (P = .45). CONCLUSION: No significant difference was observed in the outcomes between PK and DALK in pediatric keratoconus. Low-quality donor tissues in DALK increased the incidence of graft epithelial problems and suture-related complications as compared to PK.


Subject(s)
Corneal Transplantation/methods , Keratoconus/surgery , Keratoplasty, Penetrating/methods , Adolescent , Child , Corneal Topography , Female , Follow-Up Studies , Graft Rejection/physiopathology , Humans , Keratoconus/diagnosis , Keratoconus/physiopathology , Male , Postoperative Complications , Refraction, Ocular/physiology , Retrospective Studies , Slit Lamp Microscopy , Tissue Donors , Treatment Outcome , Visual Acuity/physiology
14.
Am J Ophthalmol ; 226: 68-75, 2021 06.
Article in English | MEDLINE | ID: mdl-33577788

ABSTRACT

PURPOSE: To examine pretransplant findings and outcomes of corneal transplants for keratoconus in children. DESIGN: Retrospective cohort (national registry) study. METHODS: Data on all patients aged 16 or younger (n = 170) who had a first transplant for keratoconus between 2003 and 2018 in all corneal transplant centers in the UK were compared to adult patients aged 17 and older (n = 7,191). The influence of demographic variables, pretransplant corneal findings, and transplant type on 2-year visual, rejection-free, and transplant survival outcomes was examined. RESULTS: Children had poorer pretransplant visual acuity and higher rates of corneal vascularization and ocular surface disease than adults. However, 2-year post-transplant corrected visual acuity reached 20/20 or better in 35% of children compared to 28% of adults (P = .1). Transplant rejection and failure rates were 11% (P = .79) and 3% (P = .31), respectively, for children, which were comparable to rates for adults. Endothelial rejection was reported following penetrating keratoplasty (PK) in 13% of children (10% in adults). Irreversible rejection was not recorded for any transplant in a child. Despite a lack of difference in transplant outcomes, there was a significant age effect in the Cox regression model for transplant rejection, such that for every 5-year increase in age there was a 6% reduction in the hazard of rejection. Transplant survival following anterior lamellar keratoplasty and PK in children was similar. CONCLUSIONS: Young keratoconus patients have excellent transplant outcomes and visual results comparable to adults. Overall, the hazard of rejection was found to decrease with advancing age. However, in this large cohort of young patients with keratoconus and poor vision, there is no evidence of outcome advantage in delaying transplant until adult years.


Subject(s)
Keratoconus/surgery , Keratoplasty, Penetrating/methods , Adolescent , Adult , Child , Demography , Female , Graft Rejection/physiopathology , Graft Survival/physiology , Humans , Keratoconus/diagnosis , Keratoconus/physiopathology , Male , Postoperative Complications , Registries , Retrospective Studies , Treatment Outcome , Visual Acuity/physiology
15.
Transplantation ; 105(3): 648-659, 2021 03 01.
Article in English | MEDLINE | ID: mdl-33617203

ABSTRACT

BACKGROUND: There are challenges in designing adequate, well-controlled studies of patients with active antibody-mediated rejection (AMR) after kidney transplantation (KTx). METHODS: We assessed the functional relationship between change in estimated glomerular filtration rate (eGFR) following the diagnosis of AMR and the risk of subsequent death-censored graft failure using the joint modeling framework. We included recipients of solitary KTx between 1995 and 2013 at 4 transplant centers diagnosed with biopsy-proven active AMR at least 1 year post-KTx, who had a minimum of 3-year follow-up. RESULTS: A total of 91 patients across participating centers were included in the analysis. Of the 91 patients, n = 54 patients (59%) met the death-censored graft failure endpoint and n = 62 patients (68%) met the all-cause graft failure composite endpoint. Kaplan-Meier death-censored graft survival rates at 12, 36, and 60 months postdiagnosis of AMR pooled across centers were 88.9%, 58.9%, and 36.4%, respectively. Spaghetti plots indicated a linear trend in the change in eGFR, especially in the first 12 months postdiagnosis of active AMR. A significant change in eGFR was observed within the first 12 months postdiagnosis of active AMR, getting worse by a factor of -0.757 mL/min/1.73 m2 per month during the 12-month analysis period (a delta of -9.084 mL/min/1.73 m2 at 1 y). Notably, an extrapolated 30% improvement in the slope of eGFR in the first 12 months was associated with a 10% improvement in death-censored graft failure at 5 years. CONCLUSIONS: If prospectively validated, this study may inform the design of pivotal clinical trials for therapies for late AMR.


Subject(s)
Glomerular Filtration Rate/physiology , Graft Rejection/diagnosis , Graft Survival , Kidney Transplantation/adverse effects , Kidney/physiopathology , Acute Disease , Adult , Allografts , Biopsy , Female , Follow-Up Studies , Graft Rejection/physiopathology , Humans , Kidney/pathology , Male , Retrospective Studies , Time Factors
16.
J Am Soc Nephrol ; 32(3): 708-722, 2021 03.
Article in English | MEDLINE | ID: mdl-33443079

ABSTRACT

BACKGROUND: Late antibody-mediated rejection (ABMR) is a leading cause of transplant failure. Blocking IL-6 has been proposed as a promising therapeutic strategy. METHODS: We performed a phase 2 randomized pilot trial to evaluate the safety (primary endpoint) and efficacy (secondary endpoint analysis) of the anti-IL-6 antibody clazakizumab in late ABMR. The trial included 20 kidney transplant recipients with donor-specific, antibody-positive ABMR ≥365 days post-transplantation. Patients were randomized 1:1 to receive 25 mg clazakizumab or placebo (4-weekly subcutaneous injections) for 12 weeks (part A), followed by a 40-week open-label extension (part B), during which time all participants received clazakizumab. RESULTS: Five (25%) patients under active treatment developed serious infectious events, and two (10%) developed diverticular disease complications, leading to trial withdrawal. Those receiving clazakizumab displayed significantly decreased donor-specific antibodies and, on prolonged treatment, modulated rejection-related gene-expression patterns. In 18 patients, allograft biopsies after 51 weeks revealed a negative molecular ABMR score in seven (38.9%), disappearance of capillary C4d deposits in five (27.8%), and resolution of morphologic ABMR activity in four (22.2%). Although proteinuria remained stable, the mean eGFR decline during part A was slower with clazakizumab compared with placebo (-0.96; 95% confidence interval [95% CI], -1.96 to 0.03 versus -2.43; 95% CI, -3.40 to -1.46 ml/min per 1.73 m2 per month, respectively, P=0.04). During part B, the slope of eGFR decline for patients who were switched from placebo to clazakizumab improved and no longer differed significantly from patients initially allocated to clazakizumab. CONCLUSIONS: Although safety data indicate the need for careful patient selection and monitoring, our preliminary efficacy results suggest a potentially beneficial effect of clazakizumab on ABMR activity and progression.


Subject(s)
Antibodies, Monoclonal, Humanized/therapeutic use , Graft Rejection/therapy , Interleukin-6/antagonists & inhibitors , Kidney Transplantation/adverse effects , Adult , Allografts , Antibodies, Monoclonal, Humanized/adverse effects , Double-Blind Method , Female , Glomerular Filtration Rate , Graft Rejection/immunology , Graft Rejection/physiopathology , Humans , Infections/etiology , Interleukin-6/immunology , Isoantibodies/blood , Male , Middle Aged , Tissue Donors , Treatment Outcome , Young Adult
17.
Circulation ; 143(12): 1184-1197, 2021 03 23.
Article in English | MEDLINE | ID: mdl-33435695

ABSTRACT

BACKGROUND: After heart transplantation, endomyocardial biopsy (EMBx) is used to monitor for acute rejection (AR). Unfortunately, EMBx is invasive, and its conventional histological interpretation has limitations. This is a validation study to assess the performance of a sensitive blood biomarker-percent donor-derived cell-free DNA (%ddcfDNA)-for detection of AR in cardiac transplant recipients. METHODS: This multicenter, prospective cohort study recruited heart transplant subjects and collected plasma samples contemporaneously with EMBx for %ddcfDNA measurement by shotgun sequencing. Histopathology data were collected to define AR, its 2 phenotypes (acute cellular rejection [ACR] and antibody-mediated rejection [AMR]), and controls without rejection. The primary analysis was to compare %ddcfDNA levels (median and interquartile range [IQR]) for AR, AMR, and ACR with controls and to determine %ddcfDNA test characteristics using receiver-operator characteristics analysis. RESULTS: The study included 171 subjects with median posttransplant follow-up of 17.7 months (IQR, 12.1-23.6), with 1392 EMBx, and 1834 %ddcfDNA measures available for analysis. Median %ddcfDNA levels decayed after surgery to 0.13% (IQR, 0.03%-0.21%) by 28 days. Also, %ddcfDNA increased again with AR compared with control values (0.38% [IQR, 0.31-0.83%], versus 0.03% [IQR, 0.01-0.14%]; P<0.001). The rise was detected 0.5 and 3.2 months before histopathologic diagnosis of ACR and AMR. The area under the receiver operator characteristic curve for AR was 0.92. A 0.25%ddcfDNA threshold had a negative predictive value for AR of 99% and would have safely eliminated 81% of EMBx. In addition, %ddcfDNA showed distinctive characteristics comparing AMR with ACR, including 5-fold higher levels (AMR ≥2, 1.68% [IQR, 0.49-2.79%] versus ACR grade ≥2R, 0.34% [IQR, 0.28-0.72%]), higher area under the receiver operator characteristic curve (0.95 versus 0.85), higher guanosine-cytosine content, and higher percentage of short ddcfDNA fragments. CONCLUSIONS: We found that %ddcfDNA detected AR with a high area under the receiver operator characteristic curve and negative predictive value. Monitoring with ddcfDNA demonstrated excellent performance characteristics for both ACR and AMR and led to earlier detection than the EMBx-based monitoring. This study supports the use of %ddcfDNA to monitor for AR in patients with heart transplant and paves the way for a clinical utility study. Registration: URL: https://www.clinicaltrials.gov; Unique identifier: NCT02423070.


Subject(s)
Allografts/transplantation , Cell-Free Nucleic Acids/genetics , Graft Rejection/physiopathology , Adult , Aged , Cohort Studies , Female , Humans , Male , Middle Aged , Prospective Studies , Young Adult
18.
Cardiovasc Ultrasound ; 19(1): 6, 2021 Jan 09.
Article in English | MEDLINE | ID: mdl-33422079

ABSTRACT

BACKGROUND: Acute cellular rejection (ACR) is a major complication after heart transplantation. Endomyocardial biopsy (EMB) remains the gold standard for its diagnosis, but it has concerning complications. We evaluated the usefulness of speckle tracking echocardiography (STE) and biomarkers for detecting ACR after heart transplantation. METHODS: We prospectively studied 60 transplant patients with normal left and right ventricular systolic function who underwent EMB for surveillance 6 months after transplantation. Sixty age- and sex-matched healthy individuals constituted the control group. Conventional echocardiographic parameters, left ventricular global longitudinal, radial and circumferential strain (LV-GLS, LV-GRS and LV-GCS, respectively), left ventricular systolic twist (LV-twist) and right ventricular free wall longitudinal strain (RV-FWLS) were analyzed just before the procedure. We also measured biomarkers at the same moment. RESULTS: Among the 60 studied patients, 17 (28%) had severe ACR (grade ≥ 2R), and 43 (72%) had no significant ACR (grade 0 - 1R). The absolute values of LV-GLS, LV-twist and RV-FWLS were lower in transplant patients with ACR degree ≥ 2 R than in those without ACR (12.5% ± 2.9% vs 14.8% ± 2.3%, p=0.002; 13.9° ± 4.8° vs 17.1° ± 3.2°, p=0.048; 16.6% ± 2.9% vs 21.4%± 3.2%, p < 0.001; respectively), while no differences were observed between the LV-GRS or LV-GCS. All of these parameters were lower in the transplant group without ACR than in the nontransplant control group, except for the LV-twist. Cardiac troponin I levels were significantly higher in patients with significant ACR than in patients without significant ACR [0.19 ng/mL (0.09-1.31) vs 0.05 ng/mL (0.01-0.18), p=0.007]. The combination of troponin with LV-GLS, RV-FWLS and LV-Twist had an area under curve for the detection of ACR of 0.80 (0.68-0.92), 0.89 (0.81-0.93) and 0.79 (0.66-0.92), respectively. CONCLUSION: Heart transplant patients have altered left ventricular dynamics compared with control individuals. The combination of troponin with strain parameters had higher accuracy for the detection of ACR than the isolated variables and this association might select patients with a higher risk for ACR who will benefit from an EMB procedure in the first year after heart transplantation.


Subject(s)
Echocardiography/methods , Graft Rejection/diagnosis , Heart Transplantation , Heart Ventricles/diagnostic imaging , Natriuretic Peptide, Brain/blood , Stroke Volume/physiology , Troponin I/blood , Acute Disease , Adult , Biomarkers/blood , Biopsy , Female , Follow-Up Studies , Graft Rejection/metabolism , Graft Rejection/physiopathology , Heart Ventricles/physiopathology , Humans , Male , Myocardium/metabolism , Myocardium/pathology , Postoperative Period , Prognosis , Prospective Studies , Reproducibility of Results , Systole
19.
Ther Apher Dial ; 25(3): 341-349, 2021 Jun.
Article in English | MEDLINE | ID: mdl-32666667

ABSTRACT

A retrospective cohort study was conducted to evaluate the association between the plasma volume treated by double filtration plasmapheresis and allograft outcomes for the treatment of acute antibody-mediated rejection in kidney transplant recipients. Patients were divided into two groups: group 1, plasma volume treated between 1 and <1.3 total plasma volume and group 2, plasma volume treated ≥1.3 total plasma volume. Primary outcome was ≥50% reduction of serum creatinine rising from baseline value at 1 month. A total of 32 courses (146 sessions) of double filtration plasmapheresis were performed; 17 and 15 courses in group 1 and group 2, respectively. Primary outcome occurred in 41% of group 1 and 40% of group 2 (adjusted risk ratio 1.15 [95%CI, 0.48-2.76]). Graft loss at 1 year did not differ between the two groups (adjusted hazard ratio 0.65 [95%CI, 0.23-1.87]). Infection tendency seemed to be higher in group 2 (40% vs 18%, P = .243).


Subject(s)
Graft Rejection/prevention & control , Immunoglobulins, Intravenous/therapeutic use , Immunologic Factors/therapeutic use , Kidney Transplantation , Plasma Volume/physiology , Plasmapheresis/methods , Adult , Cohort Studies , Female , Graft Rejection/immunology , Graft Rejection/physiopathology , Humans , Male , Retrospective Studies
20.
Cornea ; 40(6): 710-714, 2021 Jun 01.
Article in English | MEDLINE | ID: mdl-32947404

ABSTRACT

PURPOSE: To examine tissue loss rates, processing time, and primary graft failure (PGF) of "prestripped-only" Descemet membrane endothelial keratoplasty (DMEK) grafts at a single eye bank and how these parameters changed after the introduction of steps to preload tissue among experienced processors. METHODS: Tissue loss and processing time during DMEK graft preparation as well as PGF were analyzed retrospectively at a single eye bank between 2012 and 2018. Outcomes were assessed in consecutive grafts before and after the introduction of preloading to the eye bank's standard operating procedure. RESULTS: A total of 1326 grafts were analyzed, composed of the first 663 preloaded DMEK grafts and, for comparison, the 663 DMEK grafts processed immediately before starting the preloaded service. Mean processing time increased from 17.0 ± 3.9 minutes to 26.0 ± 5.4 minutes with the advent of preloading (P < 0.01). Initially, average processing time increased dramatically, with a maximum processing time of 51 minutes, before regressing to the average. No significant difference in the rate of tissue wastage was observed before versus after the implementation of preloaded DMEK (1.2% vs. 1.7%, P = 0.48). PGF occurred in 7 grafts before the preloaded service and 10 grafts after starting the service (1.6% vs. 2.3%, P = 0.47). CONCLUSIONS: Preloading does not affect tissue wastage for experienced technicians or the PGF rate but increases processing time. Eye banks that are considering adding preloading to their standard operating procedure may need to account for longer processing times in their daily operations.


Subject(s)
Corneal Dystrophies, Hereditary/surgery , Descemet Stripping Endothelial Keratoplasty , Endothelium, Corneal , Eye Banks/methods , Graft Rejection/physiopathology , Tissue and Organ Harvesting/methods , Aged , Corneal Dystrophies, Hereditary/physiopathology , Corneal Endothelial Cell Loss/physiopathology , Female , Humans , Male , Middle Aged , Retrospective Studies , Time Factors , Tissue Donors , Tissue and Organ Procurement , Treatment Outcome
SELECTION OF CITATIONS
SEARCH DETAIL