Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 58
Filter
1.
Article in English | MEDLINE | ID: mdl-38705500

ABSTRACT

BACKGROUND: Lung transplant recipients are traditionally monitored with pulmonary function testing (PFT) and lung biopsy to detect post-transplant complications and guide treatment. Plasma donor-derived cell free DNA (dd-cfDNA) is a novel molecular approach of assessing allograft injury, including subclinical allograft dysfunction. The aim of this study was to determine if episodes of extreme molecular injury (EMI) in lung transplant recipients increases the risk of CLAD or death. METHODS: This multicenter prospective cohort study included 238 lung transplant recipients. Serial plasma samples were collected for dd-cfDNA measurement by shotgun sequencing. EMI was defined as a dd-cfDNA above the third quartile of levels observed for acute rejection (dd-cfDNA level of ≥ 5% occurring after 45 days post-transplant). EMI was categorized as Secondary if associated with co-existing acute rejection, infection or PFT decline; or Primary if not associated to these conditions. RESULTS: EMI developed in 16% of patients at a median 343.5 (IQR: 177.3-535.5) days post-transplant. Over 50% of EMI episodes were classified as Primary. EMI was associated with an increased risk of severe CLAD or death (HR: 2.52, 95% CI: 1.10 - 3.82, p= 0.024). The risk remained consistent for Primary EMI (HR: 2.34, 95% CI 1.18-4.85, p=0.015). Time to first EMI episode was a significant predictor of the likelihood of developing CLAD or death (AUC=0.856, 95% CI =.805-908, p<.001). CONCLUSIONS: Episodes of EMI in lung transplant recipients are often isolated and not detectable with traditional clinical monitoring approaches. EMI is associated with an increased risk of severe CLAD or death, independent of concomitant transplant complications.

2.
Article in English | MEDLINE | ID: mdl-38670297

ABSTRACT

BACKGROUND: Cardiac allograft vasculopathy (CAV) remains the leading cause of long-term graft failure and mortality after heart transplantation. Effective preventive and treatment options are not available to date, largely because underlying mechanisms remain poorly understood. We studied the potential role of leukotriene B4 (LTB4), an inflammatory lipid mediator, in the development of CAV. METHODS: We used an established preclinical rat CAV model to study the role of LTB4 in CAV. We performed syngeneic and allogeneic orthotopic aortic transplantation, after which neointimal proliferation was quantified. Animals were then treated with Bestatin, an inhibitor of LTB4 synthesis, or vehicle control for 30 days post-transplant, and evidence of graft CAV was determined by histology. We also measured serial LTB4 levels in a cohort of 28 human heart transplant recipients with CAV, 17 matched transplant controls without CAV, and 20 healthy nontransplant controls. RESULTS: We showed that infiltration of the arterial wall with macrophages leads to neointimal thickening and a rise in serum LTB4 levels in our rat model of CAV. Inhibition of LTB4 production with the drug Bestatin prevents development of neointimal hyperplasia, suggesting that Bestatin may be effective therapy for CAV prevention. In a parallel study of heart transplant recipients, we found nonsignificantly elevated plasma LTB4 levels in patients with CAV, compared to patients without CAV and healthy, nontransplant controls. CONCLUSIONS: This study provides key evidence supporting the role of the inflammatory cytokine LTB4 as an important mediator of CAV development and provides preliminary data suggesting the clinical benefit of Bestatin for CAV prevention.

3.
JACC Heart Fail ; 12(4): 722-736, 2024 Apr.
Article in English | MEDLINE | ID: mdl-38244008

ABSTRACT

BACKGROUND: Potential organ donors often exhibit abnormalities on electrocardiograms (ECGs) after brain death, but the physiological and prognostic significance of such abnormalities is unknown. OBJECTIVES: This study sought to characterize the prevalence of ECG abnormalities in a nationwide cohort of potential cardiac donors and their associations with cardiac dysfunction, use for heart transplantation (HT), and recipient outcomes. METHODS: The Donor Heart Study enrolled 4,333 potential cardiac organ donors at 8 organ procurement organizations across the United States from 2015 to 2020. A blinded expert reviewer interpreted all ECGs, which were obtained once hemodynamic stability was achieved after brain death and were repeated 24 ± 6 hours later. ECG findings were summarized, and their associations with other cardiac diagnostic findings, use for HT, and graft survival were assessed using univariable and multivariable regression. RESULTS: Initial ECGs were interpretable for 4,136 potential donors. Overall, 64% of ECGs were deemed clinically abnormal, most commonly as a result of a nonspecific St-T-wave abnormality (39%), T-wave inversion (19%), and/or QTc interval >500 ms (17%). Conduction abnormalities, ectopy, pathologic Q waves, and ST-segment elevations were less common (each present in ≤5% of donors) and resolved on repeat ECGs in most cases. Only pathological Q waves were significant predictors of donor heart nonuse (adjusted OR: 0.39; 95% CI: 0.29-0.53), and none were associated with graft survival at 1 year post-HT. CONCLUSIONS: ECG abnormalities are common in potential heart donors but often resolve on serial testing. Pathologic Q waves are associated with a lower likelihood of use for HT, but they do not portend worse graft survival.


Subject(s)
Heart Diseases , Heart Failure , Heart Transplantation , Tissue and Organ Procurement , Humans , Tissue Donors , Brain Death , Electrocardiography , Arrhythmias, Cardiac
4.
J Heart Lung Transplant ; 43(3): 387-393, 2024 Mar.
Article in English | MEDLINE | ID: mdl-37802261

ABSTRACT

Primary graft dysfunction (PGD) is a leading cause of early morbidity and mortality following heart transplantation (HT). We sought to determine the association between pretransplant human leukocyte antigen (HLA) sensitization, as measured using the calculated panel reactive antibody (cPRA) value, and the risk of PGD. METHODS: Consecutive adult HT recipients (n = 596) from 1/2015 to 12/2019 at 2 US centers were included. Severity of PGD was based on the 2014 International Society for Heart and Lung Transplantation consensus statement. For each recipient, unacceptable HLA antigens were obtained and locus-specific cPRA (cPRA-LS) and pre-HT donor-specific antibodies (DSA) were assessed. RESULTS: Univariable logistic modeling showed that peak cPRA-LS for all loci and HLA-A was associated with increased severity of PGD as an ordinal variable (all loci: OR 1.78, 95% CI: 1.01-1.14, p = 0.025, HLA-A: OR 1.14, 95% CI: 1.03-1.26, p = 0.011). Multivariable analysis showed peak cPRA-LS for HLA-A, recipient beta-blocker use, total ischemic time, donor age, prior cardiac surgery, and United Network for Organ Sharing status 1 or 2 were associated with increased severity of PGD. The presence of DSA to HLA-B was associated with trend toward increased risk of mild-to-moderate PGD (OR 2.56, 95% CI: 0.99-6.63, p = 0.053), but DSA to other HLA loci was not associated with PGD. CONCLUSIONS: Sensitization for all HLA loci, and specifically HLA-A, is associated with an increased severity of PGD. These factors should be included in pre-HT risk stratification to minimize the risk of PGD.


Subject(s)
Heart Transplantation , Primary Graft Dysfunction , Adult , Humans , Primary Graft Dysfunction/epidemiology , Primary Graft Dysfunction/etiology , Heart Transplantation/adverse effects , HLA Antigens , Tissue Donors , Antibodies , HLA-A Antigens , Retrospective Studies
5.
Circulation ; 148(10): 822-833, 2023 09 05.
Article in English | MEDLINE | ID: mdl-37465972

ABSTRACT

BACKGROUND: Left ventricular dysfunction in potential donors meeting brain death criteria often results in nonuse of donor hearts for transplantation, yet little is known about its incidence or pathophysiology. Resolving these unknowns was a primary aim of the DHS (Donor Heart Study), a multisite prospective cohort study. METHODS: The DHS enrolled potential donors by neurologic determination of death (n=4333) at 8 organ procurement organizations across the United States between February 2015 and May 2020. Data included medications administered, serial diagnostic tests, and transthoracic echocardiograms (TTEs) performed: (1) within 48 hours after brain death was formally diagnosed; and (2) 24±6 hours later if left ventricular (LV) dysfunction was initially present. LV dysfunction was defined as an LV ejection fraction <50% and was considered reversible if LV ejection fraction was >50% on the second TTE. TTEs were also examined for presence of LV regional wall motion abnormalities and their reversibility. We assessed associations between LV dysfunction, donor heart acceptance for transplantation, and recipient 1-year survival. RESULTS: An initial TTE was interpreted for 3794 of the 4333 potential donors by neurologic determination of death. A total of 493 (13%) of these TTEs showed LV dysfunction. Among those donors with an initial TTE, LV dysfunction was associated with younger age, underweight, and higher NT-proBNP (N-terminal pro-B-type natriuretic peptide) and troponin levels. A second TTE was performed within 24±6 hours for a subset of donors (n=224) with initial LV dysfunction; within this subset, 130 (58%) demonstrated reversibility. Sixty percent of donor hearts with normal LV function were accepted for transplant compared with 56% of hearts with reversible LV dysfunction and 24% of hearts with nonreversible LV dysfunction. Donor LV dysfunction, whether reversible or not, was not associated with recipient 1-year survival. CONCLUSIONS: LV dysfunction associated with brain death occurs in many potential heart donors and is sometimes reversible. These findings can inform decisions made during donor evaluation and help guide donor heart acceptance for transplantation.


Subject(s)
Heart Transplantation , Ventricular Dysfunction, Left , Humans , Tissue Donors , Heart Transplantation/methods , Prospective Studies , Brain Death , Ventricular Function, Left
6.
Int J Cardiol ; 379: 24-32, 2023 05 15.
Article in English | MEDLINE | ID: mdl-36893856

ABSTRACT

OBJECTIVES: This study aimed to explore the impact of myocardial bridging (MB) on early development of cardiac allograft vasculopathy and long-term graft survival after heart transplantation. BACKGROUND: MB has been reported to be associated with acceleration of proximal plaque development and endothelial dysfunction in native coronary atherosclerosis. However, its clinical significance in heart transplantation remains unclear. METHODS: In 103 heart-transplant recipients, serial (baseline and 1-year post-transplant) volumetric intravascular ultrasound (IVUS) analyses were performed in the first 50 mm of the left anterior descending (LAD) artery. Standard IVUS indices were evaluated in 3 equally divided LAD segments (proximal, middle, and distal segments). MB was defined by IVUS as an echolucent muscular band lying on top of the artery. The primary endpoint was death or re-transplantation, assessed for up to 12.2 years (median follow-up: 4.7 years). RESULTS: IVUS identified MB in 62% of the study population. At baseline, MB patients had smaller intimal volume in the distal LAD than non-MB patients (p = 0.002). During the first year, vessel volume decreased diffusely irrespective of the presence of MB. Intimal growth diffusely distributed in non-MB patients, whereas MB patients demonstrated significantly augmented intimal formation in the proximal LAD. Kaplan-Meier analysis revealed significantly lower event-free survival in patients with versus without MB (log-rank p = 0.02). In multivariate analysis, the presence of MB was independently associated with late adverse events [hazard ratio 5.1 (1.6-22.2)]. CONCLUSION: MB appears to relate to accelerated proximal intimal growth and reduced long-term survival in heart-transplant recipients.


Subject(s)
Coronary Artery Disease , Heart Transplantation , Myocardial Bridging , Humans , Coronary Angiography , Coronary Vessels/diagnostic imaging , Coronary Vessels/surgery , Ultrasonography, Interventional , Coronary Artery Disease/diagnostic imaging , Coronary Artery Disease/surgery , Coronary Artery Disease/complications , Heart Transplantation/adverse effects
7.
Transplantation ; 107(7): 1624-1629, 2023 07 01.
Article in English | MEDLINE | ID: mdl-36801852

ABSTRACT

BACKGROUND: We investigated associations between primary graft dysfunction (PGD) and development of acute cellular rejection (ACR), de novo donor-specific antibodies (DSAs), and cardiac allograft vasculopathy (CAV) after heart transplantation (HT). METHODS: A total of 381 consecutive adult HT patients from January 2015 to July 2020 at a single center were retrospectively analyzed. The primary outcome was incidence of treated ACR (International Society for Heart and Lung Transplantation grade 2R or 3R) and de novo DSA (mean fluorescence intensity >500) within 1 y post-HT. Secondary outcomes included median gene expression profiling score and donor-derived cell-free DNA level within 1 y and incidence of cardiac allograft vasculopathy (CAV) within 3 y post-HT. RESULTS: When adjusted for death as a competing risk, the estimated cumulative incidence of ACR (PGD 0.13 versus no PGD 0.21; P = 0.28), median gene expression profiling score (30 [interquartile range, 25-32] versus 30 [interquartile range, 25-33]; P = 0.34), and median donor-derived cell-free DNA levels was similar in patients with and without PGD. After adjusting for death as a competing risk, estimated cumulative incidence of de novo DSA within 1 y post-HT in patients with PGD was similar to those without PGD (0.29 versus 0.26; P = 0.10) with a similar DSA profile based on HLA loci. There was increased incidence of CAV in patients with PGD compared with patients without PGD (52.6% versus 24.8%; P = 0.01) within the first 3 y post-HT. CONCLUSIONS: During the first year after HT, patients with PGD had a similar incidence of ACR and development of de novo DSA, but a higher incidence of CAV when compared with patients without PGD.


Subject(s)
Heart Diseases , Heart Transplantation , Primary Graft Dysfunction , Adult , Humans , Retrospective Studies , Primary Graft Dysfunction/diagnosis , Primary Graft Dysfunction/epidemiology , Primary Graft Dysfunction/etiology , HLA Antigens , Heart Transplantation/adverse effects , Graft Rejection/diagnosis , Graft Rejection/epidemiology , Allografts
8.
Am J Transplant ; 23(4): 559-564, 2023 04.
Article in English | MEDLINE | ID: mdl-36732088

ABSTRACT

The development of donor-specific antibodies after lung transplantation is associated with downstream acute cellular rejection, antibody-mediated rejection (AMR), chronic lung allograft dysfunction (CLAD), or death. It is unknown whether preemptive (early) treatment of de novo donor-specific antibodies (dnDSAs), in the absence of clinical signs and symptoms of allograft dysfunction, reduces the risk of subsequent CLAD or death. We performed a multicenter, retrospective cohort study to determine if early treatment of dnDSAs in lung transplant patients reduces the risk of the composite endpoint of CLAD or death. In the cohort of 445 patients, 145 patients developed dnDSAs posttransplant. Thirty patients received early targeted treatment for dnDSAs in the absence of clinical signs and symptoms of AMR. Early treatment of dnDSAs was associated with a decreased risk of CLAD or death (hazard ratio, 0.36; 95% confidence interval, 0.17-0.76; P < .01). Deferring treatment until the development of clinical AMR was associated with an increased risk of CLAD or death (hazard ratio, 3.00; 95% confidence interval, 1.46-6.18; P < .01). This study suggests that early, preemptive treatment of donor-specific antibodies in lung transplant patients may reduce the subsequent risk of CLAD or death.


Subject(s)
Lung Transplantation , Lung , Humans , Retrospective Studies , Antibodies , Lung Transplantation/adverse effects , Allografts , Graft Rejection/etiology , Graft Rejection/prevention & control , Graft Rejection/diagnosis
9.
J Heart Lung Transplant ; 42(2): 226-235, 2023 02.
Article in English | MEDLINE | ID: mdl-36319530

ABSTRACT

BACKGROUND: Pulmonary antibody-mediated rejection (AMR) consensus criteria categorize AMR by diagnostic certainty. This study aims to define the clinical features and associated outcomes of these recently defined AMR categories. METHODS: Adjudication committees reviewed clinical data of 335 lung transplant recipients to define clinical or subclinical AMR based on the presence of allograft dysfunction, and the primary endpoints, time from transplant to allograft failure, a composite endpoint of chronic lung allograft dysfunction and/or death. Clinical AMR was subcategorized based on diagnostic certainty as definite, probable or possible AMR if 4, 3, or 2 characteristic features were present, respectively. Allograft injury was assessed via plasma donor-derived cell-free DNA (ddcfDNA). Risk of allograft failure and allograft injury was compared for AMR categories using regression models. RESULTS: Over the 38.5 months follow-up, 28.7% of subjects developed clinical AMR (n = 96), 18.5% developed subclinical AMR (n = 62) or 58.3% were no AMR (n = 177). Clinical AMR showed higher risk of allograft failure and ddcfDNA levels compared to subclinical or no AMR. Clinical AMR included definite/probable (n = 21) or possible AMR (n = 75). These subcategories showed similar clinical characteristics, ddcfDNA levels, and risk of allograft failure. However, definite/probable AMR showed greater measures of AMR severity, including degree of allograft dysfunction and risk of death compared to possible AMR. CONCLUSIONS: Clinical AMR showed greater risk of allograft failure than subclinical AMR or no AMR. Subcategorization of clinical AMR based on diagnostic certainty correlated with AMR severity and risk of death, but not with the risk of allograft failure.


Subject(s)
Antibodies , Lung Transplantation , Humans , Transplantation, Homologous , Lung , Allografts , Graft Rejection/diagnosis
10.
Clin Transplant ; 37(3): e14699, 2023 03.
Article in English | MEDLINE | ID: mdl-35559582

ABSTRACT

BACKGROUND: Donor-derived cell free DNA (dd-cfDNA) and gene expression profiling (GEP) offer noninvasive alternatives to rejection surveillance after heart transplantation; however, there is little evidence on the paired use of GEP and dd-cfDNA for rejection surveillance. METHODS: A single center, retrospective analysis of adult heart transplant recipients. A GEP cohort, transplanted from January 1, 2015 through December 31, 2017 and eligible for rejection surveillance with GEP was compared to a paired testing cohort, transplanted July 1, 2018 through June 30, 2020, with surveillance from both dd-cfDNA and GEP. The primary outcomes were survival and rejection-free survival at 1 year post-transplant. RESULTS: In total 159 patients were included, 95 in the GEP and 64 in the paired testing group. There were no differences in baseline characteristics, except for less use of induction in the paired testing group (65.6%) compared to the GEP group (98.9%), P < .01. At 1-year, there were no differences between the paired testing and GEP groups in survival (98.4% vs. 94.7%, P = .23) or rejection-free survival (81.3% vs. 73.7% P = .28). CONCLUSIONS: Compared to post-transplant rejection surveillance with GEP alone, pairing dd-cfDNA and GEP testing was associated with similar survival and rejection-free survival at 1 year while requiring significantly fewer biopsies.


Subject(s)
Cell-Free Nucleic Acids , Heart Transplantation , Adult , Humans , Retrospective Studies , Cell-Free Nucleic Acids/genetics , Heart Transplantation/adverse effects , Gene Expression Profiling , Tissue Donors
11.
Am J Transplant ; 22(7): 1760-1765, 2022 07.
Article in English | MEDLINE | ID: mdl-35373509

ABSTRACT

Solid organ transplantation continues to be constrained by a lack of suitable donor organs. Advances in donor management and evaluation are needed to address this shortage, but the performance of research studies in deceased donors is fraught with challenges. Here we discuss several of the major obstacles we faced in the conduct of the Donor Heart Study-a prospective, multi-site, observational study of donor management, evaluation, and acceptance for heart transplantation. These included recruitment and engagement of participating organ procurement organizations, ambiguities related to study oversight, obtaining authorization for donor research, logistical challenges encountered during donor management, sustaining study momentum, and challenges related to study data management. By highlighting these obstacles encountered, as well as the solutions implemented, we hope to stimulate further discussion and actions that will facilitate the design and execution of future donor research studies.


Subject(s)
Heart Transplantation , Organ Transplantation , Tissue and Organ Procurement , Humans , Prospective Studies , Tissue Donors
12.
Am J Transplant ; 22(10): 2451-2457, 2022 10.
Article in English | MEDLINE | ID: mdl-35322546

ABSTRACT

Plasma donor-derived cell-free DNA (dd-cfDNA) is a sensitive biomarker for the diagnosis of acute rejection in lung transplant recipients; however, differences in dd-cfDNA levels between single and double lung transplant remains unknown. We performed an observational analysis that included 221 patients from two prospective cohort studies who had serial measurements of plasma dd-cfDNA at the time of bronchoscopy and pulmonary function testing, and compared dd-cfDNA between single and double lung transplant recipients across a range of disease states. Levels of dd-cfDNA were lower for single vs. double lung transplant in stable controls (median [IQR]: 0.15% [0.07, 0.44] vs. 0.46% [0.23, 0.74], p < .01) and acute rejection (1.06% [0.75, 2.32] vs. 1.78% [1.18, 5.73], p = .05). Doubling dd-cfDNA for single lung transplant to account for differences in lung mass eliminated this difference. The area under the receiver operating curve (AUC) for the detection of acute rejection was 0.89 and 0.86 for single and double lung transplant, respectively. The optimal dd-cfDNA threshold for the detection of acute rejection was 0.54% in single lung and 1.1% in double lung transplant. In conclusion, accounting for differences in dd-cfDNA in single versus double lung transplant is key for the interpretation of dd-cfDNA testing in research and clinical settings.


Subject(s)
Cell-Free Nucleic Acids , Biomarkers , Graft Rejection/diagnosis , Graft Rejection/etiology , Humans , Lung , Prospective Studies , Tissue Donors , Transplant Recipients
13.
J Am Coll Cardiol ; 78(24): 2425-2435, 2021 12 14.
Article in English | MEDLINE | ID: mdl-34886963

ABSTRACT

BACKGROUND: Single-center data suggest that the index of microcirculatory resistance (IMR) measured early after heart transplantation predicts subsequent acute rejection. OBJECTIVES: The goal of this study was to validate whether IMR measured early after transplantation can predict subsequent acute rejection and long-term outcome in a large multicenter cohort. METHODS: From 5 international cohorts, 237 patients who underwent IMR measurement early after transplantation were enrolled. The primary outcome was acute allograft rejection (AAR) within 1 year after transplantation. A key secondary outcome was major adverse cardiac events (MACE) (the composite of death, re-transplantation, myocardial infarction, stroke, graft dysfunction, and readmission) at 10 years. RESULTS: IMR was measured at a median of 7 weeks (interquartile range: 3-10 weeks) post-transplantation. At 1 year, the incidence of AAR was 14.4%. IMR was associated proportionally with the risk of AAR (per increase of 1-U IMR; adjusted hazard ratio [aHR]: 1.04; 95% confidence interval [CI]: 1.02-1.06; p < 0.001). The incidence of AAR in patients with an IMR ≥18 was 23.8%, whereas the incidence of AAR in those with an IMR <18 was 6.3% (aHR: 3.93; 95% CI: 1.77-8.73; P = 0.001). At 10 years, MACE occurred in 86 (36.3%) patients. IMR was significantly associated with the risk of MACE (per increase of 1-U IMR; aHR: 1.02; 95% CI: 1.01-1.04; P = 0.005). CONCLUSIONS: IMR measured early after heart transplantation is associated with subsequent AAR at 1 year and clinical events at 10 years. Early IMR measurement after transplantation identifies patients at higher risk and may guide personalized posttransplantation management.


Subject(s)
Coronary Circulation/physiology , Graft Rejection/physiopathology , Heart Transplantation/adverse effects , Microcirculation/physiology , Vascular Resistance/physiology , Allografts , Coronary Angiography , Female , Follow-Up Studies , Graft Rejection/diagnosis , Humans , Male , Middle Aged , Predictive Value of Tests , Prospective Studies , Time Factors
14.
Eur Heart J ; 42(48): 4918-4929, 2021 12 21.
Article in English | MEDLINE | ID: mdl-34665224

ABSTRACT

AIMS: We evaluated the long-term prognostic value of invasively assessing coronary physiology after heart transplantation in a large multicentre registry. METHODS AND RESULTS: Comprehensive intracoronary physiology assessment measuring fractional flow reserve (FFR), the index of microcirculatory resistance (IMR), and coronary flow reserve (CFR) was performed in 254 patients at baseline (a median of 7.2 weeks) and in 240 patients at 1 year after transplantation (199 patients had both baseline and 1-year measurement). Patients were classified into those with normal physiology, reduced FFR (FFR ≤ 0.80), and microvascular dysfunction (either IMR ≥ 25 or CFR ≤ 2.0 with FFR > 0.80). The primary outcome was the composite of death or re-transplantation at 10 years. At baseline, 5.5% had reduced FFR; 36.6% had microvascular dysfunction. Baseline reduced FFR [adjusted hazard ratio (aHR) 2.33, 95% confidence interval (CI) 0.88-6.15; P = 0.088] and microvascular dysfunction (aHR 0.88, 95% CI 0.44-1.79; P = 0.73) were not predictors of death and re-transplantation at 10 years. At 1 year, 5.0% had reduced FFR; 23.8% had microvascular dysfunction. One-year reduced FFR (aHR 2.98, 95% CI 1.13-7.87; P = 0.028) and microvascular dysfunction (aHR 2.33, 95% CI 1.19-4.59; P = 0.015) were associated with significantly increased risk of death or re-transplantation at 10 years. Invasive measures of coronary physiology improved the prognostic performance of clinical variables (χ2 improvement: 7.41, P = 0.006). However, intravascular ultrasound-derived changes in maximal intimal thickness were not predictive of outcomes. CONCLUSION: Abnormal coronary physiology 1 year after heart transplantation was common and was a significant predictor of death or re-transplantation at 10 years.


Subject(s)
Coronary Stenosis , Fractional Flow Reserve, Myocardial , Heart Transplantation , Cardiac Catheterization , Coronary Angiography , Coronary Vessels/diagnostic imaging , Coronary Vessels/surgery , Humans , Microcirculation , Predictive Value of Tests , Prognosis
15.
J Heart Lung Transplant ; 40(6): 488-493, 2021 06.
Article in English | MEDLINE | ID: mdl-33814284

ABSTRACT

BACKGROUND: Primary graft dysfunction (PGD) is a risk factor for chronic lung allograft dysfunction (CLAD). However, the association between PGD and degree of allograft injury remains poorly defined. In this study, we leverage a novel biomarker for allograft injury, percentage donor-derived cell-free DNA (%ddcfDNA), to study the association between PGD, degree of allograft injury, and the development of CLAD. METHODS: This prospective cohort study recruited 99 lung transplant recipients and collected plasma samples on days 1, 3, and 7 for %ddcfDNA measurements. Clinical data on day 3 was used to adjudicate for PGD. %ddcfDNA levels were compared between PGD grades. In PGD patients, %ddcfDNA was compared between those who developed CLAD and those who did not. RESULTS: On posttransplant day 3, %ddcfDNA was higher in PGD than in non-PGD patients (median [IQR]: 12.2% [8.2, 22.0] vs 8.5% [5.6, 13.2] p = 0.01). %ddcfDNA correlated with the severity grade of PGD (r = 0.24, p = 0.02). Within the PGD group, higher levels of %ddcfDNA correlated with increased risk of developing CLAD (log OR(SE) 1.38 (0.53), p = 0.009). PGD patients who developed CLAD showed ∼2-times higher %ddcfDNA levels than patients who did not develop CLAD (median [IQR]: 22.4% [11.8, 27.6] vs 9.9% [6.7, 14.9], p = 0.007). CONCLUSION: PGD patients demonstrated increased early posttransplant allograft injury, as measured by %ddcfDNA, in comparison to non-PGD patients, and these high %ddcfDNA levels were associated with subsequent development of CLAD. This study suggests that %ddcfDNA identifies PGD patients at greater risk of CLAD than PGD alone.


Subject(s)
Cell-Free Nucleic Acids/blood , Graft Rejection/blood , Lung Transplantation/adverse effects , Primary Graft Dysfunction/blood , Tissue Donors , Transplant Recipients , Adult , Allografts , Biomarkers/blood , Female , Follow-Up Studies , Graft Rejection/etiology , Humans , Male , Middle Aged , Primary Graft Dysfunction/complications , Prospective Studies , Time Factors
16.
ERJ Open Res ; 7(1)2021 Jan.
Article in English | MEDLINE | ID: mdl-33532456

ABSTRACT

Surveillance after lung transplantation is critical to the detection of acute cellular rejection (ACR) and prevention of chronic lung allograft dysfunction (CLAD). Therefore, we measured donor-derived cell-free DNA (dd-cfDNA) implementing a clinical-grade, next-generation targeted sequencing assay in 107 plasma samples from 38 unique lung transplantation recipients with diagnostic cohorts classified as: (1) biopsy-confirmed or treated ACR, (2) antibody-mediated rejection (AMR), (3) obstructive CLAD, (4) allograft infection (INFXN) and (5) Stable healthy allografts (STABLE). Our principal findings are as follows: (1) dd-cfDNA level was elevated in ACR (median 0.91%; interquartile range (IQR): 0.39-2.07%), CLAD (2.06%; IQR: 0.57-3.67%) and an aggregated cohort of rejection encompassing allograft injury (1.06%; IQR: 0.38-2.51%), compared with the STABLE cohort (0.38%; IQR: 0.23-0.87%) (p=0.02); (2) dd-cfDNA level with AMR was elevated (1.34%; IQR: 0.34-2.40%) compared to STABLE, although it did not reach statistical significance (p=0.07) due to limitations in sample size; (3) there was no difference in dd-cfDNA for allograft INFXN (0.39%; IQR: 0.18-0.67%) versus STABLE, which may relate to differences in "tissue injury" with the spectrum of bronchial colonisation versus invasive infection; (4) there was no difference for dd-cfDNA in unilateral versus bilateral lung transplantation; (5) "optimal threshold" for dd-cfDNA for aggregated rejection events representing allograft injury was determined as 0.85%, with sensitivity=55.6%, specificity=75.8%, positive predictive value (PPV)=43.3% and negative predictive value (NPV)=83.6%. Measurement of plasma dd-cfDNA may be a clinically useful tool for the assessment of lung allograft health and surveillance for "tissue injury" with a spectrum of rejection.

17.
Transplantation ; 104(10): e284-e294, 2020 10.
Article in English | MEDLINE | ID: mdl-32413012

ABSTRACT

BACKGROUND: Heart transplantation is a life-saving procedure that has seen improvements in transplant and patient outcomes due to advances in immunosuppression and prevention of posttransplantation infectious episodes (IEps). This study systematically evaluates IEps in the modern era of heart transplantation at Stanford University Medical Center. METHODS: This is a single-center retrospective review that includes 279 consecutive adult heart transplantation recipients from January 2008 to September 2017. Baseline demographic, clinical, serological, and outcomes information were collected. Kaplan-Meier estimator was used to assess survival stratified by IEp occurrence within the first year. RESULTS: A total of 600 IEps occurred in 279 patients (2.15 IEps per patient) during a median follow-up period of 3 years. Overall survival was 83.3% (95% confidence interval [CI], 76.2-88.4) at 1 year posttransplantation for those with any IEp compared with 93.0% (95% CI, 87.2-96.4) in those without IEp (P = 0.07). Bacterial IEps were the most common (n = 375; 62.5%), followed by viral (n = 180; 30.0%), fungal (n = 40; 6.7%), and parasitic (n = 5; 0.8%). IEps by Gram-negative bacteria (n = 210) outnumbered those by Gram-positive bacteria (n = 142). Compared with prior studies from our center, there was a decreased proportion of viral (including cytomegalovirus), fungal (including Aspergillus spp. and non-Aspergillus spp. molds), and Nocardia infections. There were no IEps due to Mycobacterium tuberculosis, Pneumocystis jirovecii, or Toxoplasma gondii. CONCLUSIONS: A significant reduction in viral, fungal, and Nocardia IEps after heart transplantation was observed, most likely due to advancements in immunosuppression and preventive strategies, including pretransplant infectious diseases screening and antimicrobial prophylaxis.


Subject(s)
Bacterial Infections/epidemiology , Heart Transplantation/adverse effects , Mycoses/epidemiology , Opportunistic Infections/epidemiology , Virus Diseases/epidemiology , Adult , Anti-Bacterial Agents/administration & dosage , Antibiotic Prophylaxis , Antifungal Agents/administration & dosage , Antiviral Agents/administration & dosage , Bacterial Infections/mortality , Bacterial Infections/prevention & control , California/epidemiology , Female , Heart Transplantation/mortality , Humans , Immunocompromised Host , Immunosuppressive Agents/adverse effects , Male , Middle Aged , Mycoses/mortality , Mycoses/prevention & control , Opportunistic Infections/mortality , Opportunistic Infections/prevention & control , Protective Factors , Retrospective Studies , Risk Assessment , Risk Factors , Time Factors , Treatment Outcome , Virus Diseases/mortality , Virus Diseases/prevention & control
18.
Am Heart J ; 222: 30-37, 2020 04.
Article in English | MEDLINE | ID: mdl-32007823

ABSTRACT

BACKGROUND: The safety and efficacy of angiotensin converting enzyme inhibition (ACEI) after heart transplantation (HT) is unknown. This study examined long-term clinical outcomes after ACEI in HT recipients. METHODS: The ACEI after HT study was a prospective, randomized trial that tested the efficacy of ACEI with ramipril after HT. In this study, long-term clinical outcomes were assessed in 91 patients randomized to either ramipril or placebo (median, 5.8 years). The primary endpoint was a composite of death, retransplantation, hospitalization for rejection or heart failure, and coronary revascularization. RESULTS: The primary endpoint occurred in 10 of 45 patients (22.2%) in the ramipril group and in 14 of 46 patients (30.4%) in the placebo group (Hazard ratio (HR), 0.68; 95% CI, 0.29-1.51; P = .34). When the analysis was restricted to comparing patients who remained on a renin-angiotensin system inhibitor beyond 1 year with those who did not, there was a trend to improved outcomes (HR, 0.54; 95% CI, 0.22-1.28, P = .16). There was no significant difference in creatinine, blood urea nitrogen, and potassium at 3 years after randomization. The cumulative incidence of the primary endpoint was significantly higher in patients in whom the index of microcirculatory resistance increased from baseline to 1 year compared with those in whom it did not (39.1 vs 17.4%, HR: 3.36; 95% CI, 1.07-12.7; P = .037). CONCLUSION: The use of ramipril after HT safely lowers blood pressure and is associated with favorable long-term clinical outcomes. Clinical Trial Registration-URL: https://www.clinicaltrials.gov. Unique identifier: NCT01078363.


Subject(s)
Graft Rejection/prevention & control , Heart Failure/surgery , Heart Transplantation/adverse effects , Ramipril/therapeutic use , Angiotensin-Converting Enzyme Inhibitors/therapeutic use , Blood Pressure/drug effects , Double-Blind Method , Female , Follow-Up Studies , Graft Rejection/physiopathology , Humans , Male , Microcirculation/drug effects , Middle Aged , Postoperative Period , Prospective Studies , Time Factors , Treatment Outcome
19.
Int J Cardiol ; 290: 27-32, 2019 09 01.
Article in English | MEDLINE | ID: mdl-30987835

ABSTRACT

BACKGROUND: Acute allograft rejection (AAR) plays an important role in patient and graft survival; therefore, more emphasis should be placed on its prediction. This study aimed to investigate baseline clinical and diagnostic variables associated with subsequent AAR during the first year post-transplant, especially focusing on early physiologic and anatomic measures. METHODS: This study enrolled 88 heart transplant patients who underwent fractional flow reserve (FFR), coronary flow reserve (CFR), the index of microcirculatory resistance (IMR) and intravascular ultrasound (IVUS) in the left anterior descending artery at baseline (within 8 weeks post-transplant). Cardiac index (CI), pulmonary capillary wedge pressure (PCWP), mean pulmonary artery pressure (mPAP), right atrial pressure and left ventricular ejection fraction were also evaluated. AAR was defined as acute cellular rejection of grade ≥2R and/or pathological antibody-mediated rejection of grade ≥pAMR2. RESULTS: During the first year post-transplant, 25.0% of patients experienced AAR. Patients with AAR during the first year showed higher rates of recipient obesity, lower rates of recipient-donor sex mismatch and rATG and tacrolimus uses, higher PCWP, mPAP and IMR, and lower CFR at baseline, compared with those without. In the multivariate analysis, only baseline IMR ≥ 16.0 was independently associated with AAR during the first year, demonstrating high negative predictive value (96.7%). CONCLUSIONS: Invasively assessing microvascular resistance (baseline IMR ≥ 16.0) in the early post-transplant period was an independent determinant of subsequent acute allograft rejection during the first year post-transplant, suggesting that early assessment of IMR may enhance patient risk stratification and target medical therapies to improve patient outcome.


Subject(s)
Coronary Angiography/methods , Coronary Circulation/physiology , Graft Rejection/diagnostic imaging , Heart Transplantation/trends , Microcirculation/physiology , Adult , Aged , Early Diagnosis , Female , Follow-Up Studies , Graft Rejection/drug therapy , Graft Rejection/physiopathology , Humans , Immunosuppressive Agents/therapeutic use , Male , Middle Aged , Predictive Value of Tests , Prospective Studies , Retrospective Studies , Stroke Volume/physiology
20.
EBioMedicine ; 40: 541-553, 2019 Feb.
Article in English | MEDLINE | ID: mdl-30692045

ABSTRACT

BACKGROUND: Allograft failure is common in lung-transplant recipients and leads to poor outcomes including early death. No reliable clinical tools exist to identify patients at high risk for allograft failure. This study tested the use of donor-derived cell-free DNA (%ddcfDNA) as a sensitive marker of early graft injury to predict impending allograft failure. METHODS: This multicenter, prospective cohort study enrolled 106 subjects who underwent lung transplantation and monitored them after transplantation for the development of allograft failure (defined as severe chronic lung allograft dysfunction [CLAD], retransplantation, and/or death from respiratory failure). Plasma samples were collected serially in the first three months following transplantation and assayed for %ddcfDNA by shotgun sequencing. We computed the average levels of ddcfDNA over three months for each patient (avddDNA) and determined its relationship to allograft failure using Cox-regression analysis. FINDINGS: avddDNA was highly variable among subjects: median values were 3·6%, 1·6% and 0·7% for the upper, middle, and low tertiles, respectively (range 0·1%-9·9%). Compared to subjects in the low and middle tertiles, those with avddDNA in the upper tertile had a 6·6-fold higher risk of developing allograft failure (95% confidence interval 1·6-19·9, p = 0·007), lower peak FEV1 values, and more frequent %ddcfDNA elevations that were not clinically detectable. INTERPRETATION: Lung transplant patients with early unresolving allograft injury measured via %ddcfDNA are at risk of subsequent allograft injury, which is often clinically silent, and progresses to allograft failure. FUND: National Institutes of Health.


Subject(s)
Biomarkers , Cell-Free Nucleic Acids , Graft Rejection , Lung Transplantation/adverse effects , Lung Transplantation/mortality , Tissue Donors , Aged , Allografts , Comorbidity , Female , Graft Rejection/immunology , Humans , Kaplan-Meier Estimate , Male , Middle Aged , Prognosis , Proportional Hazards Models , Prospective Studies , Risk Factors , Sequence Analysis, DNA , Time Factors , Transplantation, Homologous
SELECTION OF CITATIONS
SEARCH DETAIL
...