Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 182
Filter
1.
Article in English | MEDLINE | ID: mdl-38950666

ABSTRACT

BACKGROUND: Prior studies have shown reduced development of cardiac allograft vasculopathy (CAV) in multi-organ transplant recipients. The aim of this study was to compare the incidence of CAV between isolated heart transplants and simultaneous multi-organ heart transplants in the contemporary era. METHODS: We utilized the Scientific Registry of Transplant Recipients to perform a retrospective analysis of first-time adult heart transplant recipients between January 1, 2010 and December 31, 2019 in the United States. The primary endpoint was the development of angiographic CAV within 5 years of follow-up. RESULTS: Among 20,591 patients included in the analysis, 1,279 (6%) underwent multi-organ heart transplantation (70% heart-kidney, 16% heart-liver, 13% heart-lung, and 1% triple-organ) and 19,312 (94%) were isolated heart transplant recipients. The average age was 53 years and 74% were male. There were no significant between-group differences in cold ischemic time between the groups. The incidence of acute rejection during the first year after transplant was significantly lower in the multi-organ group (18% vs. 33%, p<0.01). The 5-year incidence of CAV was 33% in the isolated heart group and 27% in the multi-organ group (p<0.0001); differences in CAV incidence were seen as early as 1 year after transplant and persisted over time. In multivariable analysis, multi-organ heart transplant recipients had a significantly lower likelihood of CAV at 5 years (hazard ratio=0.76, 95% confidence interval: 0.66-0.88, p<0.01). CONCLUSIONS: Simultaneous multi-organ heart transplantation is associated with significantly lower long-term risk of angiographic CAV compared with isolated heart transplantation in the contemporary era.

2.
Transplant Direct ; 10(7): e1669, 2024 Jul.
Article in English | MEDLINE | ID: mdl-38953039

ABSTRACT

Background: A prior single-center, retrospective cohort study identified baseline lung allograft dysfunction (BLAD) as a risk factor for death in bilateral lung transplant recipients. In this multicenter prospective cohort study, we test the association of BLAD with death in bilateral lung transplant recipients, identify clinical risk factors for BLAD, and assess its association with allograft injury on the molecular level. Methods: This multicenter, prospective cohort study included 173 bilateral lung transplant recipients that underwent serial pulmonary function testing and plasma collection for donor-derived cell-free DNA at prespecified time points. BLAD was defined as failure to achieve ≥80% predicted for both forced expiratory volume in 1 s and forced vital capacity after lung transplant, on 2 consecutive measurements at least 3 mo apart. Results: BLAD was associated with increased risk of death (hazard ratio, 1.97; 95% confidence interval [CI], 1.05-3.69; P = 0.03) but not chronic lung allograft dysfunction alone (hazard ratio, 1.60; 95% CI, 0.87-2.95; P = 0.13). Recipient obesity (odds ratio, 1.69; 95% CI, 1.15-2.80; P = 0.04) and donor age (odds ratio, 1.03; 95% CI, 1.02-1.05; P = 0.004) increased the risk of developing BLAD. Patients with BLAD did not demonstrate higher log10(donor-derived cell-free DNA) levels compared with no BLAD (slope [SE]: -0.0095 [0.0007] versus -0.0109 [0.0007]; P = 0.15). Conclusions: BLAD is associated with an increased risk of death following lung transplantation, representing an important posttransplant outcome with valuable prognostic significance; however, early allograft specific injury on the molecular level does not increase the risk of BLAD, supporting further mechanistic insight into disease pathophysiology.

3.
Transpl Int ; 37: 12445, 2024.
Article in English | MEDLINE | ID: mdl-38962472

ABSTRACT

While allograft rejection (AR) continues to threaten the success of cardiothoracic transplantation, lack of accurate and repeatable surveillance tools to diagnose AR is a major unmet need in the clinical management of cardiothoracic transplant recipients. Endomyocardial biopsy (EMB) and transbronchial biopsy (TBBx) have been the cornerstone of rejection monitoring since the field's incipience, but both suffer from significant limitations, including poor concordance of biopsy interpretation among pathologists. In recent years, novel molecular tools for AR monitoring have emerged and their performance characteristics have been evaluated in multiple studies. An international working group convened by ESOT has reviewed the existing literature and provides a series of recommendations to guide the use of these biomarkers in clinical practice. While acknowledging some caveats, the group recognized that Gene-expression profiling and donor-derived cell-free DNA (dd-cfDNA) may be used to rule out rejection in heart transplant recipients, but they are not recommended for cardiac allograft vasculopathy screening. Other traditional biomarkers (NT-proBNP, BNP or troponin) do not have sufficient evidence to support their use to diagnose AR. Regarding lung transplant, dd-cfDNA could be used to rule out clinical rejection and infection, but its use to monitor treatment response is not recommended.


Subject(s)
Biomarkers , Graft Rejection , Heart Transplantation , Lung Transplantation , Humans , Biomarkers/blood , Biopsy , Cell-Free Nucleic Acids/blood , Consensus , Europe , Gene Expression Profiling , Graft Rejection/diagnosis , Lung Transplantation/adverse effects , Societies, Medical
6.
Article in English | MEDLINE | ID: mdl-38759766

ABSTRACT

BACKGROUND: Molecular testing with gene-expression profiling (GEP) and donor-derived cell-free DNA (dd-cfDNA) is increasingly used in the surveillance for acute cellular rejection (ACR) after heart transplant. However, the performance of dual testing over each test individually has not been established. Further, the impact of dual noninvasive surveillance on clinical decision-making has not been widely investigated. METHODS: We evaluated 2,077 subjects from the Surveillance HeartCare Outcomes Registry registry who were enrolled between 2018 and 2021 and had verified biopsy data and were categorized as dual negative, GEP positive/dd-cfDNA negative, GEP negative/dd-cfDNA positive, or dual positive. The incidence of ACR and follow-up testing rates for each group were evaluated. Positive likelihood ratios (LRs+) were calculated, and biopsy rates over time were analyzed. RESULTS: The incidence of ACR was 1.5% for dual negative, 1.9% for GEP positive/dd-cfDNA negative, 4.3% for GEP negative/dd-cfDNA positive, and 9.2% for dual-positive groups. Follow-up biopsies were performed after 8.8% for dual negative, 14.2% for GEP positive/dd-cfDNA negative, 22.8% for GEP negative/dd-cfDNA positive, and 35.4% for dual-positive results. The LR+ for ACR was 1.37, 2.91, and 3.90 for GEP positive, dd-cfDNA positive, and dual-positive testing, respectively. From 2018 to 2021, biopsies performed between 2 and 12-months post-transplant declined from 5.9 to 5.3 biopsies/patient, and second-year biopsy rates declined from 1.5 to 0.9 biopsies/patient. At 2 years, survival was 94.9%, and only 2.7% had graft dysfunction. CONCLUSIONS: Dual molecular testing demonstrated improved performance for ACR surveillance compared to single molecular testing. The use of dual noninvasive testing was associated with lower biopsy rates over time, excellent survival, and low incidence of graft dysfunction.

7.
Article in English | MEDLINE | ID: mdl-38705500

ABSTRACT

BACKGROUND: Lung transplant recipients are traditionally monitored with pulmonary function testing (PFT) and lung biopsy to detect post-transplant complications and guide treatment. Plasma donor-derived cell free DNA (dd-cfDNA) is a novel molecular approach of assessing allograft injury, including subclinical allograft dysfunction. The aim of this study was to determine if episodes of extreme molecular injury (EMI) in lung transplant recipients increases the risk of chronic lung allograft dysfunction (CLAD) or death. METHODS: This multicenter prospective cohort study included 238 lung transplant recipients. Serial plasma samples were collected for dd-cfDNA measurement by shotgun sequencing. EMI was defined as a dd-cfDNA above the third quartile of levels observed for acute rejection (dd-cfDNA level of ≥5% occurring after 45 days post-transplant). EMI was categorized as Secondary if associated with co-existing acute rejection, infection or PFT decline; or Primary if not associated with these conditions. RESULTS: EMI developed in 16% of patients at a median 343.5 (IQR: 177.3-535.5) days post-transplant. Over 50% of EMI episodes were classified as Primary. EMI was associated with an increased risk of severe CLAD or death (HR: 2.78, 95% CI: 1.26-6.22, p = 0.012). The risk remained consistent for the Primary EMI subgroup (HR: 2.34, 95% CI 1.18-4.85, p = 0.015). Time to first EMI episode was a significant predictor of the likelihood of developing CLAD or death (AUC=0.856, 95% CI=0.805-0.908, p < 0.001). CONCLUSIONS: Episodes of EMI in lung transplant recipients are often isolated and may not be detectable with traditional clinical monitoring approaches. EMI is associated with an increased risk of severe CLAD or death, independent of concomitant transplant complications.

8.
Article in English | MEDLINE | ID: mdl-38670297

ABSTRACT

BACKGROUND: Cardiac allograft vasculopathy (CAV) remains the leading cause of long-term graft failure and mortality after heart transplantation. Effective preventive and treatment options are not available to date, largely because underlying mechanisms remain poorly understood. We studied the potential role of leukotriene B4 (LTB4), an inflammatory lipid mediator, in the development of CAV. METHODS: We used an established preclinical rat CAV model to study the role of LTB4 in CAV. We performed syngeneic and allogeneic orthotopic aortic transplantation, after which neointimal proliferation was quantified. Animals were then treated with Bestatin, an inhibitor of LTB4 synthesis, or vehicle control for 30 days post-transplant, and evidence of graft CAV was determined by histology. We also measured serial LTB4 levels in a cohort of 28 human heart transplant recipients with CAV, 17 matched transplant controls without CAV, and 20 healthy nontransplant controls. RESULTS: We showed that infiltration of the arterial wall with macrophages leads to neointimal thickening and a rise in serum LTB4 levels in our rat model of CAV. Inhibition of LTB4 production with the drug Bestatin prevents development of neointimal hyperplasia, suggesting that Bestatin may be effective therapy for CAV prevention. In a parallel study of heart transplant recipients, we found nonsignificantly elevated plasma LTB4 levels in patients with CAV, compared to patients without CAV and healthy, nontransplant controls. CONCLUSIONS: This study provides key evidence supporting the role of the inflammatory cytokine LTB4 as an important mediator of CAV development and provides preliminary data suggesting the clinical benefit of Bestatin for CAV prevention.

9.
Transplantation ; 2024 Apr 19.
Article in English | MEDLINE | ID: mdl-38637919

ABSTRACT

In controlled organ donation after circulatory determination of death (cDCDD), accurate and timely death determination is critical, yet knowledge gaps persist. Further research to improve the science of defining and determining death by circulatory criteria is therefore warranted. In a workshop sponsored by the National Heart, Lung, and Blood Institute, experts identified research opportunities pertaining to scientific, conceptual, and ethical understandings of DCDD and associated technologies. This article identifies a research strategy to inform the biomedical definition of death, the criteria for its determination, and circulatory death determination in cDCDD. Highlighting knowledge gaps, we propose that further research is needed to inform the observation period following cessation of circulation in pediatric and neonatal populations, the temporal relationship between the cessation of brain and circulatory function after the withdrawal of life-sustaining measures in all patient populations, and the minimal pulse pressures that sustain brain blood flow, perfusion, activity, and function. Additionally, accurate predictive tools to estimate time to asystole following the withdrawal of treatment and alternative monitoring modalities to establish the cessation of circulatory, brainstem, and brain function are needed. The physiologic and conceptual implications of postmortem interventions that resume circulation in cDCDD donors likewise demand attention to inform organ recovery practices. Finally, because jurisdictionally variable definitions of death and the criteria for its determination may impede collaborative research efforts, further work is required to achieve consensus on the physiologic and conceptual rationale for defining and determining death after circulatory arrest.

10.
J Heart Lung Transplant ; 43(7): 1135-1141, 2024 Jul.
Article in English | MEDLINE | ID: mdl-38460620

ABSTRACT

BACKGROUND: Noninvasive methods for surveillance of acute rejection are increasingly used in heart transplantation (HT), including donor-derived cell-free DNA (dd-cfDNA). As other cardiac biomarkers differ by sex, we hypothesized that there may be sex-specific differences in the performance of dd-cfDNA for the detection of acute rejection. The purpose of the current study was to examine patterns of dd-cfDNA seen in quiescence and acute rejection in male and female transplant recipients. METHODS: Patients enrolled in the Genomic Research Alliance for Transplantation who were ≥18 years at the time of HT were included. Rejection was defined by endomyocardial biopsy with acute cellular rejection (ACR) grade ≥2R and/or antibody-mediated rejection ≥ pAMR 1. dd-cfDNA was quantitated using shotgun sequencing. Median dd-cfDNA levels were compared between sexes during quiescence and rejection. The performance of dd-cfDNA by sex was assessed using area under the receiver operator characteristic (AUROC) curve. Allograft injury was defined as dd-cfDNA ≥0.25%. RESULTS: One hundred fifty-one unique patients (49 female, 32%) were included in the analysis with 1,119 available dd-cfDNA measurements. Baseline characteristics including demographics and comorbidities were not significantly different between sexes. During quiescence, there were no significant sex differences in median dd-cfDNA level (0.04% [IQR 0.00, 0.16] in females vs 0.03% [IQR 0.00, 0.12] in males, p = 0.22). There were no significant sex differences in median dd-cfDNA for ACR (0.33% [0.21, 0.36] in females vs 0.32% [0.21, 1.10] in males, p = 0.57). Overall, median dd-cfDNA levels were higher in antibody-mediated rejection (AMR) than ACR but did not significantly differ by sex (0.50% [IQR 0.18, 0.82] in females vs 0.63% [IQR 0.32, 1.95] in males, p = 0.51). Elevated dd-cfDNA detected ACR/AMR with an AUROC of 0.83 in females and 0.89 in males, p-value for comparison = 0.16. CONCLUSIONS: There were no significant sex differences in dd-cfDNA levels during quiescence and rejection. Performance characteristics were similar, suggesting similar diagnostic thresholds can be used in men and women for rejection surveillance.


Subject(s)
Cell-Free Nucleic Acids , Graft Rejection , Heart Transplantation , Tissue Donors , Humans , Graft Rejection/diagnosis , Graft Rejection/blood , Graft Rejection/immunology , Male , Female , Middle Aged , Cell-Free Nucleic Acids/blood , Sex Factors , Adult , Biomarkers/blood , Genomics/methods
11.
Am J Transplant ; 24(6): 918-927, 2024 Jun.
Article in English | MEDLINE | ID: mdl-38514013

ABSTRACT

Xenotransplantation offers the potential to meet the critical need for heart and lung transplantation presently constrained by the current human donor organ supply. Much was learned over the past decades regarding gene editing to prevent the immune activation and inflammation that cause early organ injury, and strategies for maintenance of immunosuppression to promote longer-term xenograft survival. However, many scientific questions remain regarding further requirements for genetic modification of donor organs, appropriate contexts for xenotransplantation research (including nonhuman primates, recently deceased humans, and living human recipients), and risk of xenozoonotic disease transmission. Related ethical questions include the appropriate selection of clinical trial participants, challenges with obtaining informed consent, animal rights and welfare considerations, and cost. Research involving recently deceased humans has also emerged as a potentially novel way to understand how xeno-organs will impact the human body. Clinical xenotransplantation and research involving decedents also raise ethical questions and will require consensus regarding regulatory oversight and protocol review. These considerations and the related opportunities for xenotransplantation research were discussed in a workshop sponsored by the National Heart, Lung, and Blood Institute, and are summarized in this meeting report.


Subject(s)
Heart Transplantation , Lung Transplantation , Transplantation, Heterologous , Transplantation, Heterologous/ethics , Humans , Lung Transplantation/ethics , Animals , United States , Heart Transplantation/ethics , National Heart, Lung, and Blood Institute (U.S.) , Biomedical Research/ethics , Tissue Donors/supply & distribution , Tissue Donors/ethics
12.
J Heart Lung Transplant ; 43(6): 1021-1029, 2024 Jun.
Article in English | MEDLINE | ID: mdl-38432523

ABSTRACT

In a workshop sponsored by the U.S. National Heart, Lung, and Blood Institute, experts identified current knowledge gaps and research opportunities in the scientific, conceptual, and ethical understanding of organ donation after the circulatory determination of death and its technologies. To minimize organ injury from warm ischemia and produce better recipient outcomes, innovative techniques to perfuse and oxygenate organs postmortem in situ, such as thoracoabdominal normothermic regional perfusion, are being implemented in several medical centers in the US and elsewhere. These technologies have improved organ outcomes but have raised ethical and legal questions. Re-establishing donor circulation postmortem can be viewed as invalidating the condition of permanent cessation of circulation on which the earlier death determination was made and clamping arch vessels to exclude brain circulation can be viewed as inducing brain death. Alternatively, TA-NRP can be viewed as localized in-situ organ perfusion, not whole-body resuscitation, that does not invalidate death determination. Further scientific, conceptual, and ethical studies, such as those identified in this workshop, can inform and help resolve controversies raised by this practice.


Subject(s)
Death , Tissue and Organ Procurement , Humans , Tissue and Organ Procurement/methods , Tissue and Organ Procurement/ethics , United States , National Heart, Lung, and Blood Institute (U.S.) , Lung Transplantation , Tissue Donors , Organ Preservation/methods , Heart Transplantation
14.
J Heart Lung Transplant ; 43(6): 954-962, 2024 Jun.
Article in English | MEDLINE | ID: mdl-38423416

ABSTRACT

BACKGROUND: Since 2019, the annual transplantation rate of hearts donated following circulatory death (DCD) has increased significantly in the United States. The 2 major heart procurement techniques following circulatory death are direct procurement and perfusion (DPP) and normothermic regional perfusion (NRP). Post-transplant survival for heart recipients has not been compared between these 2 techniques. METHODS: This observational study uses data on adult heart transplants from donors after circulatory death from January 1, 2019 to December 31, 2021 in the Scientific Registry of Transplant Recipients. We identified comparable transplant cases across procurement types using propensity-score matching and measured the association between procurement technique and 1-year post-transplant survival using Kaplan-Meier and Cox proportional hazards model stratefied by matching pairs. RESULTS: Among 318 DCD heart transplants, 216 (68%) were procured via DPP, and 102 (32%) via NRP. Among 22 transplant centers that accepted circulatory-death donors, 3 used NRP exclusively, and 5 used both procurement techniques. After propensity-score matching on recipient and donor factors, there was no significant difference in 1-year post-transplant survival (93.1% for NRP vs 91.1% for DPP, p = 0.79) between procurement techniques. CONCLUSIONS: NRP and DPP procurements are associated with similar 1-year post-transplant survival. If NRP is ethically permissible and improves outcomes for abdominal organs, it should be the preferred procurement technique for DCD hearts.


Subject(s)
Graft Survival , Heart Transplantation , Organ Preservation , Perfusion , Tissue and Organ Procurement , Humans , Male , Female , Middle Aged , Tissue and Organ Procurement/methods , Perfusion/methods , Graft Survival/physiology , Organ Preservation/methods , Adult , Retrospective Studies , Tissue Donors , United States/epidemiology , Survival Rate/trends , Death , Follow-Up Studies , Registries
15.
JAMA ; 331(6): 500-509, 2024 02 13.
Article in English | MEDLINE | ID: mdl-38349372

ABSTRACT

Importance: The US heart allocation system prioritizes medically urgent candidates with a high risk of dying without transplant. The current therapy-based 6-status system is susceptible to manipulation and has limited rank ordering ability. Objective: To develop and validate a candidate risk score that incorporates current clinical, laboratory, and hemodynamic data. Design, Setting, and Participants: A registry-based observational study of adult heart transplant candidates (aged ≥18 years) from the US heart allocation system listed between January 1, 2019, and December 31, 2022, split by center into training (70%) and test (30%) datasets. Adult candidates were listed between January 1, 2019, and December 31, 2022. Main Outcomes and Measures: A US candidate risk score (US-CRS) model was developed by adding a predefined set of predictors to the current French Candidate Risk Score (French-CRS) model. Sensitivity analyses were performed, which included intra-aortic balloon pumps (IABP) and percutaneous ventricular assist devices (VAD) in the definition of short-term mechanical circulatory support (MCS) for the US-CRS. Performance of the US-CRS model, French-CRS model, and 6-status model in the test dataset was evaluated by time-dependent area under the receiver operating characteristic curve (AUC) for death without transplant within 6 weeks and overall survival concordance (c-index) with integrated AUC. Results: A total of 16 905 adult heart transplant candidates were listed (mean [SD] age, 53 [13] years; 73% male; 58% White); 796 patients (4.7%) died without a transplant. The final US-CRS contained time-varying short-term MCS (ventricular assist-extracorporeal membrane oxygenation or temporary surgical VAD), the log of bilirubin, estimated glomerular filtration rate, the log of B-type natriuretic peptide, albumin, sodium, and durable left ventricular assist device. In the test dataset, the AUC for death within 6 weeks of listing for the US-CRS model was 0.79 (95% CI, 0.75-0.83), for the French-CRS model was 0.72 (95% CI, 0.67-0.76), and 6-status model was 0.68 (95% CI, 0.62-0.73). Overall c-index for the US-CRS model was 0.76 (95% CI, 0.73-0.80), for the French-CRS model was 0.69 (95% CI, 0.65-0.73), and 6-status model was 0.67 (95% CI, 0.63-0.71). Classifying IABP and percutaneous VAD as short-term MCS reduced the effect size by 54%. Conclusions and Relevance: In this registry-based study of US heart transplant candidates, a continuous multivariable allocation score outperformed the 6-status system in rank ordering heart transplant candidates by medical urgency and may be useful for the medical urgency component of heart allocation.


Subject(s)
Heart Failure , Heart Transplantation , Tissue and Organ Procurement , Adult , Female , Humans , Male , Middle Aged , Bilirubin , Clinical Laboratory Services , Heart , Risk Factors , Risk Assessment , Heart Failure/mortality , Heart Failure/surgery , United States , Health Care Rationing/methods , Predictive Value of Tests , Tissue and Organ Procurement/methods , Tissue and Organ Procurement/organization & administration
16.
JACC Heart Fail ; 12(4): 722-736, 2024 Apr.
Article in English | MEDLINE | ID: mdl-38244008

ABSTRACT

BACKGROUND: Potential organ donors often exhibit abnormalities on electrocardiograms (ECGs) after brain death, but the physiological and prognostic significance of such abnormalities is unknown. OBJECTIVES: This study sought to characterize the prevalence of ECG abnormalities in a nationwide cohort of potential cardiac donors and their associations with cardiac dysfunction, use for heart transplantation (HT), and recipient outcomes. METHODS: The Donor Heart Study enrolled 4,333 potential cardiac organ donors at 8 organ procurement organizations across the United States from 2015 to 2020. A blinded expert reviewer interpreted all ECGs, which were obtained once hemodynamic stability was achieved after brain death and were repeated 24 ± 6 hours later. ECG findings were summarized, and their associations with other cardiac diagnostic findings, use for HT, and graft survival were assessed using univariable and multivariable regression. RESULTS: Initial ECGs were interpretable for 4,136 potential donors. Overall, 64% of ECGs were deemed clinically abnormal, most commonly as a result of a nonspecific St-T-wave abnormality (39%), T-wave inversion (19%), and/or QTc interval >500 ms (17%). Conduction abnormalities, ectopy, pathologic Q waves, and ST-segment elevations were less common (each present in ≤5% of donors) and resolved on repeat ECGs in most cases. Only pathological Q waves were significant predictors of donor heart nonuse (adjusted OR: 0.39; 95% CI: 0.29-0.53), and none were associated with graft survival at 1 year post-HT. CONCLUSIONS: ECG abnormalities are common in potential heart donors but often resolve on serial testing. Pathologic Q waves are associated with a lower likelihood of use for HT, but they do not portend worse graft survival.


Subject(s)
Heart Diseases , Heart Failure , Heart Transplantation , Tissue and Organ Procurement , Humans , Tissue Donors , Brain Death , Electrocardiography , Arrhythmias, Cardiac
17.
J Heart Lung Transplant ; 43(3): 387-393, 2024 Mar.
Article in English | MEDLINE | ID: mdl-37802261

ABSTRACT

Primary graft dysfunction (PGD) is a leading cause of early morbidity and mortality following heart transplantation (HT). We sought to determine the association between pretransplant human leukocyte antigen (HLA) sensitization, as measured using the calculated panel reactive antibody (cPRA) value, and the risk of PGD. METHODS: Consecutive adult HT recipients (n = 596) from 1/2015 to 12/2019 at 2 US centers were included. Severity of PGD was based on the 2014 International Society for Heart and Lung Transplantation consensus statement. For each recipient, unacceptable HLA antigens were obtained and locus-specific cPRA (cPRA-LS) and pre-HT donor-specific antibodies (DSA) were assessed. RESULTS: Univariable logistic modeling showed that peak cPRA-LS for all loci and HLA-A was associated with increased severity of PGD as an ordinal variable (all loci: OR 1.78, 95% CI: 1.01-1.14, p = 0.025, HLA-A: OR 1.14, 95% CI: 1.03-1.26, p = 0.011). Multivariable analysis showed peak cPRA-LS for HLA-A, recipient beta-blocker use, total ischemic time, donor age, prior cardiac surgery, and United Network for Organ Sharing status 1 or 2 were associated with increased severity of PGD. The presence of DSA to HLA-B was associated with trend toward increased risk of mild-to-moderate PGD (OR 2.56, 95% CI: 0.99-6.63, p = 0.053), but DSA to other HLA loci was not associated with PGD. CONCLUSIONS: Sensitization for all HLA loci, and specifically HLA-A, is associated with an increased severity of PGD. These factors should be included in pre-HT risk stratification to minimize the risk of PGD.


Subject(s)
Heart Transplantation , Primary Graft Dysfunction , Adult , Humans , Primary Graft Dysfunction/epidemiology , Primary Graft Dysfunction/etiology , Heart Transplantation/adverse effects , HLA Antigens , Tissue Donors , Antibodies , HLA-A Antigens , Retrospective Studies
19.
Curr Heart Fail Rep ; 20(6): 493-503, 2023 Dec.
Article in English | MEDLINE | ID: mdl-37966542

ABSTRACT

PURPOSE OF REVIEW: Heart transplantation (HT) remains the optimal therapy for patients living with end-stage heart disease. Despite recent improvements in peri-transplant management, the median survival after HT has remained relatively static, and complications of HT, including infection, rejection, and allograft dysfunction, continue to impact quality of life and long-term survival. RECENT FINDINGS: Omics technologies are becoming increasingly accessible and can identify novel biomarkers for, and reveal the underlying biology of, several disease states. While some technologies, such as gene expression profiling (GEP) and donor-derived cell-free DNA (dd-cfDNA), are routinely used in the clinical care of HT recipients, a number of emerging platforms, including pharmacogenomics, proteomics, and metabolomics, hold great potential for identifying biomarkers to aid in the diagnosis and management of post-transplant complications. Omics-based assays can improve patient and allograft longevity by facilitating a personalized and precision approach to post-HT care. The following article is a contemporary review of the current and future opportunities to leverage omics technologies, including genomics, transcriptomics, proteomics, and metabolomics in the field of HT.


Subject(s)
Heart Failure , Heart Transplantation , Humans , Allografts , Biomarkers , Graft Rejection , Heart Failure/genetics , Heart Failure/surgery , Quality of Life
20.
Eur Heart J ; 44(44): 4665-4674, 2023 Nov 21.
Article in English | MEDLINE | ID: mdl-37936176

ABSTRACT

BACKGROUND AND AIMS: Given limited evidence and lack of consensus on donor acceptance for heart transplant (HT), selection practices vary widely across HT centres in the USA. Similar variation likely exists on a broader scale-across countries and HT systems-but remains largely unexplored. This study characterized differences in heart donor populations and selection practices between the USA and Eurotransplant-a consortium of eight European countries-and their implications for system-wide outcomes. METHODS: Characteristics of adult reported heart donors and their utilization (the percentage of reported donors accepted for HT) were compared between Eurotransplant (n = 8714) and the USA (n = 60 882) from 2010 to 2020. Predictors of donor acceptance were identified using multivariable logistic regression. Additional analyses estimated the impact of achieving Eurotransplant-level utilization in the USA amongst donors of matched quality, using probability of acceptance as a marker of quality. RESULTS: Eurotransplant reported donors were older with more cardiovascular risk factors but with higher utilization than in the USA (70% vs. 44%). Donor age, smoking history, and diabetes mellitus predicted non-acceptance in the USA and, by a lesser magnitude, in Eurotransplant; donor obesity and hypertension predicted non-acceptance in the USA only. Achieving Eurotransplant-level utilization amongst the top 30%-50% of donors (by quality) would produce an additional 506-930 US HTs annually. CONCLUSIONS: Eurotransplant countries exhibit more liberal donor heart acceptance practices than the USA. Adopting similar acceptance practices could help alleviate the scarcity of donor hearts and reduce waitlist morbidity in the USA.


Subject(s)
Heart Transplantation , Tissue Donors , Adult , Humans , Europe/epidemiology , Logistic Models , Obesity/epidemiology
SELECTION OF CITATIONS
SEARCH DETAIL
...