ABSTRACT
BACKGROUND AND AIMS: Early identification of cardiac structural abnormalities indicative of heart failure is crucial to improving patient outcomes. Chest X-rays (CXRs) are routinely conducted on a broad population of patients, presenting an opportunity to build scalable screening tools for structural abnormalities indicative of Stage B or worse heart failure with deep learning methods. In this study, a model was developed to identify severe left ventricular hypertrophy (SLVH) and dilated left ventricle (DLV) using CXRs. METHODS: A total of 71 589 unique CXRs from 24 689 different patients completed within 1 year of echocardiograms were identified. Labels for SLVH, DLV, and a composite label indicating the presence of either were extracted from echocardiograms. A deep learning model was developed and evaluated using area under the receiver operating characteristic curve (AUROC). Performance was additionally validated on 8003 CXRs from an external site and compared against visual assessment by 15 board-certified radiologists. RESULTS: The model yielded an AUROC of 0.79 (0.76-0.81) for SLVH, 0.80 (0.77-0.84) for DLV, and 0.80 (0.78-0.83) for the composite label, with similar performance on an external data set. The model outperformed all 15 individual radiologists for predicting the composite label and achieved a sensitivity of 71% vs. 66% against the consensus vote across all radiologists at a fixed specificity of 73%. CONCLUSIONS: Deep learning analysis of CXRs can accurately detect the presence of certain structural abnormalities and may be useful in early identification of patients with LV hypertrophy and dilation. As a resource to promote further innovation, 71 589 CXRs with adjoining echocardiographic labels have been made publicly available.
Subject(s)
Deep Learning , Hypertrophy, Left Ventricular , Radiography, Thoracic , Humans , Hypertrophy, Left Ventricular/diagnostic imaging , Radiography, Thoracic/methods , Female , Male , Middle Aged , Echocardiography/methods , Aged , Heart Failure/diagnostic imaging , Heart Ventricles/diagnostic imaging , ROC CurveABSTRACT
BACKGROUND: Aortic regurgitation (AR) is a common complication following left ventricular assist device (LVAD) implantation. We evaluated the hemodynamic implications of AR in patients with HeartMate 3 (HM3) LVAD at baseline and in response to speed changes. METHODS AND RESULTS: Clinically stable outpatients supported by HM3 who underwent a routine hemodynamic ramp test were retrospectively enrolled in this analysis. Patients were stratified based on the presence of at least mild AR at baseline speed. Hemodynamic and echocardiographic parameters were compared between the AR and non-AR groups. Sixty-two patients were identified. At the baseline LVAD speed, 29 patients (47%) had AR, while 33 patients (53%) did not. Patients with AR were older and supported on HM3 for a longer duration. At baseline speed, all hemodynamic parameters were similar between the groups including central venous pressure, pulmonary capillary wedge pressure, pulmonary arterial pressures, cardiac output and index, and pulmonary artery pulsatility index (p > 0.05 for all). During the subacute assessment, AR worsened in some, but not all, patients, with increases in LVAD speed. There were no significant differences in 1-year mortality or hospitalization rates between the groups, however, at 1-year, ≥ moderate AR and right ventricular failure (RVF) were detected in higher rates among the AR group compared to the non-AR group (45% vs. 0%; p < 0.01, and 75% vs. 36.8%; p = 0.02, respectively). CONCLUSIONS: In a cohort of stable outpatients supported with HM3 who underwent a routine hemodynamic ramp test, the presence of mild or greater AR did not impact the ability of HM3 LVADs to effectively unload the left ventricle during early subacute assessment. Although the presence of AR did not affect mortality and hospitalization rates, it resulted in higher rates of late hemodynamic-related events in the form of progressive AR and RVF.
Subject(s)
Aortic Valve Insufficiency , Heart Failure , Heart-Assist Devices , Humans , Retrospective Studies , Heart Failure/diagnosis , Heart Failure/surgery , Heart-Assist Devices/adverse effects , Aortic Valve Insufficiency/diagnosis , Aortic Valve Insufficiency/etiology , Hemodynamics/physiologyABSTRACT
BACKGROUND: Among heart transplant (HT) recipients who develop advanced graft dysfunction, cardiac re-transplantation may be considered. A smaller subset of patients will experience failure of their second allograft and undergo repeat re-transplantation. Outcomes among these individuals are not well-described. METHODS: Adult and pediatric patients in the United Network for Organ Sharing (UNOS) registry who received HT between January 1, 1990 and December 31, 2020 were included. RESULTS: Between 1990 and 2020, 90 individuals received a third HT and three underwent a fourth HT. Recipients were younger than those undergoing primary HT (mean age 32 years). Third HT was associated with significantly higher unadjusted rates of 1-year mortality (18% for third HT vs. 13% for second HT vs. 9% for primary HT, p < .001) and 10-year mortality (59% for third HT vs. 42% for second HT vs. 37% for primary HT, p < .001). Mortality was highest amongst recipients aged >60 years and those re-transplanted for acute graft failure. Long-term rates of CAV, rejection, chronic dialysis, and hospitalization for infection were also higher. CONCLUSIONS: Third HT is associated with higher morbidity and mortality than primary HT. Further consensus is needed regarding appropriate organ stewardship for this unique subgroup.
Subject(s)
Heart Transplantation , Adult , Humans , Child , Risk Factors , Survival Rate , Transplantation, Homologous , Graft Rejection/etiology , Retrospective StudiesABSTRACT
BACKGROUND: Donor-derived cell-free DNA (dd-cfDNA) has emerged as a reliable, noninvasive method for the surveillance of allograft rejection in heart transplantation (HT) patients, but its utility in multi-organ transplants (MOT) is unknown. We describe our experience using dd-cfDNA in simultaneous MOT recipients. METHODS: A single-center retrospective review of all HT recipients between 2018 and 2022 that had at least one measurement of dd-cfDNA collected. Patients who had simultaneous MOT were identified and included in this study. Levels of dd-cfDNA were paired with endomyocardial biopsies (EMB) performed within 1 month of blood testing if available. Acute cellular rejection (ACR) was defined as ISHLT (International Society for Heart and Lung Transplantation) grade ≥ 2R. and antibody-mediated rejection (AMR) was defined as pAMR grade > 0. The within-patient variability score of the dd-cfDNA was calculated by the variance/average. RESULTS: The study included 25 multiorgan transplant recipients: 13 heart-kidney (H-K), 8 heart-liver (H-Li), and 4 heart-lung (H-Lu). The median age was 55 years, 44% were female; the median time from HT until the first dd-cfDNA measurement was 4.5 months (IQR 2, 10.5). The median dd-cfDNA level was 0.18% (IQR 0.15%, 0.27%) for H-K, 1.15% (IQR 0.77%, 2.33%) for H-Li, and 0.69% (IQR 0.62%, 1.07%) for H-Lu patients (p < 0.001). Prevalence of positive dd-cfDNA tests (threshold of 0.20%) were 42.2%, 97.3%, and 92.3% in the H-K, H-Li, and H-Lu groups, respectively. The within-patient variability score was highest in the H-Li group (median of 0.45 [IQR 0.29, 0.94]) and lowest in the H-K group (median of 0.09 [IQR 0.06, 0.12]); p = 0.002. No evidence of cardiac ACR or AMR was found. Three patients experienced renal allograft ACR and/or AMR, two patients experienced rejection of the liver allograft, and one patient experienced an episode of AMR-mediated lung rejection. One person in the H-K group experienced an episode of cardiac allograft dysfunction that was not associated with biopsy-confirmed rejection. CONCLUSION: Dd-cfDNA is chronically elevated in most MOT recipients. There is a high degree of within-patient variability in levels (particularly for H-Li and H-Lu recipients), which may limit the utility of this assay in monitoring MOT recipients.
Subject(s)
Cell-Free Nucleic Acids , Graft Rejection , Heart Transplantation , Tissue Donors , Humans , Female , Cell-Free Nucleic Acids/blood , Male , Retrospective Studies , Middle Aged , Heart Transplantation/adverse effects , Graft Rejection/diagnosis , Graft Rejection/etiology , Graft Rejection/blood , Follow-Up Studies , Prognosis , Organ Transplantation/adverse effects , Graft Survival , Biomarkers/blood , Transplant Recipients , Risk Factors , AdultABSTRACT
BACKGROUND: Belatacept (BTC), a fusion protein, selectively inhibits T-cell co-stimulation by binding to the CD80 and CD86 receptors on antigen-presenting cells (APCs) and has been used as immunosuppression in adult renal transplant recipients. However, data regarding its use in heart transplant (HT) recipients are limited. This retrospective cohort study aimed to delineate BTC's application in HT, focusing on efficacy, safety, and associated complications at a high-volume HT center. METHODS: A retrospective cohort study was conducted of patients who underwent HT between January 2017 and December 2021 and subsequently received BTC as part of their immunosuppressive regimen. Twenty-one HT recipients were identified. Baseline characteristics, history of rejection, and indication for BTC use were collected. Outcomes included renal function, graft function, allograft rejection and mortality. Follow-up data were collected through December 2023. RESULTS: Among 776 patients monitored from January 2017 to December 2021 21 (2.7%) received BTC treatment. Average age at transplantation was 53 years (± 12 years), and 38% were women. BTC administration began, on average, 689 [483, 1830] days post-HT. The primary indications for BTC were elevated pre-formed donor-specific antibodies in highly sensitized patients (66.6%) and renal sparing (23.8%), in conjunction with reduced calcineurin inhibitor dosage. Only one (4.8%) patient encountered rejection within a year of starting BTC. Graft function by echocardiography remained stable at 6 and 12 months posttreatment. An improvement was observed in serum creatinine levels (76.2% of patients), decreasing from a median of 1.58 to 1.45 (IQR [1.0-2.1] to [1.1-1.9]) over 12 months (p = .054). eGFR improved at 3 and 6 months compared with 3 months pre- BTC levels; however, this was not statistically significant (p = .24). Treatment discontinuation occurred in seven patients (33.3%) of whom four (19%) were switched back to full dose CNI. Infections occurred in 11 patients (52.4%), leading to BTC discontinuation in 4 patients (19%). CONCLUSION: In this cohort, BTC therapy was used as alternative immunosuppression for management of highly sensitized patients or for renal sparing. BTC therapy when combined with CNI dose reduction resulted in stabilization in renal function as measured through renal surrogate markers, which did not, however, reach statistical significance. Patients on BTC maintained a low rejection rate and preserved graft function. Infections were common during BTC therapy and were associated with medication pause/discontinuation in 19% of patients. Further randomized studies are needed to assess the efficacy and safety of BTC in HT recipients.
Subject(s)
Heart Transplantation , Kidney Transplantation , Adult , Humans , Female , Middle Aged , Male , Abatacept , Retrospective Studies , Kidney Transplantation/adverse effects , Immunosuppressive Agents , Calcineurin Inhibitors/therapeutic use , T-Lymphocytes , Graft Rejection/drug therapy , Graft Rejection/etiology , Transplant Recipients , Graft SurvivalABSTRACT
This review provides a comprehensive overview of the past 25+ years of research into the development of left ventricular assist device (LVAD) to improve clinical outcomes in patients with severe end-stage heart failure and basic insights gained into the biology of heart failure gleaned from studies of hearts and myocardium of patients undergoing LVAD support. Clinical aspects of contemporary LVAD therapy, including evolving device technology, overall mortality, and complications, are reviewed. We explain the hemodynamic effects of LVAD support and how these lead to ventricular unloading. This includes a detailed review of the structural, cellular, and molecular aspects of LVAD-associated reverse remodeling. Synergisms between LVAD support and medical therapies for heart failure related to reverse remodeling, remission, and recovery are discussed within the context of both clinical outcomes and fundamental effects on myocardial biology. The incidence, clinical implications and factors most likely to be associated with improved ventricular function and remission of the heart failure are reviewed. Finally, we discuss recognized impediments to achieving myocardial recovery in the vast majority of LVAD-supported hearts and their implications for future research aimed at improving the overall rates of recovery.
Subject(s)
Heart Failure/therapy , Heart-Assist Devices , Ventricular Function, Left/physiology , Ventricular Remodeling , Animals , Calcium/metabolism , Cell Death/physiology , Cytokines/metabolism , Cytoskeleton/physiology , Disease Models, Animal , Endothelial Cells/physiology , Extracellular Matrix/physiology , Heart-Assist Devices/adverse effects , Heart-Assist Devices/trends , Hemodynamics , Humans , Macrophages/cytology , Mitochondria, Heart/physiology , Myocardial Contraction , Myocytes, Cardiac/physiology , TranscriptomeABSTRACT
BACKGROUND: Leukopenia in the early period following heart transplantation (HT) is not well-studied. The aim of this study was to evaluate risk factors for the development of post-transplant leukopenia and its consequences for HT recipients. METHODS: Adult patients at a large-volume transplant center who received HT between January 1, 2010 and December 31, 2020 were included. The incidence of leukopenia (WBC ≤3 × 103 /µL) in the first 90-days following HT, individual risk factors, and its effect on 1-year outcomes were evaluated. RESULTS: Of 506 HT recipients, 184 (36%) developed leukopenia within 90-days. Median duration of the first leukopenia episode was 15.5 days (IQR 8-42.5 days). Individuals who developed leukopenia had lower pre-transplant WBC counts compared to those who did not (6.1 × 103 /µL vs. 6.9 × 103 /µL, p = .02). Initial immunosuppressive and infectious chemoprophylactic regimens were not significantly different between groups. Early leukopenia was associated with a higher mortality at 1-year (6.6% vs. 2.1%, p = .008; adjusted HR 3.0) and an increased risk of recurrent episodes. Rates of infection and rejection were not significantly different between the two groups. CONCLUSIONS: Leukopenia in the early period following HT is common and associated with an increased risk of mortality. Further study is needed to identify individuals at highest risk for leukopenia prior to transplant and optimize immunosuppressive and infectious chemoprophylactic regimens for this subgroup.
Subject(s)
Heart Transplantation , Kidney Transplantation , Leukopenia , Adult , Humans , Kidney Transplantation/adverse effects , Leukopenia/epidemiology , Leukopenia/etiology , Immunosuppressive Agents/adverse effects , Risk Factors , Heart Transplantation/adverse effects , Transplant Recipients , Graft Rejection/epidemiology , Graft Rejection/etiology , Graft Rejection/prevention & control , Retrospective StudiesABSTRACT
OBJECTIVE: Our study examines the long-term outcomes of patients discharged from the hospital without heart replacement therapy (HRT) after recovery from cardiogenic shock using venoarterial extracorporeal life support (VA-ECLS). METHODS: We retrospectively reviewed 615 cardiogenic shock patients who recovered from VA-ECLS at our institution between January 2015 and July 2021. Of those, 166 patients (27.0%) who recovered from VA-ECLS without HRT were included in this study. Baseline characteristics, discharge labs, vitals, electrocardiograms and echocardiograms were assessed. Patients were contacted to determine vital status. The primary outcome was post-discharge mortality. RESULTS: Of 166 patients, 158 patients (95.2%) had post-discharge follow-up, with a median time of follow-up of 2 years (IQR: [1 year, 4 years]). At discharge, the median ejection fraction (EF) was 52.5% (IQR: [32.5, 57.5]). At discharge, 92 patients (56%) were prescribed ß-blockers, 28 (17%) were prescribed an ACE inhibitor, ARB or ARNI, and 50 (30%) were prescribed loop diuretics. Kaplan-Meier analysis showed a 1-year survival rate of 85.6% (95% CI: [80.1%, 91.2%]) and a 5-year survival rate of 60.6% (95% CI: [49.9%, 71.3%]). A Cox regression model demonstrated that a history of congestive heart failure (CHF) was strongly predictive of increased mortality hazard (HR = 1.929; p = 0.036), while neither discharge EF nor etiology of VA-ECLS were associated with increased post-discharge mortality. CONCLUSIONS: Patients discharged from the hospital after full myocardial recovery from VA-ECLS support without HRT should have close outpatient follow-up due to the risk of recurrent heart failure and increased mortality in these patients.
ABSTRACT
BACKGROUND: Historically, women have had less access to advanced heart failure therapies, including temporary and permanent mechanical circulatory support and heart transplantation (HT), with worse waitlist and post-transplant survival compared with men. This study evaluated for improvement in sex differences across all phases of HT in the 2018 allocation system. METHODS AND RESULTS: The United Network for Organ Sharing registry was queried to identify adult patients (≥18 years) listed for HT from October 18, 2016, to October 17, 2018 (old allocation), and from October 18, 2018, to October 18, 2020 (new allocation). The outcomes of interest included waitlist survival, pretransplant use of temporary and durable mechanical circulatory support, rates of HT, and post-transplant survival. There were 15,629 patients who were listed for HT and included in this analysis; 7745 (2039 women, 26.3%) in the new and 7875 patients (2074 women, 26.3%) in the old allocation system. When compared with men in the new allocation system, women were more likely to have lower priority United Network for Organ Sharing status at time of transplant, and less likely to be supported by an intra-aortic balloon pump (27.1% vs 32.2%, P < .001), with no difference in the use of venoarterial extracorporeal membrane oxygenation (5.5% vs 6.3%, Pâ¯=â¯.28). Despite these findings, when transplantation was viewed in the context of risk for death or delisting, the cumulative incidence of transplant within 6 months of listing was higher in women than men in the new allocation system (62.4% vs 54.9%, P < .001) with no differences in post-transplant survival. When comparing women in the old with the new allocation system, the distance traveled for organ procurement was 187.5 ± 207.0 miles vs 272.8 ± 233.7 miles (P < .001). CONCLUSIONS: Although the use of temporary mechanical circulatory support in women remains lower than in men in the new allocation system, more women are being transplanted with comparable waitlist and post-transplant outcomes as men. Broader sharing may be making its greatest impact on improving transplant opportunities for women.
Subject(s)
Heart Failure , Heart Transplantation , Heart-Assist Devices , Adult , Female , Heart Failure/therapy , Heart Transplantation/methods , Humans , Intra-Aortic Balloon Pumping , Male , Retrospective Studies , Waiting ListsABSTRACT
OBJECTIVES: To describe hemodynamic efficacy and clinical outcomes of Impella percutaneous left ventricular assist device (pLVAD) in patients with cardiogenic shock (CS). BACKGROUND: Percutaneous LVADs are increasingly used in CS management. However, device-related outcomes and optimal utilization remain active areas of investigation. METHODS: All CS patients receiving pLVAD as mechanical circulatory support (MCS) between 2011 and 2017 were identified. Clinical characteristics and outcomes were analyzed. A multivariable logistic regression model was created to predict MCS escalation despite pLVAD. Outcomes were compared between early and late implantation. RESULTS: A total of 115 CS patients (mean age 63.6 ± 13.8 years; 69.6% male) receiving pLVAD as MCS were identified, the majority with CS secondary to acute myocardial infarction (AMI; 67.0%). Patients experienced significant cardiac output improvement (median 3.39 L/min to 3.90 L/min, p = .002) and pharmacological support reduction (median vasoactive-inotropic score [VIS] 25.4 to 16.4, p = .049). Placement of extracorporeal membrane oxygenation (ECMO) occurred in 48 (41.7%) of patients. Higher pre-pLVAD VIS was associated with subsequent MCS escalation in the entire cohort and AMI subgroup (OR 1.27 [95% CI 1.02-1.58], p = .034 and OR 1.72 [95% CI 1.04-2.86], p = .035, respectively). Complications were predominantly access site related (bleeding [9.6%], vascular injury [5.2%], and limb ischemia [2.6%]). In-hospital mortality was 57.4%, numerically greater survival was noted with earlier device implantation. CONCLUSIONS: Treatment with pLVAD for CS improved hemodynamic status but did not uniformly obviate MCS escalation. Mortality in CS remains high, though earlier device placement for appropriately selected patients may be beneficial.
Subject(s)
Heart-Assist Devices , Shock, Cardiogenic , Academic Medical Centers , Aged , Female , Humans , Male , Middle Aged , Retrospective Studies , Shock, Cardiogenic/diagnosis , Shock, Cardiogenic/etiology , Shock, Cardiogenic/therapy , Treatment OutcomeABSTRACT
BACKGROUND: Significant weight loss due to cardiac cachexia is an independent predictor of mortality in many heart failure (HF) clinical trials. The impact of significant weight loss while on the waitlist for heart transplant (HT) has yet to be studied with respect to post-transplant survival. METHODS: Adult HT recipients from 2010 to 2021 were identified in the UNOS registry. Patients who experienced an absolute weight change from the time of listing to transplant were included and classified into two groups by percent weight loss from time of listing to time of transplant using a cut-off of 10%. The primary endpoint was 1-year survival following HT. RESULTS: 5951 patients were included in the analysis, of whom 763 (13%) experienced ≥10% weight loss from the time of listing to transplant. Weight loss ≥ 10% was associated with reduced 1-year post-transplant survival (86.9% vs. 91.0%, long-rank p = .0003). Additionally, weight loss ≥ 10% was an independent predictor of 1-year mortality in a multivariable model adjusting for significant risk factors (adjusted HR 1.23, 95% CI 1.04-1.46). In secondary analyses, weight loss ≥ 10% was associated with reduced 1-year survival independent of hospitalized status at time of transplant as well as obesity status at listing (i.e., body mass index [BMI] < 30 kg/m2 and BMI ≥ 30 kg/m2 ). CONCLUSIONS: Preoperative weight loss ≥ 10% is associated with reduced survival in patients listed for HT. Nutrition interventions prior to transplant may prove beneficial in this population.
Subject(s)
Heart Failure , Heart Transplantation , Adult , Humans , Retrospective Studies , Obesity/epidemiology , Weight Loss , Waiting ListsABSTRACT
INTRODUCTION: For patients with advanced heart failure, socioeconomic deprivation may impede referral for heart transplantation (HT). We examined the association of socioeconomic deprivation with listing among patients evaluated at our institution and compared this against the backdrop of our local community. METHODS: We conducted a retrospective cohort study of patients evaluated for HT between January 2017 and December 2020. Patient demographics and clinical characteristics were recorded. Block group-level area deprivation index (ADI) decile was obtained at each patient's home address and Socioeconomic Status (SES) index was determined by patient zip code. RESULTS: In total, 400 evaluations were initiated; one international patient was excluded. Among this population, 111 (27.8%) were women, 219 (54.9%) were White, 94 (23.6%) Black, and 59 (14.8%) Hispanic. 248 (62.2%) patients were listed for transplant. Listed patients had significantly higher SES index and lower ADI compared to those who were not listed. However, after adjustment for clinical factors, ADI and SESi were not predictive of listing. Similarly, patient sex, race, and insurance did not influence the likelihood of listing for HT. Notably, the distribution of the referral cohort based on ADI deciles was not reflective of our center's catchment area, indicating opportunities for improving access to transplant for disadvantaged populations. CONCLUSIONS: Although socioeconomic deprivation did not predict listing in our analysis, we recognize the need for broader outreach to combat upstream bias that prevents patients from being referred for HT.
Subject(s)
Heart Failure , Heart Transplantation , Academic Medical Centers , Female , Heart Failure/surgery , Humans , Male , Retrospective Studies , Social Class , Socioeconomic FactorsABSTRACT
PURPOSE OF REVIEW: Due to the growing mismatch between donor supply and demand as well as unacceptably high transplant waitlist mortality, the heart organ allocation system was revised in October 2018. This review gives an overview of the changes in the new heart organ allocation system and its impact on heart transplant practice and outcomes in the United States. RECENT FINDINGS: The 2018 heart allocation system offers a 6-tiered policy and therefore prioritizes the sickest patients on the transplant waitlist. Patients supported with temporary mechanical circulatory support devices are prioritized as Status 1 or Status 2, resulting in increased utilization of this strategy. Patients supported with durable left ventricular assist devices have been prioritized as Status 3 or 4, which has resulted in decreased utilization of this strategy. Broader geographic sharing in the new heart allocation system has resulted in prolonged donor ischemic times. Overall, the new heart allocation system has resulted in significantly lower candidate waitlist mortality, shorter waitlist times, and higher incidence of transplantation. SUMMARY: The new United Network for Organ Sharing allocation policy confers significant advantages over the prior algorithm, allowing for decreased waitlist times and improved waitlist mortality without major impact on posttransplant survival.
Subject(s)
Heart Transplantation , Heart-Assist Devices , Tissue and Organ Procurement , Humans , Policy , Ships , Tissue Donors , United StatesABSTRACT
OBJECTIVE: The objective of this study was to describe the profiles and outcomes of a cohort of advanced heart failure patients on ambulatory inotropic therapy (AIT). BACKGROUND: With the growing burden of patients with end-stage heart failure, AIT is an increasingly common short or long-term option, for use as bridge to heart transplant (BTT), bridge to ventricular assist device (BTVAD), bridge to decision regarding advanced therapies (BTD) or as palliative care. AIT may be preferred by some patients and physicians to facilitate hospital discharge. However, counseling patients on risks and benefits is critically important in the modern era of defibrillators, durable mechanical support and palliative care. METHODS: We retrospectively studied a cohort of 241 patients on AIT. End points included transplant, VAD implantation, weaning of inotropes, or death. The primary outcomes were survival on AIT and ability to reach intended goal if planned as BTT or BTVAD. We also evaluated recurrent heart failure hospitalizations, incidence of ventricular arrhythmias (VT/VF) and indwelling line infections. Unintended consequences of AIT, such reaching unintended end point (e.g. VAD implantation in BTT patient) or worse than expected outcome after LVAD or HT, were recorded. RESULTS: Mean age of the cohort was 60.7 ± 13.2 years, 71% male, with Class III-IV heart failure (56% non-ischemic). Average ejection fraction was 19.4 ± 10.2%, pre-AIT cardiac index was 1.5 ± 0.4 L/min/m2 and 24% had prior ventricular arrhythmias. Overall on-AIT 1-year survival was 83%. Hospitalizations occurred in 51.9% (125) of patients a total of 174 times for worsening heart failure, line complication or ventricular arrhythmia. In the BTT cohort, only 42% were transplanted by the end of follow-up, with a 14.8% risk of death or delisting for clinical deterioration. For the patients who were transplanted, 1-year post HT survival was 96.7%. In the BTVAD cohort, 1-year survival after LVAD was 90%, but with 61.7% of patients undergoing LVAD as INTERMACS 1-2. In the palliative care cohort, only 24.5% of patients had a formal palliative care consult prior to AIT. CONCLUSIONS: AIT is a strategy to discharge advanced heart failure patients from the hospital. It may be useful as bridge to transplant or ventricular assist device, but may be limited by complications such as hospitalizations, infections, and ventricular arrhythmias. Of particular note, it appears more challenging to bridge to transplant on AIT in the new allocation system. It is important to clarify the goals of AIT therapy upfront and continue to counsel patients on risks and benefits of the therapy itself and potential unintended consequences. Formalized, multi-disciplinary care planning is essential to clearly define individualized patient, as well as programmatic goals of AIT.
Subject(s)
Ambulatory Care , Cardiotonic Agents , Heart Failure , Tachycardia, Ventricular , Ambulatory Care/methods , Ambulatory Care/statistics & numerical data , Assisted Circulation/instrumentation , Assisted Circulation/methods , Cardiotonic Agents/administration & dosage , Cardiotonic Agents/adverse effects , Cardiotonic Agents/classification , Female , Heart Failure/diagnosis , Heart Failure/drug therapy , Heart Failure/mortality , Heart Failure/physiopathology , Heart Transplantation/methods , Hospitalization/statistics & numerical data , Humans , Intention to Treat Analysis , Male , Middle Aged , Palliative Care/methods , Patient Acuity , Patient Discharge , Risk Assessment , Severity of Illness Index , Stroke Volume , Survival Analysis , Tachycardia, Ventricular/etiology , Tachycardia, Ventricular/prevention & control , United States/epidemiologyABSTRACT
BACKGROUND: In the general population, increased aortic stiffness is associated with an increased risk of cardiovascular events. Previous studies have demonstrated an increase in aortic stiffness in patients with a continuous flow left ventricular assist device (CF-LVAD). However, the association between aortic stiffness and common adverse events is unknown. METHODS AND RESULTS: Forty patients with a HeartMate II (HMII) (51 $ 11 years; 20% female; 25% ischemic) implanted between January 2011 and September 2017 were included. Two-dimensional transthoracic echocardiograms of the ascending aorta, obtained before HMII placement and early after heart transplant, were analyzed to calculate the aortic stiffness index (AO-SI). The study cohort was divided into patients who had an increased vs decreased AO-SI after LVAD support. A composite outcome of gastrointestinal bleeding, stroke, and pump thrombosis was defined as the primary end point and compared between the groups. While median AO-SI increased significantly after HMII support (AO-SI 4.4-6.5, Pâ¯=â¯.012), 16 patients had a lower AO-SI. Patients with increased (nâ¯=â¯24) AO-SI had a significantly higher rate of the composite end point (58% vs 12%, odds ratio 9.8, P < .01). Similarly, those with increased AO-SI tended to be on LVAD support for a longer duration, had higher LVAD speed and reduced use of angiotensin-converting enzyme inhibitors or angiotensin II receptor blockers. CONCLUSIONS: Increased aortic stiffness in patients with a HMII is associated with a significantly higher rates of adverse events. Further studies are warranted to determine the causality between aortic stiffness and adverse events, as well as the effect of neurohormonal modulation on the conduit vasculature in patients with a CF-LVAD.
Subject(s)
Heart Failure , Heart-Assist Devices , Stroke , Thrombosis , Vascular Stiffness , Female , Gastrointestinal Hemorrhage/diagnosis , Gastrointestinal Hemorrhage/epidemiology , Gastrointestinal Hemorrhage/etiology , Heart Failure/epidemiology , Heart-Assist Devices/adverse effects , Humans , Male , Retrospective Studies , Stroke/epidemiology , Stroke/etiology , Thrombosis/diagnostic imaging , Thrombosis/epidemiology , Thrombosis/etiologyABSTRACT
BACKGROUND: Heart failure predisposes to intracardiac thrombus (ICT) formation. There are limited data on the prevalence and impact of preexisting ICT on postoperative outcomes in left ventricular assist device patients. We examined the risk for stroke and death in this patient population. METHODS AND RESULTS: We retrospectively studied patients who were implanted with HeartMate (HM) II or HM3 between February 2009 and March 2019. Preoperative transthoracic echocardiograms, intraoperative transesophageal echocardiograms and operative reports were reviewed to identify ICT. There were 525 patients with a left ventricular assist device (median age 60.6 years, 81.8% male, 372 HMII and 151 HM3) included in this analysis. An ICT was identified in 44 patients (8.4%). During the follow-up, 43 patients experienced a stroke and 55 died. After multivariable adjustment, presence of ICT increased the risk for the composite of stroke or death at 6-month (hazard ratio [HR] 1.82, 95% confidence interval [CI] 1.00-3.33, Pâ¯=â¯.049). Patients with ICT were also at higher risk for stroke (HR 2.45, 95% CI 1.14-5.28, Pâ¯=â¯.021) and death (HR 2.36, 95% CI 1.17-4.79 Pâ¯=â¯.016) at 6 months of follow-up. CONCLUSIONS: The presence of ICT is an independent predictor of stroke and death at 6 months after left ventricular assist device implantation. Additional studies are needed to help risk stratify and optimize the perioperative management of this patient population.
Subject(s)
Heart Failure , Heart-Assist Devices , Stroke , Thrombosis , Female , Heart Failure/epidemiology , Humans , Male , Middle Aged , Retrospective Studies , Stroke/epidemiology , Stroke/etiology , Thrombosis/diagnostic imaging , Thrombosis/epidemiology , Treatment OutcomeABSTRACT
BACKGROUND: Conditional survival (CS) is a dynamic method of survival analysis that provides an estimate of how an individual's future survival probability changes based on time post-transplant, individual characteristics, and post-transplant events. This study sought to provide post-transplant CS probabilities for heart transplant recipients based on different prognostic variables and provide a discussion tool for the providers and the patients. METHODS: Adult heart transplant recipients from January 1, 2004, through October 18, 2018, were identified in the UNOS registry. CS probabilities were calculated using data from Kaplan-Meier survival estimates. RESULTS: CS probability exceeded actuarial survival probability at all times post-transplant. Women had similar short-term, but greater long-term CS than men at all times post-transplant (10-year CS 1.8-11.5% greater [95% CI 1.2-12.9]). Patients with ECMO or a surgical BiVAD had decreased survival at the time of transplant, but their CS was indistinguishable from all others by 1-year post-transplant. Rejection and infection requiring hospitalization during the first year were associated with a persistently decreased CS probability. CONCLUSIONS: In this study, we report differential conditional survival outcomes based on time, patient characteristics, and clinical events post-transplant, providing a dynamic assessment of survival. The survival probabilities will better inform patients and clinicians of future outcomes.
Subject(s)
Heart Transplantation , Tissue and Organ Procurement , Adult , Female , Graft Rejection/etiology , Graft Survival , Humans , Kaplan-Meier Estimate , Male , Registries , Retrospective Studies , Survival Analysis , Treatment OutcomeABSTRACT
BACKGROUND: Bridge to transplantation (BTT) with left ventricular assist devices (LVADs) is a mainstay of therapy for heart failure in patients awaiting heart transplantation (HT). Criteria for HT listing do not differ between patients medically managed and those mechanically bridged to HT. The objectives of the present study were to evaluate the impact of BTT with LVAD on posttransplantation survival, to describe differences in causes of 1-year mortality in medically and mechanically bridged patients, and to evaluate differences in risk factors for 1-year mortality between those with and those without LVAD at the time of HT. METHODS: Using the United Network of Organ Sharing database, we identified 5486 adult, single-organ HT recipients transplanted between 2008 and 2015. Patients were propensity matched for likelihood of LVAD at the time of HT. Kaplan-Meier survival estimates were used to assess the impact of BTT on 1- and 5-year mortality. Logistic regression analysis was used to evaluate the odds ratio of 1-year mortality for patients BTT with LVAD compared with those with medical management across clinically significant variables at various thresholds. RESULTS: Early mortality was higher in mechanically bridged patients: 9.5% versus 7.2% mortality at 1 year (P<0.001). BTT patients incurred an increased risk of 1-year mortality with an estimated glomerular filtration rate of 40 to 60 mL·min-1·1.73 m-2 (odds ratio, 1.69; P=0.003) and <40 mL·min-1·1.73 m-2 (odds ratio, 2.16; P=0.005). A similar trend was seen in patients with a body mass index of 25 to 30 kg/m2 (odds ratio, 1.88; P=0.024) and >30 kg/m2 (odds ratio, 2.11; P<0.001). When patients were stratified by BTT status and the presence of risk factors, including age >60 years, estimated glomerular filtration rate <40 mL·min-1·1.73 m-2, and body mass index >30 kg/m2, there were significant differences in 1-year mortality between medium- and high-risk medically and mechanically bridged patients, with 1-year mortality in high-risk BTT patients at 17.6% compared with 10.4% in high-risk medically managed patients. CONCLUSIONS: Bridge to HT with LVAD, although necessary because of organ scarcity and capable of improving wait list survival, confers a significantly higher risk of early posttransplantation mortality. Patients bridged with mechanical support may require more careful consideration for transplant eligibility after LVAD placement.
Subject(s)
Heart Failure/therapy , Heart Transplantation/mortality , Heart-Assist Devices , Postoperative Complications/mortality , Cardiovascular Agents/therapeutic use , Comorbidity , Female , Follow-Up Studies , Glomerular Filtration Rate , Heart Failure/drug therapy , Heart Failure/surgery , Humans , Kaplan-Meier Estimate , Logistic Models , Male , Middle Aged , Obesity/epidemiology , Renal Insufficiency, Chronic/epidemiology , Renal Insufficiency, Chronic/physiopathology , Risk Factors , Treatment Outcome , Waiting ListsABSTRACT
BACKGROUND: Obesity remains a relative contraindication for heart transplantation, and hence, obese patients with advanced heart failure receive ventricular assist devices (VADs) either as a destination or "bridge to weight loss" strategy. However, impact of obesity on clinical outcomes after VAD implantation is largely unknown. We sought to determine the clinical outcomes of obese patients with body mass index (BMI) ≥ 35 kg/m2) following contemporary VAD implantation. METHODS: The Interagency Registry for Mechanically Assisted Circulatory Support (INTERMACS) registry was queried for patients who underwent VAD implantation. Patients were categorized into BMI groups based on World Health Organization classification. RESULTS: Of 17,095 patients, 2620 (15%) had a BMI ≥ 35 kg/m2. Obese patients were likely to be young, non-white, females with dilated cardiomyopathy and undergo device implantation as destination. Survival was similar amongst BMI groups (Pâ¯=â¯.058). Obese patients had significantly higher risk for infection (hazard ratio [HR]: 1.215; Pâ¯=â¯.001), device malfunction or thrombosis (HR: 1.323; P ≤ .001), cardiac arrhythmia (HR: 1.188; Pâ¯=â¯.001) and hospital readmissions (HR: 1.073; Pâ¯=â¯.022), but lower risk of bleeding (HR: 0.906; Pâ¯=â¯.018). Significant weight loss (≥10%) during VAD support was achieved only by a small proportion (18.6%) of patients with BMI ≥ 35 kg/m2. Significant weight loss rates observed in obese patients with VAD implantation as destination and bridge to transplant strategy were comparable. Obese patients with significant weight loss were more likely to undergo cardiac transplantation. Weight loss worsened bleeding risk without altering risk for infection, cardiac arrhythmia, and device complications. CONCLUSIONS: Obesity alone should not be considered a contraindication for VAD therapy in contemporary era. Given durability of heart transplantation, strategies should be developed to promote weight loss, which occurs infrequently in obese patients. Impact of weight loss on clinical outcome of obese patients warrants further investigation.
Subject(s)
Heart Failure , Heart Transplantation , Heart-Assist Devices , Female , Heart Failure/complications , Heart Failure/epidemiology , Heart Failure/therapy , Humans , Obesity/complications , Obesity/epidemiology , Registries , Retrospective Studies , Treatment OutcomeABSTRACT
BACKGROUND: Gastrointestinal bleeding (GIB) is a common complication of left ventricular assist device (LVAD) therapy accounting for frequent hospitalizations and high resource utilization. METHODS: We previously developed an endoscopic algorithm emphasizing upfront evaluation of the small bowel and minimizing low-yield procedures in LVAD recipients with GIB. We compared the diagnostic and therapeutic yield of endoscopy, health-care costs, and re-bleeding rates between conventional GIB management and our algorithm using chi-square, Fisher's exact test, Wilcoxon-Mann-Whitney, and Kaplan-Meier analysis. RESULTS: We identified 33 LVAD patients with GIB. Presentation was consistent with upper GIB in 20 (61%), lower GIB in 5 (15%), and occult GIB in 8 (24%) patients. Forty-one endoscopies localized a source in 23 (56%), resulting in 14 (34%) interventions. Algorithm implementation compared with our conventional cohort was associated with a 68% increase in endoscopic diagnostic yield (P< .01), a 113% increase in therapeutic yield (P= .01), a 27% reduction in the number of procedures per patient (P < .01), a 33% decrease in length of stay (P < .01), and an 18% reduction in estimated costs (P < .01). The same median number of red blood cell transfusions were used in the 2 cohorts, with no increase in re-bleeding events in the algorithm cohort (33.3%) compared with our conventional cohort (43.7%). CONCLUSIONS: Our endoscopic management algorithm for GIB in LVAD patients proved effective in reducing low-yield procedures, improving the diagnostic and therapeutic yield of endoscopy, and decreasing health-care resource utilization and costs, while not increasing the risk of a re-bleeding event.