ABSTRACT
BACKGROUND AND AIMS: Early identification of cardiac structural abnormalities indicative of heart failure is crucial to improving patient outcomes. Chest X-rays (CXRs) are routinely conducted on a broad population of patients, presenting an opportunity to build scalable screening tools for structural abnormalities indicative of Stage B or worse heart failure with deep learning methods. In this study, a model was developed to identify severe left ventricular hypertrophy (SLVH) and dilated left ventricle (DLV) using CXRs. METHODS: A total of 71 589 unique CXRs from 24 689 different patients completed within 1 year of echocardiograms were identified. Labels for SLVH, DLV, and a composite label indicating the presence of either were extracted from echocardiograms. A deep learning model was developed and evaluated using area under the receiver operating characteristic curve (AUROC). Performance was additionally validated on 8003 CXRs from an external site and compared against visual assessment by 15 board-certified radiologists. RESULTS: The model yielded an AUROC of 0.79 (0.76-0.81) for SLVH, 0.80 (0.77-0.84) for DLV, and 0.80 (0.78-0.83) for the composite label, with similar performance on an external data set. The model outperformed all 15 individual radiologists for predicting the composite label and achieved a sensitivity of 71% vs. 66% against the consensus vote across all radiologists at a fixed specificity of 73%. CONCLUSIONS: Deep learning analysis of CXRs can accurately detect the presence of certain structural abnormalities and may be useful in early identification of patients with LV hypertrophy and dilation. As a resource to promote further innovation, 71 589 CXRs with adjoining echocardiographic labels have been made publicly available.
Subject(s)
Deep Learning , Hypertrophy, Left Ventricular , Radiography, Thoracic , Humans , Hypertrophy, Left Ventricular/diagnostic imaging , Radiography, Thoracic/methods , Female , Male , Middle Aged , Echocardiography/methods , Aged , Heart Failure/diagnostic imaging , Heart Ventricles/diagnostic imaging , ROC CurveABSTRACT
BACKGROUND: Sarcopenia is underappreciated in advanced heart failure and is not routinely assessed. In patients receiving a left ventricular assist device, preoperative sarcopenia, defined by using computed-tomography (CT)-derived pectoralis muscle-area index (muscle area indexed to body-surface area), is an independent predictor of postoperative mortality. The association between preoperative sarcopenia and outcomes after heart transplant (HT) is unknown. OBJECTIVES: The primary aim of this study was to determine whether preoperative sarcopenia, diagnosed using the pectoralis muscle-area index, is an independent predictor of days alive and out of the hospital (DAOHs) post-transplant. METHODS: Patients who underwent HT between January, 2018, and June, 2022, with available preoperative chest CT scans were included. Sarcopenia was diagnosed as pectoralis muscle-area index in the lowest sex-specific tertile. The primary endpoint was DAOHs at 1 year post-transplant. RESULTS: The study included 169 patients. Patients with sarcopenia (nâ¯=â¯55) had fewer DAOHs compared to those without sarcopenia, with a median difference of 17 days (320 vs 337 days; Pâ¯=â¯0.004). Patients with sarcopenia had longer index hospitalizations and were also more likely to be discharged to a facility other than home. In a Poisson regression model, sarcopenia was a significant univariable and the strongest multivariable predictor of DAOHs at 1 year (parameter estimateâ¯=â¯-0.17, 95% CI -0.19 to -14; Pâ¯=â¯< 0.0001). CONCLUSIONS: Preoperative sarcopenia, diagnosed using the pectoralis muscle-area index, is an independent predictor of poor outcomes after HT. This parameter is easily measurable from commonly obtained preoperative CT scans and may be considered in transplant evaluations.
ABSTRACT
BACKGROUND: Aortic regurgitation (AR) is a common complication following left ventricular assist device (LVAD) implantation. We evaluated the hemodynamic implications of AR in patients with HeartMate 3 (HM3) LVAD at baseline and in response to speed changes. METHODS AND RESULTS: Clinically stable outpatients supported by HM3 who underwent a routine hemodynamic ramp test were retrospectively enrolled in this analysis. Patients were stratified based on the presence of at least mild AR at baseline speed. Hemodynamic and echocardiographic parameters were compared between the AR and non-AR groups. Sixty-two patients were identified. At the baseline LVAD speed, 29 patients (47%) had AR, while 33 patients (53%) did not. Patients with AR were older and supported on HM3 for a longer duration. At baseline speed, all hemodynamic parameters were similar between the groups including central venous pressure, pulmonary capillary wedge pressure, pulmonary arterial pressures, cardiac output and index, and pulmonary artery pulsatility index (p > 0.05 for all). During the subacute assessment, AR worsened in some, but not all, patients, with increases in LVAD speed. There were no significant differences in 1-year mortality or hospitalization rates between the groups, however, at 1-year, ≥ moderate AR and right ventricular failure (RVF) were detected in higher rates among the AR group compared to the non-AR group (45% vs. 0%; p < 0.01, and 75% vs. 36.8%; p = 0.02, respectively). CONCLUSIONS: In a cohort of stable outpatients supported with HM3 who underwent a routine hemodynamic ramp test, the presence of mild or greater AR did not impact the ability of HM3 LVADs to effectively unload the left ventricle during early subacute assessment. Although the presence of AR did not affect mortality and hospitalization rates, it resulted in higher rates of late hemodynamic-related events in the form of progressive AR and RVF.
Subject(s)
Aortic Valve Insufficiency , Heart Failure , Heart-Assist Devices , Humans , Retrospective Studies , Heart Failure/diagnosis , Heart Failure/surgery , Heart-Assist Devices/adverse effects , Aortic Valve Insufficiency/diagnosis , Aortic Valve Insufficiency/etiology , Hemodynamics/physiologyABSTRACT
BACKGROUND: A novel implantable sensor has been designed to measure the inferior vena cava (IVC) area accurately so as to allow daily monitoring of the IVC area and collapse to predict congestion in heart failure (HF). METHODS: A prospective, multicenter, single-arm, Early Feasibility Study enrolled 15 patients with HF (irrespective of ejection fraction) and with an HF event in the previous 12 months, an elevated NT-proBNP level, and receiving ≥ 40 mg of furosemide equivalent. Primary endpoints included successful deployment without procedure-related (30 days) or sensor-related complications (3 months) and successful data transmission to a secure database (3 months). Accuracy of sensor-derived IVC area, patient adherence, NYHA classification, and KCCQ were assessed from baseline to 3 months. Patient-specific signal alterations were correlated with clinical presentation to guide interventions. RESULTS: Fifteen patients underwent implantation: 66 ± 12 years; 47% female; 27% with HFpEF, NT-ProBNP levels 2569 (median, IQR: 1674-5187, ng/L; 87% NYHA class III). All patients met the primary safety and effectiveness endpoints. Sensor-derived IVC areas showed excellent agreement with concurrent computed tomography (R2â¯=â¯0.99, mean absolute errorâ¯=â¯11.15 mm2). Median adherence to daily readings was 98% (IQR: 86%-100%) per patient-month. A significant improvement was seen in NYHA class and a nonsignificant improvement was observed in KCCQ. CONCLUSIONS: Implantation of a novel IVC sensor (FIRE1) was feasible, uncomplicated and safe. Sensor outputs aligned with clinical presentations and improvements in clinical outcomes. Future investigation to establish the IVC sensor remote management of HF is strongly warranted.
ABSTRACT
BACKGROUND: The relationship between age of a heart transplant (HT) program and outcomes has not been explored. METHODS: We performed a retrospective cohort analysis of the United Network for Organ Sharing database of all adult HTs between 2009 and 2019. For each patient, we created a variable that corresponded to program age: new (<5), developing (≥5 but <10) and established (≥10) years. RESULTS: Of 20 997 HTs, 822 were at new, 908 at developing, and 19 267 at established programs. Patients at new programs were significantly more likely to have history of cigarette smoking, ischemic cardiomyopathy, and prior sternotomy. These programs were less likely to accept organs from older donors and those with a history of hypertension or cigarette use. As compared to patients at new programs, transplant patients at established programs had less frequent rates of treated rejection during the index hospitalization (HR 0.43 [95% CI, 0.36-0.53] p < 0.001) and at 1 year (HR 0.58 [95% CI, 0.49-0.70], p < 0.001), less frequently required pacemaker implantations (HR 0.50 [95% CI, 0.36-0.69], p < 0.001), and less frequently required dialysis (HR 0.66 [95% CI, 0.53-0.82], p < 0.001). However, there were no significant differences in short- or long-term survival between the groups (log-rank p = 0.24). CONCLUSION: Patient and donor selection differed between new, developing, and established HT programs but had equivalent survival. New programs had increased likelihood of treated rejection, pacemaker implantation, and need for dialysis. Standardized post-transplant practices may help to minimize this variation and ensure optimal outcomes for all patients.
Subject(s)
Heart Transplantation , Humans , Heart Transplantation/mortality , Female , Male , Retrospective Studies , Middle Aged , Follow-Up Studies , Survival Rate , Adult , Prognosis , Tissue and Organ Procurement/statistics & numerical data , Graft Survival , Risk Factors , Graft Rejection/mortality , Graft Rejection/etiology , Postoperative Complications/mortality , Tissue Donors/supply & distribution , Age Factors , AgedABSTRACT
BACKGROUND: Among heart transplant (HT) recipients who develop advanced graft dysfunction, cardiac re-transplantation may be considered. A smaller subset of patients will experience failure of their second allograft and undergo repeat re-transplantation. Outcomes among these individuals are not well-described. METHODS: Adult and pediatric patients in the United Network for Organ Sharing (UNOS) registry who received HT between January 1, 1990 and December 31, 2020 were included. RESULTS: Between 1990 and 2020, 90 individuals received a third HT and three underwent a fourth HT. Recipients were younger than those undergoing primary HT (mean age 32 years). Third HT was associated with significantly higher unadjusted rates of 1-year mortality (18% for third HT vs. 13% for second HT vs. 9% for primary HT, p < .001) and 10-year mortality (59% for third HT vs. 42% for second HT vs. 37% for primary HT, p < .001). Mortality was highest amongst recipients aged >60 years and those re-transplanted for acute graft failure. Long-term rates of CAV, rejection, chronic dialysis, and hospitalization for infection were also higher. CONCLUSIONS: Third HT is associated with higher morbidity and mortality than primary HT. Further consensus is needed regarding appropriate organ stewardship for this unique subgroup.
Subject(s)
Heart Transplantation , Adult , Humans , Child , Risk Factors , Survival Rate , Transplantation, Homologous , Graft Rejection/etiology , Retrospective StudiesABSTRACT
Cardiac allograft vasculopathy (CAV) is a major cause of morbidity and mortality following heart transplantation (HT). Prior studies identified distinct CAV trajectories in the early post-HT period with unique predictors, but the evolution of CAV in later periods is not well-described. This study assessed the prevalence of late CAV progression and associated risk factors in HT recipients with ISHLT CAV 0/1 at 10 years post-HT. Consecutive adult patients who underwent HT from January 2000 to December 2008 were evaluated and grouped by CAV trajectories into progressors (developed ISHLT CAV 2/3) or nonprogressors (remained ISHLT CAV 0/1). A total of 130 patients were included with a median age at angiography of 61.7 years and a median follow-up time of 4.8 years. 8.5% progressed to CAV 2/3, while the remaining 91.5% were nonprogressors. Progression was not associated with death or retransplantation (27.3% [progressor] vs. 21.0% [nonprogressor], p = 0.70). These data may inform shared decision-making about late CAV screening.
Subject(s)
Disease Progression , Heart Transplantation , Postoperative Complications , Humans , Female , Male , Middle Aged , Follow-Up Studies , Heart Transplantation/adverse effects , Heart Transplantation/mortality , Risk Factors , Prognosis , Retrospective Studies , Graft Survival , Survival Rate , Graft Rejection/etiology , Coronary Artery Disease/surgery , Coronary Artery Disease/etiology , Adult , AgedABSTRACT
BACKGROUND: Since the 2018 allocation system change in heart transplantation (HT), ischemic times have increased, which may be associated with peri-operative and post-operative complications. This study aimed to compare ischemia reperfusion injury (IRI) in hearts preserved using ice-cold storage (ICS) and the Paragonix SherpaPak TM Cardiac Transport System (CTS). METHODS: From January 2021 to June 2022, consecutive endomyocardial biopsies from 90 HT recipients were analyzed by a cardiac pathologist in a single-blinded manner: 33 ICS and 57 CTS. Endomyocardial biopsies were performed at three-time intervals post-HT, and the severity of IRI manifesting histologically as coagulative myocyte necrosis (CMN) was evaluated, along with graft rejection and graft function. RESULTS: The incidence of IRI at weeks 1, 4, and 8 post-HT were similar between the ICS and CTS groups. There was a 59.3% statistically significant reduction in CMN from week 1 to 4 with CTS, but not with ICS. By week 8, there were significant reductions in CMN in both groups. Only 1 out of 33 (3%) patients in the ICS group had an ischemic time >240 mins, compared to 10 out of 52 (19%) patients in the CTS group. During the follow-up period of 8 weeks to 12 months, there were no significant differences in rejection rates, formation of de novo donor-specific antibodies and overall survival between the groups. CONCLUSION: The CTS preservation system had similar rates of IRI and clinical outcomes compared to ICS despite longer overall ischemic times. There is significantly more recovery of IRI in the early post operative period with CTS. This study supports CTS as a viable option for preservation from remote locations, expanding the donor pool.
Subject(s)
Graft Rejection , Graft Survival , Heart Transplantation , Organ Preservation , Humans , Heart Transplantation/adverse effects , Male , Female , Organ Preservation/methods , Middle Aged , Follow-Up Studies , Graft Rejection/etiology , Graft Rejection/pathology , Prognosis , Adult , Reperfusion Injury/etiology , Reperfusion Injury/pathology , Cryopreservation/methods , Tissue Donors/supply & distribution , Postoperative Complications , Retrospective StudiesABSTRACT
BACKGROUND: The use of glucagon-like-peptide 1 receptor agonists (GLP1-RA) has dramatically increased over the past 5 years for diabetes mellitus type 2 (T2DM) and obesity. These comorbidities are prevalent in adult heart transplant (HT) recipients. However, there are limited data evaluating the efficacy of this drug class in this population. The aim of the current study was to describe cardiometabolic changes in HT recipients prescribed GLP1-RA at a large-volume transplant center. METHODS: We retrospectively reviewed all adult HT recipients who received GLP1-RA after HT for a minimum of 1-month. Cardiometabolic parameters including body mass index (BMI), lipid panel, hemoglobin A1C, estimated glomerular filtration rate (eGFR), and NT-proBNP were compared prior to initiation of the drug and at most recent follow-up. We also evaluated for significant dose adjustments to immunosuppression after drug initiation and adverse effects leading to drug discontinuation. RESULTS: Seventy-four patients were included (28% female, 53% White, 20% Hispanic) and followed for a median of 383 days [IQR 209, 613] on a GLP1-RA. The majority of patients (n = 56, 76%) were prescribed semaglutide. The most common indication for prescription was T2DM alone (n = 33, 45%), followed by combined T2DM and obesity (n = 26, 35%). At most recent follow-up, mean BMI decreased from 33.3 to 31.5 kg/m2 (p < 0.0001), HbA1C from 7.3% to 6.7% (p = 0.005), LDL from 78.6 to 70.3 mg/dL (p = 0.018) and basal insulin daily dose from 32.6 to 24.8 units (p = 0.0002). CONCLUSION: HT recipients prescribed GLP1-RA therapy showed improved glycemic control, weight loss, and cholesterol levels during the study follow-up period. GLP1-RA were well tolerated and were rarely associated with changes in immunosuppression dosing.
Subject(s)
Glucagon-Like Peptide-1 Receptor , Heart Transplantation , Humans , Female , Male , Retrospective Studies , Middle Aged , Glucagon-Like Peptide-1 Receptor/agonists , Heart Transplantation/adverse effects , Follow-Up Studies , Prognosis , Diabetes Mellitus, Type 2/drug therapy , Glomerular Filtration Rate , Hypoglycemic Agents/therapeutic use , Kidney Function Tests , Adult , Postoperative Complications/drug therapy , Graft Rejection/etiology , Graft Rejection/prevention & control , Graft Rejection/drug therapy , Glucagon-Like Peptide-1 Receptor AgonistsABSTRACT
BACKGROUND: There are limited data evaluating the success of a structured transition plan specifically for pediatric heart transplant (HT) recipients following their transfer of care to an adult specialist. We sought to identify risk factors for poor adherence, graft failure, and mortality following the transfer of care to adult HT care teams. METHODS: We retrospectively reviewed all patients who underwent transition from the pediatric to adult HT program at our center between January 2011 and June 2021. Demographic characteristics, comorbid conditions, and psychosocial history were collected at the time of HT, the time of transition, and the most recent follow-up. Adverse events including mortality, graft rejection, infection, and renal function were also captured before and after the transition. RESULTS: Seventy-two patients were identified (54.1% male, 54.2% Caucasian). Mean age at the time of transition was 23 years after a median of 11.6 years in the pediatric program. The use of calcineurin inhibitors was associated with reduced mortality (HR .04, 95% CI .0-.6, p = .015), while prior psychiatric hospitalization (HR 45.3, 95% CI, 6.144-333.9, p = .0001) was associated with increased mortality following transition. Medication nonadherence and young age at the time of transition were markers for high-risk individuals prior to the transition of care. CONCLUSIONS: Transition of HT recipients from a pediatric program to an adult program occurs during a vulnerable time of emerging adulthood, and we have identified risk factors for mortality following transition. Development of a formalized transition plan with a large multidisciplinary team with focused attention on high-risk patients, including those with psychiatric comorbidities, may favorably influence outcomes.
Subject(s)
Heart Transplantation , Medication Adherence , Adult , Humans , Child , Male , Female , Retrospective Studies , Risk Factors , Graft Rejection/etiology , Transplant Recipients , Patient Care TeamABSTRACT
BACKGROUND: Donor-derived cell-free DNA (dd-cfDNA) has emerged as a reliable, noninvasive method for the surveillance of allograft rejection in heart transplantation (HT) patients, but its utility in multi-organ transplants (MOT) is unknown. We describe our experience using dd-cfDNA in simultaneous MOT recipients. METHODS: A single-center retrospective review of all HT recipients between 2018 and 2022 that had at least one measurement of dd-cfDNA collected. Patients who had simultaneous MOT were identified and included in this study. Levels of dd-cfDNA were paired with endomyocardial biopsies (EMB) performed within 1 month of blood testing if available. Acute cellular rejection (ACR) was defined as ISHLT (International Society for Heart and Lung Transplantation) grade ≥ 2R. and antibody-mediated rejection (AMR) was defined as pAMR grade > 0. The within-patient variability score of the dd-cfDNA was calculated by the variance/average. RESULTS: The study included 25 multiorgan transplant recipients: 13 heart-kidney (H-K), 8 heart-liver (H-Li), and 4 heart-lung (H-Lu). The median age was 55 years, 44% were female; the median time from HT until the first dd-cfDNA measurement was 4.5 months (IQR 2, 10.5). The median dd-cfDNA level was 0.18% (IQR 0.15%, 0.27%) for H-K, 1.15% (IQR 0.77%, 2.33%) for H-Li, and 0.69% (IQR 0.62%, 1.07%) for H-Lu patients (p < 0.001). Prevalence of positive dd-cfDNA tests (threshold of 0.20%) were 42.2%, 97.3%, and 92.3% in the H-K, H-Li, and H-Lu groups, respectively. The within-patient variability score was highest in the H-Li group (median of 0.45 [IQR 0.29, 0.94]) and lowest in the H-K group (median of 0.09 [IQR 0.06, 0.12]); p = 0.002. No evidence of cardiac ACR or AMR was found. Three patients experienced renal allograft ACR and/or AMR, two patients experienced rejection of the liver allograft, and one patient experienced an episode of AMR-mediated lung rejection. One person in the H-K group experienced an episode of cardiac allograft dysfunction that was not associated with biopsy-confirmed rejection. CONCLUSION: Dd-cfDNA is chronically elevated in most MOT recipients. There is a high degree of within-patient variability in levels (particularly for H-Li and H-Lu recipients), which may limit the utility of this assay in monitoring MOT recipients.
Subject(s)
Cell-Free Nucleic Acids , Graft Rejection , Heart Transplantation , Tissue Donors , Humans , Female , Cell-Free Nucleic Acids/blood , Male , Retrospective Studies , Middle Aged , Heart Transplantation/adverse effects , Graft Rejection/diagnosis , Graft Rejection/etiology , Graft Rejection/blood , Follow-Up Studies , Prognosis , Organ Transplantation/adverse effects , Graft Survival , Biomarkers/blood , Transplant Recipients , Risk Factors , AdultABSTRACT
BACKGROUND: Belatacept (BTC), a fusion protein, selectively inhibits T-cell co-stimulation by binding to the CD80 and CD86 receptors on antigen-presenting cells (APCs) and has been used as immunosuppression in adult renal transplant recipients. However, data regarding its use in heart transplant (HT) recipients are limited. This retrospective cohort study aimed to delineate BTC's application in HT, focusing on efficacy, safety, and associated complications at a high-volume HT center. METHODS: A retrospective cohort study was conducted of patients who underwent HT between January 2017 and December 2021 and subsequently received BTC as part of their immunosuppressive regimen. Twenty-one HT recipients were identified. Baseline characteristics, history of rejection, and indication for BTC use were collected. Outcomes included renal function, graft function, allograft rejection and mortality. Follow-up data were collected through December 2023. RESULTS: Among 776 patients monitored from January 2017 to December 2021 21 (2.7%) received BTC treatment. Average age at transplantation was 53 years (± 12 years), and 38% were women. BTC administration began, on average, 689 [483, 1830] days post-HT. The primary indications for BTC were elevated pre-formed donor-specific antibodies in highly sensitized patients (66.6%) and renal sparing (23.8%), in conjunction with reduced calcineurin inhibitor dosage. Only one (4.8%) patient encountered rejection within a year of starting BTC. Graft function by echocardiography remained stable at 6 and 12 months posttreatment. An improvement was observed in serum creatinine levels (76.2% of patients), decreasing from a median of 1.58 to 1.45 (IQR [1.0-2.1] to [1.1-1.9]) over 12 months (p = .054). eGFR improved at 3 and 6 months compared with 3 months pre- BTC levels; however, this was not statistically significant (p = .24). Treatment discontinuation occurred in seven patients (33.3%) of whom four (19%) were switched back to full dose CNI. Infections occurred in 11 patients (52.4%), leading to BTC discontinuation in 4 patients (19%). CONCLUSION: In this cohort, BTC therapy was used as alternative immunosuppression for management of highly sensitized patients or for renal sparing. BTC therapy when combined with CNI dose reduction resulted in stabilization in renal function as measured through renal surrogate markers, which did not, however, reach statistical significance. Patients on BTC maintained a low rejection rate and preserved graft function. Infections were common during BTC therapy and were associated with medication pause/discontinuation in 19% of patients. Further randomized studies are needed to assess the efficacy and safety of BTC in HT recipients.
Subject(s)
Heart Transplantation , Kidney Transplantation , Adult , Humans , Female , Middle Aged , Male , Abatacept , Retrospective Studies , Kidney Transplantation/adverse effects , Immunosuppressive Agents , Calcineurin Inhibitors/therapeutic use , T-Lymphocytes , Graft Rejection/drug therapy , Graft Rejection/etiology , Transplant Recipients , Graft SurvivalABSTRACT
BACKGROUND: Cardiomyopathies account for more than half of the cardiovascular disease during the peripartum period. In the extreme, patients may present with cardiogenic shock (CS) requiring mechanical circulatory support (MCS). The aim of this study was to report our experience with CS requiring MCS in the peripartum period. METHODS: We present a single-center retrospective analysis of all CS cases involving MCS during the peripartum period that occurred between 2012 and 2023. RESULTS: Eleven cases were included. Median age was 33, median BMI was 30.4, and 73% underwent a caesarian-section for delivery. CS presentation occurred in 36.4% during pregnancy and in 63.6% after delivery. Most patients were in Society for Cardiovascular Angiography & Interventions (SCAI) Stage C shock and in 37% the suspected etiology was peripartum cardiomyopathy. MCS usage included intra-aortic balloon pump (4), Impella microaxial blood pump (2), veno-arterial extracorporeal membrane oxygenation (6), and temporary right ventricle assist devices (2), with some patients having multiple MCS devices. The rate of major complications was 36.4%. During a median follow-up of 4.5 years, 7 patients had sustained cardiac recovery (63.6%), 1 patient (9.1%) underwent cardiac transplantation, 2 patients (18.2%) received a durable LVAD, and 2 (18.2%) have died. CONCLUSION: MCS in severe CS cases during the peripartum period is rare and associated with favorable outcomes. High recovery rates suggest favoring first MCS/LVAD over transplant.
ABSTRACT
BACKGROUND: Pre-left ventricular assist device (LVAD) pectoralis muscle assessment, an estimate of sarcopenia, has been associated with postoperative mortality and gastrointestinal bleeding, though its association with inflammation, endotoxemia, length-of-stay (LOS), and readmissions remains underexplored. METHODS: This was a single-center cohort study of LVAD patients implanted 1/2015-10/2018. Preoperative pectoralis muscle area was measured on chest computed tomography (CT), adjusted for height squared to derive pectoralis muscle area index (PMI). Those with PMI in the lowest quintile were defined as low-PMI cohort; all others constituted the reference cohort. Biomarkers of inflammation (interleukin-6, adiponectin, tumor necrosis factor-α [TNFα]) and endotoxemia (soluble (s)CD14) were measured in a subset of patients. RESULTS: Of the 254 LVAD patients, 95 had a preoperative chest CT (median days pre-LVAD: 7 [IQR 3-13]), of whom 19 (20.0%) were in the low-PMI cohort and the remainder were in the reference cohort. Compared with the reference cohort, the low-PMI cohort had higher levels of sCD14 (2594 vs. 1850 ng/mL; p = 0.04) and TNFα (2.9 vs. 1.9 pg/mL; p = 0.03). In adjusted analyses, the low-PMI cohort had longer LOS (incidence rate ratio 1.56 [95% confidence interval 1.16-2.10], p = 0.004) and higher risk of 90-day and 1-year readmissions (subhazard ratio 5.48 [1.88-16.0], p = 0.002; hazard ratio 1.73 [1.02-2.94]; p = 0.04, respectively). CONCLUSIONS: Pre-LVAD PMI is associated with inflammation, endotoxemia, and increased LOS and readmissions.
ABSTRACT
BACKGROUND: Hospital readmissions following left ventricular assist device (LVAD) remain a frequent comorbidity, associated with decreased quality of life and increased resources utilization. This study sought to determine causes, predictors, and impact on survival of hospitalizations during HeartMate 3 (HM3) support. METHODS: All patients implanted with HM3 between November 2014 to December 2019 at Columbia University Irving Medical Center were consecutively enrolled in the study. Demographics and clinical characteristics from the index admission and the first outpatient visit were collected and used to estimate 1-year and 900-day readmission-free survival and overall survival. Multivariable analysis was performed for subsequent readmissions. RESULTS: Of 182 patients who received a HM3 LVAD, 167 (92%) were discharged after index admission and experienced 407 unplanned readmissions over the median follow up of 727 (interquartile range (IQR): 410.5, 1124.5) days. One-year and 900-day mean cumulative number of all-cause unplanned readmissions was 0.43 (95%CI, 0.36, 0.51) and 1.13 (95%CI, 0.99, 1.29). The most frequent causes of rehospitalizations included major infections (29.3%), bleeding (13.2%), device-related (12.5%), volume overload (7.1%), and other (28%). One-year and 900-day survival free from all-cause readmission was 38% (95%CI, 31-46%) and 16.6% (95%CI, 10.3-24.4%). One-year and 900-day freedom from 2, 3, and ≥4 readmissions were 60.7%, 74%, 74.5% and 26.2%, 33.3%, 41.3%. One-year and 900-day survival were unaffected by the number of readmissions and remained >90%. Male sex, ischemic etiology, diabetes, lower serum creatinine, longer duration of index hospitalization, and a history of readmission between discharge and the first outpatient visit were associated with subsequent readmissions. CONCLUSIONS: Unplanned hospital readmissions after HM3 are common, with infections and bleeding accounting for the majority of readmissions. Irrespective of the number of readmissions, one-year survival remained unaffected.
Subject(s)
Heart Failure , Heart-Assist Devices , Patient Readmission , Humans , Patient Readmission/statistics & numerical data , Male , Female , Heart-Assist Devices/adverse effects , Middle Aged , Aged , Heart Failure/mortality , Heart Failure/therapy , Retrospective Studies , Adult , Risk Factors , Postoperative Complications/epidemiology , Postoperative Complications/etiology , Postoperative Complications/mortality , Quality of LifeABSTRACT
BACKGROUND: No clear guidelines exist for perioperative anticoagulation management after durable left ventricular assist device insertion. In this study, we sought to compare outcomes between anti-factor Xa (FXa) and activated partial thromboplastin time (aPTT) in monitoring unfractionated heparin (UFH) dosing after HeartMate 3 (HM3) insertion. METHODS: This is a single-center retrospective review of patients who received UFH after HM3 insertion between 01/2020-12/2022. Post-operative UFH dose was titrated by aPTT goal 45-60 sec (n = 53) or FXa goal 0.1-0.2 U/mL (n = 59). Baseline differences between cohorts were balanced by inverse probability treatment weighting. RESULTS: At baseline, unadjusted FXa patients were more likely to be white (47.5% vs. 35.8%, p < 0.001), INTERMACS 1-2 (69.5% vs. 47.2%, p = 0.013), have history of coronary artery disease (66.1% vs. 43.4%, p = 0.026), and lower eGFR (54.1 vs. 63.7 mL/min/1.73 m2, p = 0.029) compared to the aPTT group. After adjusting for several bleeding/thrombosis risk factors, 97.5% of FXa and 91.0% of aPTT patients reached therapeutic levels with comparable UFH duration and maximum dose. Moreover, in-hospital mortality (2.5% vs. 3.1%, p = 0.842), major bleeding events (4.2% vs. 9.2%, p = 0.360), and thromboembolic events (21.8% vs. 10.1%, p = 0.151) remained without significant differences between FXa and aPTT cohorts. There was a high degree of variability in FXa (r2 = 0.20) and aPTT (r2 = 0.22) values for any given UFH dose. CONCLUSIONS: No differences in frequency of bleeding or thromboembolic events were observed in this study between FXa versus aPTT cohorts after HM3 implantation. More longitudinal studies are warranted to determine whether or not one assay is superior to the other.
ABSTRACT
Cardiac allograft vasculopathy (CAV) is a leading cause of late graft failure and mortality after heart transplantation (HT). Sharing some features with atherosclerosis, CAV results in diffuse narrowing of the epicardial coronaries and microvasculature, with consequent graft ischemia. Recently, clonal hematopoiesis of indeterminate potential (CHIP) has emerged as a risk factor for cardiovascular disease and mortality. We aimed to investigate the relationship between CHIP and posttransplant outcomes, including CAV. We analyzed 479 HT recipients with stored DNA samples at 2 high-volume transplant centers, Vanderbilt University Medical Center and Columbia University Irving Medical Center. We explored the association between the presence of CHIP mutations with CAV and mortality after HT. In this case-control analysis, carriers of CHIP mutations were not at increased risk of CAV or mortality after HT. In a large multicenter genomics study of the heart transplant population, the presence of CHIP mutations was not associated with an increased risk of CAV or posttransplant mortality.
Subject(s)
Heart Diseases , Heart Transplantation , Vascular Diseases , Humans , Clonal Hematopoiesis/genetics , Heart Transplantation/adverse effects , Vascular Diseases/etiology , Risk Factors , AllograftsABSTRACT
RATIONALE & OBJECTIVE: The clinical implications of the discrepancy between cystatin C (cysC)- and serum creatinine (Scr)-estimated glomerular filtration rate (eGFR) in patients with heart failure (HF) and reduced ejection fraction (HFrEF) are unknown. STUDY DESIGN: Post-hoc analysis of randomized trial data. SETTING & PARTICIPANTS: 1,970 patients with HFrEF enrolled in PARADIGM-HF with available baseline cysC and Scr measurements. EXPOSURE: Intraindividual differences between eGFR based on cysC (eGFRcysC) and Scr (eGFRScr; eGFRdiffcysC-Scr). OUTCOMES: Clinical outcomes included the PARADIGM-HF primary end point (composite of cardiovascular [CV] mortality or HF hospitalization), CV mortality, all-cause mortality, and worsening kidney function. We also examined poor health-related quality of life (HRQoL), frailty, and worsening HF (WHF), defined as HF hospitalization, emergency department visit, or outpatient intensification of therapy between baseline and 8-month follow-up. ANALYTICAL APPROACH: Fine-Gray subdistribution hazard models and Cox proportional hazards models were used to regress clinical outcomes on baseline eGFRdiffcysC-Scr. Logistic regression was used to investigate the association of baseline eGFRdiffcysC-Scr with poor HRQoL and frailty. Linear regression models were used to assess the association of WHF with eGFRcysC, eGFRScr, and eGFRdiffcysC-Scr at 8-month follow-up. RESULTS: Baseline eGFRdiffcysC-Scr was higher than +10 and lower than-10mL/min/1.73m2 in 13.0% and 35.7% of patients, respectively. More negative values of eGFRdiffcysC-Scr were associated with worse outcomes ([sub]hazard ratio per standard deviation: PARADIGM-HF primary end point, 1.18; P=0.008; CV mortality, 1.34; P=0.001; all-cause mortality, 1.39; P<0.001; worsening kidney function, 1.31; P=0.05). For a 1-standard-deviation decrease in eGFRdiffcysC-Scr, the prevalences of poor HRQoL and frailty increased by 29% and 17%, respectively (P≤0.008). WHF was associated with a more pronounced decrease in eGFRcysC than in eGFRScr, resulting in a change in 8-month eGFRdiffcysC-Scr of-4.67mL/min/1.73m2 (P<0.001). LIMITATIONS: Lack of gold-standard assessment of kidney function. CONCLUSIONS: In patients with HFrEF, discrepancies between eGFRcysC and eGFRScr are common and are associated with clinical outcomes, HRQoL, and frailty. The decline in kidney function associated with WHF is more marked when assessed with eGFRcysC than with eGFRScr. PLAIN-LANGUAGE SUMMARY: Kidney function assessment traditionally relies on serum creatinine (Scr) to establish an estimated glomerular filtration rate (eGFR). However, this has been challenged with the introduction of an alternative marker, cystatin C (cysC). Muscle mass and nutritional status have differential effects on eGFR based on cysC (eGFRcysC) and Scr (eGFRScr). Among ambulatory patients with heart failure enrolled in PARADIGM-HF, we investigated the clinical significance of the difference between eGFRcysC and eGFRScr. More negative values (ie, eGFRScr>eGFRcysC) were associated with worse clinical outcomes (including mortality), poor quality of life, and frailty. In patients with progressive heart failure, which is characterized by muscle loss and poor nutritional status, the decline in kidney function was more pronounced when eGFR was estimated using cysC rather than Scr.
ABSTRACT
Coronary allograft vasculopathy (CAV) continues to afflict a high number of heart transplant (HT) recipients, and elevated LDL is a key risk factor. Many patients cannot tolerate statin medications after HT; however, data for alternative agents remains scarce. To address this key evidence gap, we evaluated the safety and efficacy of the PCSK9i after HT through systematic review and meta-analysis. We searched Medline, Cochrane Central, and Scopus from the earliest date through July 15th, 2021. Citations were included if they were a report of PCSK9i use in adults after HT and reported an outcome of interest. Outcomes included change in LDL cholesterol from baseline, incidence of adverse events, and evidence of CAV. Changes from baseline and outcome incidences were pooled using contemporary random-effects model methodologies. A total of six studies including 97 patients were included. Over a mean follow-up of 13 months (range 3-21), PCSK9i use lowered LDL by 82.61 mg/dL (95% CI - 119.15 to - 46.07; I2 = 82%) from baseline. Serious adverse drug reactions were rarely reported, and none was attributable to the PCSK9i therapy. Four studies reported stable calcineurin inhibitor levels during PCSK9i initiation. One study reported outcomes in 33 patients with serial coronary angiography and intravascular ultrasound, and PCSK9i were associated with stable coronary plaque thickness and lumen area. One study reported on immunologic safety, showing no DSA development within 1 month of therapy. Preliminary data suggest that long-term PCSK9i therapy is safe, significantly lowers LDL, and may attenuate CAV after HT. Additional study on larger cohorts is warranted to confirm these findings.
Subject(s)
Heart Transplantation , Hydroxymethylglutaryl-CoA Reductase Inhibitors , PCSK9 Inhibitors , Adult , Humans , Hydroxymethylglutaryl-CoA Reductase Inhibitors/therapeutic use , PCSK9 Inhibitors/therapeutic use , Cholesterol, LDLABSTRACT
This review provides a comprehensive overview of the past 25+ years of research into the development of left ventricular assist device (LVAD) to improve clinical outcomes in patients with severe end-stage heart failure and basic insights gained into the biology of heart failure gleaned from studies of hearts and myocardium of patients undergoing LVAD support. Clinical aspects of contemporary LVAD therapy, including evolving device technology, overall mortality, and complications, are reviewed. We explain the hemodynamic effects of LVAD support and how these lead to ventricular unloading. This includes a detailed review of the structural, cellular, and molecular aspects of LVAD-associated reverse remodeling. Synergisms between LVAD support and medical therapies for heart failure related to reverse remodeling, remission, and recovery are discussed within the context of both clinical outcomes and fundamental effects on myocardial biology. The incidence, clinical implications and factors most likely to be associated with improved ventricular function and remission of the heart failure are reviewed. Finally, we discuss recognized impediments to achieving myocardial recovery in the vast majority of LVAD-supported hearts and their implications for future research aimed at improving the overall rates of recovery.