ABSTRACT
BACKGROUND: Aortic regurgitation (AR) is a common complication following left ventricular assist device (LVAD) implantation. We evaluated the hemodynamic implications of AR in patients with HeartMate 3 (HM3) LVAD at baseline and in response to speed changes. METHODS AND RESULTS: Clinically stable outpatients supported by HM3 who underwent a routine hemodynamic ramp test were retrospectively enrolled in this analysis. Patients were stratified based on the presence of at least mild AR at baseline speed. Hemodynamic and echocardiographic parameters were compared between the AR and non-AR groups. Sixty-two patients were identified. At the baseline LVAD speed, 29 patients (47%) had AR, while 33 patients (53%) did not. Patients with AR were older and supported on HM3 for a longer duration. At baseline speed, all hemodynamic parameters were similar between the groups including central venous pressure, pulmonary capillary wedge pressure, pulmonary arterial pressures, cardiac output and index, and pulmonary artery pulsatility index (p > 0.05 for all). During the subacute assessment, AR worsened in some, but not all, patients, with increases in LVAD speed. There were no significant differences in 1-year mortality or hospitalization rates between the groups, however, at 1-year, ≥ moderate AR and right ventricular failure (RVF) were detected in higher rates among the AR group compared to the non-AR group (45% vs. 0%; p < 0.01, and 75% vs. 36.8%; p = 0.02, respectively). CONCLUSIONS: In a cohort of stable outpatients supported with HM3 who underwent a routine hemodynamic ramp test, the presence of mild or greater AR did not impact the ability of HM3 LVADs to effectively unload the left ventricle during early subacute assessment. Although the presence of AR did not affect mortality and hospitalization rates, it resulted in higher rates of late hemodynamic-related events in the form of progressive AR and RVF.
Subject(s)
Aortic Valve Insufficiency , Heart Failure , Heart-Assist Devices , Humans , Retrospective Studies , Heart Failure/diagnosis , Heart Failure/surgery , Heart-Assist Devices/adverse effects , Aortic Valve Insufficiency/diagnosis , Aortic Valve Insufficiency/etiology , Hemodynamics/physiologyABSTRACT
BACKGROUND: The use of glucagon-like-peptide 1 receptor agonists (GLP1-RA) has dramatically increased over the past 5 years for diabetes mellitus type 2 (T2DM) and obesity. These comorbidities are prevalent in adult heart transplant (HT) recipients. However, there are limited data evaluating the efficacy of this drug class in this population. The aim of the current study was to describe cardiometabolic changes in HT recipients prescribed GLP1-RA at a large-volume transplant center. METHODS: We retrospectively reviewed all adult HT recipients who received GLP1-RA after HT for a minimum of 1-month. Cardiometabolic parameters including body mass index (BMI), lipid panel, hemoglobin A1C, estimated glomerular filtration rate (eGFR), and NT-proBNP were compared prior to initiation of the drug and at most recent follow-up. We also evaluated for significant dose adjustments to immunosuppression after drug initiation and adverse effects leading to drug discontinuation. RESULTS: Seventy-four patients were included (28% female, 53% White, 20% Hispanic) and followed for a median of 383 days [IQR 209, 613] on a GLP1-RA. The majority of patients (n = 56, 76%) were prescribed semaglutide. The most common indication for prescription was T2DM alone (n = 33, 45%), followed by combined T2DM and obesity (n = 26, 35%). At most recent follow-up, mean BMI decreased from 33.3 to 31.5 kg/m2 (p < 0.0001), HbA1C from 7.3% to 6.7% (p = 0.005), LDL from 78.6 to 70.3 mg/dL (p = 0.018) and basal insulin daily dose from 32.6 to 24.8 units (p = 0.0002). CONCLUSION: HT recipients prescribed GLP1-RA therapy showed improved glycemic control, weight loss, and cholesterol levels during the study follow-up period. GLP1-RA were well tolerated and were rarely associated with changes in immunosuppression dosing.
Subject(s)
Glucagon-Like Peptide-1 Receptor , Heart Transplantation , Humans , Female , Male , Retrospective Studies , Middle Aged , Glucagon-Like Peptide-1 Receptor/agonists , Heart Transplantation/adverse effects , Follow-Up Studies , Prognosis , Diabetes Mellitus, Type 2/drug therapy , Glomerular Filtration Rate , Hypoglycemic Agents/therapeutic use , Kidney Function Tests , Adult , Postoperative Complications/drug therapy , Graft Rejection/etiology , Graft Rejection/prevention & control , Graft Rejection/drug therapy , Glucagon-Like Peptide-1 Receptor AgonistsABSTRACT
BACKGROUND: There are limited data evaluating the success of a structured transition plan specifically for pediatric heart transplant (HT) recipients following their transfer of care to an adult specialist. We sought to identify risk factors for poor adherence, graft failure, and mortality following the transfer of care to adult HT care teams. METHODS: We retrospectively reviewed all patients who underwent transition from the pediatric to adult HT program at our center between January 2011 and June 2021. Demographic characteristics, comorbid conditions, and psychosocial history were collected at the time of HT, the time of transition, and the most recent follow-up. Adverse events including mortality, graft rejection, infection, and renal function were also captured before and after the transition. RESULTS: Seventy-two patients were identified (54.1% male, 54.2% Caucasian). Mean age at the time of transition was 23 years after a median of 11.6 years in the pediatric program. The use of calcineurin inhibitors was associated with reduced mortality (HR .04, 95% CI .0-.6, p = .015), while prior psychiatric hospitalization (HR 45.3, 95% CI, 6.144-333.9, p = .0001) was associated with increased mortality following transition. Medication nonadherence and young age at the time of transition were markers for high-risk individuals prior to the transition of care. CONCLUSIONS: Transition of HT recipients from a pediatric program to an adult program occurs during a vulnerable time of emerging adulthood, and we have identified risk factors for mortality following transition. Development of a formalized transition plan with a large multidisciplinary team with focused attention on high-risk patients, including those with psychiatric comorbidities, may favorably influence outcomes.
Subject(s)
Heart Transplantation , Medication Adherence , Adult , Humans , Child , Male , Female , Retrospective Studies , Risk Factors , Graft Rejection/etiology , Transplant Recipients , Patient Care TeamABSTRACT
BACKGROUND: Belatacept (BTC), a fusion protein, selectively inhibits T-cell co-stimulation by binding to the CD80 and CD86 receptors on antigen-presenting cells (APCs) and has been used as immunosuppression in adult renal transplant recipients. However, data regarding its use in heart transplant (HT) recipients are limited. This retrospective cohort study aimed to delineate BTC's application in HT, focusing on efficacy, safety, and associated complications at a high-volume HT center. METHODS: A retrospective cohort study was conducted of patients who underwent HT between January 2017 and December 2021 and subsequently received BTC as part of their immunosuppressive regimen. Twenty-one HT recipients were identified. Baseline characteristics, history of rejection, and indication for BTC use were collected. Outcomes included renal function, graft function, allograft rejection and mortality. Follow-up data were collected through December 2023. RESULTS: Among 776 patients monitored from January 2017 to December 2021 21 (2.7%) received BTC treatment. Average age at transplantation was 53 years (± 12 years), and 38% were women. BTC administration began, on average, 689 [483, 1830] days post-HT. The primary indications for BTC were elevated pre-formed donor-specific antibodies in highly sensitized patients (66.6%) and renal sparing (23.8%), in conjunction with reduced calcineurin inhibitor dosage. Only one (4.8%) patient encountered rejection within a year of starting BTC. Graft function by echocardiography remained stable at 6 and 12 months posttreatment. An improvement was observed in serum creatinine levels (76.2% of patients), decreasing from a median of 1.58 to 1.45 (IQR [1.0-2.1] to [1.1-1.9]) over 12 months (p = .054). eGFR improved at 3 and 6 months compared with 3 months pre- BTC levels; however, this was not statistically significant (p = .24). Treatment discontinuation occurred in seven patients (33.3%) of whom four (19%) were switched back to full dose CNI. Infections occurred in 11 patients (52.4%), leading to BTC discontinuation in 4 patients (19%). CONCLUSION: In this cohort, BTC therapy was used as alternative immunosuppression for management of highly sensitized patients or for renal sparing. BTC therapy when combined with CNI dose reduction resulted in stabilization in renal function as measured through renal surrogate markers, which did not, however, reach statistical significance. Patients on BTC maintained a low rejection rate and preserved graft function. Infections were common during BTC therapy and were associated with medication pause/discontinuation in 19% of patients. Further randomized studies are needed to assess the efficacy and safety of BTC in HT recipients.
Subject(s)
Heart Transplantation , Kidney Transplantation , Adult , Humans , Female , Middle Aged , Male , Abatacept , Retrospective Studies , Kidney Transplantation/adverse effects , Immunosuppressive Agents , Calcineurin Inhibitors/therapeutic use , T-Lymphocytes , Graft Rejection/drug therapy , Graft Rejection/etiology , Transplant Recipients , Graft SurvivalABSTRACT
Dual circulation is a common but underrecognized physiological occurrence associated with peripheral venoarterial extracorporeal membrane oxygenation (ECMO). Competitive flow will develop between blood ejected from the heart and blood travelling retrograde within the aorta from the ECMO reinfusion cannula. The intersection of these two competitive flows is referred to as the "mixing point". The location of this mixing point, which depends upon the relative strengths of the native and extracorporeal pumps, will determine which regions of the body are perfused with blood ejected from the left ventricle and which regions are perfused by reinfused blood from the ECMO circuit, effectively establishing dual circulations. Because gas exchange within these circulations is dictated by the native lungs and membrane lung, respectively, oxygenation and carbon dioxide removal may differ between regions-depending on how well gas exchange is preserved within each circulation-potentially leading to differential oxygenation or differential carbon dioxide, each of which may have important clinical implications. In this perspective, we address the identification and management of dual circulation and differential gas exchange through various clinical scenarios of venoarterial ECMO. Recognition of dual circulation, proper monitoring for differential gas exchange, and understanding the various strategies to resolve differential oxygenation and carbon dioxide may allow for more optimal patient management and improved clinical outcomes.
Subject(s)
Extracorporeal Membrane Oxygenation , Respiratory Insufficiency , Humans , Extracorporeal Membrane Oxygenation/adverse effects , Respiratory Insufficiency/etiology , Carbon Dioxide , Lung , HeartABSTRACT
BACKGROUND: This study examines the role of extracorporeal life support flow in the development of acute kidney injury in cardiogenic shock. METHODS: We performed a retrospective analysis of 465 patients placed on extracorporeal life support at our institution between January 2015 and December 2020 for cardiogenic shock. Flow index was calculated by dividing mean flow by body surface. Stages of acute kidney injury were determined according to Kidney Disease: Improving Global Outcomes (KDIGO) organization guidelines. RESULTS: There were 179 (38.5%) patients who developed acute kidney injury, 63.1% of which were classified as Stage 3--the only subgroup associated with 1-year mortality (hazard ratio = 2.03, p < .001). Risk of kidney injury increased up to a flow index of 1.6 L/min/m2, and kidney injury was more common among patients with flow index greater than 1.6 L/min/m2 (p = .034). Those with kidney injury had higher baseline lactate levels (4.4 vs 3.1, p = .04), and Stage 3 was associated wit higher baseline creatinine (p < .001). CONCLUSIONS: In our cohort, kidney injury was common and Stage 3 kidney injury was associated with worse outcomes compared to other stages. Low flow was not associated with increased risk of kidney injury. Elevated baseline lactate and creatinine among patients with acute kidney injury suggest underlying illness severity, rather than flow, may influence kidney injury risk.
ABSTRACT
OBJECTIVES: To describe hemodynamic efficacy and clinical outcomes of Impella percutaneous left ventricular assist device (pLVAD) in patients with cardiogenic shock (CS). BACKGROUND: Percutaneous LVADs are increasingly used in CS management. However, device-related outcomes and optimal utilization remain active areas of investigation. METHODS: All CS patients receiving pLVAD as mechanical circulatory support (MCS) between 2011 and 2017 were identified. Clinical characteristics and outcomes were analyzed. A multivariable logistic regression model was created to predict MCS escalation despite pLVAD. Outcomes were compared between early and late implantation. RESULTS: A total of 115 CS patients (mean age 63.6 ± 13.8 years; 69.6% male) receiving pLVAD as MCS were identified, the majority with CS secondary to acute myocardial infarction (AMI; 67.0%). Patients experienced significant cardiac output improvement (median 3.39 L/min to 3.90 L/min, p = .002) and pharmacological support reduction (median vasoactive-inotropic score [VIS] 25.4 to 16.4, p = .049). Placement of extracorporeal membrane oxygenation (ECMO) occurred in 48 (41.7%) of patients. Higher pre-pLVAD VIS was associated with subsequent MCS escalation in the entire cohort and AMI subgroup (OR 1.27 [95% CI 1.02-1.58], p = .034 and OR 1.72 [95% CI 1.04-2.86], p = .035, respectively). Complications were predominantly access site related (bleeding [9.6%], vascular injury [5.2%], and limb ischemia [2.6%]). In-hospital mortality was 57.4%, numerically greater survival was noted with earlier device implantation. CONCLUSIONS: Treatment with pLVAD for CS improved hemodynamic status but did not uniformly obviate MCS escalation. Mortality in CS remains high, though earlier device placement for appropriately selected patients may be beneficial.
Subject(s)
Heart-Assist Devices , Shock, Cardiogenic , Academic Medical Centers , Aged , Female , Humans , Male , Middle Aged , Retrospective Studies , Shock, Cardiogenic/diagnosis , Shock, Cardiogenic/etiology , Shock, Cardiogenic/therapy , Treatment OutcomeABSTRACT
BACKGROUND: Significant weight loss due to cardiac cachexia is an independent predictor of mortality in many heart failure (HF) clinical trials. The impact of significant weight loss while on the waitlist for heart transplant (HT) has yet to be studied with respect to post-transplant survival. METHODS: Adult HT recipients from 2010 to 2021 were identified in the UNOS registry. Patients who experienced an absolute weight change from the time of listing to transplant were included and classified into two groups by percent weight loss from time of listing to time of transplant using a cut-off of 10%. The primary endpoint was 1-year survival following HT. RESULTS: 5951 patients were included in the analysis, of whom 763 (13%) experienced ≥10% weight loss from the time of listing to transplant. Weight loss ≥ 10% was associated with reduced 1-year post-transplant survival (86.9% vs. 91.0%, long-rank p = .0003). Additionally, weight loss ≥ 10% was an independent predictor of 1-year mortality in a multivariable model adjusting for significant risk factors (adjusted HR 1.23, 95% CI 1.04-1.46). In secondary analyses, weight loss ≥ 10% was associated with reduced 1-year survival independent of hospitalized status at time of transplant as well as obesity status at listing (i.e., body mass index [BMI] < 30 kg/m2 and BMI ≥ 30 kg/m2 ). CONCLUSIONS: Preoperative weight loss ≥ 10% is associated with reduced survival in patients listed for HT. Nutrition interventions prior to transplant may prove beneficial in this population.
Subject(s)
Heart Failure , Heart Transplantation , Adult , Humans , Retrospective Studies , Obesity/epidemiology , Weight Loss , Waiting ListsABSTRACT
Coronavirus disease 2019 (COVID-19) is a global pandemic affecting 185 countries and >3 000 000 patients worldwide as of April 28, 2020. COVID-19 is caused by severe acute respiratory syndrome coronavirus 2, which invades cells through the angiotensin-converting enzyme 2 receptor. Among patients with COVID-19, there is a high prevalence of cardiovascular disease, and >7% of patients experience myocardial injury from the infection (22% of critically ill patients). Although angiotensin-converting enzyme 2 serves as the portal for infection, the role of angiotensin-converting enzyme inhibitors or angiotensin receptor blockers requires further investigation. COVID-19 poses a challenge for heart transplantation, affecting donor selection, immunosuppression, and posttransplant management. There are a number of promising therapies under active investigation to treat and prevent COVID-19.
Subject(s)
Betacoronavirus , Cardiovascular Diseases , Coronavirus Infections , Pandemics , Peptidyl-Dipeptidase A , Pneumonia, Viral , Angiotensin Receptor Antagonists/therapeutic use , Angiotensin-Converting Enzyme 2 , Angiotensin-Converting Enzyme Inhibitors/therapeutic use , COVID-19 , Cardiovascular Diseases/complications , Cardiovascular Diseases/enzymology , Coronavirus Infections/complications , Coronavirus Infections/drug therapy , Coronavirus Infections/enzymology , Coronavirus Infections/therapy , Coronavirus Infections/virology , Humans , Peptidyl-Dipeptidase A/metabolism , Pneumonia, Viral/complications , Pneumonia, Viral/enzymology , Pneumonia, Viral/therapy , Pneumonia, Viral/virology , Receptors, Virus/antagonists & inhibitors , Receptors, Virus/metabolism , SARS-CoV-2 , COVID-19 Drug TreatmentABSTRACT
Cardiogenic shock (CS) is a condition associated with high mortality rates in which prognostication is uncertain for a variety of reasons, including its myriad causes, its rapidly evolving clinical course and the plethora of established and emerging therapies for the condition. A number of validated risk scores are available for CS prognostication; however, many of these are tedious to use, are designed for application in a variety of populations and fail to incorporate contemporary hemodynamic parameters and contemporary mechanical circulatory support interventions that can affect outcomes. It is important to separate patients with CS who may recover with conservative pharmacological therapies from those in who may require advanced therapies to survive; it is equally important to identify quickly those who will succumb despite any therapy. An ideal risk-prediction model would balance incorporation of key hemodynamic parameters while still allowing dynamic use in multiple scenarios, from aiding with early decision making to device weaning. Herein, we discuss currently available CS risk scores, perform a detailed analysis of the variables in each of these scores that are most predictive of CS outcomes and explore a framework for the development of novel risk scores that consider emerging therapies and paradigms for this challenging clinical entity.
Subject(s)
Heart Failure , Shock, Cardiogenic , Hemodynamics , Humans , Risk Factors , Shock, Cardiogenic/diagnosis , Shock, Cardiogenic/therapyABSTRACT
BACKGROUND: Heart failure predisposes to intracardiac thrombus (ICT) formation. There are limited data on the prevalence and impact of preexisting ICT on postoperative outcomes in left ventricular assist device patients. We examined the risk for stroke and death in this patient population. METHODS AND RESULTS: We retrospectively studied patients who were implanted with HeartMate (HM) II or HM3 between February 2009 and March 2019. Preoperative transthoracic echocardiograms, intraoperative transesophageal echocardiograms and operative reports were reviewed to identify ICT. There were 525 patients with a left ventricular assist device (median age 60.6 years, 81.8% male, 372 HMII and 151 HM3) included in this analysis. An ICT was identified in 44 patients (8.4%). During the follow-up, 43 patients experienced a stroke and 55 died. After multivariable adjustment, presence of ICT increased the risk for the composite of stroke or death at 6-month (hazard ratio [HR] 1.82, 95% confidence interval [CI] 1.00-3.33, Pâ¯=â¯.049). Patients with ICT were also at higher risk for stroke (HR 2.45, 95% CI 1.14-5.28, Pâ¯=â¯.021) and death (HR 2.36, 95% CI 1.17-4.79 Pâ¯=â¯.016) at 6 months of follow-up. CONCLUSIONS: The presence of ICT is an independent predictor of stroke and death at 6 months after left ventricular assist device implantation. Additional studies are needed to help risk stratify and optimize the perioperative management of this patient population.
Subject(s)
Heart Failure , Heart-Assist Devices , Stroke , Thrombosis , Female , Heart Failure/epidemiology , Humans , Male , Middle Aged , Retrospective Studies , Stroke/epidemiology , Stroke/etiology , Thrombosis/diagnostic imaging , Thrombosis/epidemiology , Treatment OutcomeABSTRACT
BACKGROUND: Interventricular interaction, which refers to the impact of left ventricular (LV) function on right ventricular (RV) function and vice versa, has been implicated in the pathogenesis of RV failure in LV assist device (LVAD) recipients. We sought to understand more about interventricular interaction by quantifying changes in the RV systolic and diastolic function with varying LVAD speeds. METHODS AND RESULTS: Four patients (ages 22-69 years, 75% male, and 25% with ischemic cardiomyopathy) underwent a protocolized hemodynamic ramp test within 12 months of LVAD implantation where RV pressure-volume loops were recorded with a conductance catheter. The end-systolic PV relationship and end-diastolic PV relationship were compared using the V20 and V10 indices (volumes at which end-systolic PV relationship and end-diastolic PV relationship reach a pressure of 20 and 10 mm Hg, respectively). The ∆V20 and ∆V10 refer to the change in V20 and V10 from the minimum to maximum LVAD speeds. RV PV loops demonstrated variable changes in systolic and diastolic function with increasing LVAD speed. The end-systolic PV relationship changed in 1 patient (patient 2, ∆V20â¯=â¯23.5 mL), reflecting a decrease in systolic function with increased speed, and was unchanged in 3 patients (average ∆V20â¯=â¯7.4 mL). The end-diastolic PV relationship changed with increasing speed in 3 of 4 patients (average ∆V10â¯=â¯12.5 mL), indicating an increase in ventricular compliance, and remained unchanged in one participant (patient 1; ∆V10â¯=â¯4.0 mL). CONCLUSIONS: Interventricular interaction can improve RV compliance and impair systolic function, but the overall effect on RV performance in this pilot investigation is heterogeneous. Further research is required to understand which patient characteristics and hemodynamic parameters influence the net impact of interventricular interaction.
Subject(s)
Heart Failure , Heart-Assist Devices , Ventricular Dysfunction, Right , Adult , Aged , Female , Heart Failure/therapy , Heart Ventricles/diagnostic imaging , Humans , Male , Middle Aged , Ventricular Function, Right , Ventricular Pressure , Young AdultABSTRACT
BACKGROUND: Conditional survival (CS) is a dynamic method of survival analysis that provides an estimate of how an individual's future survival probability changes based on time post-transplant, individual characteristics, and post-transplant events. This study sought to provide post-transplant CS probabilities for heart transplant recipients based on different prognostic variables and provide a discussion tool for the providers and the patients. METHODS: Adult heart transplant recipients from January 1, 2004, through October 18, 2018, were identified in the UNOS registry. CS probabilities were calculated using data from Kaplan-Meier survival estimates. RESULTS: CS probability exceeded actuarial survival probability at all times post-transplant. Women had similar short-term, but greater long-term CS than men at all times post-transplant (10-year CS 1.8-11.5% greater [95% CI 1.2-12.9]). Patients with ECMO or a surgical BiVAD had decreased survival at the time of transplant, but their CS was indistinguishable from all others by 1-year post-transplant. Rejection and infection requiring hospitalization during the first year were associated with a persistently decreased CS probability. CONCLUSIONS: In this study, we report differential conditional survival outcomes based on time, patient characteristics, and clinical events post-transplant, providing a dynamic assessment of survival. The survival probabilities will better inform patients and clinicians of future outcomes.
Subject(s)
Heart Transplantation , Tissue and Organ Procurement , Adult , Female , Graft Rejection/etiology , Graft Survival , Humans , Kaplan-Meier Estimate , Male , Registries , Retrospective Studies , Survival Analysis , Treatment OutcomeABSTRACT
Heart transplantation is the gold standard therapeutic option for select patients with end-stage heart failure. Unfortunately, successful long-term outcomes of heart transplantation can be hindered by immune-mediated rejection of the cardiac allograft, specifically acute cellular rejection, antibody-mediated rejection, and cardiac allograft vasculopathy. Extracorporeal photopheresis is a cellular immunotherapy that involves the collection and treatment of white blood cells contained in the buffy coat with a photoactive psoralen compound, 8-methoxy psoralen, and subsequent irradiation with ultraviolet A light. This process is thought to cause DNA and RNA crosslinking, ultimately leading to cell destruction. The true mechanism of therapeutic action remains unknown. In the last three decades, extracorporeal photopheresis has shown promising results and is indicated for a variety of conditions. The American Society for Apheresis currently recommends the use of extracorporeal photopheresis for the treatment of cutaneous T-cell lymphoma, scleroderma, psoriasis, pemphigus vulgaris, atopic dermatitis, graft-versus-host disease, Crohn's disease, nephrogenic systemic fibrosis, and solid organ rejection in heart, lung, and liver transplantation. In this review, we aim to explore the proposed effects of extracorporeal photopheresis and to summarize published data on its use as a prophylactic and therapy in heart transplant rejection.
Subject(s)
Heart Transplantation , Lymphoma, T-Cell, Cutaneous , Photopheresis , Skin Neoplasms , Graft Rejection/etiology , Graft Rejection/prevention & control , Heart Transplantation/adverse effects , HumansABSTRACT
Coronavirus disease 2019 (COVID-19) may predispose patients to venous thromboembolism (VTE). Limited data are available on the utilization of the Pulmonary Embolism Response Team (PERT) in the setting of the COVID-19 global pandemic. We performed a single-center study to evaluate treatment, mortality, and bleeding outcomes in patients who received PERT consultations in March and April 2020, compared to historical controls from the same period in 2019. Clinical data were abstracted from the electronic medical record. The primary study endpoints were inpatient mortality and GUSTO moderate-to-severe bleeding. The frequency of PERT utilization was nearly threefold higher during March and April 2020 (n = 74) compared to the same period in 2019 (n = 26). During the COVID-19 pandemic, there was significantly less PERT-guided invasive treatment (5.5% vs 23.1%, p = 0.02) with a numerical but not statistically significant trend toward an increase in the use of systemic fibrinolytic therapy (13.5% vs 3.9%, p = 0.3). There were nonsignificant trends toward higher in-hospital mortality or moderate-to-severe bleeding in patients receiving PERT consultations during the COVID-19 period compared to historical controls (mortality 14.9% vs 3.9%, p = 0.18 and moderate-to-severe bleeding 35.1% vs 19.2%, p = 0.13). In conclusion, PERT utilization was nearly threefold higher during the COVID-19 pandemic than during the historical control period. Among patients evaluated by PERT, in-hospital mortality or moderate-to-severe bleeding were not significantly different, despite being numerically higher, while invasive therapy was utilized less frequently during the COVID-19 pandemic.
Subject(s)
COVID-19/therapy , Health Resources/trends , Health Services Needs and Demand/trends , Patient Care Team/trends , Practice Patterns, Physicians'/trends , Pulmonary Embolism/therapy , Thrombolytic Therapy/trends , Venous Thromboembolism/therapy , Adult , Aged , Aged, 80 and over , COVID-19/complications , COVID-19/diagnosis , COVID-19/mortality , Female , Hemorrhage/etiology , Hemorrhage/mortality , Hospital Mortality , Humans , Male , Middle Aged , Pulmonary Embolism/diagnosis , Pulmonary Embolism/etiology , Pulmonary Embolism/mortality , Retrospective Studies , Risk Assessment , Risk Factors , Time Factors , Treatment Outcome , Venous Thromboembolism/diagnosis , Venous Thromboembolism/etiology , Venous Thromboembolism/mortalityABSTRACT
BACKGROUND: Acute myocardial infarction with refractory cardiogenic shock (AMI-RCS) is associated with poor outcomes. Several percutaneous mechanical circulatory support devices exist; however, limitations exist regarding long-term use. Herein, we describe our experience with the temporary surgical CentriMag VAD. METHODS: We reviewed 74 patients with AMI-RCS who underwent CentriMag VAD insertion as bridge-to-decision device from 2007 to 2020. Patients were divided into groups based on introduction of the "shock team" model: Era 1 (2007-2014, n = 51) and Era 2 (2015-2020, n = 23). RESULTS: Era 2 had higher proportion of patients with INTERMACS Profile I. The use of percutaneous MCS as bridge to VAD and the use of minimally invasive VAD were higher in Era 2. There were fewer postoperative bleeding events in Era 2 (80% vs 61%, p = .07). Thirty-day mortality was 23% and 1-year survival was 55%, which were no differences between eras. Destinations after CentriMag VAD included myocardial recovery (39%), durable LVAD (27%), and transplantation (5%). CONCLUSION: CentriMag VAD device represents a viable bridge-to-decision device with acceptable short- and long-term outcomes for patients with AMI-RCS. Stable outcomes in a progressively sicker population may be related to changes in practice patterns as well as introduction of the "shock team" concept.
Subject(s)
Heart-Assist Devices , Myocardial Infarction/surgery , Shock, Cardiogenic/surgery , Adult , Female , Heart Valve Prosthesis Implantation/adverse effects , Heart Valve Prosthesis Implantation/instrumentation , Heart Valve Prosthesis Implantation/methods , Heart-Assist Devices/adverse effects , History, 21st Century , Humans , Japan/epidemiology , Male , Middle Aged , Myocardial Infarction/complications , Myocardial Infarction/epidemiology , Percutaneous Coronary Intervention/adverse effects , Percutaneous Coronary Intervention/instrumentation , Percutaneous Coronary Intervention/methods , Retrospective Studies , Shock, Cardiogenic/epidemiology , Time Factors , Treatment OutcomeABSTRACT
The new heart transplantation (HT) allocation policy was introduced on 10/18/2018. Using the UNOS registry, we examined early outcomes following HT for restrictive cardiomyopathy, hypertrophic cardiomyopathy, cardiac sarcoidosis, or cardiac amyloidosis compared to the old system. Those listed who had an event (transplant, death, or waitlist removal) prior to 10/17/2018 were in Era 1, and those listed on or after 10/18/2018 were in Era 2. The primary endpoint was death on the waitlist or delisting due to clinical deterioration. A total of 1232 HT candidates were included, 855 (69.4%) in Era 1 and 377 (30.6%) in Era 2. In Era 2, there was a significant increase in the use of temporary mechanical circulatory support and a reduction in the primary endpoint, (20.9 events per 100 PY (Era 1) vs. 18.6 events per 100 PY (Era 2), OR 1.98, p = .005). Median waitlist time decreased (91 vs. 58 days, p < .001), and transplantation rate increased (119.0 to 204.7 transplants/100 PY for Era 1 vs Era 2). Under the new policy, there has been a decrease in waitlist time and waitlist mortality/delisting due to clinical deterioration, and an increase in transplantation rates for patients with infiltrative, hypertrophic, and restrictive cardiomyopathies without any effect on post-transplant 6-month survival.
Subject(s)
Amyloidosis , Cardiomyopathies , Cardiomyopathy, Restrictive , Heart Transplantation , Cardiomyopathies/surgery , Cardiomyopathy, Restrictive/surgery , Humans , Registries , Retrospective Studies , Waiting ListsABSTRACT
Light-chain (AL) cardiac amyloidosis (CA) has a worse prognosis than transthyretin (ATTR) CA. In this single-center study, we compared post-heart transplant (OHT, orthotopic heart transplantation) survival for AL and ATTR amyloidosis, hypothesizing that these differences would persist post-OHT. Thirty-nine patients with CA (AL, n = 18; ATTR, n = 21) and 1023 non-amyloidosis subjects undergoing OHT were included. Cox proportional hazards modeling was used to evaluate the impact of amyloid subtype and era (early era: from 2001 to 2007; late era: from 2008 to 2018) on survival post-OHT. Survival for non-amyloid patients was greater than ATTR (P = .034) and AL (P < .001) patients in the early era. One, 3-, and 5-year survival rates were higher for ATTR patients than AL patients in the early era (100% vs 75%, 67% vs 50%, and 67% vs 33%, respectively, for ATTR and AL patients). Survival in the non-amyloid cohort was 87% at 1 year, 81% at 3 years, and 76% at 5 years post-OHT. In the late era, AL and ATTR patients had unadjusted 1-year, 3-year, and 5-year survival rates of 100%, which was comparable to non-amyloid patients (90% vs 84% vs 81%). Overall, these findings demonstrate that in the current era, differences in post-OHT survival for AL compared to ATTR are diminishing; OHT outcomes for selected patients with CA do not differ from non-amyloidosis patients.
Subject(s)
Amyloid Neuropathies, Familial , Amyloidosis , Cardiomyopathies , Heart Transplantation , Amyloid Neuropathies, Familial/surgery , Cardiomyopathies/etiology , Humans , Prealbumin , Prognosis , Survival RateABSTRACT
OBJECTIVES: We sought to determine the 1-year outcomes of patients receiving successful chronic total occlusion (CTO) percutaneous coronary intervention (PCI) procedures comparing subintimal versus intraplaque wire tracking patterns. BACKGROUND: CTO PCI utilizes both intraluminal and subintimal wire tracking to achieve successful percutaneous revascularization. Intravascular ultrasound (IVUS) can be used to precisely determine the path of wire tracking. METHODS: From 2014 to 2016, data from patients undergoing CTO PCI were collected in a single-center database. The primary composite endpoint was target vessel failure (TVF) defined as cardiovascular death, target vessel myocardial infarction (MI), or target vessel revascularization (TVR). RESULTS: In total 157 patients with successful CTO PCI and concomitant IVUS imaging completed 1-year follow-up. Subintimal tracking was detected in 53.5% of cases and those patients had a higher incidence of prior PCI, prior coronary artery bypass grafting, and higher J-CTO score. At 1-year, the unadjusted rate of TVF in the subintimal tracking group was higher than the intraplaque group (17.9 vs. 6.9%, HR 2.74, 95% CI 1.00-7.54, P = 0.04), driven by numerically higher rates of TVR and peri-procedural MI. After multivariable adjustment, no significant differences in the rates of the TVF between subintimal vs. intraplaque groups were present at 1-year (TVF: HR 1.51, 95% CI 0.38-6.00, P = 0.55). Landmark analysis excluding in-hospital events showed no significant differences in TVF to 1-year. CONCLUSIONS: IVUS-detected subintimal tracking was observed in over half of successful CTO PCI cases and correlated with baseline and angiographic factors that contributed to the overall rate of TVF at 1-year.