ABSTRACT
OBJECTIVE: To compare the clinical outcomes of radial artery (RA) grafts during CABG to those of right internal mammary artery (RIMA) grafts. METHODS: This was a retrospective, single-institution cohort study of isolated CABG with multiple grafts between 2010-2022. To balance graft cohorts, propensity score matching (PSM) was performed using a 1:1 match ratio. Long-term postoperative survival was compared among RA and RIMA groups. Similarly, major adverse cardiac and cerebrovascular events (MACCE) were compared among both cohorts, with MACCE comprising death, myocardial infarction (MI), coronary revascularization, and stroke. Kaplan-Meier estimation was performed for mortality, while cumulative incidence estimation was utilized for MACCE. RESULTS: A total of 8,774 patients underwent CABG. Of those, 1,674 (19.1%) patients who underwent multiarterial CABG were included in this analysis. 326 (19.5%) patients received RA grafts and 1,348 (80.5%) received RIMA grafts. PSM yielded a cohort of 323 RA patients and 323 RIMA patients. After matching, groups were well-balanced across all baseline variables. No significant differences were observed in immediate postoperative complications or long-term survival, with 5-year survival estimates of 89.5% for the RA group vs 90.1% for the RIMA group. There was a nonsignificant trend toward a higher incidence of MACCE at 5 years in the RA group compared to the RIMA group (31.3% in the RA group vs 24.1% in the RIMA group), especially after 1-year follow-up (21.6% in the RA group vs 15.1% in the RIMA group). Specifically, for RA patients, there were higher rates of repeat revascularization in the 5-year postoperative period (14.7% in the RA group vs 5.3% in the RIMA group), particularly in the territory revascularized by the RA during the index operation (45.7% in the RA group vs 10.3% in the RIMA group). CONCLUSION: Overall, RA and RIMA secondary conduits for CABG were associated with comparable immediate postoperative complications, 5-year MACCE, and 5-year survival after PSM. RA grafting was associated with significantly higher rates of repeat coronary revascularization at 5 years, specifically in the territory revascularized by the RA during the index operation.
Subject(s)
Coronary Artery Disease , Mammary Arteries , Humans , Retrospective Studies , Cohort Studies , Radial Artery/transplantation , Mammary Arteries/transplantation , Treatment Outcome , Coronary Artery Bypass/adverse effects , Postoperative Complications/etiologyABSTRACT
BACKGROUND: This study evaluates the clinical trends, risk factors, and impact of waitlist blood transfusion on outcomes following isolated heart transplantation. METHODS: The UNOS registry was queried to identify adult recipients from January 1, 2014, to June 30, 2022. The recipients were stratified into two groups depending on whether they received a blood transfusion while on the waitlist. The incidence of waitlist transfusion was compared before and after the 2018 allocation policy change. The primary outcome was survival. Propensity score-matching was performed. Multivariable logistic regression was performed to identify predictors of waitlist transfusion. A sub-analysis was performed to evaluate the impact of waitlist time on waitlist transfusion. RESULTS: From the 21 926 recipients analyzed in this study, 4201 (19.2%) received waitlist transfusion. The incidence of waitlist transfusion was lower following the allocation policy change (14.3% vs. 23.7%, p < 0.001). The recipients with waitlist transfusion had significantly reduced 1-year posttransplant survival (88.8% vs. 91.9%, p < 0.001) compared to the recipients without waitlist transfusion in an unmatched comparison. However, in a propensity score-matched comparison, the two groups had similar 1-year survival (90.0% vs. 90.4%, p = 0.656). Multivariable analysis identified ECMO, Impella, and pretransplant dialysis as strong predictors of waitlist transfusion. In a sub-analysis, the odds of waitlist transfusion increased nonlinearly with longer waitlist time. CONCLUSION: There is a lower incidence of waitlist transfusion among transplant recipients under the 2018 allocation system. Waitlist transfusion is not an independent predictor of adverse posttransplant outcomes but rather a marker of the patient's clinical condition. ECMO, Impella, and pretransplant dialysis are strong predictors of waitlist transfusion.
Subject(s)
Blood Transfusion , Heart Transplantation , Registries , Waiting Lists , Humans , Male , Waiting Lists/mortality , Female , Heart Transplantation/adverse effects , Heart Transplantation/mortality , Middle Aged , Follow-Up Studies , Risk Factors , Prognosis , Survival Rate , Blood Transfusion/statistics & numerical data , Graft Survival , Adult , Retrospective StudiesABSTRACT
Persistent acute kidney injury (pAKI), compared with acute kidney injury (AKI) that resolves in <72 h, is associated with worse prognosis in critically ill patients. Definitions and prognosis of pAKI are not well characterized in solid organ transplant patients. Our aims were to investigate (a) definitions and incidence of pAKI; (b) association with clinical outcomes; and (c) risk factors for pAKI among heart, lung, and liver transplant recipients. We systematically reviewed the literature including PubMed, Embase, Web of Science, and Cochrane from inception to 8/1/2023 for human prospective and retrospective studies reporting on the development of pAKI in heart, lung, or liver transplant recipients. We assessed heterogeneity using Cochran's Q and I2. We identified 25 studies including 6330 patients. AKI (8%-71.6%) and pAKI (2.7%-55.1%) varied widely. Definitions of pAKI included 48-72 h (six studies), 7 days (three studies), 14 days (four studies), or more (12 studies). Risk factors included age, body mass index (BMI), diabetes, preoperative chronic kidney disease (CKD), intraoperative vasopressor use, and intraoperative circulatory support. pAKI was associated with new onset of CKD (odds ratio [OR] 1.41-11.2), graft dysfunction (OR 1.81-8.51), and long-term mortality (OR 3.01-13.96), although significant heterogeneity limited certainty of CKD and graft dysfunction outcome analyses. pAKI is common and is associated with worse mortality among liver and lung transplant recipients. Standardization of the nomenclature of AKI will be important in future studies (PROSPERO CRD42022371952).
Subject(s)
Acute Kidney Injury , Organ Transplantation , Transplant Recipients , Humans , Acute Kidney Injury/epidemiology , Acute Kidney Injury/etiology , Organ Transplantation/adverse effects , Postoperative Complications/epidemiology , Postoperative Complications/etiology , Prognosis , Risk Factors , Transplant Recipients/statistics & numerical dataABSTRACT
BACKGROUND: This study evaluated the outcomes of patients with cardiogenic shock (CS) supported with Impella 5.0 or 5.5 and identified risk factors for in-hospital mortality. METHODS: Adults with CS who were supported with Impella 5.0 or 5.5 at a single institution were included. Patients were stratified into three groups according to their CS etiology: (1) acute myocardial infarction (AMI), (2) acute decompensated heart failure (ADHF), and (3) postcardiotomy (PC). The primary outcome was survival, and secondary outcomes included adverse events during Impella support and length of stay. Multivariable logistic regression was performed to identify risk factors for in-hospital mortality. RESULTS: One hundred and thirty-seven patients with CS secondary to AMI (n = 47), ADHF (n = 86), and PC (n = 4) were included. The ADHF group had the highest survival rates at all time points. Acute kidney injury (AKI) was the most common complication during Impella support in all 3 groups. Increased rates of AKI and de novo renal replacement therapy were observed in the PC group, and the AMI group experienced a higher incidence of bleeding requiring transfusion. Multivariable analysis demonstrated diabetes mellitus, elevated pre-insertion serum lactate, and elevated pre-insertion serum creatinine were independent predictors of in-hospital mortality, but the etiology of CS did not impact mortality. CONCLUSIONS: This study demonstrates that Impella 5.0 and 5.5 provide effective mechanical support for patients with CS with favorable outcomes, with nearly two-thirds of patients alive at 180 days. Diabetes, elevated pre-insertion serum lactate, and elevated pre-insertion serum creatinine are strong risk factors for in-hospital mortality.
Subject(s)
Heart-Assist Devices , Hospital Mortality , Shock, Cardiogenic , Humans , Shock, Cardiogenic/therapy , Shock, Cardiogenic/mortality , Shock, Cardiogenic/etiology , Male , Heart-Assist Devices/adverse effects , Female , Aged , Middle Aged , Risk Factors , Treatment Outcome , Retrospective Studies , Acute Kidney Injury/therapy , Acute Kidney Injury/etiology , Acute Kidney Injury/mortality , Myocardial Infarction/complications , Myocardial Infarction/mortality , Heart Failure/mortality , Heart Failure/complicationsABSTRACT
OBJECTIVES: To describe outcomes of reconstruction of the aortomitral continuity (AMC) during concomitant aortic and mitral valve replacement (ie, the "Commando" procedure). DESIGN: A retrospective study of consecutive cardiac surgeries from 2010 to 2022. SETTING: At a single institution. PARTICIPANTS: All patients undergoing double aortic and mitral valve replacement. INTERVENTIONS: Patients were dichotomized by the performance (or not) of AMC reconstruction. MEASUREMENTS AND MAIN RESULTS: A total of 331 patients underwent double-valve replacement, of whom 21 patients (6.3%) had a Commando procedure. The Commando group was more likely to have had a previous aortic valve replacement (AVR) or mitral valve replacement (MVR) (66.7% v 27.4%, p < 0.001), redo cardiac surgery (71.4% v 31.3%, p < 0.001), and emergent/salvage surgery (14.3% v 1.61%, p = 0.001), whereas surgery was more often performed for endocarditis in the Commando group (52.4% v 22.9%, p = 0.003). The Commando group had higher operative mortality (28.6% v 10.7%, p = 0.014), more prolonged ventilation (61.9% v 31.9%, p = 0.005), longer cardiopulmonary bypass time (312 ± 118 v 218 ± 85 minutes, p < 0.001), and longer ischemic time (252 ± 90 v 176 ± 66 minutes, p < 0.001). Despite increased short-term morbidity in the Commando group, Kaplan-Meier survival estimation showed no difference in long-term survival between each group (p = 0.386, log-rank). On multivariate Cox analysis, the Commando procedure was not associated with an increased hazard of death, compared to MVR + AVR (hazard ratio 1.29, 95% CI: 0.65-2.59, p = 0.496). CONCLUSIONS: Although short-term postoperative morbidity and mortality were found to be higher for patients undergoing the Commando procedure, AMC reconstruction may be equally durable in the long term.
Subject(s)
Heart Valve Prosthesis Implantation , Mitral Valve , Humans , Mitral Valve/surgery , Heart Valve Prosthesis Implantation/methods , Retrospective Studies , Treatment Outcome , Aortic Valve/surgeryABSTRACT
OBJECTIVES: Unexpected coronary artery bypass grafting (CABG) is occasionally required during aortic root replacement (ARR). However, the impact of unplanned CABG remains unknown. DESIGN: A single-center, retrospective observational study. SETTING: At university-affiliated tertiary hospital. PARTICIPANTS: All patients who underwent ARR from 2011 through 2022. INTERVENTIONS: Aortic root replacement with or without unplanned CABG. MEASUREMENTS AND MAIN RESULTS: A total of 795 patients underwent ARR. Among them, 131 (16.5%) underwent planned concomitant CABG, and 34 (4.3%) required unplanned CABG. The most common indication of unplanned CABG was ventricular dysfunction (33.3%), followed by disease pathology (25.6%), anatomy (15.4%), and surgical complications (10.3%). A vein graft to the right coronary artery was the most commonly performed bypass. Infective endocarditis and aortic dissection were observed in 27.8% and 12.8%, respectively. Prior cardiac surgery was seen in 40.3%. The median follow-up period was 4.3 years. Unplanned CABG was not associated with operative mortality (odds ratio [OR] 1.54, 95% CI 0.33-7.16, p = 0.58) or long-term mortality (hazard ratio 0.91, 95% CI 0.44-1.89, p = 0.81). Body surface area smaller than 1.7 was independently associated with an increased risk of unplanned CABG (OR 4.51, 95% CI 1.85-11.0, p < 0.001). CONCLUSIONS: Unplanned CABG occurred in 4.3% of patients during ARR, but was not associated with operative mortality or long-term mortality. A small body surface area was a factor associated with unplanned CABG.
Subject(s)
Aortic Valve Stenosis , Coronary Artery Disease , Humans , Aortic Valve/surgery , Clinical Relevance , Aortic Valve Stenosis/surgery , Treatment Outcome , Coronary Artery Bypass/adverse effects , Retrospective Studies , Coronary Artery Disease/complications , Risk FactorsABSTRACT
BACKGROUND: The use of extracorporeal life support (ECLS) in patients after surgical repair for acute type A aortic dissection (ATAAD) has not been well documented. METHODS: We performed a systematic review and meta-analysis to assess the outcomes of ECLS after surgery for ATAAD with data published by October 2023 in compliance with the Preferred Reporting Items for Systematic Reviews and Meta-analyses (PRISMA) and the Meta-analysis of Observational Studies in Epidemiology (MOOSE) reporting guidelines. The protocol was registered in PROSPERO (CRD42023479955). RESULTS: Twelve observational studies met our eligibility criteria, including 280 patients. Mean age was 55.0 years and women represented 25.3% of the overall population. Although the mean preoperative left ventricle ejection fraction was 59.8%, 60.8% of patients developed left ventricle failure and 34.0% developed biventricular failure. Coronary involvement and malperfusion were found in 37.1% and 25.6%, respectively. Concomitant coronary bypass surgery was performed in 38.5% of patients. Regarding ECLS, retrograde flow (femoral) was present in 39.9% and central cannulation was present in 35.4%. In-hospital mortality was 62.8% and pooled estimate of successful weaning was 50.8%. Neurological complications, bleeding and renal failure were found in 25.9%, 38.7%, and 65.5%, respectively. CONCLUSION: ECLS after surgical repair for ATAAD remains associated with high rates of in-hospital death and complications, but it still represents a chance of survival in critical situations. ECLS remains a salvage attempt and surgeons should not try to avoid ECLS at all costs after repairing an ATAAD case.
ABSTRACT
BACKGROUND: Induction immunosuppression in heart transplant recipients varies greatly by center. Basiliximab (BAS) is the most commonly used induction immunosuppressant but has not been shown to reduce rejection or improve survival. The objective of this retrospective study was to compare rejection, infection, and mortality within the first 12 months following heart transplant in patients who received BAS or no induction. METHODS: This was a retrospective cohort study of adult heart transplant recipients given BAS or no induction from January 1, 2017 to May 31, 2021. The primary endpoint was incidence of treated acute cellular rejection (ACR) at 12-months post-transplant. Secondary endpoints included ACR at 90 days post-transplant, incidence of antibody-mediated rejection (AMR) at 90 days and 1 year, incidence of infection, and all-cause mortality at 1 year. RESULTS: A total of 108 patients received BAS, and 26 patients received no induction within the specified timeframe. There was a lower incidence of ACR within the first year in the BAS group compared to the no induction group (27.7 vs. 68.2%, p < .002). BAS was independently associated with a lower probability of having a rejection event during the first 12-months post-transplant (hazard ratio (HR) .285, 95% confidence interval [CI] .142-.571, p < .001). There was no difference in the rate of infection and in mortality after hospital discharge at 1-year post-transplant (6% vs. 0%, p = .20). CONCLUSION: BAS appears to be associated with greater freedom from rejection without an increase in infections. BAS may be a preferred to a no induction strategy in patients undergoing heart transplantation.
Subject(s)
Antibodies, Monoclonal , Heart Transplantation , Humans , Adult , Basiliximab , Antibodies, Monoclonal/therapeutic use , Retrospective Studies , Immunosuppressive Agents/therapeutic use , Immunosuppressive Agents/pharmacology , Graft Rejection/etiology , Recombinant Fusion Proteins/therapeutic useABSTRACT
In this project, we describe proteasome inhibitor (PI) treatment of antibody-mediated rejection (AMR) in heart transplantation (HTX). From January 2018 to September 2021, 10 patients were treated with PI for AMR: carfilzomib (CFZ) n = 8; bortezomib (BTZ) n = 2. Patients received 1-3 cycles of PI. All patients had ≥1 strong donor-specific antibody (DSA) (mean fluorescence intensity [MFI] > 8000) in undiluted serum. Most DSAs (20/21) had HLA class II specificity. The MFI of strong DSAs had a median reduction of 56% (IQR = 13%-89%) in undiluted serum and 92% (IQR = 53%-95%) at 1:16 dilution. Seventeen DSAs in seven patients were reduced > 50% at 1:16 dilution after treatment. Four DSAs from three patients did not respond. DSA with MFI > 8000 at 1:16 dilution was less responsive to treatment. 60% (6/10) patients presented with graft dysfunction; 4/6 recovered ejection fraction > 40% after treatment. Pathologic AMR was resolved in 5/7 (71.4%) of patients within 1 year after treatment. 9/10 (90%) patients survived to 1 year after AMR diagnosis. Using PI in AMR resulted in significant DSA reduction with some resolution of graft dysfunction. Larger studies are needed to evaluate PI for AMR.
Subject(s)
Heart Transplantation , Kidney Transplantation , Humans , Proteasome Inhibitors/therapeutic use , Isoantibodies , Kidney Transplantation/adverse effects , HLA Antigens , Tissue Donors , Graft Rejection/drug therapy , Graft Rejection/etiology , Retrospective StudiesABSTRACT
BACKGROUND: Patients with cardiogenic shock or end-stage heart failure can be maintained on mechanical circulatory support (MCS) devices. Once a patient undergoes placement of a device, obtaining and maintaining therapeutic anticoagulation is vital. Guidelines recommend the use of institutional protocols to assist in dosing and titration of anticoagulants. OBJECTIVE: The purpose of this study was to characterize the use of bivalirudin before and after the implementation of a standardized titration protocol in patients with MCS. METHODS: A retrospective review of patients who received bivalirudin for MCS (VA ECMO [veno-arterial extracorporeal membrane oxygenation], Impella, or LVAD [left ventricular assist device]) before and after the implementation of the titration protocol into the electronic health record (EHR) was conducted. The primary outcome was to compare the proportion of therapeutic activated partial thromboplastin time (aPTT). Secondary outcomes included number of subtherapeutic and supratherapeutic aPTTs, incidence of bleeding and clotting events, bivalirudin titrations per day, and percentage of patients with therapeutic aPTT level. RESULTS: A total of 100 patients were included (precohort = 67; postcohort = 33). The proportion of therapeutic aPTTs was significantly higher in the postcohort than that in the precohort (62% vs 48%; P < 0.001). The postcohort had 0% of patients failing to achieve therapeutic aPTT levels. The number of titrations per day was significantly lower in the postcohort, with 1.20 titrations per day versus 1.93 in the precohort (P < 0.001). CONCLUSIONS: Implementation of the bivalirudin titration nomograms within the EHR significantly increased the number of therapeutic aPTTs, reduced the number of patients who never achieved a therapeutic aPTT, and reduced the required number of titrations per day.
ABSTRACT
BACKGROUND: Impella 5.5 (Abiomed; Danvers, MA) (IMP5) is a commonly used, surgically implanted, tMCS device that requires systemic anticoagulation and purge solution to avoid pump failure. To avoid heparin-induced thrombocytopenia (HIT) from unfractionated heparin (UFH) use, our program has explored the utility of bivalirudin (BIV) for systemic anticoagulation and sodium bicarbonate-dextrose purge solution (SBPS) in IMP5.5. METHODS: This single center, retrospective study included 34 patients supported on IMP5.5 with BIV based AC and SBPS between December 1st 2020 to December 1st 2021.The efficacy and safety end points were incidence of development of HIT, Tissue Plasminogen Activator (tPA) use for suspected pump thrombosis, stroke, and device failure as well as clinically significant bleeding. RESULTS: The median duration of IMP5.5 support was 9.8 days (IQR: 6-15). Most patients were bridged to HTX (58%) followed by recovery (27%) and LVAD implantation (15%). Patients were therapeutic on bivalirudin for 64% of their IMP5.5 support. One patient (2.9%) suffered from ischemic stroke and 26.5% (9) patients developed clinically significant bleeding. tPA was administered to 7(21%) patients. One patient in the entire cohort developed HIT. CONCLUSIONS: Our experience supports the use of systemic BIV and SBPS as a method to avoid heparin exposure in a patient population predisposed to the development of HIT.
Subject(s)
Heparin , Thrombocytopenia , Humans , Heparin/adverse effects , Anticoagulants/adverse effects , Tissue Plasminogen Activator/adverse effects , Sodium Bicarbonate , Retrospective Studies , Hirudins/adverse effects , Peptide Fragments/adverse effects , Hemorrhage/chemically induced , Recombinant Proteins/adverse effects , Treatment OutcomeABSTRACT
OBJECTIVE: To determine the impact of diastolic dysfunction (DD) on survival after routine cardiac surgery. DESIGN: This was an observational study of consecutive cardiac surgeries from 2010 to 2021. SETTING: At a single institution. PARTICIPANTS: Patients undergoing isolated coronary, isolated valvular, and concomitant coronary and valvular surgery were included. Patients with a transthoracic echocardiogram (TTE) longer than 6 months prior to their index surgery were excluded from the analysis. INTERVENTIONS: Patients were categorized via preoperative TTE as having no DD, grade I DD, grade II DD, or grade III DD. MEASUREMENTS AND MAIN RESULTS: A total of 8,682 patients undergoing a coronary and/or valvular surgery were identified, of whom 4,375 (50.4%) had no DD, 3,034 (34.9%) had grade I DD, 1,066 (12.3%) had grade II DD, and 207 (2.4%) had grade III DD. The median (IQR) time of the TTE prior to the index surgery was 6 (2-29) days. Operative mortality was 5.8% in the grade III DD group v 2.4% for grade II DD, 1.9% for grade I DD, and 2.1% for no DD (p = 0.001). Atrial fibrillation, prolonged mechanical ventilation (>24 hours), acute kidney injury, any packed red blood cell transfusion, reexploration for bleeding, and length of stay were higher in the grade III DD group compared to the rest of the cohort. The median follow-up was 4.0 (IQR: 1.7-6.5) years. Kaplan-Meier survival estimates were lower in the grade III DD group than in the rest of the cohort. CONCLUSIONS: These findings suggested that DD may be associated with poor short-term and long-term outcomes.
Subject(s)
Cardiac Surgical Procedures , Ventricular Dysfunction, Left , Humans , Ventricular Dysfunction, Left/etiology , Ventricular Dysfunction, Left/complications , Cardiac Surgical Procedures/adverse effects , Echocardiography , Heart , Retrospective Studies , Treatment OutcomeABSTRACT
Veno-venous extracorporeal membrane oxygenation (VV ECMO) has become an important support modality for patients with acute respiratory failure refractory to optimal medical therapy, such as low tidal volume mechanical ventilator support, early paralytic infusion, and early prone positioning. The objective of this cohort study was to investigate the causes and timing of in-hospital mortality in patients on VV ECMO. All patients, excluding trauma and bridge to lung transplant, admitted 8/2014-6/2019 to a specialty ICU for VV ECMO were reviewed. Two hundred twenty-five patients were included. In-hospital mortality was 24.4% (n = 55). Most non-survivors (46/55, 84%) died prior to lung recovery and decannulation from VV ECMO. Most common cause of death (COD) for patients who died on VV ECMO was removal of life sustaining therapy (LST) in setting of multisystem organ failure (MSOF) (n = 24). Nine patients died a median of 9 days [6, 11] after decannulation. Most common COD in these patients was palliative withdrawal of LST due to poor prognosis (n = 3). Non-survivors were older and had worse predictive mortality scores than survivors. We found that death in patients supported with VV ECMO in our study most often occurs prior to decannulation and lung recovery. This study demonstrated that the most common cause of death in patients supported with VV ECMO was removal of LST due MSOF. Acute hemorrhage (systemic or intracranial) was not found to be a common cause of death in our patient population.
Subject(s)
Extracorporeal Membrane Oxygenation , Respiratory Distress Syndrome , Humans , Extracorporeal Membrane Oxygenation/adverse effects , Cohort Studies , Cause of Death , Respiratory Distress Syndrome/therapy , Hospital Mortality , Retrospective StudiesABSTRACT
INTRODUCTION: Veno-venous extracorporeal membrane oxygenation (VV ECMO) has become a support modality for patients with acute respiratory failure refractory to standard therapies. VV ECMO has been increasingly used during the current COVID-19 pandemic for patients with refractory respiratory failure. The object of this study was to evaluate the outcomes of VV ECMO in patients with COVID-19 compared to patients with non-COVID-19 viral infections. METHODS: We retrospectively reviewed all patients supported with VV ECMO between 8/2014 and 8/2020 whose etiology of illness was a viral pulmonary infection. The primary outcome of this study was to evaluate in-hospital mortality. The secondary outcomes included length of ECMO course, ventilator duration, hospital length of stay, incidence of adverse events through ECMO course. RESULTS: Eighty-nine patients were included (35 COVID-19 vs 54 non-COVID-19). Forty (74%) of the non-COVID-19 patients had influenza virus. Prior to cannulation, COVID-19 patients had longer ventilator duration (3 vs 1 day, p = .003), higher PaCO2 (64 vs 53 mmHg, p = .012), and white blood cell count (14 vs 9 ×103/µL, p = .004). Overall in-hospital mortality was 33.7% (n = 30). COVID-19 patients had a higher mortality (49% vs. 24%, p = .017) when compared to non-COVID-19 patients. COVID-19 survivors had longer median time on ECMO than non-COVID-19 survivors (24.4 vs 16.5 days p = .03) but had a similar hospital length of stay (HLOS) (41 vs 48 Extracorporeal Membrane Oxygenationdays p = .33). CONCLUSION: COVID-19 patients supported with VV ECMO have a higher mortality than non-COVID-19 patients. While COVID-19 survivors had significantly longer VV ECMO runs than non-COVID-19 survivors, HLOS was similar. This data add to a growing body of literature supporting the use of ECMO for potentially reversible causes of respiratory failure.
Subject(s)
COVID-19 , Extracorporeal Membrane Oxygenation , Respiratory Distress Syndrome , Respiratory Insufficiency , Humans , COVID-19/therapy , Retrospective Studies , Pandemics , Respiratory Distress Syndrome/therapy , Respiratory Insufficiency/etiology , Respiratory Insufficiency/therapyABSTRACT
We report orthotopic (life-supporting) survival of genetically engineered porcine cardiac xenografts (with six gene modifications) for almost 9 months in baboon recipients. This work builds on our previously reported heterotopic cardiac xenograft (three gene modifications) survival up to 945 days with an anti-CD40 monoclonal antibody-based immunosuppression. In this current study, life-supporting xenografts containing multiple human complement regulatory, thromboregulatory, and anti-inflammatory proteins, in addition to growth hormone receptor knockout (KO) and carbohydrate antigen KOs, were transplanted in the baboons. Selective "multi-gene" xenografts demonstrate survival greater than 8 months without the requirement of adjunctive medications and without evidence of abnormal xenograft thickness or rejection. These data demonstrate that selective "multi-gene" modifications improve cardiac xenograft survival significantly and may be foundational for paving the way to bridge transplantation in humans.
Subject(s)
Graft Rejection , Heart Transplantation , Animals , Animals, Genetically Modified , Graft Survival , Heterografts , Humans , Immunosuppressive Agents , Papio , Swine , Transplantation, HeterologousABSTRACT
BACKGROUND: Left ventricular assist devices (LVADs) have been used as a standard treatment option for patients with advanced heart failure. However, these devices are prone to adverse events. Nonsurgical bleeding (NSB) is the most common complication in patients with continuous flow (CF) LVADs. The development of acquired von Willebrand syndrome (AVWS) in CF-LVAD recipients is thought to be a key factor. However, AVWS is seen across a majority of LVAD patients, not just those with NSB. The purpose of this study was to examine the link between acquired platelet defects and NSB in CF-LVAD patients. METHODS: Blood samples were collected from 62 CF-LVAD patients at pre- and 4 post-implantation timepoints. Reduced adhesion receptor expression (GPIbα and GPVI) and activation of platelets (GPIIb/IIIa activation) were used as markers for acquired platelet defects. RESULTS: Twenty-three patients experienced at least one NSB episode. Significantly higher levels of platelet activation and receptor reduction were seen in the postimplantation blood samples from bleeders compared with non-bleeders. All patients experienced the loss of high molecular weight monomers (HMWM) of von Willebrand Factor (vWF), but no difference was seen between the two groups. Multivariable logistic regression showed that biomarkers for reduced platelet receptor expression (GPIbα and GPVI) and activation (GPIIb/IIIa) have more predictive power for NSB, with the area under curve (AUC) values of 0.72, 0.68, and 0.62, respectively, than the loss of HMWM of vWF (AUC: 0.57). CONCLUSION: The data from this study indicated that the severity of acquired platelet defects has a direct link to NSB in CF-LVAD recipients.
Subject(s)
Heart Failure , Heart-Assist Devices , von Willebrand Diseases , Humans , Heart-Assist Devices/adverse effects , von Willebrand Factor , Hemorrhage/therapy , Hemorrhage/complications , von Willebrand Diseases/etiology , Platelet Activation , Heart Failure/surgeryABSTRACT
INTRODUCTION: There are no guidelines regarding the use of bovine pericardial or porcine valves for aortic valve replacement, and prior studies have yielded conflicting results. The current study sought to compare short- and long-term outcomes in propensity-matched cohorts of patients undergoing isolated aortic valve replacement (AVR) with bovine versus porcine valves. METHODS: This was a retrospective study utilizing an institutional database of all isolated bioprosthetic surgical aortic valve replacements performed at our center from 2010 to 2020. Patients were stratified according to type of bioprosthetic valve (bovine pericardial or porcine), and 1:1 propensity-score matching was applied. Kaplan-Meier survival estimation and multivariable Cox regression for mortality were performed. Cumulative incidence functions were generated for all-cause readmissions and aortic valve reinterventions. RESULTS: A total of 1502 patients were identified, 1090 (72.6%) of whom received a bovine prosthesis and 412 (27.4%) of whom received a porcine prosthesis. Propensity-score matching resulted in 412 risk-adjusted pairs. There were no significant differences in clinical or echocardiographic postoperative outcomes in the matched cohorts. Kaplan-Meier survival estimates were comparable, and, on multivariable Cox regression, valve type was not significantly associated with long-term mortality (hazard ratio: 1.02, 95% confidence interval: 0.74, 1.40, p = .924). Additionally, there were no significant differences in competing-risk cumulative incidence estimates for all-cause readmissions (p = .68) or aortic valve reinterventions (p = .25) in the matched cohorts. CONCLUSION: The use of either bovine or porcine bioprosthetic aortic valves yields comparable postoperative outcomes, long-term survival, freedom from reintervention, and freedom from readmission.
Subject(s)
Bioprosthesis , Heart Valve Prosthesis Implantation , Heart Valve Prosthesis , Animals , Cattle , Swine , Aortic Valve/surgery , Heart Valve Prosthesis Implantation/methods , Retrospective Studies , Treatment Outcome , Prosthesis Design , Heart Valve Prosthesis/adverse effects , Bioprosthesis/adverse effects , Postoperative Complications/etiologyABSTRACT
Disseminated intravascular coagulation (DIC) is a life-threatening hematologic derangement characterized by dysregulated thrombin generation and excessive fibrinolysis. However, DIC is poorly characterized in the extracorporeal membrane oxygenation (ECMO) population, and the underlying mechanisms are not well understood. Several mechanisms contribute to DIC in ECMO, including consumption of coagulation factors, acquired von Willebrand's syndrome leading to thrombocytopenia, and hyperfibrinolysis. There are few case reports of DIC in adult ECMO patients. Most are in the context of venoarterial ECMO, which is typically used in the setting of cardiogenic shock and cardiac arrest. These disease states themselves are known to be associated with DIC, liver failure, impaired anticoagulant mechanisms, and increased fibrinolysis. We present an unusual case of a 74-year-old man who developed overt DIC during veno-venous (VV) ECMO. DIC resulted in clinical bleeding and severe hypofibrinogenemia requiring massive cryoprecipitate transfusion of 87 pooled units. When the patient was decannulated from ECMO, his platelet count and fibrinogen concentration improved within 24 hours, suggesting that ECMO was a proximate cause of his DIC.
Subject(s)
Afibrinogenemia , Disseminated Intravascular Coagulation , Extracorporeal Membrane Oxygenation , Heart Arrest , Adult , Afibrinogenemia/complications , Afibrinogenemia/therapy , Aged , Disseminated Intravascular Coagulation/etiology , Disseminated Intravascular Coagulation/therapy , Extracorporeal Membrane Oxygenation/adverse effects , Extracorporeal Membrane Oxygenation/methods , Humans , MaleABSTRACT
BACKGROUND: Extracorporeal cardiopulmonary resuscitation (ECPR) for refractory cardiac arrest has improved mortality in post-cardiac surgery patients; however, loss of neurologic function remains one of the main and devastating complications. We reviewed our experience with ECPR and investigated the effect of cannulation strategy on neurologic outcome in adult patients who experienced cardiac arrest following cardiac surgery that was managed with ECPR. METHODS: Patients were categorized by central versus percutaneous peripheral VA-extracorporeal membrane oxygenation (ECMO) cannulation strategy. We reviewed patient records and evaluated in-hospital mortality, cause of death, and neurologic status 72 hours after cannulation. RESULTS: From January 2010 to September 2019, 44 patients underwent post-cardiac surgery ECPR for cardiac arrest. Twenty-six patients received central cannulation; 18 patients received peripheral cannulation. Mean post-operative day of the cardiac arrest was 3 and 9 days (p = 0.006), and mean time between initiation of CPR and ECMO was 40 ± 24 and 28 ± 22 minutes for central and peripheral cannulation, respectively. After 72 hours of VA-ECMO support, 30% of centrally cannulated patients versus 72% of peripherally cannulated patients attained cerebral performance status 1-2 (p = 0.01). Anoxic brain injury was the cause of death in 26.9% of centrally cannulated and 11.1% of peripherally cannulated patients. Survival to discharge was 31% and 39% for central and peripheral cannulation, respectively. CONCLUSIONS: Peripheral VA-ECMO allows for continuous CPR and systemic perfusion while obtaining vascular access. Compared to central cannulation, a peripheral cannulation strategy is associated with improved neurologic outcomes and decreased likelihood of anoxic brain death.
Subject(s)
Cardiac Surgical Procedures , Cardiopulmonary Resuscitation , Extracorporeal Membrane Oxygenation , Heart Arrest , Adult , Cardiac Surgical Procedures/adverse effects , Catheterization , Heart Arrest/etiology , Heart Arrest/therapy , Humans , Retrospective Studies , Treatment OutcomeABSTRACT
Background and Objectives: Post-infarct ventricular septal rupture (PIVSR) continues to have significant morbidity and mortality, despite decreased prevalence. Impella and venoarterial extracorporeal membranous oxygenation (VA-ECMO) have been proposed as strategies to correct hemodynamic derangements and bridge patients to delayed operative repair when success rates are higher. This review places VA-ECMO and Impella support strategies in the context of bridging patients to successful PIVSR repair, with an additional case report of successful bridging with the Impella device. Materials and Methods: We report a case of PIVSR repair utilizing 14 days of Impella support. We additionally conducted a systematic review of contemporary literature to describe the application of VA-ECMO and Impella devices in the pre-operative period prior to surgical PIVSR correction. Expert commentary on the advantages and disadvantages of each of these techniques is provided. Results: We identified 19 studies with 72 patients undergoing VA-ECMO as a bridge to PIVSR repair and 6 studies with 11 patients utilizing an Impella device as a bridge to PIVSR repair. Overall, outcomes in both groups were better than expected from patients who were historically managed with medicine and balloon pump therapy, however there was a significant heterogeneity between studies. Impella provided for excellent left ventricular unloading, but did result in some concerns for reversal of shunting. VA-ECMO resulted in improved end-organ perfusion, but carried increased risks of device-related complications and requirement for additional ventricular unloading. Conclusions: Patients presenting with PIVSR in cardiogenic shock requiring a MCS bridge to definitive surgical repair continue to pose a challenge to the multidisciplinary cardiovascular team as the diverse presentation and management issues require individualized care plans. Both VA-ECMO and the Impella family of devices play a role in the contemporary management of PIVSR and offer distinct advantages and disadvantages depending on the clinical scenario. The limited case numbers reported demonstrate feasibility, safety, and recommendations for optimal management.