ABSTRACT
BACKGROUND: Data showing the efficacy and safety of the transplantation of hearts obtained from donors after circulatory death as compared with hearts obtained from donors after brain death are limited. METHODS: We conducted a randomized, noninferiority trial in which adult candidates for heart transplantation were assigned in a 3:1 ratio to receive a heart after the circulatory death of the donor or a heart from a donor after brain death if that heart was available first (circulatory-death group) or to receive only a heart that had been preserved with the use of traditional cold storage after the brain death of the donor (brain-death group). The primary end point was the risk-adjusted survival at 6 months in the as-treated circulatory-death group as compared with the brain-death group. The primary safety end point was serious adverse events associated with the heart graft at 30 days after transplantation. RESULTS: A total of 180 patients underwent transplantation; 90 (assigned to the circulatory-death group) received a heart donated after circulatory death and 90 (regardless of group assignment) received a heart donated after brain death. A total of 166 transplant recipients were included in the as-treated primary analysis (80 who received a heart from a circulatory-death donor and 86 who received a heart from a brain-death donor). The risk-adjusted 6-month survival in the as-treated population was 94% (95% confidence interval [CI], 88 to 99) among recipients of a heart from a circulatory-death donor, as compared with 90% (95% CI, 84 to 97) among recipients of a heart from a brain-death donor (least-squares mean difference, -3 percentage points; 90% CI, -10 to 3; P<0.001 for noninferiority [margin, 20 percentage points]). There were no substantial between-group differences in the mean per-patient number of serious adverse events associated with the heart graft at 30 days after transplantation. CONCLUSIONS: In this trial, risk-adjusted survival at 6 months after transplantation with a donor heart that had been reanimated and assessed with the use of extracorporeal nonischemic perfusion after circulatory death was not inferior to that after standard-care transplantation with a donor heart that had been preserved with the use of cold storage after brain death. (Funded by TransMedics; ClinicalTrials.gov number, NCT03831048.).
Subject(s)
Brain Death , Heart Transplantation , Tissue and Organ Procurement , Adult , Humans , Graft Survival , Organ Preservation , Tissue Donors , Death , Patient SafetyABSTRACT
OBJECTIVES: Extracorporeal cardiopulmonary resuscitation (ECPR) is the implementation of venoarterial extracorporeal membrane oxygenation (VA-ECMO) during refractory cardiac arrest. The role of left-ventricular (LV) unloading with Impella in addition to VA-ECMO ("ECMELLA") remains unclear during ECPR. This is the first systematic review and meta-analysis to characterize patients with ECPR receiving LV unloading and to compare in-hospital mortality between ECMELLA and VA-ECMO during ECPR. DATA SOURCES: Medline, Cochrane Central Register of Controlled Trials, Embase, and abstract websites of the three largest cardiology societies (American Heart Association, American College of Cardiology, and European Society of Cardiology). STUDY SELECTION: Observational studies with adult patients with refractory cardiac arrest receiving ECPR with ECMELLA or VA-ECMO until July 2023 according to the Preferred Reported Items for Systematic Reviews and Meta-Analysis checklist. DATA EXTRACTION: Patient and treatment characteristics and in-hospital mortality from 13 study records at 32 hospitals with a total of 1014 ECPR patients. Odds ratios (ORs) and 95% CI were computed with the Mantel-Haenszel test using a random-effects model. DATA SYNTHESIS: Seven hundred sixty-two patients (75.1%) received VA-ECMO and 252 (24.9%) ECMELLA. Compared with VA-ECMO, the ECMELLA group was comprised of more patients with initial shockable electrocardiogram rhythms (58.6% vs. 49.3%), acute myocardial infarctions (79.7% vs. 51.5%), and percutaneous coronary interventions (79.0% vs. 47.5%). VA-ECMO alone was more frequently used in pulmonary embolism (9.5% vs. 0.7%). Age, rate of out-of-hospital cardiac arrest, and low-flow times were similar between both groups. ECMELLA support was associated with reduced odds of mortality (OR, 0.53 [95% CI, 0.30-0.91]) and higher odds of good neurologic outcome (OR, 2.22 [95% CI, 1.17-4.22]) compared with VA-ECMO support alone. ECMELLA therapy was associated with numerically increased but not significantly higher complication rates. Primary results remained robust in multiple sensitivity analyses. CONCLUSIONS: ECMELLA support was predominantly used in patients with acute myocardial infarction and VA-ECMO for pulmonary embolism. ECMELLA support during ECPR might be associated with improved survival and neurologic outcome despite higher complication rates. However, indications and frequency of ECMELLA support varied strongly between institutions. Further scientific evidence is urgently required to elaborate standardized guidelines for the use of LV unloading during ECPR.
Subject(s)
Extracorporeal Membrane Oxygenation , Heart Arrest , Humans , Extracorporeal Membrane Oxygenation/methods , Heart Arrest/therapy , Heart Arrest/mortality , Heart-Assist Devices , Cardiopulmonary Resuscitation/methods , Hospital MortalityABSTRACT
BACKGROUND: Sarcopenia is underappreciated in advanced heart failure and is not routinely assessed. In patients receiving a left ventricular assist device, preoperative sarcopenia, defined by using computed-tomography (CT)-derived pectoralis muscle-area index (muscle area indexed to body-surface area), is an independent predictor of postoperative mortality. The association between preoperative sarcopenia and outcomes after heart transplant (HT) is unknown. OBJECTIVES: The primary aim of this study was to determine whether preoperative sarcopenia, diagnosed using the pectoralis muscle-area index, is an independent predictor of days alive and out of the hospital (DAOHs) post-transplant. METHODS: Patients who underwent HT between January, 2018, and June, 2022, with available preoperative chest CT scans were included. Sarcopenia was diagnosed as pectoralis muscle-area index in the lowest sex-specific tertile. The primary endpoint was DAOHs at 1 year post-transplant. RESULTS: The study included 169 patients. Patients with sarcopenia (nâ¯=â¯55) had fewer DAOHs compared to those without sarcopenia, with a median difference of 17 days (320 vs 337 days; Pâ¯=â¯0.004). Patients with sarcopenia had longer index hospitalizations and were also more likely to be discharged to a facility other than home. In a Poisson regression model, sarcopenia was a significant univariable and the strongest multivariable predictor of DAOHs at 1 year (parameter estimateâ¯=â¯-0.17, 95% CI -0.19 to -14; Pâ¯=â¯< 0.0001). CONCLUSIONS: Preoperative sarcopenia, diagnosed using the pectoralis muscle-area index, is an independent predictor of poor outcomes after HT. This parameter is easily measurable from commonly obtained preoperative CT scans and may be considered in transplant evaluations.
ABSTRACT
BACKGROUND: Aortic regurgitation (AR) is a common complication following left ventricular assist device (LVAD) implantation. We evaluated the hemodynamic implications of AR in patients with HeartMate 3 (HM3) LVAD at baseline and in response to speed changes. METHODS AND RESULTS: Clinically stable outpatients supported by HM3 who underwent a routine hemodynamic ramp test were retrospectively enrolled in this analysis. Patients were stratified based on the presence of at least mild AR at baseline speed. Hemodynamic and echocardiographic parameters were compared between the AR and non-AR groups. Sixty-two patients were identified. At the baseline LVAD speed, 29 patients (47%) had AR, while 33 patients (53%) did not. Patients with AR were older and supported on HM3 for a longer duration. At baseline speed, all hemodynamic parameters were similar between the groups including central venous pressure, pulmonary capillary wedge pressure, pulmonary arterial pressures, cardiac output and index, and pulmonary artery pulsatility index (p > 0.05 for all). During the subacute assessment, AR worsened in some, but not all, patients, with increases in LVAD speed. There were no significant differences in 1-year mortality or hospitalization rates between the groups, however, at 1-year, ≥ moderate AR and right ventricular failure (RVF) were detected in higher rates among the AR group compared to the non-AR group (45% vs. 0%; p < 0.01, and 75% vs. 36.8%; p = 0.02, respectively). CONCLUSIONS: In a cohort of stable outpatients supported with HM3 who underwent a routine hemodynamic ramp test, the presence of mild or greater AR did not impact the ability of HM3 LVADs to effectively unload the left ventricle during early subacute assessment. Although the presence of AR did not affect mortality and hospitalization rates, it resulted in higher rates of late hemodynamic-related events in the form of progressive AR and RVF.
Subject(s)
Aortic Valve Insufficiency , Heart Failure , Heart-Assist Devices , Humans , Retrospective Studies , Heart Failure/diagnosis , Heart Failure/surgery , Heart-Assist Devices/adverse effects , Aortic Valve Insufficiency/diagnosis , Aortic Valve Insufficiency/etiology , Hemodynamics/physiologyABSTRACT
We developed short-active-length distributed Bragg reflector (DBR) lasers to reduce the power consumption of chip-to-chip optical interconnects. These lasers have buried bulk InGaAsP waveguides to increase the coupling efficiency between the active region and DBR to 99.79% from the 98.14% of our previous DBR lasers that had InP channel waveguides. We achieved continuous wave operation of 5- to 80-µm active-length DBR lasers and the 5-µm-long laser consumed 24 fJ/bit with a 10-Gbps NRZ signal. The threshold current of the 5-µm laser was 51 µA, which compares favorably to our previous 10-µm DBR lasers with a threshold current of 170 µA.
ABSTRACT
Cardiac allograft vasculopathy (CAV) is a major cause of morbidity and mortality following heart transplantation (HT). Prior studies identified distinct CAV trajectories in the early post-HT period with unique predictors, but the evolution of CAV in later periods is not well-described. This study assessed the prevalence of late CAV progression and associated risk factors in HT recipients with ISHLT CAV 0/1 at 10 years post-HT. Consecutive adult patients who underwent HT from January 2000 to December 2008 were evaluated and grouped by CAV trajectories into progressors (developed ISHLT CAV 2/3) or nonprogressors (remained ISHLT CAV 0/1). A total of 130 patients were included with a median age at angiography of 61.7 years and a median follow-up time of 4.8 years. 8.5% progressed to CAV 2/3, while the remaining 91.5% were nonprogressors. Progression was not associated with death or retransplantation (27.3% [progressor] vs. 21.0% [nonprogressor], p = 0.70). These data may inform shared decision-making about late CAV screening.
Subject(s)
Disease Progression , Heart Transplantation , Postoperative Complications , Humans , Female , Male , Middle Aged , Follow-Up Studies , Heart Transplantation/adverse effects , Heart Transplantation/mortality , Risk Factors , Prognosis , Retrospective Studies , Graft Survival , Survival Rate , Graft Rejection/etiology , Coronary Artery Disease/surgery , Coronary Artery Disease/etiology , Adult , AgedABSTRACT
BACKGROUND: Since the 2018 allocation system change in heart transplantation (HT), ischemic times have increased, which may be associated with peri-operative and post-operative complications. This study aimed to compare ischemia reperfusion injury (IRI) in hearts preserved using ice-cold storage (ICS) and the Paragonix SherpaPak TM Cardiac Transport System (CTS). METHODS: From January 2021 to June 2022, consecutive endomyocardial biopsies from 90 HT recipients were analyzed by a cardiac pathologist in a single-blinded manner: 33 ICS and 57 CTS. Endomyocardial biopsies were performed at three-time intervals post-HT, and the severity of IRI manifesting histologically as coagulative myocyte necrosis (CMN) was evaluated, along with graft rejection and graft function. RESULTS: The incidence of IRI at weeks 1, 4, and 8 post-HT were similar between the ICS and CTS groups. There was a 59.3% statistically significant reduction in CMN from week 1 to 4 with CTS, but not with ICS. By week 8, there were significant reductions in CMN in both groups. Only 1 out of 33 (3%) patients in the ICS group had an ischemic time >240 mins, compared to 10 out of 52 (19%) patients in the CTS group. During the follow-up period of 8 weeks to 12 months, there were no significant differences in rejection rates, formation of de novo donor-specific antibodies and overall survival between the groups. CONCLUSION: The CTS preservation system had similar rates of IRI and clinical outcomes compared to ICS despite longer overall ischemic times. There is significantly more recovery of IRI in the early post operative period with CTS. This study supports CTS as a viable option for preservation from remote locations, expanding the donor pool.
Subject(s)
Graft Rejection , Graft Survival , Heart Transplantation , Organ Preservation , Humans , Heart Transplantation/adverse effects , Male , Female , Organ Preservation/methods , Middle Aged , Follow-Up Studies , Graft Rejection/etiology , Graft Rejection/pathology , Prognosis , Adult , Reperfusion Injury/etiology , Reperfusion Injury/pathology , Cryopreservation/methods , Tissue Donors/supply & distribution , Postoperative Complications , Retrospective StudiesABSTRACT
BACKGROUND: Donor-derived cell-free DNA (dd-cfDNA) has emerged as a reliable, noninvasive method for the surveillance of allograft rejection in heart transplantation (HT) patients, but its utility in multi-organ transplants (MOT) is unknown. We describe our experience using dd-cfDNA in simultaneous MOT recipients. METHODS: A single-center retrospective review of all HT recipients between 2018 and 2022 that had at least one measurement of dd-cfDNA collected. Patients who had simultaneous MOT were identified and included in this study. Levels of dd-cfDNA were paired with endomyocardial biopsies (EMB) performed within 1 month of blood testing if available. Acute cellular rejection (ACR) was defined as ISHLT (International Society for Heart and Lung Transplantation) grade ≥ 2R. and antibody-mediated rejection (AMR) was defined as pAMR grade > 0. The within-patient variability score of the dd-cfDNA was calculated by the variance/average. RESULTS: The study included 25 multiorgan transplant recipients: 13 heart-kidney (H-K), 8 heart-liver (H-Li), and 4 heart-lung (H-Lu). The median age was 55 years, 44% were female; the median time from HT until the first dd-cfDNA measurement was 4.5 months (IQR 2, 10.5). The median dd-cfDNA level was 0.18% (IQR 0.15%, 0.27%) for H-K, 1.15% (IQR 0.77%, 2.33%) for H-Li, and 0.69% (IQR 0.62%, 1.07%) for H-Lu patients (p < 0.001). Prevalence of positive dd-cfDNA tests (threshold of 0.20%) were 42.2%, 97.3%, and 92.3% in the H-K, H-Li, and H-Lu groups, respectively. The within-patient variability score was highest in the H-Li group (median of 0.45 [IQR 0.29, 0.94]) and lowest in the H-K group (median of 0.09 [IQR 0.06, 0.12]); p = 0.002. No evidence of cardiac ACR or AMR was found. Three patients experienced renal allograft ACR and/or AMR, two patients experienced rejection of the liver allograft, and one patient experienced an episode of AMR-mediated lung rejection. One person in the H-K group experienced an episode of cardiac allograft dysfunction that was not associated with biopsy-confirmed rejection. CONCLUSION: Dd-cfDNA is chronically elevated in most MOT recipients. There is a high degree of within-patient variability in levels (particularly for H-Li and H-Lu recipients), which may limit the utility of this assay in monitoring MOT recipients.
Subject(s)
Cell-Free Nucleic Acids , Graft Rejection , Heart Transplantation , Tissue Donors , Humans , Female , Cell-Free Nucleic Acids/blood , Male , Retrospective Studies , Middle Aged , Heart Transplantation/adverse effects , Graft Rejection/diagnosis , Graft Rejection/etiology , Graft Rejection/blood , Follow-Up Studies , Prognosis , Organ Transplantation/adverse effects , Graft Survival , Biomarkers/blood , Transplant Recipients , Risk Factors , AdultABSTRACT
BACKGROUND: Belatacept (BTC), a fusion protein, selectively inhibits T-cell co-stimulation by binding to the CD80 and CD86 receptors on antigen-presenting cells (APCs) and has been used as immunosuppression in adult renal transplant recipients. However, data regarding its use in heart transplant (HT) recipients are limited. This retrospective cohort study aimed to delineate BTC's application in HT, focusing on efficacy, safety, and associated complications at a high-volume HT center. METHODS: A retrospective cohort study was conducted of patients who underwent HT between January 2017 and December 2021 and subsequently received BTC as part of their immunosuppressive regimen. Twenty-one HT recipients were identified. Baseline characteristics, history of rejection, and indication for BTC use were collected. Outcomes included renal function, graft function, allograft rejection and mortality. Follow-up data were collected through December 2023. RESULTS: Among 776 patients monitored from January 2017 to December 2021 21 (2.7%) received BTC treatment. Average age at transplantation was 53 years (± 12 years), and 38% were women. BTC administration began, on average, 689 [483, 1830] days post-HT. The primary indications for BTC were elevated pre-formed donor-specific antibodies in highly sensitized patients (66.6%) and renal sparing (23.8%), in conjunction with reduced calcineurin inhibitor dosage. Only one (4.8%) patient encountered rejection within a year of starting BTC. Graft function by echocardiography remained stable at 6 and 12 months posttreatment. An improvement was observed in serum creatinine levels (76.2% of patients), decreasing from a median of 1.58 to 1.45 (IQR [1.0-2.1] to [1.1-1.9]) over 12 months (p = .054). eGFR improved at 3 and 6 months compared with 3 months pre- BTC levels; however, this was not statistically significant (p = .24). Treatment discontinuation occurred in seven patients (33.3%) of whom four (19%) were switched back to full dose CNI. Infections occurred in 11 patients (52.4%), leading to BTC discontinuation in 4 patients (19%). CONCLUSION: In this cohort, BTC therapy was used as alternative immunosuppression for management of highly sensitized patients or for renal sparing. BTC therapy when combined with CNI dose reduction resulted in stabilization in renal function as measured through renal surrogate markers, which did not, however, reach statistical significance. Patients on BTC maintained a low rejection rate and preserved graft function. Infections were common during BTC therapy and were associated with medication pause/discontinuation in 19% of patients. Further randomized studies are needed to assess the efficacy and safety of BTC in HT recipients.
Subject(s)
Heart Transplantation , Kidney Transplantation , Adult , Humans , Female , Middle Aged , Male , Abatacept , Retrospective Studies , Kidney Transplantation/adverse effects , Immunosuppressive Agents , Calcineurin Inhibitors/therapeutic use , T-Lymphocytes , Graft Rejection/drug therapy , Graft Rejection/etiology , Transplant Recipients , Graft SurvivalABSTRACT
OBJECTIVE: large-scale multicentre clinical trials conducted by cooperative groups have generated a lot of evidence to establish better standard treatments. The Clinical Trials Act was enforced on 1 April 2018, in Japan, and it has remarkably increased the operational burden on investigators, but its long-term impact on cancer cooperative groups is unknown. METHODS: a survey was conducted across the nine major cooperative groups that constitute the Japan Cancer Trials Network to assess the impact of Clinical Trials Act on the number of newly initiated trials from fiscal year (from 1 April to 31 March) 2017 to 2022 and that of ongoing trials on 1 April in each year from 2018 to 2023. RESULTS: the number of newly initiated trials dropped from 38 trials in fiscal year 2017 to 26 trials in fiscal year 2018, surged to 50 trials in fiscal year 2019, but then gradually decreased to 25 trials by fiscal year 2022. Specified clinical trials decreased from 32 trials in fiscal year 2019 to 12 trials in fiscal year 2022. The number of ongoing trials was 220 trials in 2018, peaked at 245 trials in 2020, but then gradually decreased to 219 trials by 2023. The number of specified clinical trials has been in consistent decline. By April 2023, of the 20 ongoing non-specified clinical trials, nine adhered to Clinical Trials Act and 11 followed the Ethical Guidelines for Medical and Health Research Involving Human Subjects. CONCLUSION: the number of multicentre clinical trials in oncology gradually decreased after the Clinical Trials Act's enforcement, which underscores the need for comprehensive amendment of the Clinical Trials Act to streamline the operational process.
Subject(s)
Clinical Trials as Topic , Medical Oncology , Neoplasms , Humans , Clinical Trials as Topic/standards , Neoplasms/therapy , Medical Oncology/legislation & jurisprudence , Japan , Surveys and QuestionnairesABSTRACT
The insulin/insulin-like growth factor-like signaling (IIS) pathway is highly conserved across metazoans and regulates numerous physiological functions, including development, metabolism, fecundity, and lifespan. The insulin receptor (InR), a crucial membrane receptor in the IIS pathway, is known to be ubiquitously expressed in various tissues, albeit at generally low levels, and its subcellular localization remains incompletely characterized. In this study, we employed CRISPR-mediated mutagenesis in the fruit fly Drosophila to create knock-in alleles of InR tagged with fluorescent proteins (InR::mCherry or InR::EYFP). By inserting the coding sequence of the fluorescent proteins mCherry or EYFP near the end of the coding sequence of the endogenous InR gene, we could trace the natural InR protein through their fluorescence. As an example, we investigated epithelial cells of the male accessory gland (AG), an internal reproductive organ, and identified two distinct patterns of InR::mCherry localization. In young AG, InR::mCherry accumulated on the basal plasma membrane between cells, whereas in mature AG, it exhibited intracellular localization as multiple puncta, indicating endocytic recycling of InR during cell growth. In the AG senescence accelerated by the mutation of Diuretic hormone 31 (Dh31), the presence of InR::mCherry puncta was more pronounced compared to the wild type. These findings raise expectations for the utility of the newly created InR::mCherry/EYFP alleles for studying the precise expression levels and subcellular localization of InR. Furthermore, this fluorescently tagged allele approach can be extended to investigate other membrane receptors with low abundance, facilitating the direct examination of their true expression and localization.
Subject(s)
Drosophila Proteins , Drosophila melanogaster , Male , Animals , Drosophila melanogaster/physiology , Receptor, Insulin/genetics , Receptor, Insulin/metabolism , Alleles , Drosophila Proteins/genetics , Drosophila Proteins/metabolism , DrosophilaABSTRACT
BACKGROUND: Pre-left ventricular assist device (LVAD) pectoralis muscle assessment, an estimate of sarcopenia, has been associated with postoperative mortality and gastrointestinal bleeding, though its association with inflammation, endotoxemia, length-of-stay (LOS), and readmissions remains underexplored. METHODS: This was a single-center cohort study of LVAD patients implanted 1/2015-10/2018. Preoperative pectoralis muscle area was measured on chest computed tomography (CT), adjusted for height squared to derive pectoralis muscle area index (PMI). Those with PMI in the lowest quintile were defined as low-PMI cohort; all others constituted the reference cohort. Biomarkers of inflammation (interleukin-6, adiponectin, tumor necrosis factor-α [TNFα]) and endotoxemia (soluble (s)CD14) were measured in a subset of patients. RESULTS: Of the 254 LVAD patients, 95 had a preoperative chest CT (median days pre-LVAD: 7 [IQR 3-13]), of whom 19 (20.0%) were in the low-PMI cohort and the remainder were in the reference cohort. Compared with the reference cohort, the low-PMI cohort had higher levels of sCD14 (2594 vs. 1850 ng/mL; p = 0.04) and TNFα (2.9 vs. 1.9 pg/mL; p = 0.03). In adjusted analyses, the low-PMI cohort had longer LOS (incidence rate ratio 1.56 [95% confidence interval 1.16-2.10], p = 0.004) and higher risk of 90-day and 1-year readmissions (subhazard ratio 5.48 [1.88-16.0], p = 0.002; hazard ratio 1.73 [1.02-2.94]; p = 0.04, respectively). CONCLUSIONS: Pre-LVAD PMI is associated with inflammation, endotoxemia, and increased LOS and readmissions.
ABSTRACT
BACKGROUND: Hospital readmissions following left ventricular assist device (LVAD) remain a frequent comorbidity, associated with decreased quality of life and increased resources utilization. This study sought to determine causes, predictors, and impact on survival of hospitalizations during HeartMate 3 (HM3) support. METHODS: All patients implanted with HM3 between November 2014 to December 2019 at Columbia University Irving Medical Center were consecutively enrolled in the study. Demographics and clinical characteristics from the index admission and the first outpatient visit were collected and used to estimate 1-year and 900-day readmission-free survival and overall survival. Multivariable analysis was performed for subsequent readmissions. RESULTS: Of 182 patients who received a HM3 LVAD, 167 (92%) were discharged after index admission and experienced 407 unplanned readmissions over the median follow up of 727 (interquartile range (IQR): 410.5, 1124.5) days. One-year and 900-day mean cumulative number of all-cause unplanned readmissions was 0.43 (95%CI, 0.36, 0.51) and 1.13 (95%CI, 0.99, 1.29). The most frequent causes of rehospitalizations included major infections (29.3%), bleeding (13.2%), device-related (12.5%), volume overload (7.1%), and other (28%). One-year and 900-day survival free from all-cause readmission was 38% (95%CI, 31-46%) and 16.6% (95%CI, 10.3-24.4%). One-year and 900-day freedom from 2, 3, and ≥4 readmissions were 60.7%, 74%, 74.5% and 26.2%, 33.3%, 41.3%. One-year and 900-day survival were unaffected by the number of readmissions and remained >90%. Male sex, ischemic etiology, diabetes, lower serum creatinine, longer duration of index hospitalization, and a history of readmission between discharge and the first outpatient visit were associated with subsequent readmissions. CONCLUSIONS: Unplanned hospital readmissions after HM3 are common, with infections and bleeding accounting for the majority of readmissions. Irrespective of the number of readmissions, one-year survival remained unaffected.
Subject(s)
Heart Failure , Heart-Assist Devices , Patient Readmission , Humans , Patient Readmission/statistics & numerical data , Male , Female , Heart-Assist Devices/adverse effects , Middle Aged , Aged , Heart Failure/mortality , Heart Failure/therapy , Retrospective Studies , Adult , Risk Factors , Postoperative Complications/epidemiology , Postoperative Complications/etiology , Postoperative Complications/mortality , Quality of LifeABSTRACT
BACKGROUND: No clear guidelines exist for perioperative anticoagulation management after durable left ventricular assist device insertion. In this study, we sought to compare outcomes between anti-factor Xa (FXa) and activated partial thromboplastin time (aPTT) in monitoring unfractionated heparin (UFH) dosing after HeartMate 3 (HM3) insertion. METHODS: This is a single-center retrospective review of patients who received UFH after HM3 insertion between 01/2020-12/2022. Post-operative UFH dose was titrated by aPTT goal 45-60 sec (n = 53) or FXa goal 0.1-0.2 U/mL (n = 59). Baseline differences between cohorts were balanced by inverse probability treatment weighting. RESULTS: At baseline, unadjusted FXa patients were more likely to be white (47.5% vs. 35.8%, p < 0.001), INTERMACS 1-2 (69.5% vs. 47.2%, p = 0.013), have history of coronary artery disease (66.1% vs. 43.4%, p = 0.026), and lower eGFR (54.1 vs. 63.7 mL/min/1.73 m2, p = 0.029) compared to the aPTT group. After adjusting for several bleeding/thrombosis risk factors, 97.5% of FXa and 91.0% of aPTT patients reached therapeutic levels with comparable UFH duration and maximum dose. Moreover, in-hospital mortality (2.5% vs. 3.1%, p = 0.842), major bleeding events (4.2% vs. 9.2%, p = 0.360), and thromboembolic events (21.8% vs. 10.1%, p = 0.151) remained without significant differences between FXa and aPTT cohorts. There was a high degree of variability in FXa (r2 = 0.20) and aPTT (r2 = 0.22) values for any given UFH dose. CONCLUSIONS: No differences in frequency of bleeding or thromboembolic events were observed in this study between FXa versus aPTT cohorts after HM3 implantation. More longitudinal studies are warranted to determine whether or not one assay is superior to the other.
ABSTRACT
This paper introduces a forensic psychiatry database established in Japan and discusses its significance and future issues. The purpose of this Database, created under the Medical Treatment and Supervision Act (MTSA) Database Project, is to improve the quality of forensic psychiatry treatment. It can collect monthly data on "basic information," "Orders and hospitalizations under the MTSA," "Treatment process," "Criminal and medical treatment history," and "problematic behavior in the unit." The online system has accumulated data on more than 8,000 items in 24 broad categories. Medical data are exported from the medical care assisting system of 32 designated inpatient facilities in XML format and then saved on USB memory sticks. The files are imported into the Database system client, which sends the data to the Database server via a virtual private network. This system minimizes errors and efficiently imports patient data. However, there is a limitation that it is difficult to set items that need to be analyzed to solve everyday clinical problems into the database system because they tend to change over time. By evaluating the effectiveness of the Database, and collecting appropriate data, it is expected to disseminate a wide range of knowledge that will contribute to the future development of mental health and welfare care.
Subject(s)
Mental Health Services , Humans , Forensic Psychiatry , Hospitalization , Japan , Online SystemsABSTRACT
Combined heart-liver transplantation (CHLT) is a rarely though increasingly performed procedure with evolving indications. Despite CHLT being performed at only a handful of centers, the use of intraoperative mechanical circulatory support to optimize hemodynamics and facilitate dual-organ transplantation varies widely. At our center, we liberally utilize veno-arterial extracorporeal membrane oxygenation (V-A ECMO) when a veno-venous shunt is anticipated to be insufficient in mitigating the hemodynamic perturbations associated with liver reperfusion. In this series, we describe our experience with V-A ECMO in sequential (heart-first) CHLT and demonstrate highly favorable outcomes with this strategy.
ABSTRACT
OBJECTIVES: Veno-arterial extracorporeal life support (V-A ECLS) is increasingly being utilized for postcardiotomy shock (PCS), though data describing the relationship between type of indexed operation and outcomes are limited. This study compared V-A ECLS outcomes across four major cardiovascular surgical procedures. METHODS: This was a single-center retrospective study of patients who required V-A ECLS for PCS between 2015 and 2022. Patients were stratified by the type of indexed operation, which included aortic surgery (AoS), coronary artery bypass grafting (CABG), valve surgery (Valve), and combined CABG and valve surgery (CABG + Valve). Factors associated with postoperative outcomes were assessed using logistic regression. RESULTS: Among 149 PCS patients who received V-A ECLS, there were 35 AoS patients (23.5%), 29 (19.5%) CABG patients, 59 (39.6%) Valve patients, and 26 (17.4%) CABG + Valve patients. Cardiopulmonary bypass times were longest in the AoS group (p < 0.01). Regarding causes of PCS, AoS patients had a greater incidence of ventricular failure, while the CABG group had a higher incidence of ventricular arrhythmia (p = 0.04). Left ventricular venting was most frequently utilized in the Valve group (p = 0.07). In-hospital mortality was worst among CABG + Valve patients (p < 0.01), and the incidence of acute kidney injury was highest in the AoS group (p = 0.03). In multivariable logistic regression, CABG + Valve surgery (odds ratio (OR) 4.20, 95% confidence interval 1.30-13.6, p = 0.02) and lactate level at ECLS initiation (OR, 1.17; 95% CI, 1.06-1.29; p < 0.01) were independently associated with mortality. CONCLUSIONS: We demonstrate that indications, management, and outcomes of V-A ECLS for PCS vary by type of indexed cardiovascular surgery.
ABSTRACT
RATIONALE & OBJECTIVE: The clinical implications of the discrepancy between cystatin C (cysC)- and serum creatinine (Scr)-estimated glomerular filtration rate (eGFR) in patients with heart failure (HF) and reduced ejection fraction (HFrEF) are unknown. STUDY DESIGN: Post-hoc analysis of randomized trial data. SETTING & PARTICIPANTS: 1,970 patients with HFrEF enrolled in PARADIGM-HF with available baseline cysC and Scr measurements. EXPOSURE: Intraindividual differences between eGFR based on cysC (eGFRcysC) and Scr (eGFRScr; eGFRdiffcysC-Scr). OUTCOMES: Clinical outcomes included the PARADIGM-HF primary end point (composite of cardiovascular [CV] mortality or HF hospitalization), CV mortality, all-cause mortality, and worsening kidney function. We also examined poor health-related quality of life (HRQoL), frailty, and worsening HF (WHF), defined as HF hospitalization, emergency department visit, or outpatient intensification of therapy between baseline and 8-month follow-up. ANALYTICAL APPROACH: Fine-Gray subdistribution hazard models and Cox proportional hazards models were used to regress clinical outcomes on baseline eGFRdiffcysC-Scr. Logistic regression was used to investigate the association of baseline eGFRdiffcysC-Scr with poor HRQoL and frailty. Linear regression models were used to assess the association of WHF with eGFRcysC, eGFRScr, and eGFRdiffcysC-Scr at 8-month follow-up. RESULTS: Baseline eGFRdiffcysC-Scr was higher than +10 and lower than-10mL/min/1.73m2 in 13.0% and 35.7% of patients, respectively. More negative values of eGFRdiffcysC-Scr were associated with worse outcomes ([sub]hazard ratio per standard deviation: PARADIGM-HF primary end point, 1.18; P=0.008; CV mortality, 1.34; P=0.001; all-cause mortality, 1.39; P<0.001; worsening kidney function, 1.31; P=0.05). For a 1-standard-deviation decrease in eGFRdiffcysC-Scr, the prevalences of poor HRQoL and frailty increased by 29% and 17%, respectively (P≤0.008). WHF was associated with a more pronounced decrease in eGFRcysC than in eGFRScr, resulting in a change in 8-month eGFRdiffcysC-Scr of-4.67mL/min/1.73m2 (P<0.001). LIMITATIONS: Lack of gold-standard assessment of kidney function. CONCLUSIONS: In patients with HFrEF, discrepancies between eGFRcysC and eGFRScr are common and are associated with clinical outcomes, HRQoL, and frailty. The decline in kidney function associated with WHF is more marked when assessed with eGFRcysC than with eGFRScr. PLAIN-LANGUAGE SUMMARY: Kidney function assessment traditionally relies on serum creatinine (Scr) to establish an estimated glomerular filtration rate (eGFR). However, this has been challenged with the introduction of an alternative marker, cystatin C (cysC). Muscle mass and nutritional status have differential effects on eGFR based on cysC (eGFRcysC) and Scr (eGFRScr). Among ambulatory patients with heart failure enrolled in PARADIGM-HF, we investigated the clinical significance of the difference between eGFRcysC and eGFRScr. More negative values (ie, eGFRScr>eGFRcysC) were associated with worse clinical outcomes (including mortality), poor quality of life, and frailty. In patients with progressive heart failure, which is characterized by muscle loss and poor nutritional status, the decline in kidney function was more pronounced when eGFR was estimated using cysC rather than Scr.
ABSTRACT
INTRODUCTION: Venoarterial extracorporeal membrane oxygenation (VA-ECMO) is a prevailing option for the management of severe early graft dysfunction. This systematic review and individual patient data (IPD) meta-analysis aims to evaluate (1) mortality, (2) rates of major complications, (3) prognostic factors, and (4) the effect of different VA-ECMO strategies on outcomes in adult heart transplant (HT) recipients supported with VA-ECMO. METHODS AND RESULTS: We conducted a systematic search and included studies of adults (≥18 years) who received VA-ECMO during their index hospitalization after HT and reported on mortality at any timepoint. We pooled data using random effects models. To identify prognostic factors, we analysed IPD using mixed effects logistic regression. We assessed the certainty in the evidence using the GRADE framework. We included 49 observational studies of 1477 patients who received VA-ECMO after HT, of which 15 studies provided IPD for 448 patients. There were no differences in mortality estimates between IPD and non-IPD studies. The short-term (30-day/in-hospital) mortality estimate was 33% (moderate certainty, 95% confidence interval [CI] 28%-39%) and 1-year mortality estimate 50% (moderate certainty, 95% CI 43%-57%). Recipient age (odds ratio 1.02, 95% CI 1.01-1.04) and prior sternotomy (OR 1.57, 95% CI 0.99-2.49) are associated with increased short-term mortality. There is low certainty evidence that early intraoperative cannulation and peripheral cannulation reduce the risk of short-term death. CONCLUSIONS: One-third of patients who receive VA-ECMO for early graft dysfunction do not survive 30 days or to hospital discharge, and one-half do not survive to 1 year after HT. Improving outcomes will require ongoing research focused on optimizing VA-ECMO strategies and care in the first year after HT.
Subject(s)
Extracorporeal Membrane Oxygenation , Heart Failure , Heart Transplantation , Adult , Humans , Extracorporeal Membrane Oxygenation/methods , Heart Transplantation/adverse effects , Hospital Mortality , Patient Discharge , Retrospective StudiesABSTRACT
BACKGROUND: Patients requiring femoral venoarterial (VA) extracorporeal life support (ECLS) are at risk of distal lower limb hypoperfusion and ischemia of the cannulated leg. In the present study, we evaluated the effect of using continuous noninvasive lower limb oximetry with near-infrared reflectance spectroscopy (NIRS) to detect tissue hypoxia and guide distal perfusion catheter (DPC) placement on the rates of leg ischemia requiring surgical intervention. METHODS: We performed a retrospective analysis of patients who had undergone femoral VA-ECLS at our institution from 2010 to 2014 (pre-NIRS era) and 2017 to 2021 (NIRS era). Patients who had undergone cannulation during the 2015 to 2016 transition era were excluded. The baseline characteristics, short-term outcomes, and ischemic complications requiring surgical intervention (eg, fasciotomy, thrombectomy, amputation, exploration) were compared across the two cohorts. RESULTS: Of the 490 patients included in the present study, 141 (28.8%) and 349 (71.2%) had undergone cannulation before and after the routine use of NIRS to direct DPC placement, respectively. The patients in the NIRS cohort had had a greater incidence of hyperlipidemia (53.7% vs 41.1%; P = .015) and hypertension (71.4% vs 60%; P = .020) at baseline, although they were less likely to have been supported with an intra-aortic balloon pump before ECLS cannulation (26.9% vs 37.6%; P = .026). These patients were also more likely to have experienced cardiac arrest (22.9% vs 7.8%; P ≤ .001) and a pulmonary cause (5.2% vs 0.7%; P = .04) as an indication for ECLS, with ECLS initiated less often for acute myocardial infarction (15.8% vs 34%; P ≤ .001). The patients in the NIRS cohort had had a smaller arterial cannula size (P ≤ .001) and a longer duration of ECLS support (5 vs 3.25 days; P ≤ .001) but significantly lower rates of surgical intervention for limb ischemia (2.6% vs 8.5%; P = .007) despite comparable rates of DPC placement (49.1% vs 44.7%; P = .427), with only two patients (1.1%) not identified by NIRS ultimately requiring surgical intervention. CONCLUSIONS: The use of a smaller arterial cannula (≤15F) and continuous NIRS monitoring to guide selective insertion of DPCs could be a valid and effective strategy associated with a reduced incidence of ischemic events requiring surgical intervention.