ABSTRACT
BACKGROUND: It has been suggested that the disparity of outcomes between the studies of transcutaneous edge-to-edge repair (TEER) for functional mitral regurgitation (FMR) in heart failure with reduced ejection fraction (HFrEF) could be due to systematic differences in the populations studied. One proposal is that there are 2 broad groups: those with proportional FMR who respond less favorably, and those in whom the FMR is greater than expected (disproportionate) FMR where edge-to-edge TEER seems to be more effective. Whether this grouping is relevant for other percutaneous interventions for FMR is unknown. OBJECTIVES: We sought to compare clinical and echocardiographic outcomes of patients with HFrEF and proportionate and disproportionate FMR treated with indirect annuloplasty using the Carillon device. METHODS: This is a pooled analysis from 3 trials of patients with FMR. Key patient eligibility in these trials specified persistent grade 2+ to 4+ FMR with >5.5 cm left ventricular (LV) end-diastolic diameter (LVEDD) and reduced ejection fraction. Patients with an effective regurgitant orifice area/LV end-diastolic volume (EROA/LVEDV) ratio under 0.15 were assigned to the proportionate FMR group (n = 74;65%) and those with a ratio above 0.15 were classed as having disproportionate FMR (n = 39;35%). RESULTS: At 12 months following treatment, both groups showed improvements in all MR variables including regurgitation volume, EROA and vena contracta. Moreover, in patients with proportionate MR there were clinically relevant and statistically significant improvements in LV volumes and diameters. There was no independent relationship between the degree of proportionality as a continuous variable and the remodeling response to Carillon therapy (change in LVEDV r = 0.17; change in LVESV r = 0.14). CONCLUSION: Percutaneous mitral annuloplasty with the Carillon device reduces MR in patients with both proportionate and disproportionate FMR, and also results in LV reverse remodeling in those with proportionate FMR. The effect on remodeling remains to be verified in a large-scale trial.
ABSTRACT
BACKGROUND: Insertable cardiac monitors (ICMs) are a clinically effective means of detecting atrial fibrillation (AF) in high-risk patients, and guiding the initiation of non-vitamin K oral anticoagulants (NOACs). Their cost-effectiveness from a US clinical payer perspective is not yet known. The objective of this study was to evaluate the cost-effectiveness of ICMs compared to standard of care (SoC) for detecting AF in patients at high risk of stroke (CHADS2 ≥ 2), in the US. METHODS: Using patient data from the REVEAL AF trial (n = 393, average CHADS2 score = 2.9), a Markov model estimated the lifetime costs and benefits of detecting AF with an ICM or with SoC (specifically intermittent use of electrocardiograms and 24-h Holter monitors). Ischemic and hemorrhagic strokes, intra- and extra-cranial hemorrhages, and minor bleeds were modelled. Diagnostic and device costs, costs of treating stroke and bleeding events and medical therapy-specifically costs of NOACs were included. Costs and health outcomes, measured as quality-adjusted life years (QALYs), were discounted at 3% per annum, in line with standard practice in the US setting. One-way deterministic and probabilistic sensitivity analyses (PSA) were undertaken. RESULTS: Lifetime per-patient cost for ICM was $31,116 versus $25,330 for SoC. ICMs generated a total of 7.75 QALYs versus 7.59 for SoC, with 34 fewer strokes projected per 1000 patients. The model estimates a number needed to treat of 29 per stroke avoided. The incremental cost-effectiveness ratio was $35,528 per QALY gained. ICMs were cost-effective in 75% of PSA simulations, using a $50,000 per QALY threshold, and a 100% probability of being cost-effective at a WTP threshold of $150,000 per QALY. CONCLUSIONS: The use of ICMs to identify AF in a high-risk population is likely to be cost-effective in the US healthcare setting.
Subject(s)
Atrial Fibrillation , Humans , Administration, Oral , Anticoagulants/administration & dosage , Atrial Fibrillation/diagnosis , Atrial Fibrillation/drug therapy , Cost-Benefit Analysis , Hemorrhage , Quality-Adjusted Life Years , Stroke , Clinical Trials as TopicABSTRACT
INTRODUCTION: Older patients may be less likely to receive cardiac resynchronisation therapy (CRT) for the management of heart failure. We aimed to describe the differences in clinical response, complications, and subsequent outcomes following CRT implantation compared to younger patients. METHODS: We conducted a retrospective cohort study of unselected, consecutive patients implanted with CRT devices between March 2008 and July 2017. We recorded complications, symptomatic and echocardiographic response, hospitalisation for heart failure, and all-cause mortality comparing patients aged <70, 70-79 and ≥ 80 years. RESULTS: Five hundred and seventy-four patients (median age 76 years [interquartile range 68-81], 73.3% male) received CRT. At baseline, patients aged ≥80 years had worse symptoms, were more likely to have co-morbidities, and less likely to be receiving comprehensive medical therapy, although left ventricular function was similar. Older patients were less likely to receive CRT-defibrillators compared to CRT-pacemakers. Complications were infrequent and not more common in older patients. Age was not a predictor of symptomatic or echocardiographic response to CRT (67.2%, 71.2% and 62.6% responders in patients aged <70, 70-79 and ≥ 80 years, respectively; P = 0.43), and time to first heart failure hospitalisation was similar across age groups (P = 0.28). Ten-year survival was lower for older patients (49.9%, 23.9% and 6.8% in patients aged <70, 70-79 and ≥ 80 years, respectively; P < 0.001). CONCLUSIONS: The benefits of CRT on symptoms and left ventricular function were not different in older patients despite a greater burden of co-morbidities and less optimal medical therapy. These findings support the use of CRT in an ageing population.
Subject(s)
Cardiac Resynchronization Therapy , Heart Failure , Humans , Male , Aged , Female , Retrospective Studies , Treatment Outcome , Cardiac Resynchronization Therapy/adverse effects , Heart Failure/diagnosis , Heart Failure/therapy , Ventricular Function, LeftABSTRACT
BACKGROUND: Many diseases are associated with chronic inflammation, resulting in widening application of anti-inflammatory therapies. Although they are effective as disease-modifying agents, these anti-inflammatory therapies increase the risk of serious infection; however, it remains unknown whether chronic systemic inflammation per se is also associated with fatal infection. METHODS: Using serum C-reactive protein (CRP) data from 461 052 UK Biobank participants, we defined incidence rate ratios (IRRs) for death from infection, cardiovascular disease, or other causes and adjusted for comorbidities and the use of anti-inflammatory therapies. RESULTS: Systemic inflammation, defined as CRP ≥2â mg/L, was common in all comorbidities considered. After adjusting for confounding factors, systemic inflammation was associated with a higher IRR point estimate for infection death (1.70; 95% confidence interval [CI], 1.51-1.92) than cardiovascular (1.48; CI, 1.40-1.57) or other death (1.41; CI, 1.37-1.45), although CIs overlapped. C-reactive protein thresholds of ≥5 and ≥10â mg/L yielded similar findings, as did analyses in people with ≥2, but not <2, comorbidities. CONCLUSIONS: Systemic inflammation per se identifies people at increased risk of infection death, potentially contributing to the observed risks of anti-inflammatory therapies in clinical trials. In future clinical trials of anti-inflammatory therapies, researchers should carefully consider risks and benefits in target populations, guided by research into mechanisms of infection risk.
Subject(s)
C-Reactive Protein , Cardiovascular Diseases , Anti-Inflammatory Agents , Cohort Studies , Humans , InflammationABSTRACT
PURPOSE OF REVIEW: The distinction between 'acute' and 'chronic' heart failure persists. Our review aims to explore whether reclassifying heart failure decompensation more accurately as an event within the natural history of chronic heart failure has the potential to improve outcomes. RECENT FINDINGS: Although hospitalisation for worsening heart failure confers a poor prognosis, much of this reflects chronic disease severity. Most patients survive hospitalisation with most deaths occurring in the post-discharge 'vulnerable phase'. Current evidence supports four classes of medications proven to reduce cardiovascular mortality for those who have heart failure with a reduced ejection fraction, with recent trials suggesting worsening heart failure events are opportunities to optimise these therapies. Abandoning the term 'acute heart failure' has the potential to give greater priority to initiating proven pharmacological and device therapies during decompensation episodes, in order to improve outcomes for those who are at the greatest risk.
Subject(s)
Heart Failure , Humans , Heart Failure/drug therapy , Aftercare , Patient Discharge , Hospitalization , Chronic Disease , Stroke VolumeABSTRACT
BACKGROUND: The evidence base for the benefits of ß-blockers in heart failure with reduced ejection fraction (HFrEF) suggests that higher doses are associated with better outcomes. OBJECTIVES: The aim of this study was to report the proportion of patients receiving optimized doses of ß-blockers, outcomes, and factors associated with suboptimal dosing. METHODS: This was a prospective cohort study of 390 patients with HFrEF undergoing clinical and echocardiography assessment at baseline and at 1 year. RESULTS: Two hundred thirty-seven patients (61%) were receiving optimized doses (≥5-mg/d bisoprolol equivalent), 72 (18%) could not be up-titrated (because of heart rate < 60 beats/min or systolic blood pressure <100 mm Hg), and the remaining 81 (21%) should have been. Survival was similarly reduced in those who could not and should have been receiving 5 mg/d or greater, and patient factors did not explain the failure to attain optimized dosing. CONCLUSIONS: Many patients with HFrEF are not receiving optimal dosing of ß-blockers, and in around half, there was no clear contraindication in terms of heart rate or blood pressure.
Subject(s)
Heart Failure , Humans , Bisoprolol/therapeutic use , Stroke Volume/physiology , Prospective Studies , Adrenergic beta-Antagonists/therapeutic use , Chronic DiseaseABSTRACT
BACKGROUND: Heart failure with reduced ejection fraction (HFrEF) is characterized by blunting of the positive relationship between heart rate and left ventricular (LV) contractility known as the force-frequency relationship (FFR). We have previously described that tailoring the rate-response programming of cardiac implantable electronic devices in patients with HFrEF on the basis of individual noninvasive FFR data acutely improves exercise capacity. We aimed to examine whether using FFR data to tailor heart rate response in patients with HFrEF with cardiac implantable electronic devices favorably influences exercise capacity and LV function 6 months later. METHODS: We conducted a single-center, double-blind, randomized, parallel-group trial in patients with stable symptomatic HFrEF taking optimal guideline-directed medical therapy and with a cardiac implantable electronic device (cardiac resynchronization therapy or implantable cardioverter-defibrillator). Participants were randomized on a 1:1 basis between tailored rate-response programming on the basis of individual FFR data and conventional age-guided rate-response programming. The primary outcome measure was change in walk time on a treadmill walk test. Secondary outcomes included changes in LV systolic function, peak oxygen consumption, and quality of life. RESULTS: We randomized 83 patients with a mean±SD age 74.6±8.7 years and LV ejection fraction 35.2±10.5. Mean change in exercise time at 6 months was 75.4 (95% CI, 23.4 to 127.5) seconds for FFR-guided rate-adaptive pacing and 3.1 (95% CI, -44.1 to 50.3) seconds for conventional settings (analysis of covariance; P=0.044 between groups) despite lower peak mean±SD heart rates (98.6±19.4 versus 112.0±20.3 beats per minute). FFR-guided heart rate settings had no adverse effect on LV structure or function, whereas conventional settings were associated with a reduction in LV ejection fraction. CONCLUSIONS: In this phase II study, FFR-guided rate-response programming determined using a reproducible, noninvasive method appears to improve exercise time and limit changes to LV function in people with HFrEF and cardiac implantable electronic devices. Work is ongoing to confirm our findings in a multicenter setting and on longer-term clinical outcomes. Registration: URL: https://www.clinicaltrials.gov; Unique identifier: NCT02964650.
Subject(s)
Cardiac Resynchronization Therapy Devices , Cardiac Resynchronization Therapy , Defibrillators, Implantable , Electric Countershock/instrumentation , Exercise Tolerance , Heart Failure/therapy , Heart Rate , Stroke Volume , Ventricular Function, Left , Aged , Aged, 80 and over , Cardiac Resynchronization Therapy/adverse effects , Double-Blind Method , Electric Countershock/adverse effects , England , Female , Functional Status , Heart Failure/diagnosis , Heart Failure/physiopathology , Humans , Male , Middle Aged , Quality of Life , Recovery of Function , Time Factors , Treatment Outcome , Walk TestABSTRACT
OBJECTIVE: The prevalence of obesity is growing globally. Adiposity increases the risk for metabolic syndrome, type 2 diabetes and cardiovascular disease. Adipose tissue distribution influences systemic metabolism and impacts metabolic disease risk. The link between sexual dimorphisms of adiposity and metabolism is poorly defined. We hypothesise that depot-specific adipose tissue mitochondrial function contributes to the sexual dimorphism of metabolic flexibility in obesity. METHODS: Male and female mice fed high fat diet (HFD) or standard diet (STD) from 8-18 weeks of age underwent whole animal calorimetry and high-resolution mitochondrial respirometry analysis on adipose tissue depots. To determine translatability we used RT-qPCR to examine key brown adipocyte-associated gene expression: peroxisome proliferator-activated receptor co-activator 1α, Uncoupling protein 1 and cell death inducing DFFA like effector a in brown adipose tissue (BAT) and subcutaneous adipose tissue (sWAT) of 18-week-old mice and sWAT from human volunteers. RESULTS: Male mice exhibited greater weight gain compared to female mice when challenged with HFD. Relative to increased body mass, the adipose to body weight ratio for BAT and sWAT depots was increased in HFD-fed males compared to female HFD-fed mice. Oxygen consumption, energy expenditure, respiratory exchange ratio and food consumption did not differ between males and females fed HFD. BAT mitochondria from obese females showed increased Complex I & II respiration and maximal respiration compared to lean females whereas obese males did not exhibit adaptive mitochondrial BAT respiration. Sexual dimorphism in BAT-associated gene expression in sWAT was also associated with Body Mass Index in humans. CONCLUSIONS: We show that sexual dimorphism of weight gain is reflected in mitochondrial respiration analysis. Female mice have increased metabolic flexibility to adapt to changes in energy intake by regulating energy expenditure through increased complex II and maximal mitochondrial respiration within BAT when HFD challenged and increased proton leak in sWAT mitochondria.
Subject(s)
Adipose Tissue , Mitochondria/metabolism , Obesity/metabolism , Sex Characteristics , Adipose Tissue/cytology , Adipose Tissue/metabolism , Animals , Disease Models, Animal , Female , Male , MiceABSTRACT
There has been a progressive evolution in the management of patients with chronic heart failure and reduced ejection fraction (HFrEF), including cardiac resynchronisation therapy (CRT) in those that fulfil pre-defined criteria. However, there exists a significant proportion with refractory symptoms in whom CRT devices are not clinically indicated or ineffective. Cardiac contractility modulation (CCM) is a novel therapy that incorporates administration of non-excitatory electrical impulses to the interventricular septum during the absolute refractory period. Implantation is analogous to a traditional transvenous pacemaker system, but with the use of two right ventricular leads. Mechanistic studies have shown augmentation of left ventricular contractility and beneficial global effects on reverse remodeling, primarily through alterations in calcium handling. This appears to occur without increasing myocardial oxygen consumption. Data from clinical trials have shown translational improvements in functional capacity and quality of life, though long-term outcome data are lacking. This review explores the rationale, evidence base and limitations of this nascent technology.
Subject(s)
Heart Failure , Ventricular Dysfunction, Left , Heart Failure/therapy , Humans , Myocardial Contraction , Quality of Life , Stroke Volume , Treatment OutcomeABSTRACT
The coronavirus disease 2019 (COVID-19) pandemic is an unprecedented challenge. Meeting this has resulted in changes to working practices and the impact on the management of patients with heart failure with reduced ejection fraction (HFrEF) is largely unknown. We performed a retrospective, observational study contrasting patients diagnosed with HFrEF attending specialist heart failure clinics at a UK hospital, whose subsequent period of optimisation of medical therapy was during the COVID-19 pandemic, with patients diagnosed the previous year. The primary outcome was the change in equivalent dosing of ramipril and bisoprolol at 6-months. Secondary outcomes were the number and type of follow-up consultations, hospitalisation for heart failure and all-cause mortality. In total, 60 patients were diagnosed with HFrEF between 1 December 2019 and 30 April 2020, compared to 54 during the same period of the previous year. The absolute number of consultations was higher (390 vs 270; p = 0.69), driven by increases in telephone consultations, with a reduction in appointments with hospital nurse specialists. After 6-months, we observed lower equivalent dosing of ramipril (3.1 ± 3.0 mg vs 4.4 ± 0.5 mg; p = 0.035) and similar dosing of bisoprolol (4.1 ± 0.5 mg vs 4.9 ± 0.5 mg; p = 0.27), which persisted for ramipril (mean difference 1.0 mg, 95% CI 0.018-2.09; p = 0.046) and bisoprolol (mean difference 0.52 mg, 95% CI -0.23-1.28; p = 0.17) after adjustment for baseline dosing. We observed no differences in the proportion of patients who died (5.0% vs 7.4%; p = 0.59) or were hospitalised with heart failure (13.3% vs 9.3%; p = 0.49). Our study suggests the transition to telephone appointments and re-deployment of heart failure nurse specialists was associated with less successful optimisation of medical therapy, especially renin-angiotensin inhibitors, compared with usual care.
Subject(s)
Adrenergic beta-1 Receptor Antagonists/administration & dosage , Angiotensin-Converting Enzyme Inhibitors/administration & dosage , Bisoprolol/administration & dosage , COVID-19 , Heart Failure/drug therapy , Ramipril/administration & dosage , Adrenergic beta-1 Receptor Antagonists/adverse effects , Aged , Angiotensin-Converting Enzyme Inhibitors/adverse effects , Bisoprolol/adverse effects , Chronic Disease , Female , Heart Failure/diagnosis , Heart Failure/mortality , Heart Failure/physiopathology , Humans , Male , Ramipril/adverse effects , Retrospective Studies , Time Factors , Treatment OutcomeABSTRACT
Cardiac resynchronization therapy (CRT) is one of the most effective therapies for heart failure with reduced ejection fraction and leads to improved quality of life, reductions in heart failure hospitalization rates and all-cause mortality. Nevertheless, up to two-thirds of eligible patients are not referred for CRT. Furthermore, post-implantation follow-up is often fragmented and suboptimal, hampering the potential maximal treatment effect. This joint position statement from three European Society of Cardiology Associations, Heart Failure Association (HFA), European Heart Rhythm Association (EHRA) and European Association of Cardiovascular Imaging (EACVI), focuses on optimized implementation of CRT. We offer theoretical and practical strategies to achieve more comprehensive CRT referral and post-procedural care by focusing on four actionable domains: (i) overcoming CRT under-utilization, (ii) better understanding of pre-implant characteristics, (iii) abandoning the term 'non-response' and replacing this by the concept of disease modification, and (iv) implementing a dedicated post-implant CRT care pathway.
Subject(s)
Cardiac Resynchronization Therapy , Heart Failure , Cardiac Resynchronization Therapy Devices , Heart Failure/diagnosis , Heart Failure/therapy , Humans , Quality of Life , Referral and Consultation , Treatment OutcomeABSTRACT
Cardiac resynchronization therapy (CRT) is one of the most effective therapies for heart failure with reduced ejection fraction and leads to improved quality of life, reductions in heartfailure hospitalization rates and reduces all-cause mortality. Nevertheless, up to two-thirds ofeligible patients are not referred for CRT. Furthermore, post implantation follow-up is oftenfragmented and suboptimal, hampering the potential maximal treatment effect. This jointposition statement from three ESC Associations, HFA, EHRA and EACVI focuses onoptimized implementation of CRT. We offer theoretical and practical strategies to achievemore comprehensive CRT referral and post-procedural care by focusing on four actionabledomains; (I) overcoming CRT under-utilization, (II) better understanding of pre-implantcharacteristics, (III) abandoning the term 'non-response' and replacing this by the concept ofdisease modification, and (IV) implementing a dedicated post-implant CRT care pathway.
ABSTRACT
OBJECTIVE: Prevention of recurrent stroke in patients with embolic stroke of undetermined source (ESUS) is challenging. The advent of safer anticoagulation in the form of direct oral anticoagulants (DOACs) has prompted exploration of prophylactic anticoagulation for all ESUS patients, rather than anticoagulating just those with documented atrial fibrillation (AF). However, recent trials have failed to demonstrate a clinical benefit, while observing increased bleeding. We modeled the economic impact of anticoagulating ESUS patients without documented AF across multiple geographies. METHODS: CRYSTAL-AF trial data were used to assess ischaemic stroke event rates in ESUS patients confirmed AF-free after long-term monitoring. Anticipated bleeding event rates (including both minor and major bleeds) with aspirin, dabigatran 150 mg, and rivaroxaban 20 mg were sourced from published meta-analyses, whilst a 30% ischaemic stroke reduction for both DOACs was assumed. Cost data for clinical events and pharmaceuticals were collected from the local payer perspective. RESULTS: Compared with aspirin, dabigatran and rivaroxaban resulted in 17.9 and 29.9 additional bleeding events per 100 patients over a patient's lifetime, respectively. Despite incorporating into our model the proposed 30% reduction in ischaemic stroke risk, both DOACs were cost-additive over patient lifetime, as the costs of bleeding events and pharmaceuticals outweighed cost savings associated with the reduction in ischaemic strokes. DOACs added £5953-£7018 per patient (UK), 6683-7368 (Netherlands), 4933-9378 (Spain), AUD$5353-6539 (Australia) and $26,768-$32,259 (US) of payer cost depending on the agent prescribed. Additionally, in the U.S. patient pharmacy co-payments ranged from $2468-$12,844 depending on agent and patient plan. In all settings, cost-savings could not be demonstrated even when the modelling assumed 100% protection from recurrent ischaemic strokes, due to the very low underlying risk of recurrent ischaemic stroke in this population (1.27 per 100 patient-years). CONCLUSIONS: Anticoagulation of non-AF patients may cause excess bleeds and add substantial costs for uncertain benefits, suggesting a personalised approach to anticoagulation in ESUS patients.
Subject(s)
Anticoagulants/adverse effects , Anticoagulants/economics , Drug Costs , Embolic Stroke/economics , Embolic Stroke/prevention & control , Hemorrhage/chemically induced , Ischemic Stroke/economics , Ischemic Stroke/prevention & control , Secondary Prevention/economics , Administration, Oral , Anticoagulants/administration & dosage , Aspirin/adverse effects , Aspirin/economics , Clinical Trials as Topic , Cost-Benefit Analysis , Dabigatran/adverse effects , Dabigatran/economics , Embolic Stroke/epidemiology , Humans , Ischemic Stroke/epidemiology , Models, Economic , Recurrence , Retrospective Studies , Risk Assessment , Risk Factors , Rivaroxaban/adverse effects , Rivaroxaban/economics , Time Factors , Treatment OutcomeABSTRACT
BACKGROUND: Observational studies investigating risk factors in coronavirus disease 2019 (COVID-19) have not considered the confounding effects of advanced care planning, such that a valid picture of risk for elderly, frail and multi-morbid patients is unknown. We aimed to report ceiling of care and cardiopulmonary resuscitation (CPR) decisions and their association with demographic and clinical characteristics as well as outcomes during the COVID-19 pandemic. METHODS: Retrospective, observational study conducted between 5th March and 7th May 2020 of all hospitalised patients with COVID-19. Ceiling of care and CPR decisions were documented using the Recommended Summary Plan for Emergency Care and Treatment (ReSPECT) process. Unadjusted and multivariable regression analyses were used to determine factors associated with ceiling of care decisions and death during hospitalisation. RESULTS: A total of 485 patients were included, of whom 409 (84·3%) had a documented ceiling of care; level one for 208 (50·9%), level two for 75 (18·3%) and level three for 126 (30·8%). CPR decisions were documented for 451 (93·0%) of whom 336 (74·5%) were 'not for resuscitation'. Advanced age, frailty, White-European ethnicity, a diagnosis of any co-morbidity and receipt of cardiovascular medications were associated with ceiling of care decisions. In a multivariable model only advanced age (odds 0·89, 0·86-0·93 p < 0·001), frailty (odds 0·48, 0·38-0·60, p < 0·001) and the cumulative number of co-morbidities (odds 0·72, 0·52-1·0, p = 0·048) were independently associated. Death during hospitalisation was independently associated with age, frailty and requirement for level two or three care. CONCLUSION: Ceiling of care decisions were made for the majority of patients during the COVID-19 pandemic, broadly in line with known predictors of poor outcomes in COVID-19, but with a focus on co-morbidities suggesting ICU admission might not be a reliable end-point for observational studies where advanced care planning is routine.
Subject(s)
Advance Care Planning , COVID-19/therapy , Clinical Decision-Making , Adult , Aged , Aged, 80 and over , Cardiopulmonary Resuscitation , Female , Humans , Life Support Care , Male , Middle Aged , Retrospective StudiesABSTRACT
BACKGROUND: Mitochondrial permeability transition pore (mPTP) opening plays a crucial role in cell death during ischemia-reperfusion injury (IRI). Cyclosporine A (CsA) inhibits mPTP opening. This study aimed to investigate the effects of CsA treatment during cardioplegia on the mitochondrial function and cardiac IRI. METHODS: Landrace pigs (52.9 ± 3.7 kg) were subjected to midline sternotomy, cardiopulmonary bypass at 34°C and 90 minutes of cardiac arrest. They received either a single shot of standard 4°C cold histidine-tryptophan-α-ketoglutarate (HTK)-Bretschneider solution (n = 11) or HTK-Bretschneider plus 1.2 mg/L CsA (histidine-tryptophan-α-ketoglutarate plus cyclosporine A (HTK/CsA); n = 11). During reperfusion global left-ventricular function was assessed and myocardial biopsies were harvested at baseline, during ischemia and 45 minutes following reperfusion. High-resolution respirometry and hydrogen peroxide production were measured. Immunohistochemical stainings for apoptosis-inducing factor and hypoxia-inducible factor-1α as well as a flow cytometry-based JC-1 mitochondrial membrane potential assay were performed. RESULTS: Hemodynamic parameters were comparable between both groups. The cytochrome C release (HTK: 930.3 ± 804.4 pg/mg, HTK/CsA: 699.7 ± 394.0 pg/mg, p = 0.457) as well as PGC1α content (HTK: 66.7%, HTK/CsA: 33.3%, p = 0.284) was lower in the HTK/CsA group. Respiratory measurements revealed that the oxygen flux under basal respiration was higher in the HTK/CsA group (8.2 ± 1.3 pmol·O2·s-1·mg-1·ww) than in the HTK group (3.8 ± 1.4 pmol·O2·s-1·mg-1·ww, p = 0.045). There were no significant differences regarding histological surrogates of apoptosis and necrosis. CONCLUSIONS: Supplementing cardioplegic solutions with CsA enhances the basal mitochondrial respiration thereby exerting a cardioprotective effect and diminishing IRI-induced damage. CsA seems to preserve mitochondrial function via non-ROS related pathways.
ABSTRACT
It has been suggested that an application of a conducted electrical weapon (CEW) might cause muscle injury such as rhabdomyolysis and an acute inflammatory response. We explored this hypothesis by testing the effects of electrical weapons on circulating markers of inflammation and muscle damage. In a prospective study, 29 volunteers received a full-trunk 5-s TASER® X26(E) CEW exposure. Venous blood samples were taken before, 5 min after, and at 24 h following the discharge. We tested for changes in serum levels of C-reactive protein (CRP), alkaline phosphatase (ALP), myoglobin, albumin, globulin, albumin/globulin ratio, aspartate and alanine aminotransferase, creatine kinase, total protein, bilirubin, and lactic acid dehydrogenase. Uncorrected CRP and myoglobin levels were lower in the immediate post exposure period (CRP levels 1.44 ± 1.39 v 1.43 ± 1.32 mg/L; p = 0.046 and myoglobin 36.8 ± 11.9 v 36.1 ± 13.9 µg/L; p = 0.0019) but these changes were not significant after correction for multiple comparisons. There were no changes in other biomarkers. At 24 h, CRP levels had decreased by 30% to 1.01 ± 0.80 mg/L (p = 0.001 from baseline). ALP was unchanged immediately after the CEW application but was reduced by 5% from baseline (66.2 ± 16.1 to 62.7 ± 16.1 IU/L; p = 0.0003) at 24 h. No other biomarkers were different from baseline at 24 h. A full-trunk electrical weapon exposure did not lead to clinically significant changes in the acute phase protein levels or changes in measures of muscle cellular injury. We found no biomarker evidence of rhabdomyolysis.
Subject(s)
Conducted Energy Weapon Injuries/complications , Rhabdomyolysis/blood , Adult , Alanine Transaminase/blood , Alkaline Phosphatase/blood , Aspartate Aminotransferases/blood , Bilirubin/blood , Biomarkers/blood , Blood Proteins/analysis , C-Reactive Protein/analysis , Creatine Kinase/blood , Female , Globulins/analysis , Humans , Lactic Acid/blood , Male , Middle Aged , Myoglobin/blood , Prospective Studies , Serum Albumin , Young AdultABSTRACT
BACKGROUND: Previous studies of outcomes in people who inject drugs (PWID) with infective endocarditis (IE) have often been retrospective, have had small sample sizes, and the duration of follow-up has been short and limited to patients who were operated on. METHODS: PWID treated for IE between 1 January 2006 and 31 December 2016 were identified from a prospectively collected database. PWID hospitalized with other infections acted as a novel comparison group. Outcomes were all-cause mortality, cause of death, relapse, recurrence, and reoperation. RESULTS: There were 105 episodes of IE in 92 PWID and 112 episodes of other infections in 107 PWID in whom IE was suspected but rejected. Survival at 30 days for the IE group was 85%, and 30-day survival following surgery was 96%. The most common pathogens were Staphylococcus species (60%) and Streptococcus species (30%). The surgical intervention rate was 47%. Survival for the IE group at 1, 3, 5, and 10 years was 74%, 63%, 58%, and 44%, respectively. This was significantly lower compared with the comparator group of other infections in PWID (P = .0002). Mortality was higher in patients who required surgery compared with those who did not (hazard ratio, 1.8 [95% confidence interval, .95-3.3]). The commonest cause of death was infection (66%), usually a further episode of IE (55%). CONCLUSIONS: Although early survival was good, long-term life expectancy was low. This was attributable to ongoing infection risk, rather than other factors known to affect prognosis in PWID. Surgery conferred no long-term survival advantage. More efforts are needed to reduce reinfection risk following an episode of IE in PWID.While early survival for people who inject drugs (PWID) with infective endocarditis is good, long-term survival is poor due to ongoing infection risk. Surgery conferred no long-term survival advantage, so more efforts are needed to reduce reinfection risks for PWID.
Subject(s)
Drug Users , Endocarditis, Bacterial , Endocarditis , Substance Abuse, Intravenous , Endocarditis/epidemiology , Endocarditis/surgery , Humans , Retrospective Studies , Substance Abuse, Intravenous/complicationsABSTRACT
INTRODUCTION: The adult congenital heart disease (ACHD) population is rapidly expanding. However, a significant proportion of these patients suffer sudden cardiac death. Recommending implantable cardioverter-defibrillator (ICD) insertion requires balancing the need for appropriate therapy in malignant arrhythmia against the consequences of inappropriate therapy and procedural complications. Here we present long-term follow-up data for ICD insertion in patients with ACHD from a large Level 1 congenital cardiac center. METHODS AND RESULTS: All patients with ACHD undergoing ICD insertion over an 18-year period were identified. Data were extracted for baseline characteristics including demographics, initial diagnosis, ventricular function, relevant medication, and indication for ICD insertion. Details regarding device insertion were gathered along with follow-up data including appropriate and inappropriate therapy and complications. A total of 136 ICDs were implanted during this period: 79 for primary and 57 for secondary prevention. The most common congenital cardiac conditions in both groups were tetralogy of Fallot and transposition of the great arteries. Twenty-two individuals in the primary prevention group received appropriate antitachycardia pacing (ATP), 14 underwent appropriate cardioversion, 17 received inappropriate ATP, and 15 received inappropriate cardioversion. In the secondary prevention group, 18 individuals received appropriate ATP, 8 underwent appropriate cardioversion, 8 received inappropriate ATP, and 7 were inappropriately cardioverted. Our data demonstrate low complication rates, particularly with leads without advisories. CONCLUSION: ICD insertion in the ACHD population involves a careful balance of the risks and benefits. Our data show a significant proportion of patients receiving appropriate therapy indicating that ICDs were inserted appropriately.
Subject(s)
Defibrillators, Implantable , Heart Defects, Congenital , Transposition of Great Vessels , Adult , Death, Sudden, Cardiac/epidemiology , Death, Sudden, Cardiac/prevention & control , Heart Defects, Congenital/complications , Heart Defects, Congenital/diagnosis , Heart Defects, Congenital/therapy , Humans , RegistriesABSTRACT
AIMS: Implanters of cardiac implantable electronic devices cannot easily choose devices by longevity as usually current models only have projected longevity data since those with known performance are obsolete. This study examines how projected device longevities are derived, the influencing factors, and their roles in guiding model choice. METHODS AND RESULTS: Ninety-eight implantable cardioverter-defibrillator (ICD) and cardiac resynchronization therapy-defibrillator (CRT-D) models released in Europe in 2007-17 were analysed for reported battery capacities, projected longevities for standardized settings stipulated by the French Haute Autorité de Santé (HAS) and manufacturer-chosen settings. Battery capacities and HAS projected longevities increased during the study period. Based on current drain estimation, therapy functions consumed only a small portion (2-7%) of the battery energy for single- and dual-chamber ICDs, but up to 50% (from biventricular pacing) for CRT-Ds. Large differences exist between manufacturers and models both in terms of battery capacity and energy consumption. CONCLUSION: Battery capacity is not the sole driver of longevity for electronic implantable cardiac devices and, particularly for ICDs, the core function consume a large part of the battery energy even in the absence of therapy. Providing standardized current drain consumption in addition to battery capacity may provide more meaningful longevity information among implantable electronic cardiac devices.
Subject(s)
Cardiac Resynchronization Therapy , Defibrillators, Implantable , Heart Failure , Cardiac Resynchronization Therapy Devices , Device Removal , Electric Countershock , Equipment Design , Equipment Failure , Europe , Heart Failure/diagnosis , Heart Failure/therapy , Humans , Longevity , Retrospective Studies , Time FactorsABSTRACT
BACKGROUND: Cardiac resynchronisation therapy (CRT) confers symptomatic and survival benefits in chronic heart failure with reduced ejection fraction (HFrEF). There remains a paucity of data on long-term performance of left ventricular (LV) leads, particularly with newer quadripolar lead designs. METHODS: This single-centre study utilised an electronic, outpatient HFrEF database to identify CRT recipients (2008-2014). The primary endpoint was temporal trend in LV pacing thresholds during follow-up. Secondary outcomes were complications relating to acute or chronic lead failure and device-related infections. RESULTS: Two hundred eighty patients were included, with mean (±SD) age of 74.2 years (±9.0) and median follow-up of 7.6 years (interquartile range 4-9). Mean LV threshold was 1.37 V (±0.73) at implant and remained stable over the study period. No differences were observed based upon lead manufacturer. Compared to non-quadripolar leads (n = 216), those of quadripolar designs (n = 64) had a lower threshold at 6 months (1.20 vs 1.37 V; P = .04) and at the end of the study period (1.32 vs 1.46 V; P = .04). Patients with HFrEF of ischaemic aetiology had higher thresholds at implant (1.46 vs 1.34 V; P = .05), and this persisted until the end of follow-up (1.49 vs 1.34 V; P = .03). There was low incidence of acute (0.71%; 2/280) and chronic lead failure (1.79%; 5/280), with four cases (1.43%) of device infection. CONCLUSIONS: LV leads in the context of CRT have excellent chronic stability and low rates of adverse events. Those with newer quadripolar lead designs have lower thresholds at initial follow-up and in the longer term.