ABSTRACT
BACKGROUND: Although guidelines recommend low-density lipoprotein cholesterol (LDL-C) to be <70mg/dL in patients with atherosclerotic cardiovascular disease (ASCVD), the rate of achieving this goal remains suboptimal. We sought to understand real world contemporary practice patterns of LDL-C management in patients with ASCVD, and whether LDL-C testing influenced management across US health systems. METHODS: A retrospective cohort study utilizing electronic medical record data from five health systems participating in the CardioHealth Alliance was performed on patients with an LDL-C measurement in 2021 and prior ASCVD. Multivariable regression modeling was used to determine the relationship of clinical factors with achievement of guideline directed LDL-C target. Changes in lipid lowering therapy (LLT) after LDL-C testing were also described. RESULTS: Among 216,074 patients with ASCVD, 129,886 (60.1%) had uncontrolled LDL-C (i.e. ≥70 mg/dL). Compared with participants with controlled LDL-C (<70mg/dL), those with uncontrolled LDL-C were more frequently female (50.9% vs 35.1%), or Black (13.7% vs. 10.3%), and less commonly had coronary artery disease as the form of vascular disease (73.0% vs. 83.5% %), heart failure (21.3% vs. 29.1% %), diabetes (34.1% vs. 48.2%), atrial fibrillation (19.3% vs. 26.1%), or chronic kidney disease (25.1% vs. 32.2%). In multivariable analyses, the factors most strongly associated with failure to achieve LDL-C control were female sex (RR 1.13 [95% CI 1.12-1.14] p <0.001) and Black race (1.15 [1.14-1.17] p <0.001). Among the 53,957 (41.5%) of those with uncontrolled LDL-C ≥70 mg/dL not on lipid lowering therapy (LLT) at baseline, only 21% were initiated on any LLT within 6 months of the uncontrolled LDL-C value. CONCLUSIONS: Within five diverse large health systems in the CardioHealth Alliance, more than half of the patients with ASCVD had uncontrolled LDL-C with significant disparities based on sex and race at baseline. The vast majority were not initiated on any lipid lowering therapy within 6 months of an elevated test result indicating persistent gaps in care that will likely worsen health inequities in outcomes. This highlights the urgent need for implementation efforts to improve equitable care.
ABSTRACT
BACKGROUND: On legacy 2D PET systems utilizing a 50 mL/min Rb-82 profile, test-retest precision of quantitative perfusion is â¼10%. It is unclear whether Rb-82 infusion rate significantly impacts quantitative perfusion and/or image quality on modern analog 3D PET-CT systems. We aimed to determine whether the Rb-82 infusion profile significantly impacts test-retest precision of quantitative perfusion, perfusion metrics, and/or image quality on a modern analog 3D PET-CT scanner. METHODS: Ninety-eight volunteers from 3 distinct groups: healthy volunteers (Normals), patients with risk factors and/or coronary disease (Clinicals) and patients with prior transmural myocardial infarctions (Infarcts), underwent cardiac stress testing on an analog 3D PET-CT. Participants received 3 consecutive resting scans and 2 consecutive stress scans, minutes apart, with two randomly assigned Rb-82 infusion profiles: 50 mL/min (fast [F]) and 20 mL/min (slow [S]). Perfusion metrics (resting (rMBF) and stress myocardial blood flow (sMBF)) were calculated using HeartSee software. Coefficients of variance (COV), repeatability (RC), MBF and image quality metrics were compared. RESULTS: rMBF correlated well between F and S profiles, with intraclass correlation coefficients (ICC) ranging 0.91-0.93. sMBF was highly correlated between F and S profiles (ICC=0.97). Fast and slow profiles were associated with similar same-day test-retest precision (COV 11.5% vs. 11.3% (p=0.77); RC 21.5% vs. 22.6%, for F-F vs S-S). There were no clinically significant differences in MBF values between F and S profiles. Image quality metrics were similar between the 2 profiles. CONCLUSIONS: There are no clinically significant differences in precision, perfusion metrics or image quality between Rb-82 fast and slow infusions using a contemporary analog 3D PET-CT.
ABSTRACT
OBJECTIVES: This study evaluated whether a novel standardized heparin dosing protocol used during atrial fibrillation catheter ablation resulted in a higher percentage of therapeutic activated clotting time (ACT) values compared to historic nonstandardized procedures. DESIGN: A retrospective cohort study SETTING: This study was conducted at Ochsner Medical Center, the largest tertiary-care teaching hospital in New Orleans, LA PARTICIPANTS: Patients undergoing catheter-based atrial fibrillation ablation INTERVENTIONS: The authors implemented a standardized heparin protocol, and enrolled 202 patients between November 2020 and March 2021. The historic controls consisted of 173 patients who underwent atrial fibrillation ablation between April 2020 and September 2020. Heparin administration in the control group was based on physician preference and was nonstandardized. MEASUREMENTS AND MAIN RESULTS: The primary endpoint was the percentage of intraprocedural ACTs in therapeutic range (≥300 to <450 s). Secondary endpoints included first measured ACT at ≥300 s and percent of measured ACTs in the supratherapeutic range (>450 s). Comparisons were performed using chi-squared tests or Fisher exact tests. Patients in the intervention group had a higher mean percentage of ACTs in the therapeutic range compared to the control group (84.9% vs. 75.8%, p<0.001). More patients in the intervention group reached therapeutic ACT on the first measurement compared to the control group (70.3% vs. 31.2%, p<0.001). CONCLUSION: During catheter-based cardiac ablation procedures, a novel standardized unfractionated heparin dosing protocol resulted in a higher percentage of ACTs in the target range, and a higher proportion of initial ACTs in the therapeutic range compared with baseline nonstandardized heparin dosing.
Subject(s)
Atrial Fibrillation , Catheter Ablation , Humans , Heparin , Anticoagulants , Atrial Fibrillation/drug therapy , Atrial Fibrillation/surgery , Retrospective Studies , Treatment Outcome , Catheter Ablation/methodsABSTRACT
INTRODUCTION: Guidelines indicate primary-prevention implantable cardioverter-defibrillators (ICDs) for most patients with left ventricular ejection fraction (LVEF) ≤ 35%. Some patients' LVEFs improve during the life of their first ICD. In patients with recovered LVEF who never received appropriate ICD therapy, the utility of generator replacement upon battery depletion remains unclear. Here, we evaluate ICD therapy based on LVEF at the time of generator change, to educate shared decision-making regarding whether to replace the depleted ICD. METHODS: We followed patients with a primary-prevention ICD who underwent generator change. Patients who received appropriate ICD therapy for ventricular tachycardia or ventricular fibrillation (VT/VF) before generator change were excluded. The primary endpoint was appropriate ICD therapy, adjusted for the competing risk of death. RESULTS: Among 951 generator changes, 423 met inclusion criteria. During 3.4 ± 2.2 years follow-up, 78 (18%) received appropriate therapy for VT/VF. Compared to patients with recovered LVEF > 35% (n = 161 [38%]), those with LVEF ≤ 35% (n = 262 [62%]) were more likely to require ICD therapy (p = .002; Fine-Gray adjusted 5-year event rates: 12.7% vs. 25.0%). Receiver operating characteristic analysis revealed the optimal LVEF cutoff for VT/VF prediction to be 45%, the use of which further improved risk stratification (p < .001), with Fine-Gray adjusted 5-year rates 6.2% versus 25.1%. CONCLUSION: Following ICD generator change, patients with primary-prevention ICDs and recovered LVEF have significantly lower risk of subsequent ventricular arrhythmias compared to those with persistent LVEF depression. Risk stratification at LVEF 45% offers significant additional negative predictive value over a 35% cutoff, without a significant loss in sensitivity. These data may be useful during shared decision-making at the time of ICD generator battery depletion.
Subject(s)
Defibrillators, Implantable , Tachycardia, Ventricular , Humans , Ventricular Function, Left , Stroke Volume , Arrhythmias, Cardiac/diagnosis , Arrhythmias, Cardiac/therapy , Tachycardia, Ventricular/diagnosis , Tachycardia, Ventricular/therapy , Ventricular Fibrillation/diagnosis , Ventricular Fibrillation/therapy , Death, Sudden, Cardiac/etiology , Death, Sudden, Cardiac/prevention & control , Risk FactorsABSTRACT
INTRODUCTION: Shared decision making (SDM) may result in treatment plans that best reflect the goals and wishes of patients, increasing patient satisfaction with the decision-making process. There is a knowledge gap to support the use of decision aids in SDM for anticoagulation therapy in patients with atrial fibrillation (AF). We describe the development and testing of a new decision aid, including a multicenter, randomized, controlled, 2-arm, open-label ENHANCE-AF clinical trial (Engaging Patients to Help Achieve Increased Patient Choice and Engagement for AF Stroke Prevention) to evaluate its effectiveness in 1,200 participants. METHODS: Participants will be randomized to either usual care or to a SDM pathway incorporating a digital tool designed to simplify the complex concepts surrounding AF in conjunction with a clinician tool and a non-clinician navigator to guide the participants through each step of the tool. The participant-determined primary outcome for this study is the Decisional Conflict Scale, measured at 1 month after the index visit during which a decision was made regarding anticoagulation use. Secondary outcomes at both 1 and 6 months will include other decision making related scales as well as participant and clinician satisfaction, oral anticoagulation adherence, and a composite rate of major bleeding, death, stroke, or transient ischemic attack. The study will be conducted at four sites selected for their ability to enroll participants of varying racial and ethnic backgrounds, health literacy, and language skills. Participants will be followed in the study for 6 months. CONCLUSIONS: The results of the ENHANCE-AF trial will determine whether a decision aid facilitates high quality shared decision making in anticoagulation discussions for stroke reduction in AF. An improved shared decision-making experience may allow patients to make decisions better aligned with their personal values and preferences, while improving overall AF care.
Subject(s)
Atrial Fibrillation , Stroke , Anticoagulants/therapeutic use , Atrial Fibrillation/complications , Atrial Fibrillation/drug therapy , Decision Making, Shared , Humans , Patient Participation , Stroke/complications , Stroke/prevention & controlABSTRACT
INTRODUCTION: Heart failure (HF) is a major cause of morbidity and mortality, with nearly half of all HF-related deaths resulting from sudden cardiac death (SCD), most often from an arrhythmic event. The pathophysiologic changes that occur in response to the hemodynamic stress of HF may lead to increased arrhythmogenesis. Theoretically, medications that block these arrhythmogenic substrates would decrease the risk of SCD. The combined angiotensin receptor and neprilysin inhibitor (ARNi; tradename Entresto) is the newest commercially available medication for the treatment of heart failure. METHODS AND RESULTS: We reviewed and synthesized the available literature regarding sacubitril/valsartan and its effects on cardiac rhythm. ARNi has been shown to decrease cardiovascular mortality and hospitalization in patients with HF with reduced ejection fraction (HFrEF). Emerging evidence suggests that ARNi also may play a role in reducing arrhythmogenesis and thereby SCD. CONCLUSION: This review summarizes the current data regarding this ARNi and its potential antiarrhythmic effects.
Subject(s)
Anti-Arrhythmia Agents , Heart Failure , Humans , Anti-Arrhythmia Agents/adverse effects , Heart Failure/diagnosis , Heart Failure/drug therapy , Neprilysin/pharmacology , Neprilysin/therapeutic use , Tetrazoles/adverse effects , Angiotensin Receptor Antagonists/adverse effects , Stroke Volume , Valsartan/pharmacology , Death, Sudden, Cardiac/etiology , Death, Sudden, Cardiac/prevention & control , Arrhythmias, Cardiac/diagnosis , Arrhythmias, Cardiac/drug therapy , Treatment OutcomeABSTRACT
BACKGROUND: Despite the high rate of sudden death after myocardial infarction among patients with a low ejection fraction, implantable cardioverter-defibrillators are contraindicated until 40 to 90 days after myocardial infarction. Whether a wearable cardioverter-defibrillator would reduce the incidence of sudden death during this high-risk period is unclear. METHODS: We randomly assigned (in a 2:1 ratio) patients with acute myocardial infarction and an ejection fraction of 35% or less to receive a wearable cardioverter-defibrillator plus guideline-directed therapy (the device group) or to receive only guideline-directed therapy (the control group). The primary outcome was the composite of sudden death or death from ventricular tachyarrhythmia at 90 days (arrhythmic death). Secondary outcomes included death from any cause and nonarrhythmic death. RESULTS: Of 2302 participants, 1524 were randomly assigned to the device group and 778 to the control group. Participants in the device group wore the device for a median of 18.0 hours per day (interquartile range, 3.8 to 22.7). Arrhythmic death occurred in 1.6% of the participants in the device group and in 2.4% of those in the control group (relative risk, 0.67; 95% confidence interval [CI], 0.37 to 1.21; P=0.18). Death from any cause occurred in 3.1% of the participants in the device group and in 4.9% of those in the control group (relative risk, 0.64; 95% CI, 0.43 to 0.98; uncorrected P=0.04), and nonarrhythmic death in 1.4% and 2.2%, respectively (relative risk, 0.63; 95% CI, 0.33 to 1.19; uncorrected P=0.15). Of the 48 participants in the device group who died, 12 were wearing the device at the time of death. A total of 20 participants in the device group (1.3%) received an appropriate shock, and 9 (0.6%) received an inappropriate shock. CONCLUSIONS: Among patients with a recent myocardial infarction and an ejection fraction of 35% or less, the wearable cardioverter-defibrillator did not lead to a significantly lower rate of the primary outcome of arrhythmic death than control. (Funded by the National Institutes of Health and Zoll Medical; VEST ClinicalTrials.gov number, NCT01446965 .).
Subject(s)
Death, Sudden, Cardiac/prevention & control , Defibrillators , Myocardial Infarction/therapy , Tachycardia, Ventricular/prevention & control , Wearable Electronic Devices , Aged , Death, Sudden, Cardiac/etiology , Defibrillators/adverse effects , Female , Humans , Male , Middle Aged , Myocardial Infarction/complications , Myocardial Infarction/mortality , Stroke Volume , Tachycardia, Ventricular/mortality , Treatment Outcome , Wearable Electronic Devices/adverse effectsABSTRACT
PURPOSE OF REVIEW: Cardiomyopathy with underlying left ventricular (LV) dysfunction is a heterogenous group of disorders that may be present with, and/or secondary to, coronary artery disease (CAD). The purpose of this review is to demonstrate, via case illustrations, the benefits offered by cardiac positron-emission tomography (PET) stress testing with coronary flow capacity (CFC) in the evaluation and treatment of patients with left ventricular (LV) dysfunction and CAD. RECENT FINDINGS: CFC, a metric that is increasing in prominence, represents the integration of several absolute perfusion metrics into clinical strata of CAD severity. Our prior work has demonstrated improvement in regional perfusion metrics as a result of revascularization to territories with severe reduction in CFC. Conversely, when CFC is adequate, there is no change in regional perfusion metrics following revascularization, despite angiographically severe stenosis. Furthermore, Gould et al. demonstrated decreased rates of myocardial infarction and death following revascularization of myocardium with severely reduced CFC, with no clinical benefit observed following revascularization of patients with preserved CFC. In a series of cases, we present pre-revascularization and post-revascularization PET scans with perfusion metrics in patients with LV dysfunction and CAD. In these examples, we demonstrate improvement in LV function and perfusion metrics following revascularization only in cases where baseline CFC is severely reduced. PET with CFC offers unique guidance regarding revascularization in patients with reduced LV function and CAD.
Subject(s)
Coronary Artery Disease , Ventricular Dysfunction, Left , Coronary Artery Disease/complications , Coronary Artery Disease/diagnostic imaging , Humans , Positron-Emission Tomography , Tomography, X-Ray Computed , Ventricular Dysfunction, Left/diagnostic imaging , Ventricular Function, LeftABSTRACT
BACKGROUND: Class 1C antiarrhythmic drugs (AADs) are effective first-line agents for atrial fibrillation (AF) treatment. However, these agents commonly are avoided in patients with known coronary artery disease (CAD), due to known increased risk in the postmyocardial infarction population. Whether 1C AADs are safe in patients with CAD but without clinical ischemia or infarct is unknown. Reduced coronary flow capacity (CFC) on positron emission tomography (PET) reliably identifies myocardial regions supplied by vessels with CAD causing flow limitation. OBJECTIVE: To assess whether treatment with 1C AADs increases mortality in patients without known CAD but with CFC indicating significantly reduced coronary blood flow. METHODS: In this pilot study, we compared patients with AF and left ventricular ejection fraction ≥50% who were treated with 1C AADs to age-matched AF patients without 1C AAD treatment. No patient had clinically evident CAD (ie, reversible perfusion defect, known ≥70% epicardial lesion, percutaneous coronary intervention, coronary artery bypass grafting, or myocardial infarction). All patients had PET-based quantification of stress myocardial blood flow and CFC. Death was assessed by clinical follow-up and social security death index search. RESULTS: A total of 78 patients with 1C AAD exposure were matched to 78 controls. Over a mean follow-up of 2.0 years, the groups had similar survival (P = .54). Among patients with CFC indicating the presence of occult CAD (ie, reduced CFC involving ≥50% of myocardium), 1C-treated patients had survival similar to (P = .44) those not treated with 1C agents. CONCLUSIONS: In a limited population of AF patients with preserved left ventricle function and PET-CFC indicating occult CAD, treatment with 1C AADs appears not to increase mortality. A larger study would be required to confidently assess the safety of these drugs in this context.
Subject(s)
Anti-Arrhythmia Agents/therapeutic use , Atrial Fibrillation/drug therapy , Coronary Artery Disease/diagnostic imaging , Heart Rate/drug effects , Perfusion Imaging , Positron-Emission Tomography , Aged , Anti-Arrhythmia Agents/adverse effects , Anti-Arrhythmia Agents/classification , Atrial Fibrillation/diagnosis , Atrial Fibrillation/mortality , Atrial Fibrillation/physiopathology , Coronary Artery Disease/mortality , Coronary Artery Disease/physiopathology , Coronary Circulation , Female , Humans , Male , Middle Aged , Pilot Projects , Predictive Value of Tests , Retrospective Studies , Risk Assessment , Risk Factors , Treatment Outcome , Ventricular Function, LeftABSTRACT
BACKGROUND: Vest Prevention of Early Sudden Death Trial did not demonstrate a significant reduction in arrhythmic death with the wearable cardioverter-defibrillator (WCD), but compliance with the device may have substantially affected the results. ThePletcher influence of WCD compliance on outcomes has not yet been fully evaluated. METHODS: Using linear and pooled logistic models, we performed as-treated analyses omitting person-time in the hospital and adjusted for correlates of WCD compliance. To assess the impact of early stopping of WCD, we performed a per-protocol Kaplan-Meier analysis, censoring after the last day the WCD was worn. Interactions of potential effect modifiers with treatment assignment and WCD compliance on outcomes were investigated. Finally, we used linear models to identify predictors of WCD compliance. RESULTS: A per-protocol analysis demonstrated a significant reduction in total (P < .001) and arrhythmic (P = .001) mortality. Better WCD compliance was independently predicted by cardiac arrest during index myocardial infarction (MI), higher Cr, diabetes, prior heart failure, EF ≤ 25%, Polish enrolling center and number of WCD alarms, while worse compliance was predicted by being divorced, Asian race, higher body mass index, prior percutaneous coronary intervention, or any WCD shock. Neither excluding time in hospital from the as-treated analysis nor adjustment for factors affecting WCD compliance materially changed the results. No variable demonstrated a significant interaction in either the intention-to-treat or as-treated analysis. CONCLUSION: Robust sensitivity analyses of as-treated and per-protocol analyses suggest that the WCD is protective in compliant patients with ejection fraction less than or equal to 35% during the first 3 months post-MI.
Subject(s)
Arrhythmias, Cardiac/therapy , Death, Sudden, Cardiac/prevention & control , Defibrillators , Electric Countershock/instrumentation , Myocardial Infarction/therapy , Patient Compliance , Wearable Electronic Devices , Aged , Arrhythmias, Cardiac/diagnosis , Arrhythmias, Cardiac/etiology , Arrhythmias, Cardiac/mortality , Death, Sudden, Cardiac/etiology , Female , Hospitalization , Humans , Male , Middle Aged , Myocardial Infarction/complications , Myocardial Infarction/diagnosis , Myocardial Infarction/mortality , Protective Factors , Risk Assessment , Risk Factors , Time Factors , Treatment OutcomeABSTRACT
INTRODUCTION: Sudden cardiac death is a substantial cause of mortality in patients with cardiomyopathy, but evidence supporting implantable cardioverter-defibrillator (ICD) implantation is less robust in nonischemic cardiomyopathy (NICM) than in ischemic cardiomyopathy. Improved risk stratification is needed. We assessed whether absolute quantification of stress myocardial blood flow (sMBF) measured by positron emission tomography (PET) predicts ventricular arrhythmias (VA) and/or death in patients with NICM. METHODS: In this pilot study, we prospectively followed patients with NICM (left ventricular ejection fraction ≤ 35%) and an ICD who underwent cardiac PET stress imaging with sMBF quantification. NICM was defined as the absence of angiographic obstructive coronary stenosis, significant relative perfusion defects on imaging, coronary revascularization, or acute coronary syndrome. Endpoints were appropriate device therapy for VA and all-cause mortality. Subgroup analysis was performed in patients who had no prior history of VA (ie, the primary prevention population). RESULTS: We followed 37 patients (60 ± 14 years, 46% male) for 41 ± 23 months. The median sMBF was 1.56 mL/g/min (interquartile range: 1.00-1.82). Lower sMBF predicted VA, both in the whole population (hazard ratio [HR] for each 0.1 mL/g/min increase: 0.84, P = .015) and in the primary prevention subset (n = 27; HR for each 0.1 mL/g/min increase: 0.81, P = .049). Patients with sMBF below the median had significantly more VA than those above the median, both in the whole population (P = .004) and in the primary prevention subset (P = .046). Estimated 3-year VA rates in the whole population were 67% among low-flow patients vs 13% among high-flow patients, and 39% vs 8%, respectively, among primary-prevention patients. sMBF did not predict all-cause mortality. CONCLUSIONS: In patients with NICM, lower sMBF predicts VA. This relationship may be useful for risk stratification for ventricular arrhythmia and decision making regarding ICD implantation.
Subject(s)
Arrhythmias, Cardiac/etiology , Cardiomyopathies/diagnostic imaging , Coronary Circulation , Death, Sudden, Cardiac/etiology , Myocardial Perfusion Imaging , Positron-Emission Tomography , Aged , Arrhythmias, Cardiac/diagnosis , Arrhythmias, Cardiac/mortality , Arrhythmias, Cardiac/prevention & control , Cardiomyopathies/complications , Cardiomyopathies/mortality , Cardiomyopathies/therapy , Clinical Decision-Making , Death, Sudden, Cardiac/prevention & control , Defibrillators, Implantable , Electric Countershock/instrumentation , Female , Humans , Male , Middle Aged , Pilot Projects , Predictive Value of Tests , Progression-Free Survival , Prospective Studies , Risk Assessment , Risk Factors , Stroke Volume , Time Factors , Ventricular Function, LeftABSTRACT
AIMS: Evidence links markers of systemic inflammation and heart failure (HF) with ventricular arrhythmias (VA) and/or death. Biomarker levels, and the risk they indicate, may vary over time. We evaluated the utility of serial laboratory measurements of inflammatory biomarkers and HF, using time-dependent analysis. METHODS AND RESULTS: We prospectively enrolled ambulatory patients with left ventricular ejection fraction (LVEF) ≤35% and a primary-prevention implanted cardioverter-defibrillator (ICD). Levels of established inflammatory biomarkers [C-reactive protein, erythrocyte sedimentation rate (ESR), suppression of tumourigenicity 2 (ST2), tumour necrosis factor alpha (TNF-α)] and brain natriuretic peptide (BNP) were assessed at 3-month intervals for 1 year. We assessed relationships between biomarkers modelled as time-dependent variables, VA, and death. Among 196 patients (66±14 years, LVEF 23±8%), 33 experienced VA, and 18 died. Using only baseline values, BNP predicted VA, and both BNP and ST2 predicted death. Using serial measurements at 3-month intervals, time-varying BNP independently predicted VA, and time-varying ST2 independently predicted death. C-statistic analysis revealed no significant benefit to repeated testing compared with baseline-only measurement. C-reactive protein, ESR, and TNF-α, either at baseline or over time, did not predict either endpoint. CONCLUSION: In stable ambulatory patients with systolic cardiomyopathy and an ICD, BNP predicts ventricular tachyarrhythmia, and ST2 predicts death. Repeated laboratory measurements over a year's time do not improve risk stratification beyond baseline measurement alone. CLINICAL TRIAL REGISTRATION: Clinicaltrials.gov NCT01892462 (https://clinicaltrials.gov/ct2/show/NCT01892462).
Subject(s)
Cardiomyopathies , Heart Failure , Biomarkers , Humans , Inflammation/diagnosis , Natriuretic Peptide, Brain , Prognosis , Stroke Volume , Ventricular Function, LeftABSTRACT
Pulmonary arterial hypertension (PAH) carries high morbidity and mortality despite available treatment options. In severe PAH, right ventricular (RV) diastolic pressure overload leads to interventricular septal bowing, hindering of left ventricular diastolic filling and reduced cardiac output (CO). Some animal studies suggest that pacing may mitigate this effect. We hypothesized that eliminating late diastole via ventricular pacing could improve CO in human subjects with severe PAH. Using minimal to no sedation, we performed transvenous acute biventricular (BiV) pacing and right heart catheterization in six patients with symptomatic PAH. Hemodynamic measurements were taken at baseline and during BiV pacing at various 20-ms intervals of V-V timing. We compared baseline CO to (1) CO while pacing the RV first by 80 ms (mimicking RV-only pacing), and then to (2) CO during pacing at the V-V timing that resulted in the highest CO. All participants were female, PASP 74 ± 14 mmHg, QRS duration 104 ± 20 ms. Compared with baseline, the CO decreased when the RV was paced first by 80 ms (7.2 ± 1.0 vs. 6.2 ± 1.1 L/min, p = 0.028). Pacing with optimal V-V timing produced CO similar to baseline (7.2 ± 1.0 vs. 7.4 ± 1.4, p = 0.92). Two patients (33%) met the predefined endpoint of a 15% increase in CO during pacing at the optimal V-V timing. In symptomatic PAH, V-V optimized acute BiV pacing does not consistently improve CO. However, acute BiV pacing did improve CO in a subset of this cohort. Further research is needed to identify predictors of response to cardiac resynchronization therapy in this population.
Subject(s)
Cardiac Output , Cardiac Resynchronization Therapy , Pulmonary Arterial Hypertension/therapy , Ventricular Dysfunction, Left/therapy , Ventricular Dysfunction, Right/therapy , Ventricular Function, Left , Ventricular Function, Right , Aged , Female , Humans , Male , Middle Aged , Pulmonary Arterial Hypertension/diagnosis , Pulmonary Arterial Hypertension/physiopathology , Recovery of Function , Severity of Illness Index , Time Factors , Treatment Outcome , Ventricular Dysfunction, Left/diagnosis , Ventricular Dysfunction, Left/physiopathology , Ventricular Dysfunction, Right/diagnosis , Ventricular Dysfunction, Right/physiopathologyABSTRACT
PURPOSE: Degenerative mitral stenosis (DMS) is an increasingly recognized cause of mitral stenosis. The goal of this study was to compare echocardiographic differences between DMS and rheumatic mitral stenosis (RMS), identify echocardiographic variables reflective of DMS severity, and propose a dimensionless mitral stenosis index (DMSI) for assessment of DMS severity. METHODS: This is a single-center, retrospective cohort study. We included patients with at least mild MS and a mean transmitral pressure gradient (TMPG) ≥4 mm Hg. Mitral valve area by the continuity equation (MVACEQ ) was used as an independent reference. The DMSI was calculated as follows: DMSI = VTILVOT / VTIMV. All-cause mortality data were collected retrospectively. RESULTS: A total of 64 patients with DMS and 24 patients with RMS were identified. MVACEQ was larger in patients with DMS (1.43 ± 0.4 cm2 ) than RMS (0.9 ± 0.3 cm2 ) by ~0.5 cm2 (P = <.001), and mean TMPG was lower in the DMS group (6.0 ± 2 vs 7.9 ± 3 mm Hg, P = .003). A DMSI of ≤0.50 and ≤0.351 was associated with MVACEQ ≤1.5 and MVACEQ ≤1.0 cm2 (P < .001), respectively. With the progression of DMS from severe to very severe, there was a significant drop in DMSI. There was a nonsignificant trend toward worse survival in patients with MVACEQ ≤1.0 cm2 and DMSI ≤0.35, suggesting severe stenosis severity. CONCLUSION: Our results show that TMPG correlates poorly with MVA in patients with DMS. Proposed DMSI may serve as a simple echocardiographic indicator of hemodynamically significant DMS.
Subject(s)
Mitral Valve Stenosis , Echocardiography , Humans , Mitral Valve/diagnostic imaging , Mitral Valve Stenosis/diagnostic imaging , Retrospective Studies , Severity of Illness IndexABSTRACT
PURPOSE: Revascularization aims to improve myocardial perfusion. However, changes in regional artery-specific quantitative perfusion after revascularization have not been systematically investigated. It is unclear whether artery-specific thresholds for coronary flow capacity (CFC) and/or relative perfusion predict improved stress perfusion after revascularization. We sought to determine the impact of revascularization based on predefined, artery-specific, severity size thresholds for CFC and/or relative perfusion defects. METHODS: Fifty patients underwent PET imaging before revascularization and then prospectively within 90 days after revascularization. Changes in regional myocardial blood flow (MBF) were stratified based on baseline perfusion abnormalities, baseline reduced CFC, and whether revascularization was performed in that region. RESULTS: Following angiographic stenosis-directed revascularization, in regions with relative perfusion abnormalities and decreased CFC, stress MBF (sMBF) increased by 0.51 cm3/min/g (59%) from baseline (p < 0.001). In regions without baseline perfusion abnormalities and yet decreased CFC, sMBF increased by 0.35 cm3/min/g (40%) from baseline (p < 0.001). In regions without perfusion abnormalities and normal CFC, sMBF did not increase significantly (+0.07 cm3/min/g, p = 0.56). Patients in whom revascularization was concordant with abnormal PET findings showed increased whole-heart sMBF (+0.22 cm3/min/g, p < 0.001), but in patients in whom revascularization was targeted only to regions without perfusion abnormalities or low CFC, sMBF did not change significantly (-0.06 cm3/min/g, p = 0.38). CONCLUSION: Revascularization targeted to regions with reduced CFC and relative perfusion abnormalities on baseline PET yielded significant improvements in sMBF. When revascularization was performed in regions without reduced CFC, sMBF did not improve.
Subject(s)
Coronary Circulation , Heart/diagnostic imaging , Myocardial Perfusion Imaging , Myocardial Revascularization , Positron-Emission Tomography , Adult , Aged , Angiography , Arteries , Coronary Artery Disease/diagnostic imaging , Coronary Stenosis/diagnostic imaging , Exercise Test , Female , Humans , Image Processing, Computer-Assisted , Male , Middle Aged , Myocardium , Perfusion , Prospective Studies , Registries , Tomography, X-Ray ComputedABSTRACT
PURPOSE OF REVIEW: To discuss the role of wearable cardioverter defibrillator (WCD) vests in preventing sudden cardiac death (SCD) in at-risk populations. RECENT FINDINGS: The impact of randomized-controlled trials with implantable cardioverter-defibrillators (ICD) therapy is well established in randomized clinical trials in ischemic cardiomyopathy. Although the benefits are not as clear in non-ischemic cardiomyopathy, meta-analyses show significant mortality benefits from immediate electrical cardioversion strategies. The role of WCDs in at-risk populations in whom ICD therapy is temporarily not indicated is not as well-established. Smaller cohort trials have shown efficacy in patients with newly-diagnosed cardiomyopathy, requiring temporary ICD explantation, and others with less common indications for WCD therapy. The Vest Prevention of Early Sudden Death Trial was a landmark randomized control study seeking to examine the benefits of WCD therapy in at-risk population, and although the primary endpoint of reducing arrhythmic death was not reached, the structure of the trial and significant differences in total mortality make a compelling case for continued use of WCD therapies in our healthcare systems.
Subject(s)
Death, Sudden, Cardiac/prevention & control , Defibrillators, Implantable , Defibrillators , Wearable Electronic Devices , Electric Countershock , Humans , Patient DischargeABSTRACT
Aims: Several published investigations demonstrated that a longer T-peak to T-end interval (Tpe) implies increased risk for ventricular tachyarrhythmia (VT/VF) and mortality. Tpe has been measured using diverse methods. We aimed to determine the optimal Tpe measurement method for screening purposes. Methods and results: We evaluated 305 patients with LVEF ≤ 35% and an implantable cardioverter-defibrillator implanted for primary prevention. Tpe was measured using seven different methods described in the literature, including six manual methods and the automated algorithm '12SL', and was corrected for heart rate. Endpoints were VT/VF and death. To account for differences in the magnitude of Tpe measurements, results are expressed in standard deviation (SD) increments. We evaluated the clinical utility of each measurement method based on predictive ability, fraction of immeasurable tracings, and intra- and interobserver correlation. >Over 31 ± 23 months, 82 (27%) patients had VT/VF, and over 49 ± 21 months, 91 (30%) died. Several rate-corrected Tpe measurement methods predicted VT/VF (HR per SD 1.20-1.34; all P < 0.05), and nearly all methods (both corrected and uncorrected) predicted death (HR per SD 1.19-1.35; all P < 0.05). Optimal predictive ability, readability, and correlation were found in the automated 12SL method and the manual tangent method in lead V2. Conclusion: For the prediction of VT/VF, the utility of Tpe depends upon the measurement method, but for the prediction of mortality, most published Tpe measurement methods are similarly predictive. Heart rate correction improves predictive ability. The automated 12SL method performs as well as any manual measurement, and among manual methods, lead V2 is most useful.
Subject(s)
Death, Sudden, Cardiac/prevention & control , Electric Countershock , Electrocardiography , Heart Rate , Primary Prevention , Tachycardia, Ventricular/diagnosis , Ventricular Dysfunction, Left/diagnosis , Ventricular Fibrillation/diagnosis , Action Potentials , Aged , Aged, 80 and over , Death, Sudden, Cardiac/etiology , Defibrillators, Implantable , Electric Countershock/instrumentation , Female , Humans , Male , Middle Aged , Predictive Value of Tests , Primary Prevention/instrumentation , Risk Assessment , Risk Factors , Stroke Volume , Tachycardia, Ventricular/mortality , Tachycardia, Ventricular/physiopathology , Tachycardia, Ventricular/therapy , Ventricular Dysfunction, Left/mortality , Ventricular Dysfunction, Left/physiopathology , Ventricular Dysfunction, Left/therapy , Ventricular Fibrillation/mortality , Ventricular Fibrillation/physiopathology , Ventricular Fibrillation/therapy , Ventricular Function, LeftABSTRACT
OBJECTIVES: We examined whether regional improvement in stress myocardial blood flow (sMBF) following angiography-guided coronary revascularization depends on the existence of a perfusion defect on positron emission tomography (PET). BACKGROUND: Percent stenosis on coronary angiography often is the main factor when deciding whether to perform revascularization, but it does not reliably relate to maximum sMBF. PET is a validated method of assessing sMBF. METHODS: 19 patients (79% M, 65 ± 12 years) underwent PET stress before and after revascularization (17 PCI, 2 CABG). Pre- and post-revascularization sMBF for each left ventricular quadrant (anterior, septal, lateral, and inferior) was stratified by the presence or absence of a baseline perfusion defect on PET and whether that region was revascularized. RESULTS: Intervention was performed on 40 of 76 quadrants. When a baseline perfusion defect existed in a region that was revascularized (n = 26), post-revascularization flow increased by 0.6 ± 0.7 cc/min/g (1.2 ± 0.4 vs 1.7 ± 0.8, P < 0.001). When no defect existed but revascularization was performed (n = 14), sMBF did not change significantly (1.7 ± 0.3 vs 1.5 ± 0.4 cc/min/g, P = 0.16). In regions without a defect that were not revascularized (n = 29), sMBF did not significantly change (2.0 ± 0.6 vs 1.9 ± 0.7, P = 0.7). CONCLUSIONS: When a stress-induced perfusion defect exists on PET, revascularization improves sMBF in that region. When there is no such defect, sMBF shows no net change, whether or not intervention is performed in that area. PET stress may be useful for identifying areas of myocardium that could benefit from revascularization, and also areas in which intervention is unlikely to yield improvement in myocardial blood flow.