ABSTRACT
Guidelines recommend patients undergoing a first pacemaker implant who have even mild left ventricular (LV) impairment should receive biventricular or conduction system pacing (CSP). There is no corresponding recommendation for patients who already have a pacemaker. We conducted a meta-analysis of randomized controlled trials (RCTs) and observational studies assessing device upgrades. The primary outcome was the echocardiographic change in LV ejection fraction (LVEF). Six RCTs (randomizing 161 patients) and 47 observational studies (2644 patients) assessing the efficacy of upgrade to biventricular pacing were eligible for analysis. Eight observational studies recruiting 217 patients of CSP upgrade were also eligible. Fourteen additional studies contributed data on complications (25 412 patients). Randomized controlled trials of biventricular pacing upgrade showed LVEF improvement of +8.4% from 35.5% and observational studies: +8.4% from 25.7%. Observational studies of left bundle branch area pacing upgrade showed +11.1% improvement from 39.0% and observational studies of His bundle pacing upgrade showed +12.7% improvement from 36.0%. New York Heart Association class decreased by -0.4, -0.8, -1.0, and -1.2, respectively. Randomized controlled trials of biventricular upgrade found improvement in Minnesota Heart Failure Score (-6.9 points) and peak oxygen uptake (+1.1 mL/kg/min). This was also seen in observational studies of biventricular upgrades (-19.67 points and +2.63 mL/kg/min, respectively). In studies of the biventricular upgrade, complication rates averaged 2% for pneumothorax, 1.4% for tamponade, and 3.7% for infection over 24 months of mean follow-up. Lead-related complications occurred in 3.3% of biventricular upgrades and 1.8% of CSP upgrades. Randomized controlled trials show significant physiological and symptomatic benefits of upgrading pacemakers to biventricular pacing. Observational studies show similar effects between biventricular pacing upgrade and CSP upgrade.
Subject(s)
Cardiac Resynchronization Therapy , Heart Failure , Pacemaker, Artificial , Ventricular Dysfunction, Left , Humans , Cardiac Resynchronization Therapy/adverse effects , Cardiac Pacing, Artificial/adverse effects , Cardiac Conduction System Disease/therapy , Heart Conduction System , Ventricular Function, Left , Stroke Volume/physiology , Treatment Outcome , Heart Failure/diagnosis , Heart Failure/therapyABSTRACT
BACKGROUND: The prognosis of patients with untreated cardiac implantable electronic device (CIED) infection is poor. Whether removal of all leads by a successful transvenous lead extraction (TLE) procedure changes the prognosis is unclear. OBJECTIVE: To identify predictors of mortality in patients with CIED infection despite successful TLE. METHODS: Retrospective single-center analysis of prospectively collected database from consecutive patients undergoing TLE at our center. Predictors for mortality were identified and a score predicting high mortality rate was calculated. RESULTS: A total of 371 consecutive patients underwent TLE, of whom 337 (90.8%) had complete hardware removal. Most were extracted due to infectious causes (81.3%). Approximately one-third (35%) died during a mean follow-up of 1056 Ā± 868 days. There was significantly higher mortality observed in the infectious group. Multivariate logistic regression models for infectious group only identified creatinine and albumin measurements as risk markers for 30 days mortality (odds ratio [OR], 1.68; 95% confidence interval [CI], 1.19-2.38; P = .003 and OR, 0.4; 95% CI, 0.16-0.97; P = .039, respectively). A risk score was created based on cutoff values of creatinine ≥2md/dL (1 point) and albumin ≤3.5 g/dL (1 point). A value of 2 points predicted a 50% chance of 30-day mortality and a 75% chance of 1-year mortality (P < .0001 for both). CONCLUSIONS: Creatinine and albumin can be used as a combined risk score to successfully identify patients at risk of death despite undergoing a successful TLE procedure for infectious reasons. This score could help decision making when contemplating on conservative antibiotic treatment vs TLE.
Subject(s)
Defibrillators, Implantable/adverse effects , Device Removal , Pacemaker, Artificial/adverse effects , Prosthesis Failure , Prosthesis-Related Infections/surgery , Adult , Aged , Aged, 80 and over , Biomarkers/blood , Creatinine/blood , Databases, Factual , Device Removal/adverse effects , Device Removal/mortality , Female , Humans , Male , Middle Aged , Prosthesis-Related Infections/diagnosis , Prosthesis-Related Infections/etiology , Prosthesis-Related Infections/mortality , Retrospective Studies , Risk Assessment , Risk Factors , Serum Albumin, Human/analysis , Time Factors , Treatment OutcomeABSTRACT
AIMS: Lead perforation is a rare, well-known complication of cardiac implantable electronic device (CIED) implants, whose management is mostly not evidence-based. Main management strategies include conservative approach based on clinical and lead function follow-up vs. routine invasive lead revision approach. This study compared the complications of both strategies by composite endpoint, including recurrent perforation-related symptoms, recurrent pericardial effusion (PEf), lead dysfunction, and device infection during 12 month follow-up. METHODS AND RESULTS: Multicentre retrospective analysis, inquiring data from imaging studies, device interrogation, pericardiocentesis, and clinical charts of patients with suspected perforating leads between 2007 and 2014 in five hospitals. All cases were reviewed by electrophysiologist and defined as definite perforations by suggestive symptoms along with lead perforation on imaging, bloody PEf on pericardiocentesis shortly after implant, or right ventricular (RV) lead non-capture along with diaphragmatic stimulation upon bipolar pacing. Clinical outcomes associated with both management approaches were compared, with respect to the composite endpoint. The study included 48 definitive perforation cases: 22 managed conservatively and 26 via lead revision. Conservative management was associated with an increased composite endpoint compared with lead revision (8/22 vs. 1/26; P = 0.007). The dominant complication among the conservative cohort was appearance of cardiac tamponade during follow-up; 5/6 occurring in cases which presented with no or only mild PEf and were treated by antiplatelets/coagulants during or shortly after CIED implantation. CONCLUSION: A conservative management of CIED lead perforation is associated with increased complications compared with early lead revision. Lead revision may be the preferred management particularly in patients receiving antiplatelets/coagulants.
Subject(s)
Cardiac Resynchronization Therapy Devices/adverse effects , Heart Injuries/etiology , Heart Injuries/therapy , Aged , Cardiac Tamponade/etiology , Cardiac Tamponade/therapy , Device Removal , Female , Humans , Male , Pericardial Effusion/etiology , Pericardial Effusion/therapy , Pericardiocentesis , Prosthesis-Related Infections/etiology , Prosthesis-Related Infections/therapy , Retreatment , Retrospective StudiesABSTRACT
Sudden cardiac death is one of the most important causes of death worldwide. Advancements in medical treatment, percutaneous interventions, and device therapy (ICD and CRTD) showed consistent reduction in mortality, mainly in survivors of SCD and in patients with ischemic cardiomyopathy and depressed left ventricular function. Patients with non-ischemic cardiomyopathies, mildly reduced LV function, and channelopathies have increased risk for SCD. Identifying the subgroup of these patients before they experience life-threatening or fatal events is essential to further improve outcomes. In this review, we aimed to summarize the current knowledge for risk stratification and primary prevention, to describe the gaps in evidence, and to discuss future directions for screening and treating patients at risk for SCD. PURPOSE OF REVIEW: The purpose of this review is to provide a comprehensive description of the etiologies of sudden cardiac death, risk stratification strategies, and to describe the current medical and interventional therapies. We aimed to discuss the current gaps in our knowledge of primary prevention of SCD and to review novel approaches and interventions. RECENT FINDINGS: The incidence of SCD has decreased in the last two decades due to improved pharmacological treatment and ICD implantation in SCD survivors and in patients with reduced left ventricular function and ischemic cardiomyopathy. The efficacy of ICD in patients with non-ischemic cardiomyopathy is challenged by new findings from the DANISH trial. Catheter ablation is new emerging strategy to prevent SCD in patients with scar relater or PVC-triggered ventricular arrhythmias. Despite the new treatments, SCD is still a major burden. ICD remains the cornerstone for patients with ischemic cardiomyopathy, whereas appropriate risk stratification of the patients with non-ischemic cardiomyopathy and channelopathies is needed to further improve outcomes. The future of ablation as the treatment and prevention of SCD remains to be studied.
Subject(s)
Death, Sudden, Cardiac/prevention & control , Death, Sudden, Cardiac/etiology , Humans , Primary Prevention/trends , Risk AssessmentABSTRACT
BACKGROUND: Catheter ablation (CA) is a well-established therapeutic option for patients with recurrent symptomatic atrial fibrillation (AF). Data on gender-related differences are limited with regard to baseline characteristics and long-term success rates of catheter ablation for AF. METHODS: We analyzed a cohort of 251 consecutive patients who underwent a first catheter ablation for AF in our institute during the period 2008 through 2015. All patients were followed by regular annual clinic visits, electrocardiograms, periodic 24-48 hour Holter monitoring, and loop recorders. The primary endpoint was first recurrence of AF during 1 year of follow-up. RESULTS: The cohort comprised 26% women (n=65), who were older (62.1 Ā± 9.6 vs. 54.4 Ā± 11.3 years, P < 0.01) and had a higher proportion of diabetes mellitus (23.1 vs. 5.4%, P < 0.001) than male patients. No other significant differences were evident. At 1 year follow-up, the cumulative survival free of AF was significantly higher in women compared with men (83% vs. 66%, respectively, log rank P value = 0.021). Subgroup analysis showed an interaction between female and small indexed left atrial diameter (LADi < 23 mm/m2). CONCLUSIONS: Our findings suggest that women experience a significantly lower rate of AF recurrence post-CA compared with men. This gender-related advantage appears to be restricted to women without significant left atrial enlargement. It further implies that left atrial enlargement has a stronger negative impact on post-CA AF recurrence in females than in males. Due to the relatively small sample number of females further research is warranted to validate our conclusions.
Subject(s)
Atrial Fibrillation/surgery , Catheter Ablation/methods , Heart Atria/surgery , Adult , Aged , Anti-Arrhythmia Agents/administration & dosage , Atrial Fibrillation/mortality , Catheter Ablation/adverse effects , Electrocardiography, Ambulatory/methods , Female , Follow-Up Studies , Heart Atria/pathology , Humans , Male , Middle Aged , Recurrence , Registries , Retrospective Studies , Sex Factors , Treatment OutcomeABSTRACT
BACKGROUND: Transvenous lead extraction can lead to tricuspid valve damage. OBJECTIVES: To assess the incidence, risk factors and clinical outcome of tricuspid regurgitation (TR) following lead extraction. METHODS: We prospectively collected data on patients who underwent lead extraction at the Sheba Medical Center prior to laser use (i.e., before 2012). Echocardiography results before and following the procedure were used to confirm TR worsening, defined as an echocardiographic increase of at least one TR grade. Various clinical and echocardiographic parameters were analyzed as risk factors for TR. Clinical and echocardiographic follow-up was conducted to assess the clinical significance outcome of extraction-induced TR. RESULTS: Of 152 patients who underwent lead extraction without laser before 2012, 86 (56%) (192 electrodes) had echocardiography results before and within one week following the procedure. New or worsening TR was discovered in 13 patients (15%). Use of mechanical tools and younger age at extraction were found on multivariate analysis to be factors for TR development (P = 0.04 and P = 0.03 respectively). Average follow-up was 22.25 Ā± 21.34 months (range 8-93). There were no significant differences in the incidence of right-sided heart failure (50% vs. 23%, P = 0.192) or hospitalizations due to heart failure exacerbations (37.5% vs. 11%, P = 0.110). No patient required tricuspid valve repair or replacement. Death rates were similar in the TR and non-TR groups (20% vs. 33%). CONCLUSIONS: TR following lead extraction is not uncommon but does not seem to affect survival or outcomes such as need for valve surgery. Its long-term effects remain to be determined.
Subject(s)
Device Removal/adverse effects , Electrodes, Implanted/adverse effects , Heart Failure/epidemiology , Pacemaker, Artificial , Tricuspid Valve Insufficiency/epidemiology , Adult , Age Factors , Aged , Aged, 80 and over , Echocardiography , Female , Follow-Up Studies , Heart Failure/etiology , Humans , Incidence , Male , Middle Aged , Multivariate Analysis , Prospective Studies , Risk Factors , Tricuspid Valve Insufficiency/etiologyABSTRACT
INTRODUCTION: Renal dysfunction is associated with increased morbi-mortality in heart failure patients. Data regarding functional and clinical efficacy of cardiac resynchronization therapy (CRT) in this population are limited. METHODS AND RESULTS: We aimed to evaluate the rate of functional response to CRT in patients with renal dysfunction and its association with long-term mortality. Our study included a total of 179 consecutive patients implanted between 2007 and 2010. The rate of functional response to CRT (defined by a composite score using New York Heart Association functional class, 6-minute walk test, and quality of life) was compared between patients with and without renal dysfunction (defined as eGFR < or ≥60 mL/min/1.73Ā m(2) ). Survival analysis estimates were constructed according to the Kaplan-Meier method, with results comparison using the log-rank test. During a median follow-up of 4.2 years, 73 patients (40%) died. Patients with low eGFR were older (72 Ā± 8 years vs. 64 Ā± 12 years; P < 0.001), and had higher prevalence of ischemic heart disease (75% vs. 53%; P = 0.003). Functional response rates did not differ significantly between patients with and without renal dysfunction (58% and 69%, respectively; P = 0.14). Despite overall higher mortality in patients with low eGFR (53.8% vs. 22.7%; P < 0.001), the presence of functional response at 1 year among patients with renal dysfunction was still independently associated with an improved long-term survival (HR = 0.49 [95%CI: 0.28-0.83]; P = 0.009). CONCLUSION: Functional response to CRT at 1 year does not differ significantly between patients with or without kidney disease, and is an independent predictor of improved long-term survival in patients with renal dysfunction.
Subject(s)
Cardiac Resynchronization Therapy/mortality , Heart Failure/mortality , Heart Failure/therapy , Kidney Diseases/mortality , Kidney Diseases/therapy , Aged , Aged, 80 and over , Cardiac Resynchronization Therapy/methods , Cohort Studies , Female , Glomerular Filtration Rate/physiology , Heart Failure/physiopathology , Humans , Kidney Diseases/physiopathology , Male , Middle Aged , Mortality/trends , Time Factors , Treatment OutcomeABSTRACT
BACKGROUND: Cardiac resynchronization therapy (CRT) has been shown to improve heart failure (HF) symptoms and survival. We hypothesized that a greater improvement in left-ventricular ejection fraction (LVEF) after CRT is associated with greater survival benefit. METHODS AND RESULTS: In 693 patients across 2 international centers, the improvement in LVEF after CRT was determined. Patients were grouped as non-/modest-, moderate-, or super-responders to CRT, defined as an absolute change in LVEF of ≤5%, 6-15%, and >15%, respectively. Changes in New York Heart Association (NYHA) functional class and left ventricular end-diastolic dimension (LVEDD) were assessed for each group. There were 395 non-/modest-, 186 moderate-, and 112 super-responders. Super-responders were more likely to be female and to have nonischemic cardiomyopathy, lower creatinine, and lower pulmonary artery systolic pressure than non-/modest- and moderate-responders. Super-responders were also more likely to have lower LVEF than non-/modest-responders. There was no difference in NYHA functional class, mitral regurgitation grade, or tricuspid regurgitation grade between groups. Improvement in NYHA functional class (-0.9Ā Ā±Ā 0.9 vs -0.4Ā Ā±Ā 0.8 [PĀ <Ā .001] and -0.6Ā Ā±Ā 0.8 [PĀ =Ā .02]) and LVEDD (-8.7Ā Ā±Ā 9.9Ā mm vs -0.5Ā Ā±Ā 5.0 and -2.4Ā Ā±Ā 5.8Ā mm [PĀ <Ā .001 for both]) was greatest in super-responders. Kaplan-Meier survival analysis revealed that super-responders achieved better survival compared with non-/modest- (PĀ <Ā .001) and moderate-responders (PĀ =Ā .049). CONCLUSIONS: Improvement in HF symptoms and survival after CRT is proportionate to the degree of improvement in LV systolic function. Super-response is more likely in women, those with nonischemic substrate, and those with lower pulmonary artery systolic pressure.
Subject(s)
Cardiac Resynchronization Therapy/trends , Heart Failure/diagnosis , Heart Failure/therapy , Stroke Volume/physiology , Aged , Cardiac Resynchronization Therapy/mortality , Female , Follow-Up Studies , Heart Failure/mortality , Humans , Male , Middle Aged , Predictive Value of Tests , Retrospective Studies , Survival Rate/trends , Treatment OutcomeABSTRACT
AIMS: Strategically chosen ventricular tachycardia (VT)/ventricular fibrillation (VF) detection and therapy parameters aimed at reducing shock deliveries were proven effective in studies that utilized single manufacturer devices with a follow-up of up to 1 year. Whether these beneficiary effects can be generalized to additional manufacturers and be maintained for longer periods is to be determined. Our aim was to evaluate the durability and applicability of the programming of strategic implantable cardioverter-defibrillators (ICDs) of various manufacturers, which is aimed at reducing the shock delivery burden in primary prevention ICD recipients. METHODS AND RESULTS: A retrospective analysis of prospectively collected data of 300 ICD recipients of various manufacturers was conducted; 160 devices were strategically programmed to reduce shocks and 140 were not. The primary endpoint was the composite of death and appropriate shocks. Additional outcomes were inappropriate shocks, syncope events, and non-sustained VTs. At a median follow-up of 24 months, 19 patients died, 31 received appropriate shocks, and 41 received inappropriate shocks. Multivariate analysis showed that strategic programming dedicated to shock reduction was associated with a 64% risk reduction in the primary endpoint [hazard ratio (HR): 0.13-0.93; P = 0.03] and a 70% reduction in inappropriate shock deliveries (HR: 0.16-0.72; P = 0.01). Very few syncope events occurred (five patients, 1.6%), and there was no between-group difference in this outcome. CONCLUSION: Utilization of strategically chosen VT/VF detection and therapy parameters was found to be effective and safe in ICDs of various manufacturers at a median follow-up period of 2 years among primary prevention patients.
Subject(s)
Death, Sudden, Cardiac/prevention & control , Defibrillators, Implantable , Electric Countershock/instrumentation , Primary Prevention/methods , Tachycardia, Ventricular/therapy , Ventricular Fibrillation/therapy , Aged , Chi-Square Distribution , Death, Sudden, Cardiac/etiology , Electric Countershock/adverse effects , Female , Humans , Kaplan-Meier Estimate , Male , Middle Aged , Multivariate Analysis , Proportional Hazards Models , Prosthesis Design , Prosthesis Failure , Registries , Retrospective Studies , Risk Factors , Syncope/etiology , Syncope/prevention & control , Tachycardia, Ventricular/complications , Tachycardia, Ventricular/diagnosis , Tachycardia, Ventricular/mortality , Time Factors , Treatment Outcome , Ventricular Fibrillation/complications , Ventricular Fibrillation/diagnosis , Ventricular Fibrillation/mortalityABSTRACT
AIMS: Absent left atrium (LA) mechanical contraction may occur following the modified Cox-maze operation, and was found to impose a potential risk for the occurrence of thrombo-embolic stroke. It is unknown whether certain morphological P-wave characteristics can surrogate absent LA mechanical activity. The aim of this study was to evaluate the morphological features of the P-waves on the surface electrocardiogram (ECG) of patients who underwent the maze operation and to relate them to the contractile profile of the LA. METHODS AND RESULTS: Electrocardiogram tracings of 150 consecutive patients that were in sustained sinus rhythm following the maze operation were evaluated. P-waves were scrutinized for morphology, duration, axis, and amplitude. Clinical, surgery-related, and echocardiographic data were collected and analysed. Forty-seven patients (31%) had no evidence of LA contraction at 3 months after surgery (baseline assessment) and on follow-up echocardiography. Multivariate analysis showed that a positive-only P-wave deflection at lead V1 (P = 0.03), a negative-only deflection at aVL, and a P-wave amplitude of ≤ 0.05 mV at the septal-anterior leads (P < 0.001 for both) were associated with absent LA mechanical contraction. In a secondary analysis, a risk score involving the above three parameters was developed for the prediction of stroke occurrence. Patients at the high-risk score group had a 30% survival freedom of stroke compared with 70% for patients at intermediate risk (P < 0.001). CONCLUSION: Absent LA mechanical contraction following the modified maze operation may be accompanied by a distinguished pattern of the P-waves on the surface ECG.
Subject(s)
Atrial Fibrillation/surgery , Atrial Function, Left , Catheter Ablation , Cryosurgery , Electrocardiography , Myocardial Contraction , Action Potentials , Aged , Atrial Fibrillation/complications , Atrial Fibrillation/diagnosis , Atrial Fibrillation/physiopathology , Catheter Ablation/adverse effects , Chi-Square Distribution , Cryosurgery/adverse effects , Female , Heart Atria/physiopathology , Heart Atria/surgery , Humans , Kaplan-Meier Estimate , Male , Middle Aged , Multivariate Analysis , Odds Ratio , Predictive Value of Tests , Proportional Hazards Models , Retrospective Studies , Risk Factors , Stroke/etiology , Stroke/physiopathology , Stroke/prevention & control , Time Factors , Treatment OutcomeABSTRACT
AIMS: Implantable cardioverter-defibrillators (ICDs) improve survival in certain high arrhythmic risk populations. However, there are sex differences regarding both the utilization and the benefit of these devices. Using a prospective national ICD registry, we aim to compare the indications for ICD implantation as well as outcomes in implanted women vs. men. METHODS AND RESULTS: All subjects implanted with an ICD or cardiac resynchronization therapy with a defibrillator (CRTD) in Israel between July 2010 and February 2013 were included. A total of 3544 subjects constructed the baseline cohort, of whom 615 (17%) were women. Women had the same age (64 years) and rate of secondary prevention indication (26%) as men. However, women were more likely than men to have significant heart failure symptoms (52 vs. 45%), QRS > 120 ms (41 vs. 36%), and a higher rate of non-ischaemic cardiomyopathy (54 vs. 21%, all P values <0.05). Using multivariate analysis, women were more likely to undergo CRTD implantation (odds ratio = 1.8, P < 0.01). Follow-up data were available for 1518 subjects with a mean follow-up of 12 months. During follow-up, there were no significant differences among genders in the rate of any single or the combined outcomes of appropriate device therapies, heart failure admissions, or death. First-year re-intervention rate was double among women (5.6 vs. 3.0%, P < 0.01). CONCLUSION: In real-world setting, women implanted with an ICD differ significantly from men in their baseline characteristics and in the use of CRTD devices. These, however, did not translate into outcome differences.
Subject(s)
Arrhythmias, Cardiac/therapy , Cardiac Resynchronization Therapy Devices , Cardiac Resynchronization Therapy , Death, Sudden, Cardiac/prevention & control , Defibrillators, Implantable , Electric Countershock/instrumentation , Health Status Disparities , Healthcare Disparities , Primary Prevention/instrumentation , Secondary Prevention/instrumentation , Aged , Arrhythmias, Cardiac/complications , Arrhythmias, Cardiac/diagnosis , Arrhythmias, Cardiac/mortality , Cardiac Resynchronization Therapy/adverse effects , Cardiac Resynchronization Therapy/mortality , Chi-Square Distribution , Death, Sudden, Cardiac/etiology , Electric Countershock/adverse effects , Electric Countershock/mortality , Female , Heart Failure/etiology , Heart Failure/therapy , Hospitalization , Humans , Israel , Kaplan-Meier Estimate , Logistic Models , Male , Middle Aged , Multivariate Analysis , Odds Ratio , Proportional Hazards Models , Prospective Studies , Registries , Retreatment , Risk Factors , Sex Factors , Time Factors , Treatment OutcomeABSTRACT
Background: The traditional classification of left ventricular hypertrophy (LVH), which relies on left ventricular geometry, fails to correlate with outcomes among patients with increased LV mass (LVM). Objectives: To identify unique clinical phenotypes of increased LVM patients using unsupervised cluster analysis, and to explore their association with clinical outcomes. Methods: Among the UK Biobank participants, increased LVM was defined as LVM index ≥72Ć¢ĀĀ g/m2 for men, and LVM index ≥55Ć¢ĀĀ g/m2 for women. Baseline demographic, clinical, and laboratory data were collected from the database. Using Ward's minimum variance method, patients were clustered based on 27 variables. The primary outcome was a composite of all-cause mortality with heart failure (HF) admissions, ventricular arrhythmia, and atrial fibrillation (AF). Cox proportional hazard model and Kaplan-Meier survival analysis were applied. Results: Increased LVM was found in 4,255 individuals, with an average age of 64 Ā± 7 years. Of these patients, 2,447 (58%) were women. Through cluster analysis, four distinct subgroups were identified. Over a median follow-up period of 5 years (IQR: 4-6), 100 patients (2%) died, 118 (2.8%) were admissioned due to HF, 29 (0.7%) were admissioned due to VA, and 208 (5%) were admissioned due to AF. Univariate Cox analysis demonstrated significantly elevated risks of major events for patients in the 2nd (HR = 1.6; 95% CI 1.2-2.16; p < .001), 3rd (HR = 2.04; 95% CI 1.49-2.78; p < .001), and 4th (HR = 2.64; 95% CI 1.92-3.62; p < .001) clusters compared to the 1st cluster. Further exploration of each cluster revealed unique clinical phenotypes: Cluster 2 comprised mostly overweight women with a high prevalence of chronic lung disease; Cluster 3 consisted mostly of men with a heightened burden of comorbidities; and Cluster 4, mostly men, exhibited the most abnormal cardiac measures. Conclusions: Unsupervised cluster analysis identified four outcomes-correlated clusters among patients with increased LVM. This phenotypic classification holds promise in offering valuable insights regarding clinical course and outcomes of patients with increased LVM.
ABSTRACT
BACKGROUND: Pulmonary vein isolation (PVI) is the most effective therapy to achieve rhythm control in atrial fibrillation (AF). Peri-procedural imaging is used in many but not all centers. However, the impact of imaging on safety and efficacy of PVI is not clear. The Israeli Catheter Ablation Registry (ICAR) is a great opportunity to explore this issue in real-world practice. AIM: To describe the real-world utilization of peri-procedural imaging technologies in a large cohort of patients undergoing ablation for AF. METHODS: A prospective-multicenter cohort of AF patients who underwent PVI during the years 2019-2021. Peri-procedural imaging (CT, ICE, TEE) was utilized based on the center and operator discretion. The study endpoints were peri-procedural complications and AF recurrence at 12Ā months follow-up among patients with and without peri-procedural imaging. RESULTS: Between January 2019 and December 2021, a total of 921 patients underwent PVI. Peri-procedural imaging (at least 1 modality of CT, TEE, and or ICE) was utilized in 753 (81.8%) and no imaging among 168 (18.2%) patients. Cryoablation was the dominant energy used for PVI in both groups (92.3% of the non-imaging group, and 95.3% among imaging group), while RF was used in the rest of the patients. Fluoroscopy time was not different between the 2 groups; however, procedure duration was longer among the imaging group (90Ā min) compared to the non-imaging group (74.5Ā min, p = 0.006). By 12Ā months, the incidence of AF recurrence and repeated ablation were not different between the groups. Complications and re-hospitalization for cardiocerebrovascular reasons were not different among the 2 groups. Cox regression model demonstrated no association between preprocedural imaging and the risk of AF recurrence after ablation. CONCLUSION: This real-world multicenter prospective registry study demonstrated that the rate of complications and the rate of recurrence of AF during 1Ā year follow-up were not different among patients who had PVI either with or without peri-procedural imaging.
ABSTRACT
AIMS: The severity of tricuspid regurgitation (TR) is a predictor of outcome among heart failure patients. The interaction between cardiac resynchronization therapy (CRT) and TR has not been described. In this study, we examined the effect of pre-implant TR, and worsened TR post-implant, on response to CRT and overall survival. METHODS AND RESULTS: We included all patients with successfully implanted CRT systems between 2007 and 2010. Patients were divided into two groups pre-implant: (Gp 1) no-or-mild TR; and (Gp 2) moderate-or-severe TR. Post-implant, patients were divided into two groups: (Gp A) improved or stable TR; and (Gp B) worsened TR. The clinical and echocardiographic outcome of all patients was assessed. The study included 193 patients. Thirty-five subjects (18%) had moderate or severe TR pre-implant (Gp 2). Baseline echo parameters and 6 min walk distance were worse in Gp 2 compared with Gp 1 (mild or no TR). There was no significant difference in clinical response to CRT between the two groups. However, Gp 2 had a significantly lower echocardiographic response (35 vs. 60%, P = 0.01) and higher mortality over 3 years (OR = 6.70, 95% CI = 1.8-24.5, P = 0.004). Post-implant, 25 patients (13%) developed worsened TR (Gp B), not associated with deterioration in right ventricle function or elevation in pulmonary artery pressure. Worsened TR predicted a reduced clinical response to CRT (42 vs. 70%, P = 0.006), when compared with Gp A. CONCLUSIONS: The presence of baseline moderate or severe TR is associated with increased mortality but does not predict clinical or echocardiographic response to CRT. Patients with worsened TR following CRT are less likely to clinically respond to CRT. Pacing leads passing through the tricuspid valve may worsen TR. It is conceivable that avoidance of lead-induced TR by alternative implantation techniques could improve the response rate to CRT.
Subject(s)
Cardiac Resynchronization Therapy/mortality , Heart Failure , Severity of Illness Index , Tricuspid Valve Insufficiency , Aged , Aged, 80 and over , Echocardiography , Female , Follow-Up Studies , Heart Failure/diagnostic imaging , Heart Failure/mortality , Heart Failure/therapy , Humans , Kaplan-Meier Estimate , Male , Middle Aged , Predictive Value of Tests , Proportional Hazards Models , Retrospective Studies , Tricuspid Valve Insufficiency/diagnostic imaging , Tricuspid Valve Insufficiency/mortality , Tricuspid Valve Insufficiency/therapyABSTRACT
BACKGROUND: Contemporary implantable cardiac defibrillators (ICD) enable storage of multiple, preepisode R-R recordings in patients who suffered from ventricular tachyarrhythmia (VTA). Timely prediction of VTA, using heart rate variability (HRV) analysis techniques, may facilitate the implementation of preventive and therapeutic strategies. AIM: To evaluate the novel multipole method of the HRV analysis in prediction of imminent VTAs in ICD patients. METHODS: We screened patients from the Biotronik HAWAI Registry (Heart Rate Analysis with Automated ICDs). A total of 28 patients from the HAWAI registries (phase I and II), having medical records, who had experienced documented, verified VTA during the 2-year follow-up, were included in our analysis. HRV during preepisode recordings of 4,500 R-R intervals were analyzed using the Dyx parameter and compared to HRV of similar length recordings from the same patients that were not followed by arrhythmia. RESULTS: Our study population consisted mainly of men 25 of 28 (89%), average age of 64.8 Ā± 9.4 years, 92% with coronary artery disease. HRV during 64 preevent recordings (2.3 events per patient on average) was analyzed and compared with 60 control recordings. The multipole method of HRV analysis showed 50% sensitivity and 91.6% specificity for prediction of ventricular tachycardia/ventricular fibrillation in the study population, with 84.5% positive predictive value. No statistically significant correlation was found between various clinical parameters and the sensitivity of imminent VTA predetection in our patients. CONCLUSION: The multipole method of HRV analysis emerges as a highly specific, possible predictor of imminent VTA, providing an early warning allowing to prepare for an arrhythmic episode.
Subject(s)
Defibrillators, Implantable , Diagnosis, Computer-Assisted/methods , Electrocardiography, Ambulatory/instrumentation , Electrocardiography, Ambulatory/methods , Heart Rate , Tachycardia, Ventricular/diagnosis , Tachycardia, Ventricular/physiopathology , Female , Humans , Male , Middle Aged , Prognosis , Reproducibility of Results , Risk Assessment , Sensitivity and Specificity , Tachycardia, Ventricular/prevention & control , Treatment OutcomeABSTRACT
We describe the case of a 14-year-old girl with a history of syncopal episodes triggered by stress or exercise. Catecholaminergic polymorphic ventricular tachycardia was diagnosed with the aid of an implantable loop recorder. The genetic testing of the patient and her family revealed a de novo novel missense mutation (Ser4155Tyr) in the exon 90 of the ryanodine receptor gene. This mutation affects a highly conserved residue (S4155) and results to replacement of serine (S) with tyrosine (Y) leading to change in physical and chemical properties. The girl was treated with an implantable defibrillator, metoprolol and flecainide. Over 1 year of follow-up she had no recurrence of ventricular tachycardia.
Subject(s)
Mutation, Missense/genetics , Ryanodine Receptor Calcium Release Channel/genetics , Syncope/genetics , Tachycardia, Ventricular/genetics , Adolescent , Anti-Arrhythmia Agents/therapeutic use , Defibrillators, Implantable , Female , Flecainide/therapeutic use , Follow-Up Studies , Genetic Predisposition to Disease/genetics , Genetic Testing/methods , Humans , Metoprolol/therapeutic use , Syncope/complications , Syncope/therapy , Tachycardia, Ventricular/complications , Tachycardia, Ventricular/therapy , Treatment OutcomeABSTRACT
BACKGROUND: Atrial fibrillation (AF) is a common diagnosis in patients presenting to urgent care centers (UCCs), yet there is scant research regarding treatment in these centers. While some of these patients are managed within UCCs, some are referred for further care in an emergency department (ED). OBJECTIVES: We aimed to identify the rate of patients referred to an ED and define predictors for this outcome. We analyzed the rates of AF diagnosis and hospital referral over the years. Finally, we described trends in patient anticoagulation (AC) medication use. METHODS: This retrospective study included 5873 visits of patients over age 18 visiting the TEREM UCC network with a diagnosis of AF over 11 years. Multivariate analysis was used to identify predictors for ED referral. RESULTS: In a multivariate model, predictors of referral to an ED included vascular disease (OR 1.88 (95% CI 1.43-2.45), p < 0.001), evening or night shifts (OR 1.31 (95% CI 1.11-1.55), p < 0.001; OR 1.68 (95% CI 1.32-2.15), p < 0.001; respectively), previously diagnosed AF (OR 0.31 (95% CI 0.26-0.37), p < 0.001), prior treatment with AC (OR 0.56 (95% CI 0.46-0.67), p < 0.001), beta blockers (OR 0.63 (95% CI 0.52-0.76), p < 0.001), and antiarrhythmic medication (OR 0.58 (95% CI 0.48-0.69), p < 0.001). Visits diagnosed with AF increased over the years (p = 0.030), while referrals to an ED decreased over the years (p = 0.050). The rate of novel oral anticoagulant prescriptions increased over the years. CONCLUSIONS: The rate of referral to an ED from a UCC over the years is declining but remains high. Referrals may be predicted using simple clinical variables. This knowledge may help to reduce the burden of hospitalizations.
ABSTRACT
Background: Evidence regarding the mortality benefit of implantable cardioverter defibrillator (ICD) non-ischemic dilated cardiomyopathy (NIDCM) is inconsistent. The most recent randomized study, the DANISH trial, did not find improved outcomes with ICD. However, based on previous studies and meta-analyses, current guidelines still highly recommend ICD implantation in NIDCM patients. The introduction of novel medications for heart failure improved the clinical outcome dramatically. We aimed in this study to evaluate the effect of Angiotensin Receptor-Neprilysin Inhibitors (ARNi) and sodium-glucose transport protein 2 inhibitors (SGLT2i) on the mortality benefit of ICD in NIDCM. Methods: We used a previous metanalysis algorithm and added an updated comprehensive literature search in PubMed for randomized control trials that examined the mortality benefit of ICD in NIDCM vs. optimal medical treatment. The primary outcome included death from any cause. We did a meta-regression analysis to search for a single independent factor affecting mortality. Using previous data, we evaluated the theoretical effect of ICD implementation on patients treated with SGLT2 inhibitors and ARNi. Results: No new articles were added to the results of the previous meta-analysis. 2,622 patients with NIDCM from 5 cohort studies published between 2002 and 2016 were included in the analysis. 50% of them underwent ICD implantation for primary prevention of sudden cardiac death, and 50% did not. ICD was associated with a significantly decreased risk for death from any cause compared to control (OR = 0.79, 95%CI: 0.66-0.95, p = 0.01, I2 = 0%). The theoretical addition of ARNi and the SGLT2 inhibitor dapagliflozin did not change the significant mortality effect of ICD (OR = 0.82, 95%CI: 0.7-0.9, p = 0.001, I2 = 0%) and (OR = 0.82, 95%CI: 0.7-0.9, p = 0.001, I2 = 0%). A meta-regression revealed no association between death from any cause and left bundle branch block (LBBB), use of amiodarone, use of angiotensin-converting enzyme inhibitors (ACEi) or angiotensin receptor blockers, year initiated enrollment, and the year ended enrollment (R2 = 0.0). Conclusion: In patients with NIDCM, the addition of ARNi and SGLT2i did not affect the mortality advantages of ICD for primary prevention. PROSPERO registry number: https://www.crd.york.ac.uk/prospero/, identifier: CRD42023403210.
ABSTRACT
BACKGROUND: Women with congenital long-QT syndrome experience an increased risk for cardiac events after the onset of adolescence that is more pronounced among carriers of the LQT2 genotype. We hypothesized that the hormonal changes associated with menopause may affect clinical risk in this population. METHODS AND RESULTS: We used a repeated-events analysis to evaluate the risk for recurrent syncope during the menopause transition and postmenopausal periods (5 years before and after the age at onset of menopause, respectively) among 282 LQT1 (n=151) and LQT2 (n=131) women enrolled in the Long-QT Syndrome Registry. Multivariate analysis showed that the risk for recurrent syncope (n=150) among LQT2 women was significantly increased during both menopause transition (hazard ratio, 3.38; P=0.005) and the postmenopausal period (hazard ratio, 8.10; P<0.001) compared with the reproductive period. The risk increase was evident among women who did or did not receive estrogen therapy. In contrast, among LQT1 women, the onset of menopause was associated with a reduction in the risk for recurrent syncope (hazard ratio, 0.19; P=0.05; P=0.02 for genotype-by-menopause interaction). Only 22 women (8%) experienced aborted cardiac arrest or sudden cardiac death during follow-up. The frequency of aborted cardiac arrest/sudden cardiac death showed a similar genotype-specific association with the onset of menopause. CONCLUSIONS: The onset of menopause is associated with a significant increase in the risk of cardiac events (dominated by recurrent episodes of syncope) in LQT2 women, suggesting that careful follow-up and continued long-term therapy are warranted in this population.
Subject(s)
Death, Sudden, Cardiac/epidemiology , Jervell-Lange Nielsen Syndrome/mortality , Menopause , Romano-Ward Syndrome/mortality , Adult , Age Distribution , ERG1 Potassium Channel , Estrogen Replacement Therapy/statistics & numerical data , Ether-A-Go-Go Potassium Channels/genetics , Female , Follow-Up Studies , Genotype , Humans , Jervell-Lange Nielsen Syndrome/genetics , KCNQ1 Potassium Channel/genetics , Middle Aged , Recurrence , Risk Factors , Romano-Ward Syndrome/genetics , Syncope/genetics , Syncope/mortalityABSTRACT
INTRODUCTION: Rate smoothing algorithms, while known to help prevent ventricular tachyarrhythmias in some patients, have been shown to result in underdetection of ventricular tachycardia (VT) due to interaction between bradycardia pacing and tachycardia detection parameters. A new algorithm named Bradycardia Tachycardia Response (BTR) has been developed in order to prevent rate smoothing-induced underdetection. The efficacy of BTR is not known. The aim of this study was to assess the effectiveness of BTR in preventing VT underdetection due to rate smoothing. METHODS AND RESULTS: Two ICD models (TELIGEN and VITALITY AVT, Boston Scientific, St. Paul, MN, USA) bearing identical rate smoothing algorithms were connected to a VT simulator. Devices were programmed similarly except for the BTR feature that exists in TELIGEN only. The detection performance of both devices was tested using varying combinations of AV delay, rate smoothing down, and upper rate limit and compared between the two models. VT underdetection (delay or nondetection) occurred during pacing in 62% of the VT episodes with VITALITY AVT. In TELIGEN, all simulated VT episodes were detected appropriately as soon as their rates exceeded the programmed VT detection rate. Detection tended to be affected by higher upper rate, longer AV delays, and more aggressive rate smoothing. CONCLUSION: The BTR algorithm effectively counteracts VT detection delay caused by the interaction of rate smoothing with VT detection parameters, thus enabling safe use of the rate smoothing feature.