ABSTRACT
BACKGROUND: Current estimates of atrial fibrillation (AF)-associated mortality rely on claims- or clinical-derived diagnoses of AF, limit AF to a binary entity, or are confounded by comorbidities. The objective of the present study is to assess the association between device-recognized AF and mortality among patients with cardiac implantable electronic devices capable of sensitive and continuous atrial arrhythmia detection. Secondary outcomes include relative mortality among cohorts with no AF, paroxysmal AF, persistent AF, and permanent AF. METHODS: Using the deidentified Optum Clinformatics US claims database (2015 to 2020) linked to the Medtronic CareLink database, we identified individuals with a cardiac implantable electronic device who transmitted data ≥6 months after implantation. AF burden was assessed during the first 6 months after implantation (baseline period). Subsequent mortality, assessed from claims data, was compared between patients with and without AF, with adjustment for age, geographic region, insurance type, Charlson Comorbidity Index, and implantation year. RESULTS: Of 21 391 patients (age, 72.9±10.9 years; 56.3% male) analyzed, 7798 (36.5%) had device-recognized AF. During a mean of 22.4±12.9 months (median, 20.1 [12.8-29.7] months) of follow-up, the overall incidence of mortality was 13.5%. Patients with AF had higher adjusted all-cause mortality than patients without AF (hazard ratio, 1.29 [95% CI, 1.20-1.39]; P<0.001). Among those with AF, patients with nonparoxysmal AF had the greatest risk of mortality (persistent AF versus paroxysmal AF: hazard ratio, 1.36 [95% CI, 1.18-1.58]; P<.001; permanent AF versus paroxysmal AF: hazard ratio, 1.23 [95% CI, 1.14-1.34]; P<.001). CONCLUSIONS: After adjustment for potential confounding factors, the presence of AF was associated with higher mortality in our cohort of patients with cardiac implantable electronic devices. Among those with AF, nonparoxysmal AF was associated with the greatest risk of mortality.
Subject(s)
Atrial Fibrillation , Defibrillators, Implantable , Humans , Atrial Fibrillation/mortality , Atrial Fibrillation/diagnosis , Aged , Male , Female , Middle Aged , Aged, 80 and over , Pacemaker, Artificial , Risk Factors , Databases, Factual , United States/epidemiologyABSTRACT
BACKGROUND: Objective data comparing the diagnostic performance of different ambulatory cardiac monitors (ACMs) are lacking. OBJECTIVES: To assess variation in monitoring strategy, clinical outcomes and healthcare utilization in patients undergoing ambulatory monitoring without a pre-existing arrhythmia diagnosis. METHODS: Using the full sample (100%) of Medicare claims data, we performed a retrospective cohort study of diagnostic-naïve patients who received first-time ACM in 2017 to 2018 and evaluated arrhythmia encounter diagnosis at 3-months, repeat ACM testing at 6 months, all-cause 90-day emergency department (ED) and inpatient utilization, and cost of different strategies: Holter; long-term continuous monitor (LTCM); non-continuous, event-based external ambulatory event monitor (AEM); and mobile cardiac telemetry (MCT). We secondarily performed a device-specific analysis by manufacturer, identified from unique claim modifier codes. RESULTS: ACMs were used in 287,789 patients (AEM = 10.3%; Holter = 53.8%; LTCM = 13.3%; MCT = 22.5%). Device-specific analysis showed that compared to Holter, AEM, MCT, or other LTCM manufacturers, a specific LTCM (Zioâ XT 14-day patch, iRhythm Technologies, San Francisco, CA) had the highest adjusted odds of diagnosis and lowest adjusted odds of ACM retesting. Findings were consistent for specific arrhythmia diagnoses of ventricular tachycardia, atrioventricular block, and paroxysmal atrial fibrillation. As a category, LTCM was associated with the lowest 1-year incremental health care expenditures (mean Δ$10,159), followed by Holter ($10,755), AEM ($11,462), and MCT ($12,532). CONCLUSIONS: There was large variation in diagnostic monitoring strategy. A specific LTCM was associated with the highest adjusted odds of a new arrhythmia diagnosis and lowest adjusted odds of repeat ACM testing. LTCM as a category had the lowest incremental acute care utilization. Different monitoring strategies may produce different results with respect to diagnosis and care.
Subject(s)
Atrial Fibrillation , Electrocardiography, Ambulatory , Methacrylates , United States , Humans , Aged , Retrospective Studies , Medicare , Atrial Fibrillation/diagnosis , Health Expenditures , Patient Acceptance of Health CareABSTRACT
INTRODUCTION: We aimed to study whether KardiaMobile 6L 30-second capture technology could shorten ECG collection time compared to standard 12L ECG without compromising data usability. METHODS: A single-center, non-randomized trial was performed on patients presenting for follow-up visits to the electrophysiology (EP) clinic. Providers in the KardiaMobile 6L group were allowed to request a standard 12L if the 6L was deemed insufficient for clinical care. Room utilization times, defined as the time from medical assistant room entry to exit, were compared for each group. RESULTS: There were 100 patients in the study, with 50 in each arm. Average room utilization time for the 12L group and 6L groups were 10.33 ± 2.2 and 7.27 ± 1.93 min, respectively (p < .001). In 8 (16%) visits for the 6L group, an additional 12L was requested. CONCLUSION: For EP follow-up visits, clinic utilization time was significantly reduced with the KardiaMobile 6L compared to the 12L ECG with infrequent need for an additional 12L.
Subject(s)
Electrocardiography , Predictive Value of Tests , Humans , Male , Female , Middle Aged , Time Factors , Aged , Action Potentials , Heart Rate , Adult , Equipment DesignABSTRACT
BACKGROUND: Success of atypical atrial flutter (AAFL) ablation has historically been limited by difficulty mapping the complex re-entrant circuits involved. While high-density (HD) mapping has become commonplace in clinical practice, there are limited data on outcomes of HD versus non-HD mapping for AAFL ablation. OBJECTIVE: To compare clinical outcomes and healthcare utilization using HD mapping versus non-HD mapping for AAFL ablation. METHODS: Retrospective analysis of all AAFL procedures between 2005 and 2022 at an academic medical center was conducted. Procedures utilizing a 16-electrode HD Grid catheter and Precision mapping system were compared to procedures using prior generation 10-20 electrode spiral catheters and the Velocity system (Abbott, IL). Cox regression models and Poisson regression models were utilized to examine procedural and healthcare utilization outcomes. Models were adjusted for left ventricular ejection fraction, CHA2DS2-VASc, and history of prior ablation. RESULTS: There were 108 patients (62% HD mapping) included in the analysis. Baseline clinical characteristics were similar between groups. Use of HD mapping was associated with a higher rate of AAFL circuit delineation (92.5% vs. 76%; p = .014) and a greater adjusted procedure success rate, defined as non-inducibility at procedure end, (aRR (95% CI) 1.26 (1.02-1.55) p = .035) than non-HD mapping. HD mapping was also associated with a lower rate of ED visits (aIRR (95% CI) 0.32 (0.14-0.71); p = .007) and hospitalizations (aIRR (95% CI) 0.32 (0.14-0.68); p = .004) for AF/AFL/HF through 1 year. While there was a lower rate of recurrent AFL through 1 year among HD mapping cases (aHR (95% CI) 0.60 (0.31-1.16) p = .13), statistical significance was not met likely due to the low sample size and higher rate of ambulatory rhythm monitoring in the HD group (61% vs. 39%, p = .025). CONCLUSION: Compared to non-HD mapping, AAFL ablation with HD mapping is associated with improvements in the ability to define the AAFL circuit, greater procedural success, and a reduction in the number of ED visits and hospitalization for AF/AFL/HF.
ABSTRACT
BACKGROUND: Left atrial (LA) myopathy is thought to be associated with silent brain infarctions (SBI) through changes in blood flow hemodynamics leading to thrombogenesis. 4D-flow MRI enables in-vivo hemodynamic quantification in the left atrium (LA) and LA appendage (LAA). PURPOSE: To determine whether LA and LAA hemodynamic and volumetric parameters are associated with SBI. STUDY TYPE: Prospective observational study. POPULATION: A single-site cohort of 125 Participants of the multiethnic study of atherosclerosis (MESA), mean age: 72.3 ± 7.2 years, 56 men. FIELD STRENGTH/SEQUENCE: 1.5T. Cardiac MRI: Cine balanced steady state free precession (bSSFP) and 4D-flow sequences. Brain MRI: T1- and T2-weighted SE and FLAIR. ASSESSMENT: Presence of SBI was determined from brain MRI by neuroradiologists according to routine diagnostic criteria in all participants without a history of stroke based on the MESA database. Minimum and maximum LA volumes and ejection fraction were calculated from bSSFP data. Blood stasis (% of voxels <10 cm/sec) and peak velocity (cm/sec) in the LA and LAA were assessed by a radiologist using an established 4D-flow workflow. STATISTICAL TESTS: Student's t test, Mann-Whitney U test, one-way ANOVA, chi-square test. Multivariable stepwise logistic regression with automatic forward and backward selection. Significance level P < 0.05. RESULTS: 26 (20.8%) had at least one SBI. After Bonferroni correction, participants with SBI were significantly older and had significantly lower peak velocities in the LAA. In multivariable analyses, age (per 10-years) (odds ratio (OR) = 1.99 (95% confidence interval (CI): 1.30-3.04)) and LAA peak velocity (per cm/sec) (OR = 0.87 (95% CI: 0.81-0.93)) were significantly associated with SBI. CONCLUSION: Older age and lower LAA peak velocity were associated with SBI in multivariable analyses whereas volumetric-based measures from cardiac MRI or cardiovascular risk factors were not. Cardiac 4D-flow MRI showed potential to serve as a novel imaging marker for SBI. LEVEL OF EVIDENCE: 3 TECHNICAL EFFICACY: Stage 2.
ABSTRACT
BACKGROUND: The impact of using direct-to-consumer wearable devices as a means to timely detect atrial fibrillation (AF) and to improve clinical outcomes is unknown. METHODS: Heartline is a pragmatic, randomized, and decentralized application-based trial of US participants aged ≥65 years. Two randomized cohorts include adults with possession of an iPhone and without a history of AF and those with a diagnosis of AF taking a direct oral anticoagulant (DOAC) for ≥30 days. Participants within each cohort are randomized (3:1) to either a core digital engagement program (CDEP) via iPhone application (Heartline application) and an Apple Watch (Apple Watch Group) or CDEP alone (iPhone-only Group). The Apple Watch Group has the watch irregular rhythm notification (IRN) feature enabled and access to the ECG application on the Apple Watch. If an IRN notification is issued for suspected AF then the study application instructs participants in the Apple Watch Group to seek medical care. All participants were "watch-naïve" at time of enrollment and have an option to either buy or loan an Apple Watch as part of this study. The primary end point is time from randomization to clinical diagnosis of AF, with confirmation by health care claims. Key secondary endpoint are claims-based incidence of a 6-component composite cardiovascular/systemic embolism/mortality event, DOAC medication use and adherence, costs/health resource utilization, and frequency of hospitalizations for bleeding. All study assessments, including patient-reported outcomes, are conducted through the study application. The target study enrollment is approximately 28,000 participants in total; at time of manuscript submission, a total of 26,485 participants have been enrolled into the study. CONCLUSION: The Heartline Study will assess if an Apple Watch with the IRN and ECG application, along with application-facilitated digital health engagement modules, improves time to AF diagnosis and cardiovascular outcomes in a real-world environment. TRIAL REGISTRATION: ClinicalTrials.gov Identifier: NCT04276441.
Subject(s)
Atrial Fibrillation , Embolism , Thromboembolism , Adult , Humans , Atrial Fibrillation/complications , Atrial Fibrillation/diagnosis , Atrial Fibrillation/drug therapy , Thromboembolism/diagnosis , Thromboembolism/etiology , Thromboembolism/prevention & control , HemorrhageABSTRACT
Uninterrupted anticoagulation for atrial fibrillation (AF), regardless of AF burden, is deeply rooted in practice since the early anticoagulation trials. However, uninterrupted anticoagulation is not without risks, and may not be beneficial for allcomers with a history of AF. Indeed, contemporary data that support a critical duration threshold of AF that benefits from anticoagulation, and a temporal association between stroke and multihour AF episodes, compel the study of a more targeted approach to AF anticoagulation. In this review, we discuss data that support further investigation of "pill in the pocket" anticoagulation for AF, and introduce the pivotal Rhythm Evaluation for Anticoagulation Therapy for Atrial Fibrillation (REACT-AF) trial that will robustly evaluate this strategy.
Subject(s)
Atrial Fibrillation , Stroke , Humans , Atrial Fibrillation/complications , Atrial Fibrillation/diagnosis , Atrial Fibrillation/drug therapy , Risk Factors , Stroke/diagnosis , Stroke/etiology , Stroke/prevention & control , Anticoagulants/adverse effectsABSTRACT
INTRODUCTION: The Apple watch (AW) irregular rhythm notification (IRN) feature uses photoplethysmography to identify prolonged episodes of irregular rhythm suggestive of atrial fibrillation (AF). IRN is FDA cleared for those with no previous history of AF, however, these devices are increasingly being used for AF management. The objective of the present study was to determine the accuracy of the IRN in subjects with a previous diagnosis of nonpermanent AF. METHODS: Subjects with a history of nonpermanent AF and either an insertable cardiac monitor (ICM) or cardiac implanted electronic device (CIED) with <5% ventricular pacing were fitted with an AW Series 5 for 6 months. AF episodes were compared between the ICM/CIED and IRN. The primary endpoints were sensitivity, specificity, positive predictive value (PPV), and negative predictive value (NPV) of the IRN by subject for AF ≥1 h. Secondary endpoints were sensitivity and PPV by AF episode ≥1 h. Analysis was limited to a maximum of 10 ICM/CIED episodes per subject and included only those AF episodes occurring during active AW use confirmed by activity data. RESULTS: Thirty participants were enrolled. Mean age was 65.4 ± 12.2 years and 40% were female. There were 10 ICMs and 20 CIEDs. Eleven subjects had AF on ICM/CIED while the AW was worn, of whom 8 were detected by IRN. There were no false positive IRN detections by subject ("by subject" 72% sensitivity, 100% specificity, 100% PPV, and 90% NPV). Five subjects had AF only when the AW was not worn. There were a total of 70 AF episodes on ICM/CIED, 35 of which occurred while the AW was being worn. Of these, 21 were detected by IRN with 1 false positive ("by episode" sensitivity = 60.0%, PPV = 95.5%). CONCLUSION: In a population with known AF, the AW IRN had a low rate of false positive detections and high specificity. Sensitivity for detection by subject and by AF episode was lower. The current IRN algorithm appears accurate for AF screening as currently cleared, but increased sensitivity and wear times would be necessary for disease management.
Subject(s)
Atrial Fibrillation , Humans , Female , Middle Aged , Aged , Male , Atrial Fibrillation/diagnosis , Electrocardiography, Ambulatory , Reproducibility of Results , Predictive Value of Tests , AlgorithmsABSTRACT
INTRODUCTION: Oral sotalol initiation requires a multiple-day, inpatient admission to monitor for QT prolongation during loading. A 1-day intravenous (IV) sotalol loading protocol was approved by the United States Food and Drug Administration in March 2020, but limited data on clinical use and administration currently exists. This study describes implementation of an IV sotalol protocol within an integrated health system, provides initial efficacy and safety outcomes, and examines length of stay (LOS) compared with oral sotalol initiation. METHODS: IV sotalol was administered according to a prespecified initiation protocol to adult patients with refractory atrial or ventricular arrhythmias. Baseline characteristics, safety and feasibility outcomes, and LOS were compared with patients receiving oral sotalol over a similar time period. RESULTS: From January 2021 to June 2022, a total of 29 patients (average age 66.0 ± 8.6 years, 27.6% women) underwent IV sotalol load and 20 patients (average age 60.4 ± 13.9 years, 65.0% women) underwent oral sotalol load. The load was successfully completed in 22/29 (75.9%) patients receiving IV sotalol and 20/20 (100%) of patients receiving oral sotalol, although 7/20 of the oral sotalol patients (35.0%) required dose reduction. Adverse events interrupting IV sotalol infusion included bradycardia (seven patients, 24.1%) and QT prolongation (three patients, 10.3%). No patients receiving IV or oral sotalol developed sustained ventricular arrhythmias before discharge. LOS for patients completing IV load was 2.6 days shorter (mean 1.0 vs. 3.6, p < .001) compared with LOS with oral load. CONCLUSION: IV sotalol loading has a safety profile that is similar to oral sotalol. It significantly shortens hospital LOS, potentially leading to large cost savings.
Subject(s)
Long QT Syndrome , Sotalol , Adult , Female , Humans , Middle Aged , Aged , Male , Sotalol/adverse effects , Anti-Arrhythmia Agents/therapeutic use , Length of Stay , Feasibility Studies , Arrhythmias, Cardiac/drug therapy , Long QT Syndrome/chemically inducedABSTRACT
BACKGROUND: Hemodynamic assessment of left atrial (LA) flow using phase contrast MRI provides insight into thromboembolic risk in atrial fibrillation (AF). However, conventional flow imaging techniques are averaged over many heartbeats. PURPOSE: To evaluate beat-to-beat variability and LA hemodynamics in patients with AF using real time phase contrast (RTPC) MRI. STUDY TYPE: Prospective. SUBJECTS: Thirty-five patients with history of AF (68 ± 10 years, nine female), 10 healthy controls (57 ± 19 years, four female). FIELD STRENGTH/SEQUENCE: 5T, 2D RTPC with through-plane velocity-encoded gradient echo sequence and 4D flow MRI with three-directional velocity-encoded gradient echo sequence. ASSESSMENT: RTPC was continuously acquired for a mid-LA slice in all subjects. 4D flow data were interpolated at the RTPC location and normally projected for comparison with RTPC. RR intervals extracted from RTPC were used to calculate heart rate variability (HRV = interquartile range over median × 100%). Patients were classified into low (<9.7%) and high (>9.7%) HRV groups. LA peak/mean velocity and stasis (%velocities < 5.8 cm/sec) were calculated from segmented 2D images. Variability in RTPC flow metrics was quantified by coefficient of variation (CV) over all cycles. STATISTICAL TESTS: Pearson's correlation coefficient (r), Bland-Altman analysis, Kruskal-Wallis test. A P value < 0.05 was considered statistically significant. RESULTS: RTPC and 4D flow measurements were strongly/significantly correlated for all hemodynamic parameters (R2 = 0.75-0.83) in controls. Twenty-four patients had low HRV (mean = 4 ± 2%) and 11 patients had high HRV (27 ± 9%). In patients, increased HRV was significantly correlated with CV of peak velocity (r = 0.67), mean velocity (r = 0.51), and stasis (r = 0.41). A stepwise decrease in peak/mean velocity and increase in stasis was observed when comparing controls vs. low HRV vs. high HRV groups. Mean velocity and stasis differences were significant for control vs. high HRV groups. CONCLUSIONS: RTPC may be suitable for assessing the impact of HRV on hemodynamics and provide insight for AF management in highly arrhythmic patients. EVIDENCE LEVEL: 1 TECHNICAL EFFICACY: Stage 2.
Subject(s)
Atrial Fibrillation , Humans , Female , Atrial Fibrillation/diagnostic imaging , Prospective Studies , Blood Flow Velocity/physiology , Hemodynamics , Magnetic Resonance Imaging/methodsABSTRACT
BACKGROUND: Following catheter ablation for atrial fibrillation (AF), there are dynamic changes in the atrial myocardium associated with damage to and necrosis of atrial tissue and other procedure related changes in rhythm and anticoagulation. Early time-dependent changes in biomarkers of necrosis, inflammation, and coagulation have been reported. This study examines mid-term (4-8 weeks post-ablation) changes in biomarkers and explores their ability to predict AF recurrence at one-year. METHODS: Twenty-seven patients (mean age 65.4 ± 9.7 years, 30% female) undergoing catheter ablation for AF had peripheral venous blood samples obtained at the time of ablation and 4-8 weeks later. All samples were processed to obtain plasma which was frozen for subsequent analysis. Coagulation studies were performed at the Northwestern Special Hemostasis Laboratory: VWF, ADAMTS13, PAI-1, D-dimer, and TAT complexes. A commercial lab analyzed samples for CRP, cystatin C, fibrinogen, galectin, IL-6, MMP-2, myoglobin, NT-proBNP, PAI-1, TIMP-1, TIMP-2, TPA, and VWF. RESULTS: Significant changes were noted with higher levels of ADAMTS13 (p < 0.0001), fibrinogen (p = 0.004), MMP-2 (p = 0.0002), TIMP-2 (p = 0.003), and TPA (p = 0.001) compared to lower levels of TAT (p < 0.0001) and NT-proBNP (p = 0.0001) at follow up post-ablation. One year after ablation, AF had recurred in 11/26 (42%) of patients. None of the biomarker changes predicted the 1-year outcome, and there was no significant association with the use of warfarin versus rivaroxaban. CONCLUSION: In patients undergoing catheter ablation for AF, there were significant changes in pre- vs post-ablation levels of multiple biomarkers. However, these changes were not associated with 1-year outcome of AF recurrence.
Subject(s)
Atrial Fibrillation , Catheter Ablation , Humans , Female , Middle Aged , Aged , Male , Atrial Fibrillation/surgery , Matrix Metalloproteinase 2 , Tissue Inhibitor of Metalloproteinase-2 , Treatment Outcome , Plasminogen Activator Inhibitor 1 , von Willebrand Factor , Biomarkers , Catheter Ablation/methods , RecurrenceABSTRACT
Evidence suggests that atrial fibrillation (AF) could increase the risk of worsening kidney function (WKF) which is linked to an increased risk of stroke, bleeding, and death in AF patients. However, limited data exist regarding the factors that could lead to WKF in these patients. Therefore, we sought to identify the potential factors associated with the development of WKF in patients with non-valvular AF (NVAF). We analyzed prospectively recruited 1122 NVAF patients [men 71.9%, median age 73.0 years (interquartile range: 66.0-79.0)] with a baseline estimated glomerular filtration rate (eGFR) ≥ 15 mL/min/1.73 m2 from the Hokuriku-Plus AF Registry. The primary outcome was incident WKF, defined as the %eGFR change from the baseline ≥ 30% during the follow-up period. We evaluated the association between baseline variables and incident WKF using univariate and multivariate Cox proportional hazard models. We also evaluated the non-linear association between the identified factors and incident WKF. During a median follow-up period of 3.0 years (interquartile range: 2.7-3.3), incident WKF was observed in 108 patients (32.6 per 1000 person-years). Compared to the patients without incident WKF, the patients with incident WKF were older and had a higher prevalence of heart failure (HF), diabetes mellitus (DM), and vascular disease at baseline. Those who experienced incident WKF also had higher diastolic blood pressure, lower hemoglobin, lower eGFR, higher B-type natriuretic peptide (BNP) and used warfarin more frequently. Upon multivariate analysis, age ≥ 75 years, HF, DM, and anemia were independently associated with incident WKF. Additionally, age and hemoglobin were linearly associated with the risk of incident WKF, whereas a J- or U-shaped association was observed for HbA1c and BNP. Age ≥ 75 years, HF, DM, and anemia were associated with the development of WKF in Japanese patients with NVAF. In patients with these risk factors, a careful monitoring of the kidney function and appropriate interventions may be important when possible.
Subject(s)
Atrial Fibrillation , Heart Failure , Male , Humans , Aged , Atrial Fibrillation/diagnosis , Atrial Fibrillation/epidemiology , Atrial Fibrillation/complications , Warfarin , Risk Factors , Kidney , RegistriesABSTRACT
BACKGROUND: Shared decision making (SDM) improves the likelihood that patients will receive care in a manner consistent with their priorities. To facilitate SDM, decision aids (DA) are commonly used, both to prepare a patient before their clinician visit, as well as to facilitate discussion during the visit. However, the relative efficacy of patient-focused or encounter-based DAs on SDM and patient outcomes remains largely unknown. We aim to directly estimate the comparative effectiveness of two DA's on SDM observed in encounters to discuss stroke prevention strategies in patients with atrial fibrillation (AF). METHODS: The study aims to recruit 1200 adult patients with non-valvular AF who qualify for anticoagulation therapy, and their clinicians who manage stroke prevention strategies, in a 2x2 cluster randomized multi-center trial at six sites. Two DA's were developed as interactive, online, non-linear tools: a patient decision aid (PDA) to be used by patients before the encounter, and an encounter decision aid (EDA) to be used by clinicians with their patients during the encounter. Patients will be randomized to PDA or usual care; clinicians will be randomized to EDA or usual care. RESULTS: Primary outcomes are quality of SDM, patient decision making, and patient knowledge. Secondary outcomes include anticoagulation choice, adherence, and clinical events. CONCLUSION: This trial is the first randomized, head-to-head comparison of the effects of an EDA versus a PDA on SDM. Our results will help to inform future SDM interventions to improve patients' AF outcomes and experiences with stroke prevention strategies.
Subject(s)
Atrial Fibrillation , Stroke , Adult , Anticoagulants/therapeutic use , Atrial Fibrillation/complications , Atrial Fibrillation/drug therapy , Decision Making , Decision Support Techniques , Humans , Patient Participation , Stroke/complications , Stroke/prevention & controlABSTRACT
PURPOSE: To evaluate the safety of MRI in patients with fragmented retained leads (FRLs) through numerical simulation and phantom experiments. METHODS: Electromagnetic and thermal simulations were performed to determine the worst-case RF heating of 10 patient-derived FRL models during MRI at 1.5 T and 3 T and at imaging landmarks corresponding to head, chest, and abdomen. RF heating measurements were performed in phantoms implanted with reconstructed FRL models that produced highest heating in numerical simulations. The potential for unintended tissue stimulation was assessed through a conservative estimation of the electric field induced in the tissue due to gradient-induced voltages developed along the length of FRLs. RESULTS: In simulations under conservative approach, RF exposure at B1+ ≤ 2 µT generated cumulative equivalent minutes (CEM)43 < 40 at all imaging landmarks at both 1.5 T and 3 T, indicating no thermal damage for acquisition times (TAs) < 10 min. In experiments, the maximum temperature rise when FRLs were positioned at the location of maximum electric field exposure was measured to be 2.4°C at 3 T and 2.1°C at 1.5 T. Electric fields induced in the tissue due to gradient-induced voltages remained below the threshold for cardiac tissue stimulation in all cases. CONCLUSIONS: Simulation and experimental results indicate that patients with FRLs can be scanned safely at both 1.5 T and 3 T with most clinical pulse sequences.
Subject(s)
Magnetic Resonance Imaging , Radio Waves , Heart/diagnostic imaging , Heating , Hot Temperature , Humans , Magnetic Resonance Imaging/adverse effects , Magnetic Resonance Imaging/methods , Phantoms, ImagingABSTRACT
In recent years, there has been an emergence of long-term cardiac monitoring devices, particularly as they relate to nonprescribed, user-initiated, wearable- and/or, smartphone-based devices. With these new available data, practitioners are challenged to interpret these data in the context of routine clinical decision-making. While there are many potential uses for long-term rhythm monitoring, in this review, we will focus on the evolving role of this technology in atrial fibrillation (AF) monitoring after catheter and/or surgical ablation. Here, we explore the landscape of prescription-based tools for long-term rhythm monitoring; investigate commercially available technologies that are accessible directly to patients, and look towards the future with investigative technologies that could have a growing role in this space.
Subject(s)
Atrial Fibrillation , Catheter Ablation , Atrial Fibrillation/diagnosis , Atrial Fibrillation/surgery , Catheter Ablation/adverse effects , Catheters , Humans , Recurrence , Treatment OutcomeABSTRACT
INTRODUCTION: Atrial fibrillation (AF) is a growing health problem and is associated with increased risk of stroke. The Cox-Maze surgical procedure has offered the highest success rate, but utilization of this technique is low due to procedure invasiveness and complexity. Advances in catheter ablation and minimally invasive surgical techniques offer new options for AF treatment. METHODS: In this review, we describe current trends and outcomes of minimally invasive treatment of persistent and long-standing persistent AF. RESULTS: Treatment of persistent and long-standing persistent AF can be successfully treated using a team approach combining cardiac surgery and electrophysiology procedures. With this approach, the 1-year freedom from AF off antiarrhythmic drugs was 85%. DISCUSSION: There are a variety of techniques and approaches used around the world as technology evolves to help develop new treatment strategies for AF. Our report will focus on a hybrid treatment approach using surgical and electrophysiology approaches providing enhanced treatment options by replicating Cox-Maze IV lesions using skills from each specialty. Closure of the left atrial appendage as part of these procedures enhances protection from late stroke. A team approach provides a cohesive evaluation, treatment, and monitoring plan for patients. Development of successful, less invasive treatment options will help address the growing population of patients with AF.
Subject(s)
Atrial Fibrillation , Catheter Ablation , Stroke , Atrial Fibrillation/diagnosis , Atrial Fibrillation/etiology , Atrial Fibrillation/surgery , Catheter Ablation/adverse effects , Catheter Ablation/methods , Humans , Stroke/etiology , Stroke/prevention & control , Thoracoscopy/adverse effects , Thoracoscopy/methods , Treatment OutcomeABSTRACT
INTRODUCTION: Esophageal thermal injury (ETI) is a well-recognized complication of atrial fibrillation (AF) ablation. Previous studies have demonstrated that direct esophageal cooling reduces ETI during radiofrequency AF ablation. The purpose of this study was to evaluate the use of an esophageal warming device to prevent ETI during cryoballoon ablation (CBA) for AF. METHODS: This prospective, double-blinded study enrolled 42 patients with symptomatic AF undergoing CBA. Patients were randomized to the treatment group with esophageal warming (42°C) using recirculated water through a multilumen, silicone tube inserted into the esophagus (EnsoETM®; Attune Medical) (WRM) or the control group with a luminal single-electrode esophageal temperature monitoring probe (LET). Patients underwent upper endoscopy esophagogastroduodenoscopy (EGD) the following day. ETI was classified into four grades. RESULTS: Baseline patient characteristics were similar between groups. Procedural characteristics including number of freezes, total freeze time, early freeze terminations, coldest balloon temperature, procedure duration, posterior wall ablation, and proton pump inhibitor and transesophageal echocardiogram use before procedure were not different between groups. The EGD was completed in 40/42 patients. There was significantly more ETI in the WRM group compared to the LET group (n = 8 [38%] vs. n = 1 [5%], p = 0.02). All ETI lesions were grade 1 (erythema) or 2 (superficial ulceration). Total freeze time in the left inferior pulmonary vein was predictive of ETI (360 vs. 300 s, p = 0.03). CONCLUSION: Use of a luminal heat exchange tube for esophageal warming during CBA for AF was paradoxically associated with a higher risk of ETI.
Subject(s)
Atrial Fibrillation , Catheter Ablation , Cryosurgery , Pulmonary Veins , Humans , Atrial Fibrillation/diagnosis , Atrial Fibrillation/surgery , Prospective Studies , Temperature , Catheter Ablation/methods , Pulmonary Veins/diagnostic imaging , Pulmonary Veins/surgery , Cryosurgery/adverse effectsABSTRACT
PURPOSE OF REVIEW: Atrial fibrillation is the most common sustained rhythm abnormality and is associated with stroke, heart failure, cognitive decline, and premature death. Digital health technologies using consumer-grade mobile technologies (i.e. mHealth) capable of recording heart rate and rhythm can now reliably detect atrial fibrillation using single lead or multilead ECG or photoplethysmography (PPG). This review will discuss how these developments are being used to detect and manage atrial fibrillation. RECENT FINDINGS: Studies have established the accuracy of mHealth devices for atrial fibrillation detection. The feasibility of using mHealth technology to screen for atrial fibrillation has also been established, though the utility of screening is controversial. In addition to screening, key aspects of atrial fibrillation management can also be performed remotely and effectively using mHealth, though with some important limitations. SUMMARY: mHealth technologies have proven disruptive in the diagnosis and management of atrial fibrillation. Healthcare providers can leverage these advances to better care for their atrial fibrillation patients whenever necessary.
Subject(s)
Atrial Fibrillation , Telemedicine , Atrial Fibrillation/diagnosis , Atrial Fibrillation/therapy , Electrocardiography , Humans , Photoplethysmography , TechnologyABSTRACT
Palpitations are a common symptom managed by general practitioners and cardiologists; atrial fibrillation (AF) is the most common arrhythmia in adults. The recent commercial availability of smartphone-based devices and wearable technologies with arrhythmia detection capabilities has revolutionized the diagnosis and management of these common medical issues, as it has placed the power of arrhythmia detection into the hands of the patient. Numerous mobile health (mHealth) devices that can detect, record, and automatically interpret irregularities in heart rhythm and abrupt changes in heart rate using photoplethysmography (PPG)- and electrocardiogram-based technologies are now commercially available. As opposed to prescription-based external rhythm monitoring approaches, these devices are more inexpensive and allow for longer-term monitoring, thus increasing sensitivity for arrhythmia detection, particularly for patients with infrequent symptoms possibly due to cardiac arrhythmias. These devices can be used to correlate symptoms with cardiac arrhythmias, assess efficacy and toxicities of arrhythmia therapies, and screen the population for serious rhythm disturbances such as AF. Although several devices have received clearance for AF detection from the United States Food & Drug Administration, limitations include the need for ECG confirmation for arrhythmias detected by PPG alone, false positives, false negatives, charging requirements for the battery, and financial cost. In summary, the growth of commercially available devices for remote, patient-facing rhythm monitoring represents an exciting new opportunity in the care of patients with palpitations and known or suspected dysrhythmias. Physicians should be familiar with the evidence that underlies their added value to patient care and, importantly, their current limitations.
Subject(s)
Atrial Fibrillation , Telemedicine , Adult , Atrial Fibrillation/diagnosis , Atrial Fibrillation/therapy , Electrocardiography , Humans , Photoplethysmography , SmartphoneABSTRACT
AIMS: The effectiveness and safety of same-day discharge (SDD) for catheter ablation (CA) for atrial fibrillation (AF) has not been fully elucidated using a large nationwide database. This study aimed to evaluate the all-cause readmission rates within 30-days among patients receiving CA for AF with an SDD protocol compared with a conventional overnight stay (ONS). METHODS AND RESULTS: We performed a retrospective cohort study using the US Nationwide Readmission Database. The primary outcome was all-cause 30-day readmission following discharge in patients receiving CA and a secondary outcome was requiring total healthcare cost. A 1 : 3 propensity score matching was conducted to compare the safety and efficacy within both SDD and ONS group. Among 30 776 patients [mean 67.2 ± 11.4 years, 12 590 female (41.5%)] who received CA from 2016 through 2018, 440 (1.42%) patients were discharged on the same-day following CA (SDD group), and the remaining 30 336 patients stayed at least one night in the hospital (ONS group). A propensity score analysis generated 1751 matched pairs (440 in the SDD group; 1311 in the ONS group). The 30-day readmission following discharge was not significantly higher in the SDD group than the ONS group (SDD vs. ONS: 12.7% vs. 9.7%; hazard ratio: 1.17, 95% confidence interval: 0.76-1.81, P = 0.47). Healthcare cost was significantly higher in the ONS group ($25 237 ± 14 036 vs. $30 749 ± 16 383; P < 0.01). CONCLUSION: In this nationwide database study, there was no significant difference in the all-cause 30-day readmission following SDD for CA compared with ONS.