Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 103
Filter
Add more filters

Publication year range
1.
Article in English | MEDLINE | ID: mdl-38725220

ABSTRACT

INTRODUCTION: We performed a cross-sectional study using the Centers for Disease Control and Prevention's (CDC's) Wide-Ranging Online Data for Epidemiologic Research (WONDER) database to analyze the trends in cardiac implantable electronic device (CIED) infection-related mortality from 1999 to 2020. METHODS: We analyzed the death certificate data from the CDC WONDER database from 1999 to 2020 for CIED infections in the US population aged ≥25 years using International Classification of Diseases, Tenth Revision (ICD-10) codes, listed as the underlying or contributing cause of death. Age-adjusted mortality rates (AAMR) and 95% confidence intervals (CIs) were computed per 1 million population by standardizing crude mortality rates to the 2000 US census population. To assess annual mortality trends, we employed the Joinpoint regression model, calculating the annual percent change (APC) in AAMR and corresponding 95% CIs. RESULTS: Overall, there was an observed declining trend in AAMRs related to CIED infection-related mortality. Males accounted for 55% of the total deaths, with persistently higher AAMRs compared to females over the study duration. Both males and females had an overall decreasing trend in AAMRs throughout the study duration. On race/ethnicity stratified analysis, non-Hispanic (NH) Blacks exhibited the highest overall AAMR, followed by NH American Indians or Alaska Natives, NH Whites, Hispanic or Latinos, and NH Asian or Pacific Islanders. On a stratified analysis based on region, the South region had the highest overall AAMR, followed by the Midwest, West, and Northeast regions. CONCLUSION: Our study demonstrates a significant decline in CIED infection-related mortality in patients over the last two decades. Notable gender, racial/ethnic, and regional differences exist in the rates of mortality related to CIED infections.

2.
Article in English | MEDLINE | ID: mdl-38695242

ABSTRACT

INTRODUCTION: Leadless pacemakers (LPM) have established themselves as the important therapeutic modality in management of selected patients with symptomatic bradycardia. To determine real-world utilization and in-hospital outcomes of LPM implantation since its approval by the Food and Drug Administration in 2016. METHODS: For this retrospective cohort study, data were extracted from the National Inpatient Sample database from the years 2016-2020. The outcomes analyzed in our study included implantation trends of LPM over study years, mortality, major complications (defined as pericardial effusion requiring intervention, any vascular complication, or acute kidney injury), length of stay, and cost of hospitalization. Implantation trends of LPM were assessed using linear regression. Using years 2016-2017 as a reference, adjusted outcomes of mortality, major complications, prolonged length of stay (defined as >6 days), and increased hospitalization cost (defined as median cost >34 098$) were analyzed for subsequent years using a multivariable logistic regression model. RESULTS: There was a gradual increased trend of LPM implantation over our study years (3230 devices in years 2016-2017 to 11 815 devices in year 2020, p for trend <.01). The adjusted mortality improved significantly after LPM implantation in subsequent years compared to the reference years 2016-2017 (aOR for the year 2018: 0.61, 95% CI: 0.51-0.73; aOR for the year 2019: 0.49, 95% CI: 0.41-0.59; and aOR for the year 2020: 0.52, 95% CI: 0.44-0.62). No differences in adjusted rates of major complications were demonstrated over the subsequent years. The adjusted cost of hospitalization was higher for the years 2019 (aOR: 1.33, 95% CI: 1.22-1.46) and 2020 (aOR: 1.69, 95% CI: 1.55-1.84). CONCLUSION: The contemporary US practice has shown significantly increased implantation rates of LPM since its approval with reduced rates of inpatient mortality.

3.
Dig Dis Sci ; 69(1): 246-253, 2024 Jan.
Article in English | MEDLINE | ID: mdl-37914889

ABSTRACT

BACKGROUND: Limited data are available on the epidemiology of gastroesophageal junction adenocarcinoma (GEJAC), particularly in comparison to esophageal adenocarcinoma (EAC). With the advent of molecular non-endoscopic Barrett's esophagus (BE) detection tests which sample the esophagus and gastroesophageal junction, early detection of EAC and GEJAC has become a possibility and their epidemiology has gained importance. AIMS: We sought to evaluate time trends in the epidemiology and survival of patients with EAC and GEJAC in a population-based cohort. METHODS: EAC and GEJAC patients from 1976 to 2019 were identified using ICD 9 and 10 diagnostic codes from the Rochester Epidemiology Project (REP). Clinical data and survival status were abstracted. Poisson regression was used to calculate incidence rate ratios (IRR). Survival analysis and Cox proportional models were used to assess predictors of survival. RESULTS: We included 443 patients (287 EAC,156 GEJAC). The incidence of EAC and GEJAC during 1976-2019 was 1.40 (CI 1.1-1.74) and 0.83 (CI 0.61-1.11) per 100,000 people, respectively. There was an increase in the incidence of EAC (IRR = 2.45, p = 0.011) and GEJAC (IRR = 3.17, p = 0.08) from 2000 to 2004 compared to 1995-1999, plateauing in later time periods. Most patients had associated BE and presented at advanced stages, leading to high 5-year mortality rates (66% in EAC and 59% in GEJAC). Age and stage at diagnosis were predictors of mortality. CONCLUSION: The rising incidence of EAC/GEJAC appears to have plateaued somewhat in the last decade. However, both cancers present at advanced stages with persistently poor survival, underscoring the need for early detection.


Subject(s)
Adenocarcinoma , Barrett Esophagus , Esophageal Neoplasms , Humans , Esophageal Neoplasms/diagnosis , Esophageal Neoplasms/epidemiology , Esophageal Neoplasms/etiology , Barrett Esophagus/diagnosis , Barrett Esophagus/epidemiology , Barrett Esophagus/complications , Adenocarcinoma/pathology , Esophagogastric Junction/pathology
4.
J Cardiovasc Electrophysiol ; 34(5): 1196-1205, 2023 05.
Article in English | MEDLINE | ID: mdl-37130436

ABSTRACT

INTRODUCTION: Most patients undergoing a left atrial appendage occlusion (LAAO) procedure are admitted for overnight observation. A same-day discharge strategy offers the opportunity to improve resource utilization without compromising patient safety. We compared the patient safety outcomes and post-discharge complications between same-day discharge versus hospital admission (HA) (>1 day) in patients undergoing LAAO procedure. METHODS: A systematic search of MEDLINE and Embase was conducted. Outcomes of interest included peri-procedural complications, re-admissions, discharge complications including major bleeding and vascular complications, ischemic stroke, all-cause mortality, and peri-device leak >5 mm. Mantel-Haenszel risk ratios (RRs) with 95% CIs were calculated. RESULTS: A total of seven observational studies met the inclusion criteria. There was no statistically significant difference between same-day discharge versus HA regarding readmission (RR: 0.61; 95% confidence interval [CI]: [0.29-1.31]; p = .21), ischemic stroke after discharge (RR: 1.16; 95% CI: [0.49-2.73]), peri-device leak >5 mm (RR: 1.27; 95% CI: [0.42-3.85], and all-cause mortality (RR: 0.60; 95% CI: [0.36-1.02]). The same-day discharge study group had significantly lower major bleeding or vascular complications (RR: 0.71; 95% CI: [0.54-0.94]). CONCLUSIONS: This meta-analysis of seven observational studies showed no significant difference in patient safety outcomes and post-discharge complications between same-day discharge versus HA. These findings provide a solid basis to perform a randomized control trial to eliminate any potential confounders.


Subject(s)
Atrial Appendage , Atrial Fibrillation , Ischemic Stroke , Stroke , Humans , Stroke/etiology , Stroke/prevention & control , Atrial Fibrillation/diagnosis , Atrial Fibrillation/surgery , Atrial Fibrillation/complications , Atrial Appendage/surgery , Patient Discharge , Aftercare , Treatment Outcome , Observational Studies as Topic
5.
J Cardiovasc Electrophysiol ; 34(12): 2514-2526, 2023 Dec.
Article in English | MEDLINE | ID: mdl-37787013

ABSTRACT

BACKGROUND: Obesity is associated with an increased risk of developing recurrent atrial fibrillation (AF) after catheter ablation (CA). However, the current data on weight loss interventions show inconsistent results in preventing the recurrence of AF after CA. METHODS: We conducted a systematic search in MEDLINE and EMBASE to identify studies that reported the outcome of recurrence of AF after CA in obese patients undergoing weight interventions. The subgroup analysis included: (1) Weight loss versus no weight loss, (2) >10% weight loss versus <10% weight loss, (3) <10% weight loss versus no weight loss, (4) Follow-up <12 months, and (5) Follow-up >12 months after CA. Mantel-Haenszel risk ratios with a 95% confidence interval (CI) were calculated using a random effects model and for heterogeneity, I2 statistics were reported. RESULTS: A total of 10 studies (one randomized controlled trial and nine observational studies) comprising 1851 patients were included. The recurrence of AF was numerically reduced in the weight loss group (34.5%) versus no weight loss group (58.2%), but no statistically significant difference was observed (risk ratio [RR] = 0.76; 95% CI: 0.49-1.18, p = .22). However, there was a statistically significant reduction in recurrence of AF with weight loss versus no weight loss at follow-up >12 months after CA (RR = 0.47; 95% CI: 0.32-0.68, p < .0001). At follow-up >12 months after CA, both >10% weight loss versus <10% weight loss (RR = 0.49; 95% CI: 0.31-0.80, p = .004) and <10% weight loss versus no weight loss (RR = 0.39; 95% CI: 0.31-0.49, p < .00001) were associated with a statistically significant reduction in recurrent AF. CONCLUSION: In patients with AF undergoing CA, weight loss is associated with reducing recurrent AF at > 12 months after ablation and these benefits are consistently seen with both >10% and <10% weight loss. The benefits of weight loss in preventing recurrent AF after CA should be examined in larger studies with extended follow-up duration.


Subject(s)
Atrial Fibrillation , Catheter Ablation , Humans , Atrial Fibrillation/diagnosis , Atrial Fibrillation/surgery , Atrial Fibrillation/etiology , Treatment Outcome , Recurrence , Obesity/complications , Obesity/diagnosis , Catheter Ablation/adverse effects , Catheter Ablation/methods , Randomized Controlled Trials as Topic
6.
Gastrointest Endosc ; 98(5): 713-721, 2023 Nov.
Article in English | MEDLINE | ID: mdl-37356631

ABSTRACT

BACKGROUND AND AIMS: Endoscopic eradication therapy (EET) is guideline endorsed for management of early-stage (T1) esophageal adenocarcinoma (EAC). Patients with baseline high-grade dysplasia (HGD) and EAC are at highest risk of recurrence after successful EET, but limited data exist on long-term (>5 year) recurrence outcomes. Our aim was to assess the incidence and predictors of long-term recurrence in a multicenter cohort of patients with T1 EAC treated with EET. METHODS: Patients with T1 EAC achieving successful endoscopic cancer eradication with a minimum of 5 years' clinical follow-up were included. The primary outcome was neoplastic recurrence, defined as dysplasia or EAC, and it was characterized as early (<2 years), intermediate (2-5 years), or late (>5 years). Predictors of recurrence were assessed by time to event analysis. RESULTS: A total of 84 T1 EAC patients (75 T1a, 9 T1b) with a median 9.1 years (range, 5.1-18.3 years) of follow-up were included. The overall incidence of neoplastic recurrence was 2.0 per 100 person-years of follow-up. Seven recurrences (3 dysplasia, 4 EAC) occurred after 5 years of EAC remission. Overall, 88% of recurrences were treated successfully endoscopically. EAC recurrence-related mortality occurred in 3 patients at a median of 5.2 years from EAC remission. Complete eradication of intestinal metaplasia was independently associated with reduced recurrence (hazard ratio, .13). CONCLUSIONS: Following successful EET of T1 EAC, neoplastic recurrence occurred after 5 years in 8.3% of cases. Careful long-term surveillance should be continued in this patient population. Complete eradication of intestinal metaplasia should be the therapeutic end point for EET.

7.
Europace ; 25(4): 1415-1422, 2023 04 15.
Article in English | MEDLINE | ID: mdl-36881781

ABSTRACT

AIMS: To determine outcomes in atrial fibrillation patients undergoing percutaneous left atrial appendage occlusion (LAAO) based on the underlying stroke risk (defined by the CHA2DS2-VASc score). METHODS AND RESULTS: Data were extracted from the National Inpatient Sample for calendar years 2016-20. Left atrial appendage occlusion implantations were identified on the basis of the International Classification of Diseases, 10th Revision, Clinical Modification code of 02L73DK. The study sample was stratified on the basis of the CHA2DS2-VASc score into three groups (scores of 3, 4, and ≥5). The outcomes assessed in our study included complications and resource utilization. A total of 73 795 LAAO device implantations were studied. Approximately 63% of LAAO device implantations occurred in patients with CHA2DS2-VASc scores of 4 and ≥5. The crude prevalence of pericardial effusion requiring intervention was higher with increased CHA2DS2-VASc score (1.4% in patients with a score of ≥5 vs. 1.1% in patients with a score of 4 vs. 0.8% in patients with a score of 3, P < 0.01). In the multivariable model adjusted for potential confounders, CHA2DS2-VASc scores of 4 and ≥5 were found to be independently associated with overall complications [adjusted odds ratio (aOR) 1.26, 95% confidence interval (CI) 1.18-1.35, and aOR 1.88, 95% CI 1.73-2.04, respectively] and prolonged length of stay (aOR 1.18, 95% CI 1.11-1.25, and aOR 1.54, 95% CI 1.44-1.66, respectively). CONCLUSION: A higher CHA2DS2-VASc score was associated with an increased risk of peri-procedural complications and resource utilization after LAAO. These findings highlight the importance of patient selection for the LAAO procedure and need validation in future studies.


Subject(s)
Atrial Appendage , Atrial Fibrillation , Stroke , Humans , Atrial Fibrillation/complications , Atrial Fibrillation/diagnosis , Atrial Fibrillation/surgery , Stroke/diagnosis , Stroke/epidemiology , Stroke/etiology , Atrial Appendage/surgery , Retrospective Studies , Treatment Outcome
8.
Europace ; 25(7)2023 07 04.
Article in English | MEDLINE | ID: mdl-37341446

ABSTRACT

BACKGROUND AND AIMS: Colchicine is an anti-inflammatory drug that may prevent post-operative atrial fibrillation (POAF). The effect of this drug has been inconsistently shown in previous clinical trials. We aimed to compare the efficacy and safety of colchicine vs. placebo to prevent POAF in patients undergoing cardiac surgery. METHODS AND RESULTS: A systematic search of EMBASE, MEDLINE, SCOPUS, ClinicalTrials.gov, and the Cochrane Library for randomized controlled trials (RCTs) was conducted from inception till April 2023. The primary outcome was the incidence of POAF after any cardiac surgery. The secondary outcome was the rate of drug discontinuation due to adverse events and adverse gastrointestinal events. Risk ratios (RR) were reported using the Mantel Haenszel method. A total of eight RCTs comprising 1885 patients were included. There was a statistically significant lower risk of developing POAF with colchicine vs. placebo (RR: 0.70; 95% CI: 0.59-0.82; P < 0.01, I2 = 0%), and this effect persisted across different subgroups. There was a significantly higher risk of adverse gastrointestinal events (RR: 2.20; 95% CI: 1.38-3.51; P < 0.01, I2 = 55%) with no difference in the risk of drug discontinuation in patients receiving colchicine vs. placebo (RR: 1.33; 95% CI: 0.93-1.89; P = 0.11, I2 = 0%). CONCLUSION: This meta-analysis of eight RCTs shows that colchicine is effective at preventing POAF, with a significantly higher risk of adverse gastrointestinal events but no difference in the rate of drug discontinuation. Future studies are required to define the optimal duration and dose of colchicine for the prevention of POAF.


Subject(s)
Atrial Fibrillation , Cardiac Surgical Procedures , Humans , Colchicine/adverse effects , Atrial Fibrillation/diagnosis , Atrial Fibrillation/epidemiology , Atrial Fibrillation/etiology , Randomized Controlled Trials as Topic , Cardiac Surgical Procedures/adverse effects , Incidence
9.
Pacing Clin Electrophysiol ; 46(5): 422-424, 2023 05.
Article in English | MEDLINE | ID: mdl-36932820

ABSTRACT

BACKGROUND: The implications of LBBB in heart failure with preserved ejection fraction (HFpEF) is unclear. Our study assesses clinical outcomes among patients with LBBB and HFpEF who were admitted with acute decompensated heart failure. METHODS: This is a cross-sectional study was conducted using the National Inpatient Sample (NIS) database from 2016-2019. RESULTS: We found 74,365 hospitalizations with HFpEF and LBBB and 3,892,354 hospitalizations with HFpEF without LBBB. Patients with LBBB were older (78.9 vs 74.2 years) and had higher rates of coronary artery disease (53.05% vs 40.8%), hypertension (74.7% vs 70.8%), atrial fibrillation (32.8% vs 29.4%), sick sinus rhythm (3.4% vs 2.02%), complete heart block (1.8% vs 0.66%), ventricular tachycardia (3.5% vs 1.7%), and ventricular fibrillation (0.24% vs 0.11%). Patients with LBBB were found to have decreased in-hospital mortality (OR: 0.85; 0.76-0.96; p-0.009) but higher rates of cardiac arrest (OR: 1.39; 1.06-1.83; p-0.02) and need for mechanical circulatory support (OR: 1.7; 1.28-2.36; p-0.001). Patients with LBBB underwent a higher rate of pacemaker (OR: 2.98; 2.75-3.23; p < 0.001) and ICD (implantable cardioverter-defibrillator) placement (OR: 3.98; 2.81-5.62; p < 0.001). Patients with LBBB were also found to have a higher mean cost of hospitalization ($81,402 vs $60,358; p < 0.001) but lower length of stay (4.8 vs 5.4 days; p < 0.001). CONCLUSION: In patients admitted with decompensated heart failure with preserved ejection fraction, left bundle branch block is associated with increased odds of cardiac arrest, mechanical circulatory support requirement, device implantation and mean cost of hospitalization but decreased odds of in-hospital mortality.


Subject(s)
Heart Arrest , Heart Failure , Humans , Bundle-Branch Block , Heart Failure/complications , Heart Failure/therapy , Stroke Volume , Cross-Sectional Studies , Treatment Outcome
10.
Pacing Clin Electrophysiol ; 46(10): 1242-1245, 2023 10.
Article in English | MEDLINE | ID: mdl-37695052

ABSTRACT

The association of psychosocial risk factors with cardiovascular disease is well-established, and there is a growing recognition of their influence on atrial fibrillation (AF) . A recent National Heart, Lung, and Blood Institute workshop called for transforming AF research to integrate social determinants of health. There is limited data examining the impact of psychosocial risk factors (PSRFs) on outcomes in patients with an established diagnosis of AF. Catheter ablation for AF has been shown to improve arrhythmia burden and quality of life compared with medical treatment alone. It is unknown how PSRFs affect clinical outcomes in patients undergoing AF ablation. It is important to understand this relationship, especially given the increasing adoption of catheter ablation in clinical practice.


Subject(s)
Atrial Fibrillation , Catheter Ablation , Humans , Quality of Life , Treatment Outcome , Risk Factors , Catheter Ablation/adverse effects , Recurrence
11.
Neuroimage ; 249: 118871, 2022 04 01.
Article in English | MEDLINE | ID: mdl-34995797

ABSTRACT

Convolutional neural networks (CNN) can accurately predict chronological age in healthy individuals from structural MRI brain scans. Potentially, these models could be applied during routine clinical examinations to detect deviations from healthy ageing, including early-stage neurodegeneration. This could have important implications for patient care, drug development, and optimising MRI data collection. However, existing brain-age models are typically optimised for scans which are not part of routine examinations (e.g., volumetric T1-weighted scans), generalise poorly (e.g., to data from different scanner vendors and hospitals etc.), or rely on computationally expensive pre-processing steps which limit real-time clinical utility. Here, we sought to develop a brain-age framework suitable for use during routine clinical head MRI examinations. Using a deep learning-based neuroradiology report classifier, we generated a dataset of 23,302 'radiologically normal for age' head MRI examinations from two large UK hospitals for model training and testing (age range = 18-95 years), and demonstrate fast (< 5 s), accurate (mean absolute error [MAE] < 4 years) age prediction from clinical-grade, minimally processed axial T2-weighted and axial diffusion-weighted scans, with generalisability between hospitals and scanner vendors (Δ MAE < 1 year). The clinical relevance of these brain-age predictions was tested using 228 patients whose MRIs were reported independently by neuroradiologists as showing atrophy 'excessive for age'. These patients had systematically higher brain-predicted age than chronological age (mean predicted age difference = +5.89 years, 'radiologically normal for age' mean predicted age difference = +0.05 years, p < 0.0001). Our brain-age framework demonstrates feasibility for use as a screening tool during routine hospital examinations to automatically detect older-appearing brains in real-time, with relevance for clinical decision-making and optimising patient pathways.


Subject(s)
Aging , Brain/diagnostic imaging , Human Development , Magnetic Resonance Imaging , Neuroimaging , Adolescent , Adult , Age Factors , Aged , Aged, 80 and over , Aging/pathology , Aging/physiology , Deep Learning , Human Development/physiology , Humans , Magnetic Resonance Imaging/methods , Magnetic Resonance Imaging/standards , Middle Aged , Neuroimaging/methods , Neuroimaging/standards , Young Adult
12.
Clin Gastroenterol Hepatol ; 20(11): 2644-2646.e1, 2022 Nov.
Article in English | MEDLINE | ID: mdl-34481958

ABSTRACT

The Los Angeles (LA) classification is the most accurate means of assessing esophageal injury from caustic gastric acid with focused and greater concentrations in areas of erosive disease.1 However, data from animal models and patients have proposed that an initial diffuse inflammatory pathway contributes to injury in gastroesophageal reflux disease (GERD) mediated by interleukin (IL) 8, IL1ß,2,3 and hypoxia-inducible factors.4,5 These observations demonstrate a lymphocyte predominant inflammatory process over course of 1-2 weeks associated with basal zone hyperplasia and dilation of intercellular spaces.6 In cultured human esophageal epithelial cells and patients, it is further suggested that acid causes this chronic inflammatory reaction.


Subject(s)
Gastroesophageal Reflux , Animals , Humans , Gastroesophageal Reflux/diagnostic imaging , Gastroesophageal Reflux/complications , Positron-Emission Tomography
13.
Clin Gastroenterol Hepatol ; 20(12): 2772-2779.e8, 2022 12.
Article in English | MEDLINE | ID: mdl-35217151

ABSTRACT

BACKGROUND & AIMS: Prediction of progression risk in Barrett's esophagus (BE) may enable personalized management. We aimed to assess the adjunct value of a tissue systems pathology test (TissueCypher) performed on paraffin-embedded biopsy tissue, when added to expert pathology review in predicting incident progression, pooling individual patient-level data from multiple international studies METHODS: Demographics, clinical features, the TissueCypher risk class/score, and progression status were analyzed. Conditional logistical regression analysis was used to develop multivariable models predicting incident progression with and without the TissueCypher risk class (low, intermediate, high). Concordance (c-) statistics were calculated and compared with likelihood ratio tests to assess predictive ability of models. A risk prediction calculator integrating clinical variables and TissueCypher risk class was also developed. RESULTS: Data from 552 patients with baseline no (n = 472), indefinite (n = 32), or low-grade dysplasia (n = 48) (comprising 152 incident progressors and 400 non-progressors) were analyzed. A high-risk test class independently predicted increased risk of progression to high-grade dysplasia/adenocarcinoma (odds ratio, 6.0; 95% confidence interval, 2.9-12.0), along with expert confirmed low-grade dysplasia (odds ratio, 2.9; 95% confidence interval, 1.2-7.2). Model prediction of progression with the TissueCypher risk class incorporated was significantly superior than without, in the whole cohort (c-statistic 0.75 vs 0.68; P < .0001) and the nondysplastic BE subset (c-statistic 0.72 vs 0.63; P < .0001). Sensitivity and specificity of the high risk TissueCypher class were 38% and 94%, respectively. CONCLUSIONS: An objective tissue systems pathology test high-risk class is a strong independent predictor of incident progression in patients with BE, substantially improving progression risk prediction over clinical variables alone. Although test specificity was high, sensitivity was modest.


Subject(s)
Adenocarcinoma , Barrett Esophagus , Esophageal Neoplasms , Precancerous Conditions , Humans , Barrett Esophagus/diagnosis , Barrett Esophagus/pathology , Esophageal Neoplasms/diagnosis , Esophageal Neoplasms/epidemiology , Esophageal Neoplasms/pathology , Precancerous Conditions/pathology , Disease Progression , Adenocarcinoma/pathology
14.
Clin Gastroenterol Hepatol ; 20(12): 2763-2771.e3, 2022 12.
Article in English | MEDLINE | ID: mdl-35245702

ABSTRACT

BACKGROUND & AIMS: Recommended surveillance intervals after complete eradication of intestinal metaplasia (CE-IM) after endoscopic eradication therapy (EET) are largely not evidence-based. Using recurrence rates in a multicenter international Barrett's esophagus (BE) CE-IM cohort, we aimed to generate optimal intervals for surveillance. METHODS: Patients with dysplastic BE undergoing EET and achieving CE-IM from prospectively maintained databases at 5 tertiary-care centers in the United States and the United Kingdom were included. The cumulative incidence of recurrence was estimated, accounting for the unknown date of actual recurrence that lies between the dates of current and previous endoscopy. This cumulative incidence of recurrence subsequently was used to estimate the proportion of patients with undetected recurrence for various surveillance intervals over 5 years. Intervals were selected that minimized recurrences remaining undetected for more than 6 months. Actual patterns of post-CE-IM follow-up evaluation are described. RESULTS: A total of 498 patients (with baseline low-grade dysplasia, 115 patients; high-grade dysplasia [HGD], 288 patients; and intramucosal adenocarcinoma [IMCa], 95 patients) were included. Any recurrence occurred in 27.1% and dysplastic recurrence occurred in 8.4% over a median of 2.6 years of follow-up evaluation. For pre-ablation HGD/IMCa, intervals of 6, 12, 18, and 24 months, and then annually, resulted in no patients with dysplastic recurrence undetected for more than 6 months, comparable with current guideline recommendations despite a 33% reduction in the number of surveillance endoscopies. For pre-ablation low-grade dysplasia, intervals of 1, 2, and 4 years balanced endoscopic burden and undetected recurrence risk. CONCLUSIONS: Lengthening post-CE-IM surveillance intervals would reduce the endoscopic burden after CE-IM with comparable rates of recurrent HGD/IMCa. Future guidelines should consider reduced surveillance frequency.


Subject(s)
Adenocarcinoma , Barrett Esophagus , Esophageal Neoplasms , Humans , Barrett Esophagus/epidemiology , Cohort Studies , Esophageal Neoplasms/diagnosis , Esophageal Neoplasms/epidemiology , Esophageal Neoplasms/surgery , Metaplasia , Adenocarcinoma/pathology , Endoscopy, Gastrointestinal , Hyperplasia , Esophagoscopy/methods
15.
Am J Gastroenterol ; 117(7): 1154-1157, 2022 07 01.
Article in English | MEDLINE | ID: mdl-35311761

ABSTRACT

INTRODUCTION: To describe the clinical, endoscopic, and histopathology features of esophageal graft-vs-host disease (GVHD). METHODS: Patients with biopsy-proven esophageal GVHD diagnosed at Mayo Clinic between 2000 and 2021 were included. RESULTS: In 43 esophageal patients, other organ GVHD was present in 58% before and 86% at esophageal GVHD diagnosis. Esophageal specific symptoms were uncommon (dysphagia 26% and odynophagia/heartburn 5%). Esophagogastroduodenoscopy was abnormal in 72% patients demonstrating erosive esophagitis, ulceration, desquamation, or rings/furrows in a diffuse or focal pattern. DISCUSSION: There should be a low threshold for esophageal biopsies for GVHD because esophageal symptoms and endoscopic findings may be nonspecific or absent.


Subject(s)
Deglutition Disorders , Esophagitis , Graft vs Host Disease , Biopsy , Deglutition Disorders/etiology , Esophagitis/complications , Graft vs Host Disease/complications , Graft vs Host Disease/diagnosis , Graft vs Host Disease/pathology , Heartburn/etiology , Humans , Retrospective Studies
16.
Gastrointest Endosc ; 96(6): 918-925.e3, 2022 12.
Article in English | MEDLINE | ID: mdl-35718071

ABSTRACT

BACKGROUND AND AIMS: The risk of progression in Barrett's esophagus (BE) increases with development of dysplasia. There is a critical need to improve the diagnosis of BE dysplasia, given substantial interobserver disagreement among expert pathologists and overdiagnosis of dysplasia by community pathologists. We developed a deep learning model to predict dysplasia grade on whole-slide imaging. METHODS: We digitized nondysplastic BE (NDBE), low-grade dysplasia (LGD), and high-grade dysplasia (HGD) histology slides. Two expert pathologists confirmed all histology and digitally annotated areas of dysplasia. Training, validation, and test sets were created (by a random 70/20/10 split). We used an ensemble approach combining a "you only look once" model to identify regions of interest and histology class (NDBE, LGD, or HGD) followed by a ResNet101 model pretrained on ImageNet applied to the regions of interest. Diagnostic performance was determined for the whole slide. RESULTS: We included slides from 542 patients (164 NDBE, 226 LGD, and 152 HGD) yielding 8596 bounding boxes in the training set, 1946 bounding boxes in the validation set, and 840 boxes in the test set. When the ensemble model was used, sensitivity and specificity for LGD was 81.3% and 100%, respectively, and >90% for NDBE and HGD. The overall positive predictive value and sensitivity metric (calculated as F1 score) was .91 for NDBE, .90 for LGD, and 1.0 for HGD. CONCLUSIONS: We successfully trained and validated a deep learning model to accurately identify dysplasia on whole-slide images. This model can potentially help improve the histologic diagnosis of BE dysplasia and the appropriate application of endoscopic therapy.


Subject(s)
Adenocarcinoma , Barrett Esophagus , Deep Learning , Esophageal Neoplasms , Humans , Barrett Esophagus/diagnosis , Barrett Esophagus/pathology , Esophageal Neoplasms/pathology , Adenocarcinoma/pathology , Disease Progression , Hyperplasia
17.
Gastrointest Endosc ; 95(3): 422-431.e2, 2022 03.
Article in English | MEDLINE | ID: mdl-34624303

ABSTRACT

BACKGROUND AND AIMS: Strong evidence supports the use of radiofrequency ablation (RFA) in the management of dysplastic/neoplastic Barrett's esophagus (BE). Recently, the efficacy of the cryoballoon ablation (CBA) system was demonstrated in multicenter cohort studies. We aimed to assess the comparative effectiveness and safety of these 2 ablation modalities for endoscopic eradication therapy (EET) in a cohort study. METHODS: Data were abstracted on patients with dysplastic BE or intramucosal carcinoma undergoing EET using RFA or CBA as the primary ablation modality at 2 referral centers. The primary outcome was the rate of complete remission intestinal metaplasia (CRIM). Secondary outcomes were rates of complete remission of dysplasia (CRD) and adverse events. Cox proportional hazards models and propensity scored-matched analyses were conducted to compare outcomes. RESULTS: Three hundred eleven patients (CBA, 85 patients; RFA, 226 patients) with a median follow-up of 1.5 years (interquartile range, .8, 2.5) in the RFA group and 2.0 years (interquartile range, 1.3, 2.5) in the CBA group were studied. On multivariable analyses, the chances of reaching CRD and CRIM were not influenced by ablation modality. Propensity score-matched analysis revealed a comparable chance of achieving CRIM (CBA vs RFA: hazard ratio, 1.24; 95% confidence interval, .79-1.96; P = .35) and CRD (CBA vs RFA: hazard ratio, 1.19; 95% confidence interval, .82-1.73; P = .36). The CBA group had a higher stricture rate compared with the RFA group (10.4% vs 4.4%, P = .04). CONCLUSIONS: Histologic outcomes of EET using CBA and RFA for dysplastic BE appear to be comparable. A randomized trial is needed to definitively compare outcomes between these 2 modalities.


Subject(s)
Barrett Esophagus , Catheter Ablation , Esophageal Neoplasms , Barrett Esophagus/pathology , Catheter Ablation/adverse effects , Cohort Studies , Esophageal Neoplasms/pathology , Esophagoscopy/adverse effects , Humans , Propensity Score , Treatment Outcome
18.
Am J Gastroenterol ; 116(12): 2465-2469, 2021 12 01.
Article in English | MEDLINE | ID: mdl-34534126

ABSTRACT

INTRODUCTION: We examined national Google Trends and local healthcare utilization after 3 high-impact gastroenterology publications. METHODS: Changes in US Google Trends and Olmsted County health utilization were studied. RESULTS: Publication views within 30 days were 51,458 (Imperiale), 49,759 (Pimentel), and 18,750 (Gomm). Colonoscopy searches (P = 0.04) and Cologuard tests performed (P < 0.01) increased while colonoscopies decreased (P < 0.01). Searches for rifaximin (P = 0.05), irritable bowel syndrome (P < 0.01), diarrhea (P < 0.01), and rifaximin prescriptions (P = 0.02) increased. Increase in hydrogen-2 blocker searches (P = 0.02) and prescriptions (P < 0.01) and gastroesophageal reflux disease (P < 0.01) and dementia office visits (P < 0.01) occurred. DISCUSSION: High-impact gastroenterology publications influence Google searches and local population-based healthcare utilization.


Subject(s)
Digestive System Diseases/therapy , Gastroenterology , Patient Acceptance of Health Care/statistics & numerical data , Periodicals as Topic , Search Engine/trends , Humans
19.
J Urban Health ; 97(3): 348-357, 2020 06.
Article in English | MEDLINE | ID: mdl-32333243

ABSTRACT

The informal settlements of the Global South are the least prepared for the pandemic of COVID-19 since basic needs such as water, toilets, sewers, drainage, waste collection, and secure and adequate housing are already in short supply or non-existent. Further, space constraints, violence, and overcrowding in slums make physical distancing and self-quarantine impractical, and the rapid spread of an infection highly likely. Residents of informal settlements are also economically vulnerable during any COVID-19 responses. Any responses to COVID-19 that do not recognize these realities will further jeopardize the survival of large segments of the urban population globally. Most top-down strategies to arrest an infectious disease will likely ignore the often-robust social groups and knowledge that already exist in many slums. Here, we offer a set of practice and policy suggestions that aim to (1) dampen the spread of COVID-19 based on the latest available science, (2) improve the likelihood of medical care for the urban poor whether or not they get infected, and (3) provide economic, social, and physical improvements and protections to the urban poor, including migrants, slum communities, and their residents, that can improve their long-term well-being. Immediate measures to protect residents of urban informal settlements, the homeless, those living in precarious settlements, and the entire population from COVID-19 include the following: (1) institute informal settlements/slum emergency planning committees in every urban informal settlement; (2) apply an immediate moratorium on evictions; (3) provide an immediate guarantee of payments to the poor; (4) immediately train and deploy community health workers; (5) immediately meet Sphere Humanitarian standards for water, sanitation, and hygiene; (6) provide immediate food assistance; (7) develop and implement a solid waste collection strategy; and (8) implement immediately a plan for mobility and health care. Lessons have been learned from earlier pandemics such as HIV and epidemics such as Ebola. They can be applied here. At the same time, the opportunity exists for public health, public administration, international aid, NGOs, and community groups to innovate beyond disaster response and move toward long-term plans.


Subject(s)
Coronavirus Infections/prevention & control , Pandemics/prevention & control , Pneumonia, Viral/prevention & control , Poverty Areas , Urban Population , Betacoronavirus , COVID-19 , Health Services Accessibility/organization & administration , Housing/standards , Humans , SARS-CoV-2 , Sanitation/methods , Urban Health , Vulnerable Populations
20.
J Am Chem Soc ; 141(19): 7831-7841, 2019 05 15.
Article in English | MEDLINE | ID: mdl-31042366

ABSTRACT

Living cells have the ability to control the dynamics of responsive assemblies such as the cytoskeleton by temporally activating and deactivating inert precursors. While DNA nanotechnology has demonstrated many synthetic supramolecular assemblies that rival biological ones in size and complexity, dynamic control of their formation is still challenging. Taking inspiration from nature, we developed a DNA-RNA nanotube system whose assembly and disassembly can be temporally controlled at physiological temperature using transcriptional programs. Nanotubes assemble when inert DNA monomers are directly and selectively activated by RNA molecules that become embedded in the structure, producing hybrid DNA-RNA assemblies. The reactions and molecular programs controlling nanotube formation are fueled by enzymes that produce or degrade RNA. We show that the speed of assembly and disassembly of the nanotubes can be controlled by tuning various reaction parameters in the transcriptional programs. We anticipate that these hybrid structures are a starting point to build integrated biological circuits and functional scaffolds inside natural and artificial cells, where RNA produced by gene networks could fuel the assembly of nucleic acid components on demand.


Subject(s)
DNA/chemistry , Enzymes/metabolism , Nanotechnology , Nanotubes/chemistry , RNA/chemistry , DNA/metabolism , Kinetics , Models, Molecular , Nucleic Acid Conformation , RNA/metabolism , Temperature
SELECTION OF CITATIONS
SEARCH DETAIL