Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 18 de 18
Filter
1.
Dig Dis Sci ; 67(11): 5053-5062, 2022 11.
Article in English | MEDLINE | ID: mdl-35182250

ABSTRACT

BACKGROUND AND AIMS: The Coronavirus disease 2019 (COVID-19) pandemic led to the restructuring of most healthcare systems, but the impact on patients undergoing inpatient endoscopic procedures is unknown. We sought to identify factors associated with 30-day mortality among patients undergoing inpatient endoscopy before and during the first wave of the pandemic within an academic tertiary care center. METHODS: We studied patients who underwent inpatient endoscopic procedures from March 1-May 31 in 2020 (COVID-19 era), the peak of the pandemic's first wave across the care center studied, and in March 1-May 31, 2018 and 2019 (control). Patient demographics and hospitalization/procedure data were compared between groups. Cox regression analyses were conducted to identify factors associated with 30-day mortality. RESULTS: Inpatient endoscopy volume decreased in 2020 with a higher proportion of urgent procedures, increased proportion of patients receiving blood transfusions, and a 10.1% mortality rate. In 2020, male gender, further distance from hospital, need for intensive care unit (ICU) admission, and procedures conducted outside the endoscopy suite were associated with increased risk of 30-day mortality. CONCLUSIONS: Patients undergoing endoscopy during the pandemic had higher proportions of ICU admission, more urgent indications, and higher rates of 30-day mortality. Greater proportions of urgent endoscopy cases may be due to hospital restructuring or patient reluctance to seek hospital care during a pandemic. Demographic and procedural characteristics associated with higher mortality risk may be potential areas to improve outcomes during future pandemic hospital restructuring efforts.


Subject(s)
COVID-19 , Pandemics , Humans , Male , COVID-19/epidemiology , Inpatients , Endoscopy, Gastrointestinal , Intensive Care Units , Retrospective Studies
2.
Clin Gastroenterol Hepatol ; 18(5): 1091-1098.e1, 2020 05.
Article in English | MEDLINE | ID: mdl-31352090

ABSTRACT

BACKGROUND & AIMS: Guidelines recommend testing patients with peptic ulcer disease for Helicobacter pylori infection. We sought to identify factors associated with adherence to testing for H pylori in patients hospitalized for bleeding ulcers and to evaluate whether performing these tests affect risk for rebleeding. METHODS: We performed a retrospective study of 830 inpatients who underwent endoscopy from 2011 through 2016 for gastrointestinal bleeding from gastric or duodenal ulcers. We searched electronic medical records for evidence of tests to detect H pylori by biopsy, serologic, or stool antigen analyses. We used multivariable models to identify clinical, demographic, and endoscopic factors associated with testing for H pylori. Kaplan-Meier analysis was performed to determine whether H pylori testing altered risk for the composite outcome of rebleeding or death within 1 year of admission. RESULTS: Among the patients hospitalized for bleeding peptic ulcer disease during the 6-year period, 19% were not tested for H pylori within 60 days of index endoscopy. Hospitalization in the intensive care unit (ICU) was the factor most frequently associated with nonadherence to H pylori testing guidelines (only 66% of patients in the ICU were tested vs 90% of patients not in the ICU; P < .01), even after we adjusted for ulcer severity, coagulation status, extent of blood loss, and additional factors (adjusted odds ratio, 0.42; 95% CI, 0.27-0.66). Testing for H pylori was associated with a 51% decreased risk of rebleeding or death during the year after admission (adjusted hazard ratio 0.49; 95% CI, 0.36-0.67). CONCLUSIONS: In an analysis of hospitalized patients who underwent endoscopy for gastrointestinal bleeding from gastric or duodenal ulcers, we found admission to the ICU to be associated with failure to test for H pylori infection. Failure to test for H pylori was independently associated with increased risk of rebleeding or death within 1 year of hospital admission. We need strategies to increase testing for H pylori among inpatients with bleeding ulcers.


Subject(s)
Helicobacter Infections , Helicobacter pylori , Peptic Ulcer , Helicobacter Infections/complications , Helicobacter Infections/diagnosis , Hospitalization , Humans , Peptic Ulcer/complications , Retrospective Studies
10.
Clin Gastroenterol Hepatol ; 19(6): 1298, 2021 06.
Article in English | MEDLINE | ID: mdl-33249021
11.
Liver Transpl ; 22(8): 1085-91, 2016 08.
Article in English | MEDLINE | ID: mdl-27302834

ABSTRACT

Identifying which liver transplantation (LT) candidates with severe kidney injury will have a full recovery of renal function after liver transplantation alone (LTA) is difficult. Avoiding unnecessary simultaneous liver-kidney transplantation (SLKT) can optimize the use of scarce kidney grafts. Incorrect predictions of spontaneous renal recovery after LTA can lead to increased morbidity and mortality. We retrospectively analyzed all LTA patients at our institution from February 2002 to February 2013 (n = 583) and identified a cohort with severe subacute renal injury (n = 40; creatinine <2 mg/dL in the 14-89 days prior to LTA and not on renal replacement therapy [RRT] yet, ≥2 mg/dL within 14 days of LTA and/or on RRT). Of 40 LTA recipients, 26 (65%) had renal recovery and 14 (35%) did not. The median (interquartile range) warm ischemia time (WIT) in recipients with and without renal recovery after LTA was 31 minutes (24-46 minutes) and 39 minutes (34-49 minutes; P = 0.02), respectively. Adjusting for the severity of the subacute kidney injury with either Acute Kidney Injury Network or Risk, Injury, Failure, Loss, and End-Stage Kidney Disease criteria, increasing WIT was associated with lack of renal recovery (serum creatinine <2 mg/dL after LTA, not on RRT), with an odds ratio (OR) of 1.08 (1.01-1.16; P = 0.03) and 1.09 (1.01-1.17; P = 0.02), respectively. For each minute of increased WIT, there was an 8%-9% increase in the risk of lack of renal recovery after LTA. In a separate cohort of 98 LTA recipients with subacute kidney injury, we confirmed the association of WIT and lack of renal recovery (OR, 1.04; P = 0.04). In LT candidates with severe subacute renal injury, operative measures to minimize WIT may improve renal recovery potentially avoiding RRT and the need for subsequent kidney transplant. Liver Transplantation 22 1085-1091 2016 AASLD.


Subject(s)
Acute Kidney Injury/diagnosis , End Stage Liver Disease/surgery , Kidney/physiopathology , Liver Transplantation/adverse effects , Recovery of Function , Warm Ischemia/adverse effects , Acute Kidney Injury/blood , Acute Kidney Injury/mortality , Adult , Creatinine/blood , End Stage Liver Disease/mortality , Female , Humans , Kidney Function Tests , Liver Transplantation/methods , Male , Middle Aged , Practice Guidelines as Topic , Retrospective Studies , Risk Factors , Severity of Illness Index , Time Factors , Treatment Outcome
12.
Liver Transpl ; 21(8): 1022-30, 2015 Aug.
Article in English | MEDLINE | ID: mdl-26074140

ABSTRACT

Donor age has become the dominant donor factor used to predict graft failure (GF) after liver transplantation (LT) in hepatitis C virus (HCV) recipients. The purpose of this study was to develop and validate a model of corrected donor age (CDA) for HCV LT recipients that transforms the risk of other donor factors into the scale of donor age. We analyzed all first LT recipients with HCV in the United Network for Organ Sharing (UNOS) registry from January 1998 to December 2007 (development cohort, n = 14,538) and January 2008 to December 2011 (validation cohort, n = 7502) using Cox regression, excluding early GF (<90 days from LT). Accuracy in predicting 1 year GF (death or repeat LT) was assessed with the net reclassification index (NRI). In the development cohort, after controlling for pre-LT recipient factors and geotemporal trends (UNOS region, LT year), the following donor factors were independent predictors of GF, all P < 0.05: donor age (hazard ratio [HR], 1.02/year), donation after cardiac death (DCD; HR, 1.31), diabetes (HR, 1.23), height < 160 cm (HR, 1.13), aspartate aminotransferase (AST) ≥ 120 U/L (HR, 1.10), female (HR, 0.94), cold ischemia time (CIT; HR, 1.02/hour), and non-African American (non-AA) donor-African American (AA) recipient (HR, 1.65). Transforming these risk factors into the donor age scale yielded the following: DCD = +16 years; diabetes = +12 years; height < 160 cm = +7 years; AST ≥ 120 U/L = +5 years; female = -4 years; and CIT = +1 year/hour > 8 hours and -1 year/hour < 8 hours. There was a large effect of donor-recipient race combinations: +29 years for non-AA donor and an AA recipient but only +5 years for an AA donor and an AA recipient, and -2 years for an AA donor and a non-AA recipient. In a validation cohort, CDA better classified risk of 1-year GF versus actual age (NRI, 4.9%; P = 0.009) and versus the donor risk index (9.0%, P < 0.001). The CDA, compared to actual donor age, provides an intuitive and superior estimation of graft quality for HCV-positive LT recipients because it incorporates additional factors that impact LT GF rates.


Subject(s)
Decision Support Techniques , Donor Selection , End Stage Liver Disease/surgery , Graft Survival , Hepatitis C/complications , Liver Transplantation/methods , Tissue Donors , Adult , Age Factors , End Stage Liver Disease/diagnosis , End Stage Liver Disease/mortality , End Stage Liver Disease/virology , Female , Hepatitis C/diagnosis , Hepatitis C/mortality , Humans , Likelihood Functions , Liver Transplantation/adverse effects , Liver Transplantation/mortality , Male , Middle Aged , Multivariate Analysis , Predictive Value of Tests , Proportional Hazards Models , Reproducibility of Results , Risk Assessment , Risk Factors , Time Factors , Treatment Outcome , United States
13.
Dig Dis Sci ; 59(8): 1983-6, 2014 Aug.
Article in English | MEDLINE | ID: mdl-24599771

ABSTRACT

BACKGROUND: Cardiopulmonary resuscitation (CPR) after cardiac arrest in terminally ill patients is controversial. End-stage liver disease (ESLD) patients are unique from other terminally ill as they are generally younger and may be candidates for curative liver transplantation, but multiple studies have suggested poor outcomes when these patients require CPR. Predictors of success of CPR in ESLD have not been fully investigated, limiting end-of-life discussions. AIM: The aim of this study was to quantify the rate and predictors of successful CPR in ESLD. METHODS: We performed a retrospective chart review of patients with ESLD who received CPR from 2/2002 to 12/2013 at a single institution. Pre-cardiac arrest variables were collected for analysis as predictors of survival. Our primary outcome was survival to hospital discharge. RESULTS: Of the 38 patients who underwent CPR, six survived to hospital discharge. When comparing those who survived to discharge with those who did not, we found no significant difference in age (p = 0.34), gender (p = 0.85), presence of ascites (p = 0.67), location at time of arrest (p = 0.39), concurrent GI bleeding (p = 0.48), and multiple individual lab values. Significant predictors of not surviving to hospital discharge were a model for end-stage liver disease (MELD) ≥ 20 (OR 6.0, p = 0.044) and presentation with a non-shockable rhythm (PEA/asystole) (OR 29, p < 0.001). CONCLUSION: ESLD patients requiring CPR have worse outcomes as their MELD score increases. CPR in ESLD when MELD is <20 or with a shockable rhythm has a greater likelihood of success.


Subject(s)
Cardiopulmonary Resuscitation/mortality , End Stage Liver Disease/complications , Adult , Aged , Colorado/epidemiology , End Stage Liver Disease/mortality , Female , Forecasting , Humans , Male , Middle Aged , Patient Discharge/statistics & numerical data , Retrospective Studies
14.
Aliment Pharmacol Ther ; 57(1): 94-102, 2023 01.
Article in English | MEDLINE | ID: mdl-36394111

ABSTRACT

BACKGROUND: Guidelines recommend against aspirin for primary prevention of cardiovascular events in individuals with a history of gastrointestinal bleeding (GIB). It is unknown how often patients on primary prevention aspirin hospitalised with GIB have aspirin discontinued at discharge. AIMS: To determine the rate of aspirin deprescription and explore long-term outcomes in patients taking aspirin for primary prevention of cardiovascular events. METHODS: We evaluated all patients hospitalised at Yale-New Haven Hospital between January 2014 and October 2021 with GIB who were on aspirin for primary prevention. Our primary endpoint was the frequency of aspirin deprescription at discharge. Our secondary endpoints were post-discharge hospitalisations for major adverse cardiovascular events (MACE) or GIB. Time-to-event analysis was performed using Kaplan-Meier curves and the log-rank test. RESULTS: We identified 320 patients with GIB on aspirin for primary prevention: median age was 72 (interquartile range [IQR] 61-81) years and 297 (92.8%) were on aspirin 81 mg daily. Only 25 (9.0%) patients surviving their hospitalisation were deprescribed aspirin at discharge. Among 260 patients with follow-up (median 1103 days; IQR 367-1670), MACE developed post-discharge in 2/25 (8.0%) with aspirin deprescription versus 37/235 (15.7%) with aspirin continuation (log-rank p = 0.28). 0/25 patients with aspirin deprescription had subsequent hospitalisation for GIB versus 17/235 (7.2%) who continued aspirin (log-rank p = 0.13). CONCLUSIONS: Aspirin for primary cardiovascular prevention was rarely deprescribed at discharge in patients hospitalised with GIB. Processes designed to ensure appropriate deprescription of aspirin are crucial to improve adherence to guidelines, thereby improving the risk-benefit ratio in patients at high risk of subsequent GIB hospitalisations with minimal increased risk of MACE.


Subject(s)
Aspirin , Cardiovascular Diseases , Humans , Middle Aged , Aged , Aged, 80 and over , Aspirin/adverse effects , Patient Discharge , Aftercare , Gastrointestinal Hemorrhage/chemically induced , Gastrointestinal Hemorrhage/prevention & control , Cardiovascular Diseases/prevention & control , Primary Prevention
15.
J Immunother ; 44(8): 325-334, 2021 10 01.
Article in English | MEDLINE | ID: mdl-34380976

ABSTRACT

BACKGROUND: Immune checkpoint inhibitors (ICIs) have transformed the management of advanced malignancies but are associated with diarrhea and colitis. The objective of our systematic review and meta-analysis was to determine the incidence and outcomes of ICI-associated diarrhea and colitis. Bibliographic databases were searched through August 13, 2019, for observational studies of ICI therapy reporting the incidence and/or treatment of diarrhea or colitis. The primary outcome was ICI-associated diarrhea and colitis. Meta-analyses were performed with random-effects models. Twenty-five studies (N=12,661) were included. All studies had a high risk of bias in at least 1 domain. The overall incidence of diarrhea/colitis was 12.8% [95% confidence interval (CI), 8.8-18.2, I2=96.5]. The incidence was lower in patients treated with anti-programmed cell death 1/programmed death-ligand 1 (4.1%, 95% CI, 2.6-6.5) than in those treated with anti-cytotoxic T-cell lymphocyte-associated antigen 4 (20.1%, 95% CI, 15.9-25.1). The remission of diarrhea and/or colitis was higher in patients treated with corticosteroids plus biologics (88.4%, 95% CI, 79.4-93.8) than in those treated with corticosteroids alone (58.3%, 95% CI, 49.3-66.7, Q=18.7, P<0.001). ICI were permanently discontinued in 48.1% of patients (95% CI, 17.8-79.1). ICI were restarted after temporary interruption in 48.6% of patients (95% CI, 18.2-79.4) of whom 17.0% (95% CI, 6.4-30.0) experienced recurrence. Real-world incidence of ICI-associated diarrhea/colitis exceeds 10%. These events lead to permanent ICI discontinuation in just over 50% of patients, while <20% have recurrence of symptoms if ICI are resumed. Further studies are needed to identify patients who would benefit from early treatment with biologics as well as appropriate patients to resume ICI therapy.


Subject(s)
Colitis/chemically induced , Diarrhea/chemically induced , Immune Checkpoint Inhibitors/adverse effects , Adrenal Cortex Hormones/therapeutic use , Biological Products/therapeutic use , Colitis/drug therapy , Colitis/epidemiology , Diarrhea/drug therapy , Diarrhea/epidemiology , Humans , Incidence , Neoplasms/drug therapy , Neoplasms/epidemiology , Observational Studies as Topic
16.
Inflamm Bowel Dis ; 27(8): 1270-1276, 2021 07 27.
Article in English | MEDLINE | ID: mdl-33165569

ABSTRACT

BACKGROUND: There are limited data on how vedolizumab (VDZ) impacts extraintestinal manifestations (EIMs) in inflammatory bowel disease (IBD). The aim of this study was to determine the clinical outcomes of EIMs after initiation of VDZ for patients with IBD. METHODS: A multicenter retrospective study of patients with IBD who received at least 1 dose of VDZ between January 1, 2014 and August 1, 2019 was conducted. The primary outcome was the rate of worsening EIMs after VDZ. Secondary outcomes were factors associated with worsening EIMs and peripheral arthritis (PA) specifically after VDZ. RESULTS: A total of 201 patients with IBD (72.6% with Crohn disease; median age 38.4 years (interquartile range, 29-52.4 years); 62.2% female) with EIMs before VDZ treatment were included. The most common type of EIM before VDZ was peripheral arthritis (PA) (68.2%). Worsening of EIMs after VDZ occurred in 34.8% of patients. There were no statistically significant differences between the worsened EIM (n = 70) and the stable EIM (n = 131) groups in term of age, IBD subtype, or previous and current medical therapy. We found that PA was significantly more common in the worsening EIM group (84.3% vs 59.6%; P < 0.01). Worsening of EIMs was associated with a higher rate of discontinuation of VDZ during study follow-up when compared with the stable EIM group (61.4% vs 44%; P = 0.02). Treatment using VDZ was discontinued specifically because of EIMs in 9.5% of patients. CONCLUSIONS: Almost one-third of patients had worsening EIMs after VDZ, which resulted in VDZ discontinuation in approximately 10% of patients. Previous biologic use or concurrent immunosuppressant or corticosteroid therapy did not predict EIM course after VDZ.


Subject(s)
Antibodies, Monoclonal, Humanized/therapeutic use , Arthritis , Inflammatory Bowel Diseases , Adult , Arthritis/drug therapy , Arthritis/etiology , Female , Humans , Inflammatory Bowel Diseases/complications , Inflammatory Bowel Diseases/drug therapy , Male , Middle Aged , Retrospective Studies
17.
Inflamm Bowel Dis ; 26(9): 1394-1400, 2020 08 20.
Article in English | MEDLINE | ID: mdl-31689354

ABSTRACT

BACKGROUND: Despite increased risk of venous thromboembolism (VTE) among hospitalized patients with inflammatory bowel disease (IBD), pharmacologic prophylaxis rates remain low. We sought to understand the reasons for this by assessing factors associated with VTE prophylaxis in patients with IBD and the safety of its use. METHODS: This was a retrospective cohort study conducted among patients hospitalized between January 2013 and August 2018. The primary outcome was VTE prophylaxis, and exposures of interest included acute and chronic bleeding. Medical records were parsed electronically for covariables, and logistic regression was used to assess factors associated with VTE prophylaxis. RESULTS: There were 22,499 patients studied, including 474 (2%) with IBD. Patients with IBD were less likely to be placed on VTE prophylaxis (79% with IBD, 87% without IBD), particularly if hematochezia was present (57% with hematochezia, 86% without hematochezia). Among patients with IBD, admission to a medical service and hematochezia (adjusted odds ratio 0.27; 95% CI, 0.16-0.46) were among the strongest independent predictors of decreased VTE prophylaxis use. Neither hematochezia nor VTE prophylaxis was associated with increased blood transfusion rates or with a clinically significant decline in hemoglobin level during hospitalization. CONCLUSION: Hospitalized patients are less likely to be placed on VTE prophylaxis if they have IBD, and hematochezia may drive this. Hematochezia appeared to be minor and was unaffected by VTE prophylaxis. Education related to the safety of VTE prophylaxis in the setting of minor hematochezia may be a high-yield way to increase VTE prophylaxis rates in patients with IBD.


Subject(s)
Anticoagulants/therapeutic use , Gastrointestinal Hemorrhage/complications , Inflammatory Bowel Diseases/complications , Venous Thromboembolism/prevention & control , Adolescent , Adult , Contraindications, Drug , Female , Hospitalization , Humans , Logistic Models , Male , Odds Ratio , Retrospective Studies , Risk Factors , Venous Thromboembolism/etiology , Young Adult
SELECTION OF CITATIONS
SEARCH DETAIL