Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 34
Filter
1.
Surg Open Sci ; 20: 94-97, 2024 Aug.
Article in English | MEDLINE | ID: mdl-38973811

ABSTRACT

Liposomal bupivacaine (LB) has been used in multimodal pain management regimens to improve postsurgical analgesia. This retrospective cohort analysis assessed clinical and economic outcomes of LB vs non-LB analgesia in minimally invasive colorectal resection surgery using real-world patient data from the IQVIA linkage claims databases. Patients who received LB were 1:1 matched to patients who did not receive LB (non-LB) via propensity scores. Outcomes included opioid use during the perioperative (2 weeks before surgery to 2 weeks after discharge), continued (>2 weeks to 3 months after discharge), and persistent (>3 months to 6 months after discharge) periods and healthcare resource utilization (HRU) during the first 3 months after discharge. Mean opioid consumption was lower in the LB (n = 4397) versus non-LB (n = 4397) cohort perioperatively (483 vs 538 morphine milligram equivalents [MMEs]; P = 0.001) and after discharge within ∼3 months (222 vs 328 MMEs; P < 0.0001) and 3-6 months (245 vs 384 MMEs; P < 0.0001). The LB cohort had shorter mean length of stay (5.2 vs 5.7 days; P < 0.0001) and fewer inpatient readmissions (odds ratio [OR], 0.71; P < 0.0001), emergency department visits (OR, 0.78; P < 0.0001), and outpatient/office visits (OR, 0.91; P = 0.028) than the non-LB cohort 3 months after discharge. These data suggest use of LB in minimally invasive colorectal resection surgery may reduce perioperative and postdischarge opioid use as well as HRU. Although additional studies are needed to confirm these findings, this analysis provides valuable real-world data from large claims databases to evaluate clinical and economic outcomes that complement other types of retrospective and prospective studies.

2.
Plast Reconstr Surg Glob Open ; 12(6): e5874, 2024 Jun.
Article in English | MEDLINE | ID: mdl-38855138

ABSTRACT

Background: Liposomal bupivacaine (LB) can be used for postsurgical analgesia after breast reconstruction. We examined real-world clinical and economic benefits of LB versus bupivacaine after deep inferior epigastric perforator (DIEP) flap breast reconstruction. Methods: This retrospective cohort study used the IQVIA claims databases to identify patients undergoing primary DIEP flap breast reconstruction in 2016-2019. Patients receiving LB and those receiving bupivacaine were compared to assess opioid utilization in morphine milligram equivalents (MMEs) and healthcare resource utilization during perioperative (2 weeks before surgery to 2 weeks after discharge) and 6-month postdischarge periods. A generalized linear mixed-effects model and inverse probability of treatment weighting method were performed. Results: Weighted baseline characteristics were similar between cohorts (LB, n = 669; bupivacaine, n = 348). The LB cohort received significantly fewer mean MMEs versus the bupivacaine cohort during the perioperative (395 versus 512 MMEs; rate ratio [RR], 0.771 [95% confidence interval (CI), 0.677-0.879]; P = 0.0001), 72 hours after surgery (63 versus 140 MMEs; RR, 0.449 [95% CI, 0.347-0.581]; P < 0.0001), and inpatient (154 versus 303 MMEs; RR, 0.508 [95% CI, 0.411-0.629]; P < 0.0001) periods; postdischarge filled opioid prescriptions were comparable. The LB cohort was less likely to have all-cause inpatient readmission (odds ratio, 0.670 [95% CI, 0.452-0.993]; P = 0.046) and outpatient clinic/office visits (odds ratio, 0.885 [95% CI, 0.785-0.999]; P = 0.048) 3 months after discharge than the bupivacaine cohort; other all-cause healthcare resource utilization outcomes were not different. Conclusions: LB was associated with fewer perioperative MMEs and all-cause 3-month inpatient readmissions and outpatient clinic/office visits than bupivacaine in patients undergoing DIEP flap breast reconstruction.

3.
J Arthroplasty ; 2024 Jun 26.
Article in English | MEDLINE | ID: mdl-38942249

ABSTRACT

INTRODUCTION: Total knee arthroplasty (TKA) is performed on approximately 790,000 patients annually in the United States and is projected to increase to 1.5 million by 2050. This study aimed at assessing the use of preoperative cryoneurolysis on patients undergoing TKA by analyzing: 1) pain severity; 2) opioid use; 3) functional status; and 4) sleep disturbance over 6 months following discharge. METHODS: Patients enrolled in the Innovations in Genicular Outcomes Registry (iGOR) between September 2021 and February 2024 were followed for 6 months. Our analyses included patients undergoing unilateral primary TKA with no pre-operative opioid prescription who either received, or did not receive, cryoneurolysis. Baseline patient demographics were collected before TKA and tabulated. Pain management was assessed via the Brief Pain Inventory-Short Form (BPI-SF) instrument for pain severity. Sleep disturbance was measured using the Patient-Reported Outcomes Measurement Information System (PROMIS) questionnaire. Each outcome measure was assessed prior to TKA, weekly, and at monthly follow-up. Data was analyzed by a generalized linear mixed-effect regression model to compare cryoneurolysis versus control patients, with a P < 0.05 as significant. RESULTS: There were 80 patients who were treated with preoperative cryoneurolysis, while 60 control patients did not have treatment. Patients receiving cryoneurolysis experienced significantly lower pain severity and sleep disturbance over the 6-month follow-up than control patients (P = 0.046). Cryoneurolysis was also associated with a trend toward greater functional improvement that did not reach statistical significance (P = 0.061). Further, patients who underwent cryoneurolysis were 72% less likely than controls to take opioids over six months following discharge (P <0.001). CONCLUSIONS: Pre-operative cryoneurolysis therapy in opioid-naïve patients undergoing TKA is associated with improved pain, decreased opioid use, and improved sleep disturbance for 6 months postoperatively. Cryoneurolysis, a non-opioid pain relief modality administered pre-operatively, demonstrated substantial benefits in patients who underwent TKA.

4.
J Arthroplasty ; 2024 Jun 25.
Article in English | MEDLINE | ID: mdl-38936436

ABSTRACT

INTRODUCTION: Knee osteoarthritis (OA) affects 19% of American adults over 45 years old and costs $27+ billion annually. A wide range of non-operative treatment options are available. This study compared six treatments: cryoneurolysis with deep genicular nerve block (Cryo-Deep/Both), cryoneurolysis with superficial nerve block (Cryo-Superficial), intra-articular hyaluronic acid (IA-HA) injections, non-steroidal anti-inflammatory drug injections (IA-NSAIDs), IA-corticosteroids (IA-CS) injections, or IA-triamcinolone extended release (IA-TA-ER) injections over 4 months for: 1) pain severity and analgesic use; and 2) physical function (from Knee Injury and Osteoarthritis Outcome Score for Joint Replacement (KOOS, JR)). METHODS: Patients who had unilateral knee OA and received non-operative intervention were enrolled in the Innovations in Genicular Outcomes (iGOR) registry, a novel, multi-center real-world registry, between September 2021 and February 2024. A total of 480 patients were enrolled. Both pain and functional outcomes were assessed at baseline, weekly, and monthly, which were analyzed by: overall trend, magnitude changes pre- to post-treatment, and distribution-based minimal clinically important difference score (MCID). Multivariate linear regressions with adjustments for seven confounding factors were used to compare follow-up outcomes among six treatment groups. RESULTS: Use of IA-TA-ER injections was associated with the lowest pain, greatest pain reduction, and highest prevalence of patients achieving MCID relative to other treatments (P < 0.001). Deep/Both-Cryo and IA-CS were associated with a higher prevalence of achieving MCID than IA-HA, IA-NSAIDs, and Cryo-Superficial (P ≤ 0.001). Use of IA-TA-ER was also associated with the greatest functional score, improvement from baseline, and highest prevalence of patients achieving MCID than other treatments (P ≤ 0.003). CONCLUSIONS: The IA-TA-ER appears to outperform other treatments in terms of pain relief and functional improvement for up to 4 months following treatment. In addition, outcomes in the novel cryoneurolysis and conventional IA-CS were similar to one another and better than those in IA-HA and IA-NSAIDs.

5.
Spine J ; 2024 Jun 04.
Article in English | MEDLINE | ID: mdl-38843956

ABSTRACT

BACKGROUND CONTEXT: Perioperative pain management affects cost and outcomes in elective spine surgery. PURPOSE: This study investigated the association between liposomal bupivacaine (LB) and outpatient spine surgery outcomes, including perioperative, postoperative, and postdischarge opioid use and healthcare resource utilization. STUDY DESIGN: This was a retrospective comparative study. PATIENT SAMPLE: Eligibility criteria included adults with ≥6 months of continuous data before and after outpatient spine procedures including discectomy, laminectomy, or lumbar fusion. Patients receiving LB were matched 1:3 to patients receiving non-LB analgesia by propensity scores. OUTCOME MEASURES: Outcomes included (1) opioid use in morphine milligram equivalents (MMEs) during the perioperative and postdischarge periods and (2) postdischarge readmission and emergency department (ED) visits up to 3 months after surgery. Generalized linear mixed-effects modeling with appropriate distributions was used for analysis. METHODS: Deidentified data from the IQVIA linkage claims databases (2016-2019) were used for the analysis. This study was funded by Pacira BioSciences, Inc. RESULTS: In total, 381 patients received LB and 1143 patients received non-LB analgesia. Baseline characteristics were well balanced after propensity score matching. The LB cohort used fewer MMEs versus the non-LB cohort before discharge (80 vs 132 MMEs [mean difference, -52 MMEs; p=.0041]). Following discharge, there was a nonsignificant reduction in opioid use in the LB cohort versus the non-LB cohort within 90 days (429 vs 480 MMEs [mean difference, -50 MMEs; p=.289]) and from >90 days to 180 days (349 vs 381 MMEs [mean difference, -31 MMEs; p=.507]). The LB cohort had significantly lower rates of ED visits at 2 months after discharge versus the non-LB cohort (3.9% vs 7.6% [odds ratio, 0.50; p=.015]). Postdischarge readmission rates did not differ between cohorts. CONCLUSIONS: Use of LB for outpatient spine surgery was associated with reduced opioid use at the hospital and nonsignificant reduction in opioid use at all postoperative timepoints examined through 90 days after surgery versus non-LB analgesia. ED visit rates were significantly lower at 60 days after discharge. These findings support reduced cost and improved quality metrics in patients treated with LB versus non-LB analgesia for outpatient spine surgery.

6.
J Health Econ Outcomes Res ; 9(2): 86-94, 2022.
Article in English | MEDLINE | ID: mdl-36168593

ABSTRACT

Background: Epidural analgesia can be associated with high costs and postsurgical risks such as hypotension, despite its widespread use and value in providing opioid-sparing pain management. We tested the hypothesis that liposomal bupivacaine (LB) might be a reliable alternative to epidural analgesia in this real-world study. Objectives: To compare economic outcomes and hypotension incidence associated with use of LB and epidural analgesia for abdominal surgery. Methods: This retrospective analysis identified records of adults who underwent abdominal surgeries between January 2016 and September 2019 with either LB administration or traditional epidural analgesia using the Premier Healthcare Database. Economic outcomes included length of stay, hospital costs, rates of discharge to home, and 30-day hospital readmissions. Secondary outcomes included incidence of postsurgical hypotension and vasopressor use. Subgroup analyses were stratified by surgical procedure (colorectal, abdominal) and approach (endoscopic, open). A generalized linear model adjusted for patient and hospital characteristics was used for all comparisons. Results: A total of 5799 surgical records (LB, n=4820; epidural analgesia, n=979) were included. Compared with cases where LB was administered, cases of epidural analgesia use were associated with a 1.6-day increase in length of stay (adjusted rate ratio [95% confidence interval (CI), 1.2 [1.2-1.3]]; P<.0001) and $6304 greater hospital costs (adjusted rate ratio [95% CI], 1.2 [1.2-1.3]]; P<.0001). Cost differences were largely driven by room-and-board fees. Epidural analgesia was associated with reduced rates of discharge to home (P<.0001) and increased 30-day readmission rates (P=.0073) compared with LB. Epidural analgesia was also associated with increased rates of postsurgical hypotension (30% vs 11%; adjusted odds ratio [95% CI], 2.8 [2.3-3.4]; P<.0001) and vasopressor use (22% vs 7%; adjusted odds ratio [95% CI], 3.1 [2.5-4.0]; P<.0001) compared with LB. Subgroup analyses by surgical procedure and approach were generally consistent with overall comparisons. Discussion: Our results are consistent with previous studies that demonstrated epidural analgesia can be associated with higher utilization of healthcare resources and complications compared with LB. Conclusions: Compared with epidural analgesia, LB was associated with economic benefits and reduced incidence of postsurgical hypotension and vasopressor use.

7.
J Mark Access Health Policy ; 9(1): 1912924, 2021 Apr 19.
Article in English | MEDLINE | ID: mdl-33968334

ABSTRACT

Background/Objective: This study evaluated emergency department (ED) visit trends, subsequent inpatient admissions for patients with inflammatory bowel disease (IBD) diagnosis and IBD-related abdominal pain (AP), and hospital-level variation in inpatient admission rates in the USA (US). Methods: This population-based, cross-sectional study included data from Nationwide Emergency Department Sample (NEDS, 2006─2013) database. Patients ≥18 years of age with primary ED diagnosis of IBD/IBD-related AP were included. Variables included demographics, insurance information, household income, Quan-Charlson comorbidity score, ED discharge disposition, and length of hospital stay (2006, 2010, and 2013). Variation between hospitals using risk-adjusted admission ratio was estimated. Results: Annual ED visits for IBD/100,000 US population increased (30 in 2006 vs 42 in 2013, p = 0.09), subsequent admissions remained stable (20 in 2006 vs 23 in 2013, p = 0.52). ED visits for IBD-related AP increased by 71% (7 in 2006 vs 12 in 2013; p = 0.12), subsequent admissions were stable (0.50 in 2006 vs 0.58 in 2013; p = 0.88). Proportion of patients with subsequent hospitalization decreased (IBD: 65.7% to 55.7%; IBD-related AP: 6.9% to 4.9%). Variation in subsequent inpatient admissions was 1.42 (IBD) and 1.96 (IBD-related AP). Conclusions: An increase in annual ED visits was observed for patients with IBD and IBD-related AP; however, subsequent inpatient admission rate remained stable.

8.
J Cardiothorac Vasc Anesth ; 35(12): 3681-3687, 2021 12.
Article in English | MEDLINE | ID: mdl-33975790

ABSTRACT

OBJECTIVE: Effective postsurgical pain management is important for pediatric patients to improve outcomes while reducing resource use and waste. The authors examined opioid consumption and economic outcomes associated with liposomal bupivacaine (LB) or non-LB analgesia use in pediatric patients undergoing cardiothoracic surgery. DESIGN: The authors retrospectively analyzed Premier Healthcare Database records. SETTING: The data extracted from the database included patient records from hospitals across the United States in both rural and urban locations. PARTICIPANTS: The records included data from patients aged 12-to-<18 years. INTERVENTIONS: The records belonged to patients undergoing video-assisted thoracoscopic procedures (VATS) who received LB or non-LB analgesia after surgery. MEASUREMENTS AND MAIN RESULTS: Outcomes included in-hospital postsurgical opioid consumption in morphine milligram equivalents (MMEs), hospital length of stay (LOS), and total hospital costs; the LB and non-LB cohorts were compared using a generalized linear model with inverse probability of treatment weighting to balance the cohorts. For VATS procedures, pediatric patients receiving LB had significant reductions in in-hospital opioid consumption (632 v 991 MMEs; p < 0.0001), shorter LOS (5.1 v 5.6 days; p = 0.0023), and lower total hospital costs ($18,084 v $21,962; p < 0.0001) compared with those receiving non-LB analgesia. CONCLUSIONS: These results support use of LB in multimodal analgesia regimens for managing pain in pediatric patients after cardiothoracic surgery.


Subject(s)
Anesthetics, Local , Bupivacaine , Analgesics, Opioid , Child , Humans , Pain, Postoperative/diagnosis , Pain, Postoperative/drug therapy , Pain, Postoperative/prevention & control , Retrospective Studies
9.
J Health Econ Outcomes Res ; 8(1): 29-35, 2021 Apr 14.
Article in English | MEDLINE | ID: mdl-33880386

ABSTRACT

Background: Approximately 60% of hospitalized children undergoing surgery experience at least 1 day of moderate-to-severe pain after surgery. Pain following spine surgery may affect opioid exposure, length of stay (LOS), and costs in hospitalized pediatric patients. This is a retrospective cohort analysis of pediatric patients undergoing inpatient primary spine surgery. Objectives: To examine the association of opioid-related and economic outcomes with postsurgical liposomal bupivacaine (LB) or non-LB analgesia in pediatric patients who received spine surgery. Methods: Premier Healthcare Database records (January 2015-September 2019) for patients aged 1-17 years undergoing inpatient primary spine surgery were retrospectively analyzed. Outcomes included in-hospital postsurgical opioid consumption (morphine milligram equivalents [MMEs]), opioid-related adverse events (ORAEs), LOS (days), and total hospital costs. A generalized linear model adjusting for baseline characteristics was used. Results: Among 10 189 pediatric patients, the LB cohort (n=373) consumed significantly fewer postsurgical opioids than the non-LB cohort (n=9816; adjusted MME ratio, 0.53 [95% confidence interval (CI), 0.45-0.61]; P<0.0001). LOS was significantly shorter in the LB versus non-LB cohort (adjusted rate ratio, 0.86 [95% CI, 0.80-0.94]; P=0.0003). Hospital costs were significantly lower in the LB versus non-LB cohort overall (adjusted rate ratio, 0.92 [95% CI, 0.86-0.99]; P=0.0227) mostly because of decreased LOS and central supply costs. ORAEs were not significantly different between groups (adjusted rate ratio, 0.84 [95% CI, 0.65-1.08]; P=0.1791). Discussion: LB analgesia was associated with shorter LOS and lower hospital costs compared with non-LB analgesia in pediatric patients undergoing spine surgery. The LB cohort had lower adjusted room and board and central supply costs than the non-LB cohort. These data suggest that treatment with LB might reduce hospital LOS and subsequently health-care costs, and additional cost savings outside the hospital room may factor into overall health-care cost savings. LB may reduce pain and the need for supplemental postsurgical opioids, thus reducing pain and opioid-associated expenses while improving patient satisfaction with postsurgical care. Conclusions: Pediatric patients undergoing spine surgery who received LB had significantly reduced in-hospital postsurgical opioid consumption, LOS, and hospital costs compared with those who did not.

10.
J Stroke Cerebrovasc Dis ; 30(5): 105715, 2021 May.
Article in English | MEDLINE | ID: mdl-33743312

ABSTRACT

OBJECTIVES: In a previous real-world study, rivaroxaban reduced the risk of stroke overall and severe stroke compared with warfarin in patients with nonvalvular atrial fibrillation (NVAF). The aim of this study was to assess the reproducibility in a different database of our previously observed results (Alberts M, et al. Stroke. 2020;51:549-555) on the risk of severe stroke among NVAF patients in a different population treated with rivaroxaban or warfarin. MATERIAL AND METHODS: This retrospective cohort study included patients from the IBM® MarketScan® Commercial and Medicare databases (2011-2019) who initiated rivaroxaban or warfarin after a diagnosis of NVAF, had ≥6 months of continuous health plan enrollment, had a CHA2DS2-VASc score ≥2, and had no history of stroke or anticoagulant use. Patient data were assessed until the earliest occurrence of a primary inpatient diagnosis of stroke, death, end of health plan enrollment, or end of study. Stroke severity was defined by National Institutes of Health Stroke Scale (NIHSS) score, imputed by random forest model. Cox proportional hazard regression was used to compare risk of stroke between cohorts, balanced by inverse probability of treatment weighting. RESULTS: The mean observation period from index date to either stroke, or end of eligibility or end of data was 28 months. Data from 13,599 rivaroxaban and 39,861 warfarin patients were included. Stroke occurred in 272 rivaroxaban-treated patients (0.97/100 person-years [PY]) and 1,303 warfarin-treated patients (1.32/100 PY). Rivaroxaban patients had lower risk for stroke overall (hazard ratio [HR], 0.82; 95% confidence interval [CI], 0.76-0.88) and for minor (NIHSS 1 to <5; HR, 0.83; 95% CI, 0.74-0.93), moderate (NIHSS 5 to <16; HR, 0.88; 95% CI, 0.78-0.99), and severe stroke (NIHSS 16 to 42; HR, 0.44; 95% CI, 0.22-0.91). CONCLUSIONS: The results of this study in a larger population of NVAF patients align with previous real-world findings and the ROCKET-AF trial by showing improved stroke prevention with rivaroxaban versus warfarin across all stroke severities.


Subject(s)
Anticoagulants/therapeutic use , Atrial Fibrillation/drug therapy , Factor Xa Inhibitors/therapeutic use , Rivaroxaban/therapeutic use , Stroke/prevention & control , Warfarin/therapeutic use , Aged , Aged, 80 and over , Anticoagulants/adverse effects , Atrial Fibrillation/diagnosis , Atrial Fibrillation/epidemiology , Databases, Factual , Factor Xa Inhibitors/adverse effects , Female , Humans , Male , Middle Aged , Retrospective Studies , Risk Assessment , Risk Factors , Rivaroxaban/adverse effects , Severity of Illness Index , Stroke/diagnosis , Stroke/epidemiology , Time Factors , Treatment Outcome , United States/epidemiology , Warfarin/adverse effects
11.
J Med Econ ; 24(1): 212-217, 2021.
Article in English | MEDLINE | ID: mdl-33499689

ABSTRACT

AIMS: Rivaroxaban reduces stroke compared with warfarin in patients with non-valvular atrial fibrillation (NVAF). This study compared healthcare costs before and after stroke in NVAF patients treated with rivaroxaban or warfarin. MATERIALS AND METHODS: Using de-identified IBM MarketScan Commercial and Medicare databases, this retrospective cohort study (from 2011 to 2019) included patients with NVAF who initiated rivaroxaban or warfarin within 30 days after initial NVAF diagnosis. Patients who developed stroke were identified, and stroke severity was determined by the National Institutes of Health Stroke Scale (NIHSS) score, imputed by a random forest method. Total all-cause per-patient per-year (PPPY) costs of care were determined for patients: (1) who developed stroke during the pre- and post-stroke periods and (2) who remained stroke-free during the follow-up period. Treatment groups were balanced using inverse probability of treatment weighting. RESULTS: A total of 13,599 patients initiated rivaroxaban and 39,861 initiated warfarin, of which 272 (2.0%) and 1,303 (3.3%), respectively, developed stroke during a mean follow-up of 28 months. Among patients who developed stroke, PPPY costs increased from the pre-stroke to post-stroke period, with greater increases in the warfarin cohort relative to the rivaroxaban cohort. Overall, the costs increased by 1.78-fold for rivaroxaban vs 3.07-fold for warfarin; for less severe strokes (NIHSS < 5), costs increased 0.88-fold and 1.05-fold, respectively. Cost increases for more severe strokes (NIHSS ≥ 5) among rivaroxaban patients were half those for warfarin patients (3.19-fold vs 6.37-fold, respectively). Among patients without stroke, costs were similar during the follow-up period between the two treatment groups. CONCLUSIONS: Total all-cause costs of care increased in the post-stroke period, and particularly in the patients treated with warfarin relative to those treated with rivaroxaban. The lower rate of stroke in the rivaroxaban cohort suggests that greater pre- to post-stroke cost increases result from more strokes occurring in the warfarin cohort.


Subject(s)
Atrial Fibrillation , Stroke , Aged , Anticoagulants/adverse effects , Atrial Fibrillation/complications , Atrial Fibrillation/drug therapy , Dabigatran , Health Care Costs , Humans , Medicare , Retrospective Studies , Rivaroxaban/adverse effects , United States , Warfarin/adverse effects
12.
Hematology ; 25(1): 366-371, 2020 Dec.
Article in English | MEDLINE | ID: mdl-33095117

ABSTRACT

OBJECTIVE: To describe chronic lymphocytic leukemia (CLL) treatment patterns and patient outcomes in Latin America. METHODS: This chart review study (NCT02559583; 2008-2015)evaluated time to progression (TTP) and overall survival (OS) outcomes among patients with CLL who initiate done (n = 261) to two (n = 96) lines of therapy (LOT) since diagnosis. Differences in TTP and OS were assessed by Kaplan-Meier analysis, with a log-rank test for statistical significance. Association between therapeutic regimen and risk for disease progression or death was estimated using Cox proportional hazard regression. RESULTS: The most commonly prescribed therapies in both LOTs were chlorambucil-, followed by fludarabine- and cyclophosphamide (C)/CHOP-based therapies. Chlorambucil- and C/CHOP-based therapies were largely prescribed to elderly patients (≥65 years) while fludarabine-based therapy was predominantly used by younger patients (≤65 years). In LOT1, relative to chlorambucil-administered patients, those prescribed fludarabine-based therapies had lower risk of disease progression (hazard ratio [HR] and 95% confidence interval [CI] 0.32 [0.19-0.54]), whereas C/CHOP-prescribed patients had higher risk (HR 95%CI 1.88 [1.17-3.04]). Similar results were observed in LOT2. There was no difference in OS between treatments in both LOTs. DISCUSSION: Novel therapies such as kinase inhibitors were rarely prescribed in LOT1 or LOT2in Latin America. The greater TTP observed forfludarabine-based therapies could be attributed to the fact that fludarabine-based therapies are predominantly administered to young and healthy patients. CONCLUSION: Chlorambucil-based therapy, which has limited benefits, is frequently prescribed in Latin America. Prescribing novel agents for fludarabine-based therapy-ineligible patients with CLL is the need of the hour. Trial registration: ClinicalTrials.gov identifier: NCT02559583.


Subject(s)
Antineoplastic Combined Chemotherapy Protocols/administration & dosage , Leukemia, Lymphocytic, Chronic, B-Cell , Age Factors , Aged , Antineoplastic Combined Chemotherapy Protocols/adverse effects , Disease-Free Survival , Female , Humans , Latin America/epidemiology , Leukemia, Lymphocytic, Chronic, B-Cell/diagnosis , Leukemia, Lymphocytic, Chronic, B-Cell/drug therapy , Leukemia, Lymphocytic, Chronic, B-Cell/mortality , Male , Middle Aged , Risk Factors , Survival Rate
13.
Cardiol Ther ; 9(1): 153-165, 2020 Jun.
Article in English | MEDLINE | ID: mdl-32124423

ABSTRACT

INTRODUCTION: Hospitalization is the largest component of health care spending in the United States. Most hospitalized patients first visit the emergency department (ED), where hospitalization decisions are made. Optimal utilization of hospital resources is critical for all stakeholders. METHODS: We performed a population-based, cross-sectional study evaluating ED visits and subsequent inpatient admissions for patients with coronary artery disease (CAD) and chest pain (CP) suggestive of CAD from 2006 to 2013 using the Nationwide Emergency Department Sample database weighted for national estimates. We analyzed trends using a generalized linear regression model with a Poisson distribution and Wald test. RESULTS: From 2006 to 2013, there was a 15% decrease in ED visits for CAD (p < 0.01), while ED visit rates for CP increased 31% (p < 0.01). Subsequent inpatient admission rates decreased 18% for CAD (p < 0.01) and 33% for CP (p < 0.01). Trends were not modified by patient and hospital strata. CONCLUSION: ED visits and subsequent inpatient admissions resulting from CAD decreased from 2006 to 2013. Patients with CP had a substantially higher number of ED visits, with a significant decline in inpatient admissions.

14.
Stroke ; 51(4): 1297-1300, 2020 04.
Article in English | MEDLINE | ID: mdl-32078496

ABSTRACT

Background and Purpose- Although exogenous hormone therapy (HT) use has been associated with increased risk of ischemic stroke in postmenopausal women, it remains unknown whether sex hormone levels contribute to ischemic stroke risk. We aimed to estimate associations between plasma sex hormone levels and ischemic stroke risk, by HT status, in a nested case-control study of postmenopausal women from the NHS (Nurses' Health Study). Methods- Women with confirmed incident ischemic stroke (n=419) were matched with controls (n=419) by age, HT use, and other factors. Plasma estradiol and testosterone levels were measured using liquid chromatography tandem mass spectrometry; SHBG (sex hormone-binding globulin) was assayed by electrochemiluminescence immunoassay. Associations of total and free estradiol and testosterone, the estradiol/testosterone ratio, and SHBG with ischemic stroke were estimated using conditional logistic regressions stratified by HT status with adjustment for matching and cardiovascular risk factors. Results- Current HT users had different hormone profiles from never/past users. No clear linear trends were observed between estradiol (total or free) levels or the estradiol/testosterone ratio and ischemic stroke risk among either current users (Ptrend>0.1) or never/past users (Ptrend>0.6). For both current and never/past users, the associations between some of the sex hormones and ischemic stroke differed by body mass index categories (Pinteraction≤0.04). For women with a body mass index <25 kg/m2, a higher estradiol/testosterone ratio was associated with significantly elevated ischemic stroke risk among current users (Ptrend=0.01), and higher levels of total and free estradiol were significantly associated with higher ischemic stroke risk among never/past users (Ptrend≤0.04). Testosterone and SHBG were not associated with ischemic stroke in either current or never/past users. Conclusions- Our findings do not support a role of sex hormone levels in mediating ischemic stroke risk among postmenopausal women. Replications in additional larger studies are required.


Subject(s)
Brain Ischemia/blood , Estradiol/blood , Postmenopause/blood , Stroke/blood , Testosterone/blood , Aged , Biomarkers/blood , Brain Ischemia/diagnosis , Case-Control Studies , Female , Humans , Middle Aged , Prospective Studies , Risk Factors , Stroke/diagnosis
15.
BMC Med Inform Decis Mak ; 20(1): 8, 2020 01 08.
Article in English | MEDLINE | ID: mdl-31914991

ABSTRACT

BACKGROUND: Stroke severity is an important predictor of patient outcomes and is commonly measured with the National Institutes of Health Stroke Scale (NIHSS) scores. Because these scores are often recorded as free text in physician reports, structured real-world evidence databases seldom include the severity. The aim of this study was to use machine learning models to impute NIHSS scores for all patients with newly diagnosed stroke from multi-institution electronic health record (EHR) data. METHODS: NIHSS scores available in the Optum© de-identified Integrated Claims-Clinical dataset were extracted from physician notes by applying natural language processing (NLP) methods. The cohort analyzed in the study consists of the 7149 patients with an inpatient or emergency room diagnosis of ischemic stroke, hemorrhagic stroke, or transient ischemic attack and a corresponding NLP-extracted NIHSS score. A subset of these patients (n = 1033, 14%) were held out for independent validation of model performance and the remaining patients (n = 6116, 86%) were used for training the model. Several machine learning models were evaluated, and parameters optimized using cross-validation on the training set. The model with optimal performance, a random forest model, was ultimately evaluated on the holdout set. RESULTS: Leveraging machine learning we identified the main factors in electronic health record data for assessing stroke severity, including death within the same month as stroke occurrence, length of hospital stay following stroke occurrence, aphagia/dysphagia diagnosis, hemiplegia diagnosis, and whether a patient was discharged to home or self-care. Comparing the imputed NIHSS scores to the NLP-extracted NIHSS scores on the holdout data set yielded an R2 (coefficient of determination) of 0.57, an R (Pearson correlation coefficient) of 0.76, and a root-mean-squared error of 4.5. CONCLUSIONS: Machine learning models built on EHR data can be used to determine proxies for stroke severity. This enables severity to be incorporated in studies of stroke patient outcomes using administrative and EHR databases.


Subject(s)
Electronic Health Records , Machine Learning , Natural Language Processing , Severity of Illness Index , Stroke/diagnosis , Aged , Aged, 80 and over , Cohort Studies , Databases, Factual , Female , Humans , Male , Middle Aged
16.
Stroke ; 51(2): 549-555, 2020 02.
Article in English | MEDLINE | ID: mdl-31888412

ABSTRACT

Background and Purpose- Oral anticoagulation therapy is standard of care for patients with nonvalvular atrial fibrillation to prevent stroke. This study compared rivaroxaban and warfarin for stroke and all-cause mortality risk reduction in a real-world setting. Methods- This retrospective cohort study (2011-2017) included de-identified patients from the Optum Clinformatics Database who started treatment with rivaroxaban or warfarin within 30 days following initial diagnosis of nonvalvular atrial fibrillation. Before nonvalvular atrial fibrillation diagnosis, patients had 6 months of continuous health plan enrollment and CHA2DS2-VASc score ≥2. Stroke severity was determined by the National Institutes of Health Stroke Scale, imputed based on machine learning algorithms. Stroke and all-cause mortality risks were compared by treatment using Cox proportional hazard regression, with inverse probability of treatment weighting to balance cohorts for baseline risk factors. Stratified analysis by treatment duration was also performed. Results- During a mean follow-up of 27 months, 175 (1.33/100 patient-years [PY]) rivaroxaban-treated and 536 (1.66/100 PY) warfarin-treated patients developed stroke. The inverse probability of treatment weighting model showed that rivaroxaban reduced stroke risk by 19% (hazard ratio [HR], 0.81 [95% CI, 0.73-0.91]). Analysis by stroke severity revealed risk reductions by rivaroxaban of 48% for severe stroke (National Institutes of Health Stroke Scale score, 16-42; HR, 0.52 [95% CI, 0.33-0.82]) and 19% for minor stroke (National Institutes of Health Stroke Scale score, 1 to <5; HR, 0.81 [95% CI, 0.68-0.96]), but no difference for moderate stroke (National Institutes of Health Stroke Scale score, 5 to <16; HR, 0.93 [95% CI, 0.78-1.10]). A total of 41 (0.31/100 PY) rivaroxaban-treated and 147 (0.44/100 PY) warfarin-treated patients died poststroke, 12 (0.09/100 PY) and 67 (0.20/100 PY) of whom died within 30 days, representing mortality risk reductions by rivaroxaban of 24% (HR, 0.76 [95% CI, 0.61-0.95]) poststroke and 59% (HR, 0.41 [95% CI, 0.28-0.60]) within 30 days. Conclusions- After the initial diagnosis of atrial fibrillation, patients treated with rivaroxaban versus warfarin had significant risk reduction for stroke, especially severe stroke, and all-cause mortality after a stroke. Findings from this observational study may help inform anticoagulant choice for stroke prevention in patients with nonvalvular atrial fibrillation.


Subject(s)
Atrial Fibrillation/drug therapy , Atrial Fibrillation/mortality , Rivaroxaban , Warfarin , Adolescent , Adult , Aged , Aged, 80 and over , Anticoagulants/adverse effects , Anticoagulants/therapeutic use , Atrial Fibrillation/complications , Factor Xa Inhibitors/adverse effects , Factor Xa Inhibitors/therapeutic use , Female , Humans , Male , Middle Aged , Proportional Hazards Models , Retrospective Studies , Rivaroxaban/adverse effects , Rivaroxaban/therapeutic use , Stroke/diagnosis , Stroke/drug therapy , Stroke/mortality , Warfarin/adverse effects , Warfarin/therapeutic use , Young Adult
17.
Br J Haematol ; 188(3): 383-393, 2020 02.
Article in English | MEDLINE | ID: mdl-31392724

ABSTRACT

Limited data are available regarding contemporary multiple myeloma (MM) treatment practices in Latin America. In this retrospective cohort study, medical records were reviewed for a multinational cohort of 1103 Latin American MM patients (median age, 61 years) diagnosed in 2008-2015 who initiated first-line therapy (LOT1). Of these patients, 33·9% underwent autologous stem cell transplantation (ASCT). During follow-up, 501 (45·4%) and 129 (11·7%) patients initiated second- (LOT2) and third-line therapy (LOT3), respectively. In the LOT1 setting, from 2008 to 2015, there was a decrease in the use of thalidomide-based therapy, from 66·7% to 42·6%, and chemotherapy from, 20·2% to 5·9%, whereas use of bortezomib-based therapy or bortezomib + thalidomide increased from 10·7% to 45·5%. Bortezomib-based therapy and bortezomib + thalidomide were more commonly used in ASCT patients and in private clinics. In non-ASCT and ASCT patients, median progression-free survival (PFS) was 15·0 and 31·1 months following LOT1 and 10·9 and 9·5 months following LOT2, respectively. PFS was generally longer in patients treated with bortezomib-based or thalidomide-based therapy versus chemotherapy. These data shed light on recent trends in the management of MM in Latin America. Slower uptake of newer therapies in public clinics and poor PFS among patients with relapsed MM point to areas of unmet therapeutic need in Latin America.


Subject(s)
Multiple Myeloma/therapy , Practice Patterns, Physicians'/statistics & numerical data , Adult , Age Factors , Aged , Antineoplastic Combined Chemotherapy Protocols/therapeutic use , Bortezomib/administration & dosage , Comorbidity , Drug Utilization/statistics & numerical data , Female , Follow-Up Studies , Hematopoietic Stem Cell Transplantation/statistics & numerical data , Humans , Kaplan-Meier Estimate , Latin America/epidemiology , Male , Middle Aged , Multiple Myeloma/epidemiology , Private Facilities/statistics & numerical data , Public Facilities/statistics & numerical data , Retrospective Studies , Thalidomide/administration & dosage , Treatment Outcome
18.
Int J Cancer ; 147(4): 920-930, 2020 08 15.
Article in English | MEDLINE | ID: mdl-31863463

ABSTRACT

Although previous studies have suggested a potential role of sex hormones in the etiology of colorectal cancer (CRC), no study has yet examined the associations between circulating sex hormones and survival among CRC patients. We prospectively assessed the associations of prediagnostic plasma concentrations of estrone, estradiol, free estradiol, testosterone, free testosterone and sex hormone-binding globulin (SHBG) with CRC-specific and overall mortality among 609 CRC patients (370 men and 239 postmenopausal women not taking hormone therapy at blood collection) from four U.S. cohorts. Multivariable hazard ratios (HRs) and 95% confidence intervals (CIs) were estimated using Cox proportional hazard regression. We identified 174 deaths (83 CRC-specific deaths) in men and 106 deaths (70 CRC-specific deaths) in women. In men, higher circulating level of free testosterone was associated with lower risk of overall (the highest vs. lowest tertiles, HR = 0.66, 95% CI, 0.45-0.99, ptrend = 0.04) and possibly CRC-specific mortality (HR = 0.73, 95% CI, 0.41-1.29, ptrend = 0.27). We generally observed nonsignificant inverse associations for other sex steroids, and a positive association for SHBG with CRC-specific mortality among male patients. In women, however, we found a suggestive positive association of estrone with overall (HR = 1.54, 95% CI, 0.92-2.60, ptrend = 0.11) and CRC-specific mortality (HR = 1.96, 95% CI, 1.01-3.84, ptrend = 0.06). Total estradiol, free estradiol and free testosterone were generally suggestively associated with higher risk of mortality among female patients, although not statistically significant. These findings implicated a potential role of endogenous sex hormones in CRC prognosis, which warrant further investigation.


Subject(s)
Colorectal Neoplasms/blood , Estradiol/blood , Estrone/blood , Sex Hormone-Binding Globulin/metabolism , Testosterone/blood , Aged , Aged, 80 and over , Colorectal Neoplasms/diagnosis , Female , Humans , Male , Middle Aged , Multivariate Analysis , Postmenopause , Prospective Studies , Survival Analysis
19.
J Am Coll Nutr ; 36(6): 462-469, 2017 Aug.
Article in English | MEDLINE | ID: mdl-28682183

ABSTRACT

BACKGROUND: The Women's Health Initiative (WHI) Dietary Modification (DM) trial did not show that reductions in dietary fat accompanied by increases in vegetable and fruit consumption decrease the incidence of colorectal cancer. Secondary analyses suggested that aspirin use may modify the intervention effects of DM on colorectal cancer development, although a recent reanalysis including the postintervention period confirmed no main effect of the intervention on reducing colorectal cancer incidence Methods: We analyzed data from 48,834 postmenopausal women who were randomized into the low-fat DM (N = 19,540) or comparison (N = 29,294) group for an average 8.1 years and followed for an additional 9.4 years through August 31, 2014. Exposure to specific class(es) or strength(s) of nonsteroidal anti-inflammatory drugs (NSAIDs) was modeled at baseline and as time-dependent use through the 9-year clinic visit. A Cox proportional hazard model was employed to assess the association of the DM, medication use, and their interaction with colorectal cancer events. RESULTS: A total of 906 incident cases of colorectal cancer were identified during the intervention and postintervention periods. By both exposure models, we found that colorectal cancer incidence was not different in the DM from the comparison group among any type of NSAID users. None of the interactions with any category of NSAID use was statistically significant; however there was most modest evidence for an interaction (p = 0.07) with aspirin use at baseline (hazard ratio [HR] = 0.81, 95% confidence interval [CI], 0.60-1.11 for users; HR = 1.12, 95% CI, 0.97-1.30 for nonusers). Strength and duration of aspirin use at baseline did not alter the associations. CONCLUSION: Extended follow-up of women in the WHI DM trial did not confirm combined protective effects of aspirin and low-fat diet on colorectal cancer risk among the postmenopausal women.


Subject(s)
Anti-Inflammatory Agents, Non-Steroidal , Colorectal Neoplasms/prevention & control , Dietary Fats/administration & dosage , Aged , Female , Humans , Middle Aged , Proportional Hazards Models , Risk Factors
20.
PLoS One ; 11(7): e0158822, 2016.
Article in English | MEDLINE | ID: mdl-27438335

ABSTRACT

BACKGROUND: The burden of disease due to norovirus infection has been well described in the general United States population, but studies of norovirus occurrence among persons with chronic medical conditions have been limited mostly to the immunocompromised. We assessed the impact of norovirus gastroenteritis on health care utilization in US subjects with a range of chronic medical conditions. METHODS: We performed a retrospective cohort study using MarketScan data from July 2002 to December 2013, comparing the rates of emergency department visits, outpatient visits and hospitalizations among patients with chronic conditions (renal, cardiovascular, respiratory, immunocompromising, gastrointestinal, hepatic/pancreatic and neurological conditions and diabetes) with those in a healthy population. We estimated the rates of these outcomes due to norovirus gastroenteritis using an indirect modelling approach whereby cases of gastroenteritis of unknown cause and not attributed to a range of other causes were assumed to be due to norovirus. RESULTS: Hospitalization rates for norovirus gastroenteritis were higher in all of the risk groups analyzed compared with data in otherwise healthy subjects, ranging from 3.2 per 10,000 person-years in persons with chronic respiratory conditions, to 23.1 per 10,000 person-years in persons with chronic renal conditions, compared to 2.1 per 10,000 among persons without chronic conditions. Over 51% of all norovirus hospitalizations occurred in the 37% of the population with some form of chronic medical condition. Outpatient visits for norovirus gastroenteritis were also increased in persons with chronic gastrointestinal or immunocompromising conditions. CONCLUSION: Norovirus gastroenteritis leads to significantly higher rates of healthcare utilization in patients with a chronic medical condition compared to patients without any such condition.


Subject(s)
Caliciviridae Infections/epidemiology , Chronic Disease/epidemiology , Models, Statistical , Norovirus/physiology , Acute Disease , Adolescent , Adult , Age Factors , Aged , Aged, 80 and over , Child , Child, Preschool , Delivery of Health Care/statistics & numerical data , Emergency Service, Hospital/statistics & numerical data , Gastroenteritis/diagnosis , Gastroenteritis/epidemiology , Gastroenteritis/virology , Hospitalization/statistics & numerical data , Humans , Incidence , Infant , Infant, Newborn , Middle Aged , Outpatients/statistics & numerical data , Risk Factors , Seasons , Young Adult
SELECTION OF CITATIONS
SEARCH DETAIL
...