Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 7 de 7
Filter
Add more filters











Database
Language
Publication year range
1.
Am J Health Syst Pharm ; 76(5): 275-285, 2019 Feb 09.
Article in English | MEDLINE | ID: mdl-30698654

ABSTRACT

PURPOSE: To compare patients with atrial fibrillation (AF) initiating direct oral anticoagulants (DOACs) versus warfarin on clinical outcomes including stroke, systemic embolism (SE), bleeding events, and cost of care. METHODS: This retrospective observational study used Medicare Advantage Prescription Drug and fully insured commercial claims from the Humana Research Database. Patients with AF who initiated a DOAC or warfarin from January 1, 2012, through September 30, 2015, were included. Date of the first prescription of DOAC or warfarin was the index date. Patients in the DOAC and warfarin groups were matched on propensity scores. Patients were censored at end of enrollment or study period, discontinuation, or switch of index medication. Clinical outcomes were compared in the matched groups using Cox proportional hazards models. Annualized costs and costs adjusted for censoring using Lin's interval method were also compared between the two cohorts. RESULTS: Patients on DOACs had a significantly lower risk of ischemic stroke (hazard ratio [HR], 0.88; 95% confidence interval [CI], 0.79-0.98), hemorrhagic stroke (HR, 0.65; CI, 0.46-0.92), SE (HR, 0.53; 95% CI, 0.43-0.65), and composite outcome of stroke or SE (HR, 0.78; 95% CI, 0.71-0.86) compared with patients on warfarin. Bleeding risk was not statistically significant (HR, 0.85; 95% CI, 0.71-1.01). While annualized pharmacy costs were higher, annualized medical and total costs were lower in the DOAC group compared with the warfarin group. CONCLUSION: The results of the study indicated that patients on DOACs had lower rates of ischemic stroke, hemorrhagic stroke, SE, and composite outcome of stroke or SE compared with patients on warfarin. No significant differences in bleeding rates between the DOAC and warfarin groups were observed, while total cost of care was lower in the DOAC group.


Subject(s)
Anticoagulants/administration & dosage , Atrial Fibrillation/drug therapy , Atrial Fibrillation/epidemiology , Medicare Part C , Warfarin/administration & dosage , Aged , Aged, 80 and over , Anticoagulants/adverse effects , Atrial Fibrillation/diagnosis , Female , Hemorrhage/chemically induced , Hemorrhage/diagnosis , Hemorrhage/epidemiology , Humans , Longitudinal Studies , Male , Medicare Part C/trends , Retrospective Studies , Stroke/diagnosis , Stroke/epidemiology , Stroke/prevention & control , Treatment Outcome , United States/epidemiology , Warfarin/adverse effects
2.
J Comp Eff Res ; 7(7): 685-691, 2018 07.
Article in English | MEDLINE | ID: mdl-29808717

ABSTRACT

Factors influencing differences in persistence between dabigatran and warfarin in patients with nonvalvular atrial fibrillation (NVAF) remain unclear. AIM: Compare differences in persistence between new dabigatran and warfarin users in patients newly diagnosed with NVAF, adjusting for sociodemographics, clinical characteristics, patient out-of-pocket cost and other covariates. METHODS: A retrospective matched-cohort study was conducted using a US claims database of Medicare and commercially insured patients with NVAF aged≥ 18 years. Persistence and monthly out-of-pocket costs for dabigatran or warfarin were calculated and adjusted for covariates using Cox proportional hazard models. RESULTS & CONCLUSION: Unadjusted persistence was significantly lower among dabigatran users (n = 1025) compared with matched warfarin users (38 vs 46%). Adjusting for covariates rendered this difference insignificant (hazard ratio = 0.930).


Subject(s)
Anticoagulants/therapeutic use , Atrial Fibrillation/drug therapy , Dabigatran/therapeutic use , Warfarin/therapeutic use , Aged , Antithrombins/economics , Antithrombins/therapeutic use , Atrial Fibrillation/economics , Cohort Studies , Costs and Cost Analysis , Dabigatran/economics , Databases, Factual , Drug Costs , Female , Humans , Male , Medicare/economics , Medication Adherence , Proportional Hazards Models , Retrospective Studies , Stroke/economics , Stroke/prevention & control , United States , Warfarin/economics
3.
Pain Pract ; 14(3): E116-25, 2014 Mar.
Article in English | MEDLINE | ID: mdl-24268019

ABSTRACT

OBJECTIVE: Growth in the number of patients with pain conditions, and the subsequent rise in prescription opioid use for treatment, has been accompanied by an increase in diagnosed opioid abuse. Understanding what drives the incremental healthcare costs of members diagnosed with prescription opioid abuse may assist in developing better screening techniques for abuse. DESIGN: This retrospective analysis examined costs, resource use, and comorbidities 365 days pre- and postdiagnosis in prescription opioid users diagnosed with abuse (cases) vs. their matched nondiagnosed controls. Inclusion criteria for cases were diagnosis of opioid abuse (ICD-9-CM: 304.0x, 304.7x, 305.5x, 965.0x). Multivariate analysis used generalized linear modeling with log-transformed cost as dependent variable, controlling for comorbidities. RESULTS: Final sample sizes were 8,390 cases and 16,780 matched controls. Postindex abuse-related costs were $2,099 for commercial members, $539 for Medicare members aged < 65, and $170 for Medicare members aged ≥ 65. A higher percentage of cases had pain conditions (82.0% vs. 57.4% commercial, 95.9% vs. 87.5% Medicare members aged < 65, 92.9% vs. 82.4% Medicare members aged ≥ 65, P < 0.0001), and a higher numbers of cases had multiple opioid prescribers (3.7 vs. 1.4 commercial, 3.3 vs. 2.2 Medicare < 65, 2.2 vs. 1.6 Medicare ≥ 65, P < 0.0001) than controls preindex. Cases had higher rates of substance abuse and psychiatric diagnoses pre- and postindex (P < 0.0001, all comparisons). Adjusted costs were 28% higher for cases than for controls (P < 0.0001). CONCLUSION: Costs of members diagnosed with prescription opioid abuse are driven by higher pain and psychiatric comorbidities relative to nonabuse controls.


Subject(s)
Analgesics, Opioid/economics , Health Care Costs , Medicare/economics , Opioid-Related Disorders/economics , Adolescent , Adult , Aged , Aged, 80 and over , Child , Female , Humans , Male , Middle Aged , Retrospective Studies , United States , Young Adult
4.
Pain Pract ; 14(3): E106-15, 2014 Mar.
Article in English | MEDLINE | ID: mdl-24289539

ABSTRACT

PURPOSE: To measure the prevalence of diagnosed opioid abuse and prescription opioid use in a multistate managed care organization. METHODS: This retrospective claims data analysis reviewed the prevalence of diagnosed opioid abuse and the parallel prevalence of prescription opioid use in half-year intervals for commercial and Medicare members enrolled with Humana Inc., from January 1, 2008 to June 30, 2010. Diagnosis of opioid abuse was defined by ≥ 1 medical claim with any of the following ICD-9-CM codes: 304.0 ×, 304.7 ×, 305.5 ×, 965.0 ×, excluding 965.01, and opioid use was defined by ≥ 1 filled prescription for an opioid. The prevalence of opioid abuse was defined by the number of members with an opioid abuse diagnosis, divided by the number of members enrolled in each 6-month interval. RESULTS: The 6-month prevalence of diagnosed opioid abuse increased from 0.84 to 1.15 among commercial and from 3.17 to 6.35 among Medicare members, per 1,000. In contrast, there was no marked increase in prescription opioid use during the same time period (118.0 to 114.8 for commercial members, 240.6 to 256.9 for Medicare members, per 1,000). The prevalence of diagnosed opioid abuse was highest among members younger than 65 years for both genders in commercial (18- to 34-year-olds) and Medicare (35- to 54-year-olds) populations. CONCLUSIONS: Despite a stable rate of prescription opioid use among the observed population, the prevalence of diagnosed opioid abuse is increasing, particularly in the Medicare population.


Subject(s)
Medicare/economics , Opioid-Related Disorders/epidemiology , Analgesics, Opioid/economics , Databases, Factual , Humans , Insurance Claim Review/economics , Managed Care Programs , Opioid-Related Disorders/diagnosis , Opioid-Related Disorders/economics , Prevalence , Retrospective Studies , United States
5.
Am J Manag Care ; 19(10): 816-23, 2013 Oct.
Article in English | MEDLINE | ID: mdl-24304160

ABSTRACT

OBJECTIVE: To identify inefficiencies in drug and medical service utilization related to pain management in patients with osteoarthritis and chronic low back pain. STUDY DESIGN: This retrospective cohort study applied revised measures of pain management inefficiencies to Humana Medicare members with osteoarthritis and/or chronic low back pain. METHODS: Subjects had either 2 or more claims for osteoarthritis on different days or 2 or more claims for low back pain 90 or more days apart, from January 1, 2008, to June 30, 2010, with the first occurrence assigned the index date. Inefficiencies were identified for 365 days postindex.Pain-related healthcare costs postindex were compared between members with and without inefficiencies. A generalized linear model calculated adjusted costs per member controlling for age, sex, and comorbidities. RESULTS: Most members diagnosed with osteoarthritis, chronic low back pain, or both (N = 68,453) had at least 1 inefficiency measure (n = 37,863) during the postindex period. High per member costs were for repeated surgical procedures ($26,451) and inpatient admissions ($19,372) compared with members without inefficiencies ($781; P < .0001). High total costs (prevalence times per member cost) were for repeated diagnostic testing and excessive office visits. Members with an inefficiency had adjusted pain-related costs 5.42 times higher than those of members without an inefficiency (P <.0001). CONCLUSIONS: Pain management inefficiencies are common and costly among Humana Medicare members with osteoarthritis and/or chronic low back pain. Further work by providers and payers is needed to determine benefits of member identification and early intervention for these inefficiencies.


Subject(s)
Low Back Pain/therapy , Osteoarthritis/therapy , Pain Management/economics , Adolescent , Adult , Chronic Pain/economics , Chronic Pain/therapy , Humans , Insurance Claim Review , Low Back Pain/economics , Middle Aged , Osteoarthritis/economics , Outcome Assessment, Health Care , Pain Management/standards , Quality of Health Care , Retrospective Studies , Young Adult
6.
J Theor Biol ; 329: 20-31, 2013 Jul 21.
Article in English | MEDLINE | ID: mdl-23567649

ABSTRACT

There is a need to advance our ability to conduct credible human risk assessments for inhalational anthrax associated with exposure to a low number of bacteria. Combining animal data with computational models of disease will be central in the low-dose and cross-species extrapolations required in achieving this goal. The objective of the current work was to apply and advance the competing risks (CR) computational model of inhalational anthrax where data was collected from NZW rabbits exposed to aerosols of Ames strain Bacillus anthracis. An initial aim was to parameterize the CR model using high-dose rabbit data and then conduct a low-dose extrapolation. The CR low-dose attack rate was then compared against known low-dose rabbit data as well as the low-dose curve obtained when the entire rabbit dose-response data set was fitted to an exponential dose-response (EDR) model. The CR model predictions demonstrated excellent agreement with actual low-dose rabbit data. We next used a modified CR model (MCR) to examine disease incubation period (the time to reach a fever >40 °C). The MCR model predicted a germination period of 14.5h following exposure to a low spore dose, which was confirmed by monitoring spore germination in the rabbit lung using PCR, and predicted a low-dose disease incubation period in the rabbit between 14.7 and 16.8 days. Overall, the CR and MCR model appeared to describe rabbit inhalational anthrax well. These results are discussed in the context of conducting laboratory studies in other relevant animal models, combining the CR/MCR model with other computation models of inhalational anthrax, and using the resulting information towards extrapolating a low-dose response prediction for man.


Subject(s)
Anthrax/microbiology , Bacillus anthracis/pathogenicity , Infectious Disease Incubation Period , Models, Biological , Respiratory Tract Infections/microbiology , Animals , Anthrax/prevention & control , Anthrax Vaccines , Bacillus anthracis/physiology , Bacterial Load , Disease Models, Animal , Lung/microbiology , Male , Rabbits , Respiratory Tract Infections/prevention & control , Risk Assessment/methods , Spores, Bacterial/pathogenicity , Spores, Bacterial/physiology
7.
Article in English | MEDLINE | ID: mdl-22919678

ABSTRACT

There is a need to better understand inhalational anthrax in relevant animal models. This understanding could aid risk assessment, help define therapeutic windows, and provide a better understanding of disease. The aim here was to characterize and quantify bacterial deposition and dissemination in rabbits following exposure to single high aerosol dose (> 100 LD(50)) of Bacillus anthracis (Ames) spores immediately following exposure through 36 h. The primary goal of collecting the data was to support investigators in developing computational models of inhalational anthrax disease. Rabbits were vaccinated prior to exposure with the human vaccine (Anthrax Vaccine Adsorbed, AVA) or were sham-vaccinated, and were then exposed in pairs (one sham and one AVA) so disease kinetics could be characterized in equally-dosed hosts where one group is fully protected and is able to clear the infection (AVA-vaccinated), while the other is susceptible to disease, in which case the bacteria are able to escape containment and replicate uncontrolled (sham-vaccinated rabbits). Between 4-5% of the presented aerosol dose was retained in the lung of sham- and AVA-vaccinated rabbits as measured by dilution plate analysis of homogenized lung tissue or bronchoalveolar lavage (BAL) fluid. After 6 and 36 h, >80% and >96%, respectively, of the deposited spores were no longer detected in BAL, with no detectable difference between sham- or AVA-vaccinated rabbits. Thereafter, differences between the two groups became noticeable. In sham-vaccinated rabbits the bacteria were detected in the tracheobronchial lymph nodes (TBLN) 12 h post-exposure and in the circulation at 24 h, a time point which was also associated with dramatic increases in vegetative CFU in the lung tissue of some animals. In all sham-vaccinated rabbits, bacteria increased in both TBLN and blood through 36 h at which point in time some rabbits succumbed to disease. In contrast, AVA-vaccinated rabbits showed small numbers of CFU in TBLN between 24 and 36 h post-exposure with small numbers of bacteria in the circulation only at 24 h post-exposure. These results characterize and quantify disease progression in naïve rabbits following aerosol administration of Ames spores which may be useful in a number of different research applications, including developing quantitative models of infection for use in human inhalational anthrax risk assessment.


Subject(s)
Anthrax Vaccines/immunology , Anthrax/complications , Anthrax/pathology , Bacillus anthracis/pathogenicity , Bacteremia/pathology , Blood/microbiology , Lung/microbiology , Respiratory Tract Infections/complications , Respiratory Tract Infections/pathology , Animals , Anthrax/microbiology , Anthrax/prevention & control , Anthrax Vaccines/administration & dosage , Bacteremia/microbiology , Bacteremia/prevention & control , Bacterial Load , Disease Models, Animal , Follow-Up Studies , Inhalation Exposure , Lymph Nodes/microbiology , Rabbits , Respiratory Tract Infections/microbiology , Respiratory Tract Infections/prevention & control , Time Factors
SELECTION OF CITATIONS
SEARCH DETAIL