Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 256
Filter
Add more filters

Publication year range
1.
Pharmacoepidemiol Drug Saf ; 32(5): 545-557, 2023 05.
Article in English | MEDLINE | ID: mdl-36464785

ABSTRACT

BACKGROUND: We sought to develop and prospectively validate a dynamic model that incorporates changes in biomarkers to predict rapid clinical deterioration in patients hospitalized for COVID-19. METHODS: We established a retrospective cohort of hospitalized patients aged ≥18 years with laboratory-confirmed COVID-19 using electronic health records (EHR) from a large integrated care delivery network in Massachusetts including >40 facilities from March to November 2020. A total of 71 factors, including time-varying vital signs and laboratory findings during hospitalization were screened. We used elastic net regression and tree-based scan statistics for variable selection to predict rapid deterioration, defined as progression by two levels of a published severity scale in the next 24 h. The development cohort included the first 70% of patients identified chronologically in calendar time; the latter 30% served as the validation cohort. A cut-off point was estimated to alert clinicians of high risk of imminent clinical deterioration. RESULTS: Overall, 3706 patients (2587 in the development and 1119 in the validation cohort) met the eligibility criteria with a median of 6 days of follow-up. Twenty-four variables were selected in the final model, including 16 dynamic changes of laboratory results or vital signs. Area under the ROC curve was 0.81 (95% CI, 0.79-0.82) in the development set and 0.74 (95% CI, 0.71-0.78) in the validation set. The model was well calibrated (slope = 0.84 and intercept = -0.07 on the calibration plot in the validation set). The estimated cut-off point, with a positive predictive value of 83%, was 0.78. CONCLUSIONS: Our prospectively validated dynamic prognostic model demonstrated temporal generalizability in a rapidly evolving pandemic and can be used to inform day-to-day treatment and resource allocation decisions based on dynamic changes in biophysiological factors.


Subject(s)
COVID-19 , Clinical Deterioration , Humans , Adolescent , Adult , COVID-19/epidemiology , Prognosis , Retrospective Studies , Hospitalization
2.
Pharmacoepidemiol Drug Saf ; 31(12): 1280-1286, 2022 12.
Article in English | MEDLINE | ID: mdl-36089808

ABSTRACT

Traditional approaches to hypothesis testing in comparative post-approval safety and effectiveness studies of medical products are often inadequate because of a limited scope of possible inferences (e.g., superiority or inferiority). Often there is interest in simultaneously testing for superiority, equivalence, inferiority, non-inferiority, and non-superiority, which can be achieved using a partition testing framework. Partition testing only requires selection of an equivalence margin and calculation of a two-sided Wald confidence interval. In addition to permitting a broader range of inferences, the strengths of the approach include: mitigating publication bias, avoiding use of a clinically irrelevant nil hypothesis, and more transparent and impartial appraisal of the clinical importance of a study's findings by pre-specifying an equivalence margin. However, a challenge in implementing the approach can be the process for identifying an equivalence margin. The methodology is illustrated using a published study of the safety of Ondansetron for the off-label treatment of nausea and vomiting during pregnancy. Applying the method to the study results would have led to a conclusion that women exposed to Ondansetron in comparison to those that are not, are equivalent with respect to risk of cardiac malformations and oral clefts. These conclusions are more in line with the magnitude of the observed effects than the conclusions resulting from a traditional inferiority/superiority testing conducted by the study authors.


Subject(s)
Ondansetron , Research Design , Humans , Female
3.
Pharmacoepidemiol Drug Saf ; 31(4): 411-423, 2022 04.
Article in English | MEDLINE | ID: mdl-35092316

ABSTRACT

PURPOSE: The high-dimensional propensity score (HDPS) is a semi-automated procedure for confounder identification, prioritisation and adjustment in large healthcare databases that requires investigators to specify data dimensions, prioritisation strategy and tuning parameters. In practice, reporting of these decisions is inconsistent and this can undermine the transparency, and reproducibility of results obtained. We illustrate reporting tools, graphical displays and sensitivity analyses to increase transparency and facilitate evaluation of the robustness of analyses involving HDPS. METHODS: Using a study from the UK Clinical Practice Research Datalink that implemented HDPS we demonstrate the application of the proposed recommendations. RESULTS: We identify seven considerations surrounding the implementation of HDPS, such as the identification of data dimensions, method for code prioritisation and number of variables selected. Graphical diagnostic tools include assessing the balance of key confounders before and after adjusting for empirically selected HDPS covariates and the identification of potentially influential covariates. Sensitivity analyses include varying the number of covariates selected and assessing the impact of covariates behaving empirically as instrumental variables. In our example, results were robust to both the number of covariates selected and the inclusion of potentially influential covariates. Furthermore, our HDPS models achieved good balance in key confounders. CONCLUSIONS: The data-adaptive approach of HDPS and the resulting benefits have led to its popularity as a method for confounder adjustment in pharmacoepidemiological studies. Reporting of HDPS analyses in practice may be improved by the considerations and tools proposed here to increase the transparency and reproducibility of study results.


Subject(s)
Algorithms , Pharmacoepidemiology , Confounding Factors, Epidemiologic , Humans , Propensity Score , Reproducibility of Results
4.
Ann Intern Med ; 174(9): 1214-1223, 2021 09.
Article in English | MEDLINE | ID: mdl-34280330

ABSTRACT

BACKGROUND: The role of differing levels of frailty in the choice of oral anticoagulants for older adults with atrial fibrillation (AF) is unclear. OBJECTIVE: To examine the outcomes of direct oral anticoagulants (DOACs) versus warfarin by frailty levels. DESIGN: 1:1 propensity score-matched analysis of Medicare data, 2010 to 2017. SETTING: Community. PATIENTS: Medicare beneficiaries with AF who initiated use of dabigatran, rivaroxaban, apixaban, or warfarin. MEASUREMENTS: Composite end point of death, ischemic stroke, or major bleeding by frailty levels, defined by a claims-based frailty index. RESULTS: In the dabigatran-warfarin cohort (n = 158 730; median follow-up, 72 days), the event rate per 1000 person-years was 63.5 for dabigatran initiators and 65.6 for warfarin initiators (hazard ratio [HR], 0.98 [95% CI, 0.92 to 1.05]; rate difference [RD], -2.2 [CI, -6.5 to 2.1]). For nonfrail, prefrail, and frail persons, HRs were 0.81 (CI, 0.68 to 0.97), 0.98 (CI, 0.90 to 1.08), and 1.09 (CI, 0.96 to 1.23), respectively. In the rivaroxaban-warfarin cohort (n = 275 944; median follow-up, 82 days), the event rate per 1000 person-years was 77.8 for rivaroxaban initiators and 83.7 for warfarin initiators (HR, 0.98 [CI, 0.94 to 1.02]; RD, -5.9 [CI, -9.4 to -2.4]). For nonfrail, prefrail, and frail persons, HRs were 0.88 (CI, 0.77 to 0.99), 1.04 (CI, 0.98 to 1.10), and 0.96 (CI, 0.89 to 1.04), respectively. In the apixaban-warfarin cohort (n = 218 738; median follow-up, 84 days), the event rate per 1000 person-years was 60.1 for apixaban initiators and 92.3 for warfarin initiators (HR, 0.68 [CI, 0.65 to 0.72]; RD, -32.2 [CI, -36.1 to -28.3]). For nonfrail, prefrail, and frail persons, HRs were 0.61 (CI, 0.52 to 0.71), 0.66 (CI, 0.61 to 0.70), and 0.73 (CI, 0.67 to 0.80), respectively. LIMITATIONS: Residual confounding and lack of clinical frailty assessment. CONCLUSION: For older adults with AF, apixaban was associated with lower rates of adverse events across all frailty levels. Dabigatran and rivaroxaban were associated with lower event rates only among nonfrail patients. PRIMARY FUNDING SOURCE: National Institute on Aging.


Subject(s)
Anticoagulants/administration & dosage , Atrial Fibrillation/drug therapy , Frail Elderly , Warfarin/administration & dosage , Administration, Oral , Aged , Dabigatran/administration & dosage , Female , Humans , Male , Massachusetts , Medicare , Propensity Score , Pyrazoles/administration & dosage , Pyridones/administration & dosage , Retrospective Studies , Rivaroxaban/administration & dosage , United States
5.
JAMA ; 327(11): 1051-1060, 2022 03 15.
Article in English | MEDLINE | ID: mdl-35289881

ABSTRACT

Importance: Guidelines for managing venous thromboembolism (VTE) recommend at least 90 days of therapy with oral anticoagulants. Limited evidence exists about the optimal drug for continuing therapy beyond 90 days. Objective: To compare having prescriptions dispensed for apixaban, rivaroxaban, or warfarin after an initial 90 days of anticoagulation therapy for the outcomes of hospitalization for recurrent VTE, major bleeding, and death. Design, Setting, and Participants: This exploratory retrospective cohort study used data from fee-for-service Medicare (2009-2017) and from 2 commercial health insurance (2004-2018) databases and included 64 642 adults who initiated oral anticoagulation following hospitalization discharge for VTE and continued treatment beyond 90 days. Exposures: Apixaban, rivaroxaban, or warfarin prescribed after an initial 90-day treatment for VTE. Main Outcomes and Measures: Primary outcomes included hospitalization for recurrent VTE and hospitalization for major bleeding. Analyses were adjusted using propensity score weighting. Patients were followed up from the end of the initial 90-day treatment episode until treatment cessation, outcome, death, disenrollment, or end of available data. Weighted Cox proportional hazards models were used to estimate hazard ratios (HRs) and 95% CIs. Results: The study included 9167 patients prescribed apixaban (mean [SD] age, 71 [14] years; 5491 [59.9%] women), 12 468 patients prescribed rivaroxaban (mean [SD] age, 69 [14] years; 7067 [56.7%] women), and 43 007 patients prescribed warfarin (mean [SD] age, 70 [15] years; 25 404 [59.1%] women). The median (IQR) follow-up was 109 (59-228) days for recurrent VTE and 108 (58-226) days for major bleeding outcome. After propensity score weighting, the incidence rate of hospitalization for recurrent VTE was significantly lower for apixaban compared with warfarin (9.8 vs 13.5 per 1000 person-years; HR, 0.69 [95% CI, 0.49-0.99]), but the incidence rates were not significantly different between apixaban and rivaroxaban (9.8 vs 11.6 per 1000 person-years; HR, 0.80 [95% CI, 0.53-1.19]) or rivaroxaban and warfarin (HR, 0.87 [95% CI, 0.65-1.16]). Rates of hospitalization for major bleeding were 44.4 per 1000 person-years for apixaban, 50.0 per 1000 person-years for rivaroxaban, and 47.1 per 1000 person-years for warfarin, yielding HRs of 0.92 (95% CI, 0.78-1.09) for apixaban vs warfarin, 0.86 (95% CI, 0.71-1.04) for apixaban vs rivaroxaban, and 1.07 (95% CI, 0.93-1.24) for rivaroxaban vs warfarin. Conclusions and Relevance: In this exploratory analysis of patients prescribed extended-duration oral anticoagulation therapy after hospitalization for VTE, prescription dispenses for apixaban beyond 90 days, compared with warfarin beyond 90 days, were significantly associated with a modestly lower rate of hospitalization for recurrent VTE, but no significant difference in rate of hospitalization for major bleeding. There were no significant differences for comparisons of apixaban vs rivaroxaban or rivaroxaban vs warfarin.


Subject(s)
Anticoagulants/adverse effects , Pyrazoles/adverse effects , Pyridones/adverse effects , Rivaroxaban/adverse effects , Venous Thromboembolism/drug therapy , Warfarin/adverse effects , Administration, Oral , Aged , Aged, 80 and over , Anticoagulants/administration & dosage , Cohort Studies , Female , Hospitalization , Humans , Male , Middle Aged , Pyrazoles/administration & dosage , Pyridones/administration & dosage , Recurrence , Retrospective Studies , Rivaroxaban/administration & dosage , Time Factors , Warfarin/administration & dosage
6.
Am J Epidemiol ; 190(7): 1424-1433, 2021 07 01.
Article in English | MEDLINE | ID: mdl-33615330

ABSTRACT

The tree-based scan statistic (TreeScan; Martin Kulldorff, Harvard Medical School, Boston, Massachusetts) is a data-mining method that adjusts for multiple testing of correlated hypotheses when screening thousands of potential adverse events for signal identification. Simulation has demonstrated the promise of TreeScan with a propensity score (PS)-matched cohort design. However, it is unclear which variables to include in a PS for applied signal identification studies to simultaneously adjust for confounding across potential outcomes. We selected 4 pairs of medications with well-understood safety profiles. For each pair, we evaluated 5 candidate PSs with different combinations of 1) predefined general covariates (comorbidity, frailty, utilization), 2) empirically selected (data-driven) covariates, and 3) covariates tailored to the drug pair. For each pair, statistical alerting patterns were similar with alternative PSs (≤11 alerts in 7,996 outcomes scanned). Inclusion of covariates tailored to exposure did not appreciably affect screening results. Inclusion of empirically selected covariates can provide better proxy coverage for confounders but can also decrease statistical power. Unlike tailored covariates, empirical and predefined general covariates can be applied "out of the box" for signal identification. The choice of PS depends on the level of concern about residual confounding versus loss of power. Potential signals should be followed by pharmacoepidemiologic assessment where confounding control is tailored to the specific outcome(s) under investigation.


Subject(s)
Data Interpretation, Statistical , Data Mining/methods , Drug Evaluation/statistics & numerical data , Pharmacoepidemiology/methods , Propensity Score , Cohort Studies , Humans
7.
Am Heart J ; 233: 109-121, 2021 03.
Article in English | MEDLINE | ID: mdl-33358690

ABSTRACT

BACKGROUND: In patients with atrial fibrillation, incomplete adherence to anticoagulants increases risk of stroke. Non-warfarin oral anticoagulants (NOACs) are expensive; we evaluated whether higher copayments are associated with lower NOAC adherence. METHODS: Using a national claims database of commercially-insured patients, we performed a cohort study of patients with atrial fibrillation who newly initiated a NOAC from 2012 to 2018. Patients were stratified into low (<$35), medium ($35-$59), or high (≥$60) copayments and propensity-score weighted based on demographics, insurance characteristics, comorbidities, prior health care utilization, calendar year, and the NOAC received. Follow-up was 1 year, with censoring for switching to a different anticoagulant, undergoing an ablation procedure, disenrolling from the insurance plan, or death. The primary outcome was adherence, measured by proportion of days covered (PDC). Secondary outcomes included NOAC discontinuation (no refill for 30 days after the end of NOAC supply) and switching anticoagulants. We compared PDC using a Kruskal-Wallis test and rates of discontinuation and switching using Cox proportional hazards models. RESULTS: After weighting patients across the 3 copayment groups, the effective sample size was 17,558 patients, with balance across 50 clinical and demographic covariates (standardized differences <0.1). Mean age was 62 years, 29% of patients were female, and apixaban (43%), and rivaroxaban (38%) were the most common NOACs. Higher copayments were associated with lower adherence (P < .001), with a PDC of 0.82 (Interquartile range [IQR] 0.36-0.98) among those with high copayments, 0.85 (IQR 0.41-0.98) among those with medium copayments, and 0.88 (IQR 0.41-0.99) among those with low copayments. Compared to patients with low copayments, patients with high copayments had higher rates of discontinuation (hazard ratio [HR] 1.13, 95% confidence interval [CI] 1.08-1.19; P < .001). CONCLUSIONS: Among atrial fibrillation patients newly initiating NOACs, higher copayments in commercial insurance were associated with lower adherence and higher rates of discontinuation in the first year. Policies to lower or limit cost-sharing of important medications may lead to improved adherence and better outcomes among patients receiving NOACs.


Subject(s)
Atrial Fibrillation/complications , Deductibles and Coinsurance/economics , Medication Adherence/statistics & numerical data , Stroke/prevention & control , Anticoagulants/economics , Anticoagulants/therapeutic use , Antithrombins/economics , Antithrombins/therapeutic use , Cohort Studies , Dabigatran/economics , Dabigatran/therapeutic use , Databases, Factual/statistics & numerical data , Deductibles and Coinsurance/statistics & numerical data , Drug Costs , Factor Xa Inhibitors/economics , Factor Xa Inhibitors/therapeutic use , Female , Humans , Male , Medicare Part C/statistics & numerical data , Middle Aged , Pyrazoles/economics , Pyrazoles/therapeutic use , Pyridines/economics , Pyridines/therapeutic use , Pyridones/economics , Pyridones/therapeutic use , Rivaroxaban/economics , Rivaroxaban/therapeutic use , Sample Size , Stroke/etiology , Thiazoles/economics , Thiazoles/therapeutic use , United States , Warfarin/economics , Warfarin/therapeutic use
8.
Ophthalmology ; 128(2): 248-255, 2021 02.
Article in English | MEDLINE | ID: mdl-32777229

ABSTRACT

PURPOSE: There is an urgent need for treatments that prevent or delay development to advanced age-related macular degeneration (AMD). Drugs already on the market for other conditions could affect progression to neovascular AMD (nAMD). If identified, these drugs could provide insights for drug development targets. The objective of this study was to use a novel data mining method that can simultaneously evaluate thousands of correlated hypotheses, while adjusting for multiple testing, to screen for associations between drugs and delayed progression to nAMD. DESIGN: We applied a nested case-control study to administrative insurance claims data to identify cases with nAMD and risk-set sampled controls that were 1:4 variable ratio matched on age, gender, and recent healthcare use. PARTICIPANTS: The study population included cases with nAMD and risk set matched controls. METHODS: We used a tree-based scanning method to evaluate associations between hierarchical classifications of drugs that patients were exposed to within 6 months, 7 to 24 months, or ever before their index date. The index date was the date of first nAMD diagnosis in cases. Risk-set sampled controls were assigned the same index date as the case to which they were matched. The study was implemented using Medicare data from New Jersey and Pennsylvania, and national data from IBM MarketScan Research Database. We set an a priori threshold for statistical alerting at P ≤ 0.01 and focused on associations with large magnitude (relative risks ≥ 2.0). MAIN OUTCOME MEASURES: Progression to nAMD. RESULTS: Of approximately 4000 generic drugs and drug classes evaluated, the method detected 19 distinct drug exposures with statistically significant, large relative risks indicating that cases were less frequently exposed than controls. These included (1) drugs with prior evidence for a causal relationship (e.g., megestrol); (2) drugs without prior evidence for a causal relationship, but potentially worth further exploration (e.g., donepezil, epoetin alfa); (3) drugs with alternative biologic explanations for the association (e.g., sevelamer); and (4) drugs that may have resulted in statistical alerts due to their correlation with drugs that alerted for other reasons. CONCLUSIONS: This exploratory drug-screening study identified several potential targets for follow-up studies to further evaluate and determine if they may prevent or delay progression to advanced AMD.


Subject(s)
Choroidal Neovascularization/diagnosis , Drug Evaluation, Preclinical/methods , Drugs, Generic/therapeutic use , Wet Macular Degeneration/diagnosis , Aged , Aged, 80 and over , Case-Control Studies , Choroidal Neovascularization/prevention & control , Data Mining , Disease Progression , Drug Repositioning/methods , Female , Humans , Insurance Claim Review , Male , Medicare/statistics & numerical data , United States , Wet Macular Degeneration/prevention & control
9.
Value Health ; 24(6): 804-811, 2021 06.
Article in English | MEDLINE | ID: mdl-34119078

ABSTRACT

OBJECTIVES: In the United States, brand-name prescription drugs remain expensive until market exclusivity ends and lower-cost generics become available. Delayed generic drug uptake may increase spending and worsen medication adherence and patient outcomes. We assessed recent trends and factors associated with generic uptake. METHODS: Among 227 drugs facing new generic competition from 2012 to 2017, we used a national claims database to measure generic uptake in the first and second year after generic entry, defined as the proportion of claims for a generic version of the drug. Using linear regression, we evaluated associations between generic uptake and key drug characteristics. RESULTS: Mean generic uptake was 66.1% (standard deviation 22.1%) in the first year and 82.7% (standard deviation 21.6%) in the second year after generic entry. From 2012 to 2017 generic uptake decreased 4.3% per year in the first year (95% confidence interval, 2.8%-5.8%, P < .001) and 3.2%/year in the second year (95% confidence interval, 1.2%-5.1%). Generic uptake was lower for injected than oral drugs in the first year (38.5% vs 70.0%, P < .001) and second year (50.3% vs 86.9%, P < .001). In the second year, generic uptake was higher among drugs with an authorized generic (86.1 vs 80.1%, P = .045) and those with ≥3 generic competitors (87.7% vs 78.6%, P = .055). CONCLUSION: Early generic uptake decreased over the past several years. This trend may adversely affect patients and increase prescription drug spending. Policies are needed to encourage generic competition, particularly among injected drugs administered in a hospital or clinic setting.


Subject(s)
Drug Costs/trends , Drug Substitution/trends , Drugs, Generic/therapeutic use , Practice Patterns, Physicians'/trends , Prescription Drugs/therapeutic use , Cost-Benefit Analysis , Databases, Factual , Drug Prescriptions , Drug Substitution/economics , Drug Utilization/trends , Drugs, Generic/economics , Economic Competition/trends , Humans , Medication Adherence , Practice Patterns, Physicians'/economics , Prescription Drugs/economics , Time Factors , United States
10.
Pharmacoepidemiol Drug Saf ; 30(6): 671-684, 2021 06.
Article in English | MEDLINE | ID: mdl-33715267

ABSTRACT

PURPOSE: Consensus is needed on conceptual foundations, terminology and relationships among the various self-controlled "trigger" study designs that control for time-invariant confounding factors and target the association between transient exposures (potential triggers) and abrupt outcomes. The International Society for Pharmacoepidemiology (ISPE) funded a working group of ISPE members to develop guidance material for the application and reporting of self-controlled study designs, similar to Standards of Reporting Observational Epidemiology (STROBE). This first paper focuses on navigation between the types of self-controlled designs to permit a foundational understanding with guiding principles. METHODS: We leveraged a systematic review of applications of these designs, that we term Self-controlled Crossover Observational PharmacoEpidemiologic (SCOPE) studies. Starting from first principles and using case examples, we reviewed outcome-anchored (case-crossover [CCO], case-time control [CTC], case-case-time control [CCTC]) and exposure-anchored (self-controlled case-series [SCCS]) study designs. RESULTS: Key methodological features related to exposure, outcome and time-related concerns were clarified, and a common language and worksheet to facilitate the design of SCOPE studies is introduced. CONCLUSIONS: Consensus on conceptual foundations, terminology and relationships among SCOPE designs will facilitate understanding and critical appraisal of published studies, as well as help in the design, analysis and review of new SCOPE studies. This manuscript is endorsed by ISPE.


Subject(s)
Pharmacoepidemiology , Research Design , Case-Control Studies , Cross-Over Studies , Humans , Time Factors
11.
Pharmacoepidemiol Drug Saf ; 30(3): 320-333, 2021 03.
Article in English | MEDLINE | ID: mdl-33099844

ABSTRACT

PURPOSES: Drug induced acute liver injury (ALI) is a frequent cause of liver failure. Case-based designs were empirically assessed and calibrated in the French National claims database (SNDS), aiming to identify the optimum design for drug safety alert generation associated with ALI. METHODS: All cases of ALI were extracted from SNDS (2009-2014) using specific and sensitive definitions. Positive and negative drug controls were used to compare 196 self-controlled case series (SCCS), case-control (CC), and case-population (CP) design variants, using area under the receiver operating curve (AUC), mean square error (MSE) and coverage probability. Parameters that had major impacts on results were identified through logistic regression. RESULTS: Using a specific ALI definition, AUCs ranged from 0.78 to 0.94, 0.64 to 0.92 and 0.48 to 0.85, for SCCS, CC and CP, respectively. MSE ranged from 0.12 to 0.40, 0.22 to 0.39 and 1.03 to 5.29, respectively. Variants adjusting for multiple drug use had higher coverage probabilities. Univariate regressions showed that high AUCs were achieved with SCCS using exposed time as the risk window. The top SCCS variant yielded an AUC = 0.93 and MSE = 0.22 and coverage = 86%, with 1/7 negative and 13/18 positive controls presenting significant estimates. CONCLUSIONS: SCCS adjusting for multiple drugs and using exposed time as the risk window performed best in generating ALI-related drug safety alert and providing estimates of the magnitude of the risk. This approach may be useful for ad-hoc pharmacoepidemiology studies to support regulatory actions.


Subject(s)
Pharmaceutical Preparations , Pharmacoepidemiology , Databases, Factual , Delivery of Health Care , Humans , Liver
12.
Ann Intern Med ; 172(7): 463-473, 2020 04 07.
Article in English | MEDLINE | ID: mdl-32150751

ABSTRACT

Background: Apixaban and rivaroxaban are the most commonly prescribed direct oral anticoagulants for adults with atrial fibrillation, but head-to-head data comparing their safety and effectiveness are lacking. Objective: To compare the safety and effectiveness of apixaban versus rivaroxaban for patients with nonvalvular atrial fibrillation. Design: New-user, active-comparator, retrospective cohort study. Setting: A U.S. nationwide commercial health care claims database from 28 December 2012 to 1 January 2019. Patients: Adults newly prescribed apixaban (n = 59 172) or rivaroxaban (n = 40 706). Measurements: The primary effectiveness outcome was a composite of ischemic stroke or systemic embolism. The primary safety outcome was a composite of intracranial hemorrhage or gastrointestinal bleeding. Results: 39 351 patients newly prescribed apixaban were propensity score matched to 39 351 patients newly prescribed rivaroxaban. Mean age was 69 years, 40% of patients were women, and mean follow-up was 288 days for new apixaban users and 291 days for new rivaroxaban users. The incidence rate of ischemic stroke or systemic embolism was 6.6 per 1000 person-years for adults prescribed apixaban compared with 8.0 per 1000 person-years for those prescribed rivaroxaban (hazard ratio [HR], 0.82 [95% CI, 0.68 to 0.98]; rate difference, 1.4 fewer events per 1000 person-years [CI, 0.0 to 2.7]). Adults prescribed apixaban also had a lower rate of gastrointestinal bleeding or intracranial hemorrhage (12.9 per 1000 person-years) compared with those prescribed rivaroxaban (21.9 per 1000 person-years), corresponding to an HR of 0.58 (CI, 0.52 to 0.66) and a rate difference of 9.0 fewer events per 1000 person-years (CI, 6.9 to 11.1). Limitation: Unmeasured confounding, incomplete laboratory data. Conclusion: In routine care, adults with atrial fibrillation prescribed apixaban had a lower rate of both ischemic stroke or systemic embolism and bleeding compared with those prescribed rivaroxaban. Primary Funding Source: Division of Pharmacoepidemiology and Pharmacoeconomics, Brigham and Women's Hospital.


Subject(s)
Atrial Fibrillation/drug therapy , Factor Xa Inhibitors/therapeutic use , Pyrazoles/therapeutic use , Pyridones/therapeutic use , Rivaroxaban/therapeutic use , Administration, Oral , Aged , Embolism/epidemiology , Embolism/prevention & control , Factor Xa Inhibitors/administration & dosage , Female , Humans , Incidence , Male , Propensity Score , Pyrazoles/administration & dosage , Pyridones/administration & dosage , Retrospective Studies , Rivaroxaban/administration & dosage , Stroke/epidemiology , Stroke/prevention & control
13.
Am J Epidemiol ; 189(12): 1467-1477, 2020 12 01.
Article in English | MEDLINE | ID: mdl-32639512

ABSTRACT

Using nationwide Danish registries, we conducted a population-based case-crossover study evaluating the association between switching from a vitamin K antagonist (VKA) to a direct oral anticoagulant (DOAC), and vice versa, and 30-day risks of bleeding and arterial thromboembolism in patients with atrial fibrillation (AF). The case-crossover population was identified among oral anticoagulant users during 2011-2018 (n = 123,217) as patients with AF with 1) a case-defining outcome and 2) an anticoagulant switch during the 180 days preceding the outcome. Odds ratios were estimated using conditional logistic regression by comparing the occurrence of switching during the 30-day window immediately preceding the outcome to that in reference windows in the same individual 60-180 days before the outcome. The case-crossover populations for switching from VKA to DOAC and DOAC to VKA comprised 1,382 and 287 case patients, respectively. Switching from VKA to DOAC, but not from DOAC to VKA, was associated with an increased short-term risk of bleeding (odds ratio = 1.42; 95% confidence intervals: 1.13, 1.79, and 1.06; and 0.64, 1.75, respectively) and ischemic stroke (odds ratio = 1.74; 95% confidence intervals: 1.21, 2.51, and 0.92; and 0.46, 1.83, respectively). Our findings suggest that switching from VKA to DOAC is an intermittent risk factor of bleeding and ischemic stroke in patients with AF.


Subject(s)
Anticoagulants/adverse effects , Atrial Fibrillation/complications , Drug Substitution/adverse effects , Hemorrhage/chemically induced , Thromboembolism/prevention & control , Aged , Aged, 80 and over , Cohort Studies , Cross-Over Studies , Female , Humans , Male , Thromboembolism/etiology , Vitamin K/antagonists & inhibitors
14.
Am Heart J ; 228: 36-43, 2020 10.
Article in English | MEDLINE | ID: mdl-32768690

ABSTRACT

BACKGROUND: Less than half of patients with cardiometabolic disease consistently take prescribed medications. While health insurers and some delivery organizations use claims to measure adherence, most clinicians do not have access during routine interactions. Self-reported scales exist, but their practical utility is often limited by length or cost. By contrast, the accuracy of a new 3-item self-reported measure has been demonstrated in individuals with HIV. We evaluated its concordance with claims-based adherence measures in cardiometabolic disease. METHODS: We used data from a recently-completed pragmatic trial of patients with cardiometabolic conditions. After 12 months of follow-up, intervention subjects were mailed a survey with the 3-item measure that queries about medication use in the prior 30 days. Responses were linearly transformed and averaged. Adherence was also measured in claims in month 12 and months 1-12 of the trial using proportion of days covered (PDC) metrics. We compared validation metrics for non-adherence for self-report (average <0.80) compared with claims (PDC <0.80). RESULTS: Of 459 patients returning the survey (response rate: 43.5%), 50.1% were non-adherent in claims in month 12 while 20.9% were non-adherent based on the survey. Specificity of the 3-item metric for non-adherence was high (month 12: 0.83). Sensitivity was relatively poor (month 12: 0.25). Month 12 positive and negative predictive values were 0.59 and 0.52, respectively. CONCLUSIONS: A 3-item self-reported measure has high specificity but poor sensitivity for non-adherence versus claims in cardiometabolic disease. Despite this, the tool could help target those needing adherence support, particularly in the absence of claims data.


Subject(s)
Medication Adherence/statistics & numerical data , Metabolic Syndrome/drug therapy , Patient Reported Outcome Measures , Surveys and Questionnaires/standards , Female , Humans , Insurance Claim Review/statistics & numerical data , Male , Metabolic Syndrome/epidemiology , Metabolic Syndrome/psychology , Middle Aged , Outcome Assessment, Health Care , Pharmaceutical Services, Online , Remote Consultation/methods , Remote Consultation/statistics & numerical data , Self Report/standards , Sensitivity and Specificity , United States/epidemiology
15.
Epidemiology ; 31(6): 860-871, 2020 11.
Article in English | MEDLINE | ID: mdl-32897909

ABSTRACT

BACKGROUND: We examined whether the apparent association between renal cell carcinoma (RCC) and use of dihydropyridine calcium channel blockers (CCBs) was explained by confounding by indication since hypertension, the main indication for CCBs, is a risk factor for RCC. METHODS: Using Danish health registries, we conducted a nested case-control study including 7315 RCC cases during 2000-2015. We matched each case with up to 20 controls on age and sex using risk-set sampling. We estimated odds ratios (ORs) for long-term CCB use associated with RCC using conditional logistic regression. We addressed confounding by indication by (1) adjusting for hypertension severity indicators; (2) evaluating dose-response patterns; (3) examining whether other first-line anti-hypertensives were associated with RCC; and (4) using an active comparator new user design by nesting the study in new users of CCBs or angiotensin-converting enzyme inhibitors (ACEIs). RESULTS: The adjusted OR for RCC associated with long-term CCB use compared to non-use was 1.76 (1.63-1.90). After we additionally adjusted for hypertension severity indicators, the OR remained elevated (OR 1.37; confidence interval [CI] 1.25, 1.49) with evidence of a dose-response pattern. Other anti-hypertensives were also associated with RCC, for example, ACEIs (OR 1.27; 95% CI = 1.16, 1.39) and thiazides (OR 1.22; 95% CI = 1.12, 1.34). In the active comparator new user design, the OR was 1.21 (95% CI = 0.95, 1.53) for use of CCBs compared with ACEIs. CONCLUSIONS: In this population, confounding by indication appeared to explain at least part of the association between RCC and dihydropyridine CCBs.


Subject(s)
Calcium Channel Blockers , Carcinoma, Renal Cell , Kidney Neoplasms , Calcium Channel Blockers/adverse effects , Carcinoma, Renal Cell/chemically induced , Carcinoma, Renal Cell/epidemiology , Case-Control Studies , Confounding Factors, Epidemiologic , Denmark/epidemiology , Humans , Kidney Neoplasms/chemically induced , Kidney Neoplasms/epidemiology , Risk
16.
Epidemiology ; 31(6): 806-814, 2020 11.
Article in English | MEDLINE | ID: mdl-32841986

ABSTRACT

We use simulated data to examine the consequences of depletion of susceptibles for hazard ratio (HR) estimators based on a propensity score (PS). First, we show that the depletion of susceptibles attenuates marginal HRs toward the null by amounts that increase with the incidence of the outcome, the variance of susceptibility, and the impact of susceptibility on the outcome. If susceptibility is binary then the Bross bias multiplier, originally intended to quantify bias in a risk ratio from a binary confounder, also quantifies the ratio of the instantaneous marginal HR to the conditional HR as susceptibles are depleted differentially. Second, we show how HR estimates that are conditioned on a PS tend to be between the true conditional and marginal HRs, closer to the conditional HR if treatment status is strongly associated with susceptibility and closer to the marginal HR if treatment status is weakly associated with susceptibility. We show that associations of susceptibility with the PS matter to the marginal HR in the treated (ATT) though not to the marginal HR in the entire cohort (ATE). Third, we show how the PS can be updated periodically to reduce depletion-of-susceptibles bias in conditional estimators. Although marginal estimators can hit their ATE or ATT targets consistently without updating the PS, we show how their targets themselves can be misleading as they are attenuated toward the null. Finally, we discuss implications for the interpretation of HRs and their relevance to underlying scientific and clinical questions. See video Abstract: http://links.lww.com/EDE/B727.


Subject(s)
Bias , Propensity Score , Proportional Hazards Models , Cohort Studies , Humans
17.
Epidemiology ; 31(1): 82-89, 2020 01.
Article in English | MEDLINE | ID: mdl-31569120

ABSTRACT

Estimating hazard ratios (HR) presents challenges for propensity score (PS)-based analyses of cohorts with differential depletion of susceptibles. When the treatment effect is not null, cohorts that were balanced at baseline tend to become unbalanced on baseline characteristics over time as "susceptible" individuals drop out of the population at risk differentially across treatment groups due to having outcome events. This imbalance in baseline covariates causes marginal (population-averaged) HRs to diverge from conditional (covariate-adjusted) HRs over time and systematically move toward the null. Methods that condition on a baseline PS yield HR estimates that fall between the marginal and conditional HRs when these diverge. Unconditional methods that match on the PS or weight by a function of the PS can estimate the marginal HR consistently but are prone to misinterpretation when the marginal HR diverges toward the null. Here, we present results from a series of simulations to help analysts gain insight on these issues. We propose a novel approach that uses time-dependent PSs to consistently estimate conditional HRs, regardless of whether susceptibles have been depleted differentially. Simulations show that adjustment for time-dependent PSs can adjust for covariate imbalances over time that are caused by depletion of susceptibles. Updating the PS is unnecessary when outcome incidence is so low that depletion of susceptibles is negligible. But if incidence is high, and covariates and treatment affect risk, then covariate imbalances arise as susceptibles are depleted, and PS-based methods can consistently estimate the conditional HR only if the PS is periodically updated.


Subject(s)
Cohort Studies , Propensity Score , Proportional Hazards Models , Research Design , Humans , Time Factors
18.
Cardiovasc Diabetol ; 19(1): 25, 2020 02 25.
Article in English | MEDLINE | ID: mdl-32098624

ABSTRACT

BACKGROUND: The low cost of thiazolidinediones makes them a potentially valuable therapeutic option for the > 300 million economically disadvantaged persons worldwide with type 2 diabetes mellitus. Differential selectivity of thiazolidinediones for peroxisome proliferator-activated receptors in the myocardium may lead to disparate arrhythmogenic effects. We examined real-world effects of thiazolidinediones on outpatient-originating sudden cardiac arrest (SCA) and ventricular arrhythmia (VA). METHODS: We conducted population-based high-dimensional propensity score-matched cohort studies in five Medicaid programs (California, Florida, New York, Ohio, Pennsylvania | 1999-2012) and a commercial health insurance plan (Optum Clinformatics | 2000-2016). We defined exposure based on incident rosiglitazone or pioglitazone dispensings; the latter served as an active comparator. We controlled for confounding by matching exposure groups on propensity score, informed by baseline covariates identified via a data adaptive approach. We ascertained SCA/VA outcomes precipitating hospital presentation using a validated, diagnosis-based algorithm. We generated marginal hazard ratios (HRs) via Cox proportional hazards regression that accounted for clustering within matched pairs. We prespecified Medicaid and Optum findings as primary and secondary, respectively; the latter served as a conceptual replication dataset. RESULTS: The adjusted HR for SCA/VA among rosiglitazone (vs. pioglitazone) users was 0.91 (0.75-1.10) in Medicaid and 0.88 (0.61-1.28) in Optum. Among Medicaid but not Optum enrollees, we found treatment effect heterogeneity by sex (adjusted HRs = 0.71 [0.54-0.93] and 1.16 [0.89-1.52] in men and women respectively, interaction term p-value = 0.01). CONCLUSIONS: Rosiglitazone and pioglitazone appear to be associated with similar risks of SCA/VA.


Subject(s)
Arrhythmias, Cardiac/epidemiology , Death, Sudden, Cardiac/epidemiology , Diabetes Mellitus, Type 2/drug therapy , Hypoglycemic Agents/therapeutic use , Pioglitazone/therapeutic use , Rosiglitazone/therapeutic use , Adult , Aged , Arrhythmias, Cardiac/diagnosis , Arrhythmias, Cardiac/prevention & control , Databases, Factual , Death, Sudden, Cardiac/prevention & control , Diabetes Mellitus, Type 2/diagnosis , Diabetes Mellitus, Type 2/epidemiology , Female , Humans , Hypoglycemic Agents/adverse effects , Incidence , Male , Medicaid , Middle Aged , Pioglitazone/adverse effects , Protective Factors , Risk Assessment , Risk Factors , Rosiglitazone/adverse effects , Time Factors , Treatment Outcome , United States/epidemiology
19.
Stat Med ; 39(3): 340-351, 2020 02 10.
Article in English | MEDLINE | ID: mdl-31769079

ABSTRACT

Sequential analysis is used in clinical trials and postmarket drug safety surveillance to prospectively monitor efficacy and safety to quickly detect benefits and problems, while taking the multiple testing of repeated analyses into account. When there are multiple outcomes, each one may be given a weight corresponding to its severity. This paper introduces an exact sequential analysis procedure for multiple weighted binomial end points; the analysis incorporates a drug's combined benefit and safety profile. It works with a variety of alpha spending functions for continuous, group, or mixed group-continuous sequential analysis. The binomial probabilities may vary over time and do not need to be known a priori. The new method was implemented in the free R Sequential package for both one- and two-tailed sequential analysis. An example is given examining myocardial infarction and major bleeding events in patients who initiated non-steroidal antiinflammatory drugs.


Subject(s)
Biometry/methods , Endpoint Determination/methods , Computer Simulation , Humans , Probability
20.
Value Health ; 23(4): 434-440, 2020 04.
Article in English | MEDLINE | ID: mdl-32327160

ABSTRACT

OBJECTIVES: Outcomes-based contracts tie rebates and discounts for expensive drugs to outcomes. The objective was to estimate the utility of outcomes-based contracts for diabetes medications using real-world data and to identify methodologic limitations of this approach. METHODS: A population-based cohort study of adults newly prescribed a medication for diabetes with a publicly announced outcomes-based contract (ie, exenatide microspheres ["exenatide"], dulaglutide, or sitagliptin) was conducted. The comparison group included patients receiving canagliflozin or glipizide. The primary outcome was announced in the outcomes-based contract: the percentage of adults with a follow-up hemoglobin A1C <8% up to 1 year later. Secondary outcomes included the percentage of patients diagnosed with hypoglycemia and the cost of a 1-month supply. RESULTS: Thousands of adults newly filled prescriptions for exenatide (n = 5079), dulaglutide (n = 6966), sitagliptin (n = 40 752), canagliflozin (n = 16 404), or glipizide (n = 59 985). The percentage of adults subsequently achieving a hemoglobin A1C below 8% ranged from 83% (dulaglutide, sitagliptin) to 71% (canagliflozin). The rate of hypoglycemia was 25 per 1000 person-years for exenatide, 37 per 1000 person-years for dulaglutide, 28 per 1000 person-years for sitagliptin, 18 per 1000 person-years for canagliflozin, and 34 per 1000 person-years for glipizide. The cash price for a 1-month supply was $847 for exenatide, $859 for dulaglutide, $550 for sitagliptin, $608 for canagliflozin, and $14 for glipizide. CONCLUSION: Outcomes-based pricing of diabetes medications has the potential to lower the cost of medications, but using outcomes such as hemoglobin A1C may not be clinically meaningful because similar changes in A1C can be achieved with generic medications at a far lower cost.


Subject(s)
Contracts/economics , Diabetes Mellitus, Type 2/drug therapy , Hypoglycemic Agents/administration & dosage , Outcome Assessment, Health Care/methods , Aged , Canagliflozin/administration & dosage , Canagliflozin/economics , Cohort Studies , Diabetes Mellitus, Type 2/economics , Exenatide/administration & dosage , Exenatide/economics , Female , Follow-Up Studies , Glipizide/administration & dosage , Glipizide/economics , Glucagon-Like Peptides/administration & dosage , Glucagon-Like Peptides/analogs & derivatives , Glucagon-Like Peptides/economics , Humans , Hypoglycemic Agents/economics , Immunoglobulin Fc Fragments/administration & dosage , Immunoglobulin Fc Fragments/economics , Male , Middle Aged , Recombinant Fusion Proteins/administration & dosage , Recombinant Fusion Proteins/economics , Sitagliptin Phosphate/administration & dosage , Sitagliptin Phosphate/economics
SELECTION OF CITATIONS
SEARCH DETAIL