Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 16 de 16
Filter
1.
Clin Lymphoma Myeloma Leuk ; 24(2): e1-e12, 2024 02.
Article in English | MEDLINE | ID: mdl-37923653

ABSTRACT

Multiple myeloma (MM) accounts for 10% of hematologic cancers in the U.S.; however, incidence and mortality occur disproportionately between racial groups in real-world settings. Our study's objective was to systematically characterize the disparities in overall survival (OS) among Black and White patients with MM in the US using real-world evidence studies. A systematic literature review was undertaken by searching Embase and MEDLINE for observational studies conducted in the US, published between January 1, 2015 and October 25, 2021, and reporting OS for Black and White patients with MM. Records were reviewed by 2 independent researchers. OS data were extracted as hazard ratios (HR), median survival, or %, with methods of adjustment, as reported. Evidence quality was assessed by data source, population, and variables for which HRs for risk of death were adjusted. We included 33 US studies comprising 410,086 patients (21.5% Black; 78.5% White) with MM. Receipt of treatment varied; however, most studies reported that patients either underwent stem cell transplant and/or received systemic therapy. HRs from 9 studies were considered "high quality" by comparing nationally representative, generalizable cohorts and adjusting for key prognostic, treatment, and/or socioeconomic factors. After adjustment, these data suggested that Black patients exhibit similar or superior survival outcomes compared with their White counterparts. When data are adjusted for important confounders, Black patients exhibit better or equal survival to White patients, indicating that similarities in patient populations and equal access to treatment can bridge the disparity in patient outcomes between races.


Subject(s)
Healthcare Disparities , Multiple Myeloma , Humans , Multiple Myeloma/therapy , Proportional Hazards Models , Racial Groups , Black or African American , White , Survival Rate
2.
Clin Lymphoma Myeloma Leuk ; 24(3): 177-186, 2024 03.
Article in English | MEDLINE | ID: mdl-37996264

ABSTRACT

BACKGROUND: We sought to understand the clinical effectiveness associated with use of hypomethylating agents (HMAs) azacitidine (AZA) and decitabine (DEC) for patients with refractory anemia with excess blasts (RAEB; an established proxy for higher-risk myelodysplastic syndromes/neoplasms) in contemporary and representative real-world settings. PATIENTS AND METHODS: We used the Surveillance, Epidemiology and End Results (SEER)-Medicare database, a linkage of cancer registry and Medicare claims data, to identify patients aged ≥ 66 years diagnosed with RAEB, between 2009 and 2017 in the United States, and who received AZA or DEC as first-line therapy. Outcomes measured were overall survival (OS), event-free survival (EFS), and incidence of progression-related acute myeloid leukemia (AML). RESULTS: Of 973 eligible patients, 738 (75.8%) received AZA and 235 (24.2%) received DEC; 6.4% received hematopoietic cell transplantation during follow-up. In the overall population, median OS was 13.9 months (95% confidence interval [CI]: 12.9-15.0), median EFS was 5.2 months (95% CI: 4.9-5.7), and 38.0% of patients progressed to AML. Incidences of AML progression and death were 25.6% and 29.9%, respectively, at Year 1, and 34.3% and 44.8%, respectively, at Year 2. There were no significant differences in clinical benefits between AZA and DEC. CONCLUSION: Median OS with both HMAs remained significantly shorter than in the AZA-001 clinical trial, highlighting how patient outcomes vary between clinical and real-world settings. Further research is required to understand why these disparities exist.


Subject(s)
Anemia, Refractory, with Excess of Blasts , Leukemia, Myeloid, Acute , Humans , Aged , United States/epidemiology , Anemia, Refractory, with Excess of Blasts/drug therapy , Decitabine/pharmacology , Decitabine/therapeutic use , Antimetabolites, Antineoplastic/pharmacology , Antimetabolites, Antineoplastic/therapeutic use , Medicare , Azacitidine/therapeutic use , Leukemia, Myeloid, Acute/drug therapy
3.
Dig Dis Sci ; 64(8): 2095-2106, 2019 08.
Article in English | MEDLINE | ID: mdl-30820708

ABSTRACT

Celiac disease (CD) is an immune-mediated gastrointestinal (GI) disorder driven by innate and adaptive immune responses to gluten. Presentation of CD has changed over time, with non-GI symptoms, such as anemia and osteoporosis, presenting more commonly. With improved screening and diagnostic methods, the reported prevalence of CD has increased globally, and there is considerable global variation in diagnostic and treatment practices. The objective of this study was to describe the current state of CD diagnosis and treatment patterns. A targeted review of literature from MEDLINE, Embase, the Cochrane Library, and screening of relevant conference abstracts was performed. The generally recommended diagnostic approach is GI endoscopy with small bowel biopsy; however, in selected patients, biopsy may be avoided and diagnosis based on positive serology and clinical symptoms. Diagnosis often is delayed; the average diagnostic delay after symptom onset is highly variable and can last up to 12 years. Barriers to accurate and timely diagnosis include atypical presentation, lack of physician awareness about current diagnostic criteria, misdiagnosis, and limited access to specialists. Currently, strict adherence to a gluten-free diet (GFD) is the only recommended treatment, which is not successful in all patients. Only one-third of patients are monitored regularly following diagnosis. Unmet needs for CD include improvements in the accuracy and timeliness of diagnosis, and the development of treatments for both refractory CD and GFD nonresponsive CD. Further research should investigate the impact of education about gluten-free eating and the availability of gluten-free foods support adherence and improve outcomes in patients with CD.


Subject(s)
Celiac Disease/diet therapy , Celiac Disease/diagnosis , Diet, Gluten-Free , Biopsy , Delayed Diagnosis , Endoscopy, Gastrointestinal , Humans
4.
Nutrients ; 11(2)2019 Feb 12.
Article in English | MEDLINE | ID: mdl-30759885

ABSTRACT

Celiac disease (CD) is an immune-mediated gastrointestinal disorder driven by innate and adaptive immune responses to gluten. Patients with CD are at an increased risk of several neurological manifestations, frequently peripheral neuropathy and gluten ataxia. A systematic literature review of the most commonly reported neurological manifestations (neuropathy and ataxia) associated with CD was performed. MEDLINE, Embase, the Cochrane Library, and conference proceedings were systematically searched from January 2007 through September 2018. Included studies evaluated patients with CD with at least one neurological manifestation of interest and reported prevalence, and/or incidence, and/or clinical outcomes. Sixteen studies were included describing the risk of gluten neuropathy and/or gluten ataxia in patients with CD. Gluten neuropathy was a neurological manifestation in CD (up to 39%) in 13 studies. Nine studies reported a lower risk and/or prevalence of gluten ataxia with a range of 0%⁻6%. Adherence to a gluten-free diet appeared to improve symptoms of both neuropathy and ataxia. The prevalence of gluten neuropathy and gluten ataxia in patients with CD varied in reported studies, but the increased risk supports the need for physicians to consider CD in patients with ataxia and neurological manifestations of unknown etiology.


Subject(s)
Ataxia/etiology , Celiac Disease/complications , Central Nervous System Diseases/etiology , Humans
5.
Pharmacoeconomics ; 37(1): 45-61, 2019 01.
Article in English | MEDLINE | ID: mdl-30221333

ABSTRACT

BACKGROUND: The prevalence of celiac disease (CD) has rapidly increased over recent decades, but costs related to CD remain poorly quantified. OBJECTIVE: This systematic review assessed the economic burden of CD in North America and Europe. METHODS: MEDLINE, EMBASE, EconLit, and the Cochrane Library databases were systematically searched to identify English-language literature from 2007 to 2018 that assessed costs, cost effectiveness, and health resource utilization for CD. RESULTS: Forty-nine studies met the inclusion criteria, of which 28 (57.1%) addressed costs of testing and diagnosis; 33 (67.3%) were from Europe. The cost per positive CD diagnosis of testing patients already undergoing esophagogastroduodenoscopy for other indications ranged from 1300 Canadian dollars ($Can) in Canada (2016 value) to €44,712 in the Netherlands (2013 value). Adding the CD test was cost effective when it combined diagnostic modalities (e.g., serology and biopsy). Direct annual excess costs to a US payer per diagnosed CD patient totaled $US6000 (2013 value) more than for a person without CD, chiefly due to outpatient care. Hospitalizations, emergency visits, and medication use were more common with CD. After initiating a gluten-free diet (GFD), patients visited primary care providers less often, used more medications, and missed fewer days from school and work. CONCLUSIONS: Most of the few available economic studies of CD assess testing and diagnosis costs, especially in Europe. Methods of testing generally are considered cost effective when they combine diagnostic modalities in symptomatic patients. Most costs to a payer of managing CD derive from outpatient care. Following GFD initiation, patients lose fewer days from work and school than pretreatment.


Subject(s)
Celiac Disease/economics , Celiac Disease/therapy , Cost of Illness , Ambulatory Care/economics , Cost-Benefit Analysis , Diet, Gluten-Free/economics , Europe , Humans , North America , Treatment Adherence and Compliance
6.
Melanoma Manag ; 5(1): MMT01, 2018 Jun.
Article in English | MEDLINE | ID: mdl-30190927

ABSTRACT

INTRODUCTION: Immunotherapies, including checkpoint inhibitors (CIs) such as cytotoxic T-lymphocyte antigen-4 (CTLA-4) and programmed death-1 (PD-1) inhibitors, are revolutionizing the treatment of advanced melanoma. Combining CTLA-4 and PD-1 inhibitors provides additional clinical benefit compared with single agents alone. However, combination therapy can increase the incidence of gastrointestinal adverse events (GI AEs). This systematic review assessed the epidemiological, clinical, economic, and humanistic burden of GI AEs due to combination CIs in advanced melanoma. METHODS: MEDLINE, EMBASE, and the Cochrane Library were systematically searched (December 2011 to December 2016) to identify primary studies, systematic reviews, meta-analyses, and conference proceedings (2014-2016) evaluating adults treated with ≥2 CIs for advanced melanoma. RESULTS: Of the 3391 identified articles, 14 were included. Most studies examined the ipilimumab plus nivolumab combination. Any grade and grade 3-4 GI AEs occurred in more patients receiving ipilimumab plus nivolumab versus ipilimumab or nivolumab alone. The most common grade 3-4 GI AEs were diarrhea and colitis. Grade 3-4 colitis occurred in more patients receiving ipilimumab plus nivolumab. However, grade 3-4 diarrhea occurred at the same rate as ipilimumab alone. GI AEs developed with ipilimumab plus nivolumab approximately 6.6 weeks after initiating treatment. No studies assessing the economic or humanistic burden of GI AEs were identified. CONCLUSION: GI AEs occurred at a higher rate and greater severity in patients treated with ipilimumab plus nivolumab versus ipilimumab or nivolumab monotherapy. The lack of research on economic and humanistic burden of GI AEs with combination CIs for advanced melanoma represents an unmet need in the literature and should be explored in future studies.

7.
J Comp Eff Res ; 7(2): 149-165, 2018 02.
Article in English | MEDLINE | ID: mdl-29076747

ABSTRACT

Chronic dermal ulcers affect approximately 2.4-4.5 million people in the USA and are associated with loss of function, decreased quality of life and significant economic burden. Debridement is a critical component of wound care involving removal of nonviable tissue from chronic wounds to stimulate the granulation and epithelialization process. Clostridial collagenase ointment has been used as a method of wound debridement for more than 50 years and is currently the only enzymatic debriding ointment with US FDA approval. This review discusses the results of recent real-world studies that build upon the evidence demonstrating the clinical effectiveness, cost-effectiveness and safety of clostridial collagenase ointment across wound types and care settings.


Subject(s)
Microbial Collagenase/administration & dosage , Skin Ulcer/drug therapy , Chronic Disease , Cost-Benefit Analysis , Debridement/economics , Debridement/methods , Epidemiologic Methods , Humans , Microbial Collagenase/economics , Ointments , Quality of Life , Skin Ulcer/economics , Treatment Outcome , Wound Healing/drug effects
8.
Clinicoecon Outcomes Res ; 9: 485-494, 2017.
Article in English | MEDLINE | ID: mdl-28860830

ABSTRACT

OBJECTIVES: Pressure ulcer (PU) treatment poses significant clinical and economic challenges to health-care systems. The aim of this study was to assess the cost-effectiveness and budget impact of enzymatic debridement with clostridial collagenase ointment (CCO) compared with autolytic debridement with medicinal honey (MH) for PU treatment from a US payer/Medicare perspective in the hospital outpatient department setting. METHODS: A cost-effectiveness analysis using a Markov model was developed using a 1-week cycle length across a 1-year time horizon. The three health states were inflammation/senescence, granulation/proliferation (ie, patients achieving 100% granulation), and epithelialization. Data sources included the US Wound Registry, Medicare fee schedules, and other published clinical and cost studies about PU treatment. RESULTS: In the base case analysis over a 1-year time horizon, CCO was the economically dominant strategy (ie, simultaneously conferring greater benefit at less cost). Patients treated with CCO experienced 22.7 quality-adjusted life weeks (QALWs) at a cost of $6,161 over 1 year, whereas MH patients experienced 21.9 QALWs at a cost of $7,149. Patients treated with CCO achieved 11.5 granulation weeks and 6.0 epithelization weeks compared with 10.6 and 4.4 weeks for MH, respectively. The number of clinic visits was 40.1 for CCO vs 43.4 for MH, and the number of debridements was 12.3 for CCO compared with 17.6 for MH. Probabilistic sensitivity analyses determined CCO dominant in 72% of 10,000 iterations and cost-effective in 91%, assuming a benchmark willingness-to-pay threshold of $50,000/quality-adjusted life year ($962/QALW). The budget impact analysis showed that for every 1% of patients shifted from MH to CCO, a cost savings of $9,883 over 1 year for a cohort of 1,000 patients was observed by the payer. CONCLUSION: The results of these economic analyses suggest that CCO is a cost-effective, economically dominant alternative to MH in the treatment of patients with PUs in the hospital outpatient department setting.

9.
Thromb J ; 14: 14, 2016.
Article in English | MEDLINE | ID: mdl-27303213

ABSTRACT

Vitamin K antagonists (VKAs) are effective oral anticoagulants that are titrated to a narrow therapeutic international normalized ratio (INR) range. We reviewed published literature assessing the impact of INR stability - getting into and staying in target INR range - on outcomes including thrombotic events, major bleeding, and treatment costs, as well as key factors that impact INR stability. A time in therapeutic range (TTR) of ≥65 % is commonly accepted as the definition of INR stability. In the real-world setting, this is seldom achieved with standard-of-care management, thus increasing the patients' risks of thrombotic or major bleeding events. There are many factors associated with poor INR control. Being treated in community settings, newly initiated on a VKA, younger in age, or nonadherent to therapy, as well as having polymorphisms of CYP2C9 or VKORC1, or multiple physical or mental co-morbid disease states have been associated with lower TTR. Clinical prediction tools are available, though they can only explain <10 % of the variance behind poor INR control. Clinicians caring for patients who require anticoagulation are encouraged to intensify diligence in INR management when using VKAs and to consider appropriate use of newer anticoagulants as a therapeutic option.

10.
Pharmacotherapy ; 36(5): 488-95, 2016 May.
Article in English | MEDLINE | ID: mdl-27015873

ABSTRACT

OBJECTIVE: To estimate the quality-adjusted life-years (QALYs), costs, and cost-effectiveness of high-dose edoxaban compared with adjusted-dose warfarin in patients at risk for stroke who have nonvalvular atrial fibrillation (NVAF) and a creatinine clearance (Clcr ) of 15-95 ml/minute. METHODS: A Markov model was created to compare the cost-effectiveness of high-dose edoxaban and adjusted-dose warfarin in patients with a Clcr of 15-95 ml/minute. The model was performed from a U.S. societal perspective and assumed patients initiated therapy at 70 years of age, had a mean CHADS2 (congestive heart failure, hypertension, age 75 or older, diabetes, stroke) score of 3, and no contraindications to anticoagulation. The model assumed a cycle length of 1 month and a lifetime horizon (maximum of 30 years/360 cycles). Data sources included renal subgroup analysis of the Effective Anticoagulation with Factor Xa Next Generation in Atrial Fibrillation (ENGAGE-AF) trial and other published studies. Outcomes included lifetime costs (2014 US$), QALYs, and incremental cost-effectiveness ratios. The robustness of the model's conclusions was tested using one-way and 10,000-iteration probabilistic sensitivity analysis (PSA). RESULTS: Patients treated with high-dose edoxaban lived an average of 10.50 QALYs at a lifetime treatment cost of $99,833 compared with 10.11 QALYs and $123,516 for those treated with adjusted-dose warfarin. The model's conclusions were found to be robust upon one-way sensitivity analyses. PSA suggested high-dose edoxaban was economically dominant compared with adjusted-dose warfarin in more than 99% of the 10,000 iterations run. CONCLUSIONS: High-dose edoxaban appears to be an economically dominant strategy when compared with adjusted-dose warfarin for the prevention of stroke in NVAF patients with a Clcr of 15-95 ml/minute and an appreciable risk of stroke.


Subject(s)
Atrial Fibrillation/economics , Cost-Benefit Analysis/statistics & numerical data , Pyridines/economics , Stroke/economics , Thiazoles/economics , Warfarin/economics , Aged , Anticoagulants/economics , Anticoagulants/therapeutic use , Atrial Fibrillation/drug therapy , Female , Health Care Costs/statistics & numerical data , Humans , Male , Markov Chains , Pyridines/therapeutic use , Quality-Adjusted Life Years , Stroke/prevention & control , Thiazoles/therapeutic use , Warfarin/therapeutic use
11.
PLoS One ; 10(4): e0125879, 2015.
Article in English | MEDLINE | ID: mdl-25919293

ABSTRACT

INTRODUCTION: When first line therapy with metformin is insufficient for patients with type 2 diabetes (T2D), the optimal adjunctive therapy is unclear. We assessed the efficacy and safety of adjunctive antidiabetic agents in patients with inadequately controlled T2D on metformin alone. MATERIALS AND METHODS: A search of MEDLINE and CENTRAL, clinicaltrials.gov, regulatory websites was performed. We included randomized controlled trials of 3-12 months duration, evaluating Food and Drug Administration or European Union approved agents (noninsulin and long acting, once daily basal insulins) in patients experiencing inadequate glycemic control with metformin monotherapy (≥ 1500 mg daily or maximally tolerated dose for ≥ 4 weeks). Random-effects network meta-analyses were used to compare the weighted mean difference for changes from baseline in HbA1c, body weight (BW) and systolic blood pressure (SBP), and the risk of developing hypoglycemia, urinary (UTI) and genital tract infection (GTI). RESULTS: Sixty-two trials evaluating 25 agents were included. All agents significantly reduced HbA1c vs. placebo; albeit not to the same extent (range, 0.43% for miglitol to 1.29% for glibenclamide). Glargine, sulfonylureas (SUs) and nateglinide were associated with increased hypoglycemia risk vs. placebo (range, 4.00-11.67). Sodium glucose cotransporter-2 (SGLT2) inhibitors, glucagon-like peptide-1 analogs, miglitol and empagliflozin/linagliptin significantly reduced BW (range, 1.15-2.26 kg) whereas SUs, thiazolindinediones, glargine and alogliptin/pioglitazone caused weight gain (range, 1.19-2.44 kg). SGLT2 inhibitors, empagliflozin/linagliptin, liraglutide and sitagliptin decreased SBP (range, 1.88-5.43 mmHg). No therapy increased UTI risk vs. placebo; however, SGLT2 inhibitors were associated with an increased risk of GTI (range, 2.16-8.03). CONCLUSIONS: Adding different AHAs to metformin was associated with varying effects on HbA1c, BW, SBP, hypoglycemia, UTI and GTI which should impact clinician choice when selecting adjunctive therapy.


Subject(s)
Diabetes Mellitus, Type 2/drug therapy , Hypoglycemic Agents/therapeutic use , Metformin/therapeutic use , Blood Pressure/drug effects , Body Weight/drug effects , Diabetes Mellitus, Type 2/complications , Diabetes Mellitus, Type 2/physiopathology , Drug Therapy, Combination , Glycated Hemoglobin , Humans , Hypoglycemia/etiology , Hypoglycemic Agents/pharmacology , Metformin/pharmacology , Randomized Controlled Trials as Topic , Systole/drug effects , Treatment Outcome
12.
Chest ; 147(4): 1043-1062, 2015 Apr.
Article in English | MEDLINE | ID: mdl-25317677

ABSTRACT

BACKGROUND: Studies suggest outpatient treatment or early discharge of patients with acute pulmonary embolism (aPE) is reasonable for those deemed to be at low risk of early mortality. We sought to determine clinical prediction rule accuracy for identifying patients with aPE at low risk for mortality. METHODS: We performed a literature search of Medline and Embase from January 2000 to March 2014, along with a manual search of references. We included studies deriving/validating a clinical prediction rule for early post-aPE all-cause mortality and providing mortality data over at least the index aPE hospitalization but ≤ 90 days. A bivariate model was used to pool sensitivity and specificity estimates using a random-effects approach. Traditional random-effects meta-analysis was performed to estimate the weighted proportion of patients deemed at low risk for early mortality and their ORs for death compared with high-risk patients. RESULTS: Forty studies (52 cohort-clinical prediction rule analyses) reporting on 11 clinical prediction rules were included. The highest sensitivities were observed with the Global Registry of Acute Coronary Events (0.99, 95% CI = 0.89-1.00), Aujesky 2006 (0.97, 95% CI = 0.95-0.99), simplified Pulmonary Embolism Severity Index (0.92, 95% CI = 0.89-0.94), Pulmonary Embolism Severity Index (0.89, 95% CI = 0.87-0.90), and European Society of Cardiology (0.88, 95% CI = 0.77-0.94) tools, with remaining clinical prediction rule sensitivities ranging from 0.41 to 0.82. Of these five clinical prediction rules with the highest sensitivities, none had a specificity > 0.48. They suggested anywhere from 22% to 45% of patients with aPE were at low risk and that low-risk patients had a 77% to 97% lower odds of death compared with those at high risk. CONCLUSIONS: Numerous clinical prediction rules for prognosticating early mortality in patients with aPE are available, but not all demonstrate the high sensitivity needed to reassure clinicians.


Subject(s)
Decision Support Techniques , Pulmonary Embolism/mortality , Risk Assessment/methods , Cause of Death/trends , Global Health , Humans , Prognosis , Risk Factors , Survival Rate/trends
13.
Thromb J ; 12: 14, 2014.
Article in English | MEDLINE | ID: mdl-25024644

ABSTRACT

BACKGROUND: Atrial fibrillation (AF) patients frequently require anticoagulation with vitamin K antagonists (VKAs) to prevent thromboembolic events, but their use increases the risk of hemorrhage. We evaluated time spent in therapeutic range (TTR), proportion of international normalized ratio (INR) measurements in range (PINRR), adverse events in relation to INR, and predictors of INR control in AF patients using VKAs. METHODS: We searched MEDLINE, CENTRAL and EMBASE (1990-June 2013) for studies of AF patients receiving adjusted-dose VKAs that reported INR control measures (TTR and PINRR) and/or reported an INR measurement coinciding with thromboembolic or hemorrhagic events. Random-effects meta-analyses and meta-regression were performed. RESULTS: Ninety-five articles were included. Sixty-eight VKA-treated study groups reported measures of INR control, while 43 studies reported an INR around the time of the adverse event. Patients spent 61% (95% CI, 59-62%), 25% (95% CI, 23-27%) and 14% (95% CI, 13-15%) of their time within, below or above the therapeutic range. PINRR assessments were within, below, and above range 56% (95% CI, 53-59%), 26% (95% CI, 23-29%) and 13% (95% CI, 11-17%) of the time. Patients receiving VKA management in the community spent less TTR than those managed by anticoagulation clinics or in randomized trials. Patients newly receiving VKAs spent less TTR than those with prior VKA use. Patients in Europe/United Kingdom spent more TTR than patients in North America. Fifty-seven percent (95% CI, 50-64%) of thromboembolic events and 42% (95% CI, 35 - 51%) of hemorrhagic events occurred at an INR <2.0 and >3.0, respectively; while 56% (95% CI, 48-64%) of ischemic strokes and 45% of intracranial hemorrhages (95% CI, 29-63%) occurred at INRs <2.0 and >3.0, respectively. CONCLUSIONS: Patients on VKAs for AF frequently have INRs outside the therapeutic range. While, thromboembolic and hemorrhagic events do occur patients with a therapeutic INR; patients with an INR <2.0 make up many of the cases of thromboembolism, while those >3.0 make up many of the cases of hemorrhage. Managing anticoagulation outside of a clinical trial or anticoagulation clinic is associated with poorer INR control, as is, the initiation of therapy in the VKA-naïve. Patients in Europe/UK have better INR control than those in North America.

14.
J Clin Epidemiol ; 67(10): 1093-102, 2014 Oct.
Article in English | MEDLINE | ID: mdl-25018102

ABSTRACT

OBJECTIVES: Decision makers use models to assist in evaluating the cost-effectiveness of pharmacologic stroke prevention in atrial fibrillation (SPAF). STUDY DESIGN AND SETTING: We performed a search of databases through October 3, 2012 to identify pharmacologic SPAF cost-effectiveness models. RESULTS: Of 30 identified models, 28 included warfarin, but only 60% assessed the impact of warfarin control on conclusions. Aspirin, dual antiplatelet, and newer anticoagulants were included in 41%, 10%, and 63% of models, respectively. Models used similar structures but included varying health states and made varying assumptions. They rarely reported performing a literature search to identify anticoagulant-specific inputs and used similar and older sources. Sixteen models used a lone randomized trial to reflect the efficacy and safety of main comparisons. One-third of models claimed a societal perspective; however, none included indirect costs. Patients typically initiated anticoagulation in the sixth or seventh decade of life and are followed for their lifetimes. Almost 70% of incremental cost-effectiveness ratios were below reported willingness-to-pay thresholds. All used deterministic sensitivity analyses and 77% conducted Monte Carlo simulation. Less than half of the models were rated "high quality," yet were frequently published in high-impact journals. CONCLUSION: Pharmacologic SPAF cost-effectiveness models have been extensively reported, but many may have flaws giving reason for decision makers to use caution. We provide 10 recommendations to avoid common flaws in SPAF cost-effectiveness models.


Subject(s)
Anticoagulants/economics , Atrial Fibrillation/drug therapy , Models, Economic , Publishing/standards , Stroke/prevention & control , Anticoagulants/therapeutic use , Cost-Benefit Analysis , Humans , Publishing/statistics & numerical data , Randomized Controlled Trials as Topic , Warfarin/economics , Warfarin/therapeutic use
15.
BMJ Open ; 4(6): e005379, 2014 Jun 20.
Article in English | MEDLINE | ID: mdl-24951111

ABSTRACT

OBJECTIVE: To aid trialists, systematic reviewers and others, we evaluated the degree of standardisation of control measure reporting that has occurred in atrial fibrillation (AF) and venous thromboembolism (VTE) studies since 2000; and attempted to determine whether the prior recommendation of reporting ≥2 measures per study has been employed. DESIGN: Systematic review. SEARCH STRATEGY: We searched bibliographic databases (2000 to June 2013) to identify AF and VTE studies evaluating dose-adjusted vitamin K antagonists (VKAs) and reporting ≥1 control measure. The types of measures reported, proportion of studies reporting ≥2 measures and mean (±SD) number of measures per study were determined for all studies and compared between subgroups. DATA EXTRACTION: Through the use of a standardised data extraction tool, we independently extracted all data, with disagreements resolved by a separate investigator. RESULTS: 148 studies were included, 57% of which reported ≥2 control measures (mean/study=2.13±1.36). The proportion of time spent in the target international normalised ratio range (TTR) was most commonly reported (79%), and was frequently accompanied by time above/below range (52%). AF studies more frequently reported ≥2 control measures compared with VTE studies (63% vs 37%; p=0.004), and reported a greater number of measures per study (mean=2.36 vs 1.53; p<0.001). Observational studies were more likely to provide ≥2 measures compared with randomised trials (76% vs 33%; p<0.001) and report a greater number of measures (mean=2.58 vs 1.63; p<0.001). More recent studies (2004-2013) reported ≥2 measures more often than older (2000-2003) studies (59% vs 35%; p=0.05) and reported more measures per study (mean=2.23 vs 1.48; p=0.02). CONCLUSIONS: While TTR was often utilised, studies reported ≥2 measures of VKA control only about half of the time and lacked consistency in the types of measures reported. A trend towards studies reporting greater numbers of VKA control measures over time was observed over our review time horizon, particularly, with AF and observational studies.


Subject(s)
Anticoagulants/administration & dosage , Clinical Studies as Topic , Drug Monitoring , Venous Thromboembolism/prevention & control , Vitamin K/antagonists & inhibitors , Atrial Fibrillation/complications , Humans , International Normalized Ratio , Venous Thromboembolism/etiology
16.
Thromb Res ; 134(2): 310-9, 2014 Aug.
Article in English | MEDLINE | ID: mdl-24935672

ABSTRACT

INTRODUCTION: Patients with venous thromboembolism (VTE) frequently require vitamin K antagonists (VKAs) to prevent recurrent events, but their use increases hemorrhage risk. We performed a meta-analysis to assess the quality of international normalized ratio (INR) control, identify study-level predictors of poor control and to examine the relationship between INR control and adverse outcomes in VTE patients. MATERIALS AND METHODS: We searched bibliographic databases (1990-June 2013) for studies of VTE patients receiving adjusted-dose VKAs that reported time in range (2.0-3.0) or proportion of INRs in range and/or reported INR measurements coinciding with thromboembolic or hemorrhagic events. Meta-analysis and meta-regression analysis was performed. RESULTS: Upon meta-analysis, studies found 59% (95%CI: 54-64%) of INRs measured and 61% (95%CI: 59-63%) of the time patients were treated were spent outside the target range of 2.0-3.0; with a tendency for under- versus over-anticoagulation. Moreover, this poor INR control resulted in a greater chance of recurrent VTE (beta-coefficient=-0.46, p=0.01) and major bleeding (beta-coefficient=-0.30, p=0.02). Patients with an INR<2.0 made up 58% (95%CI: 39-77%) of VTE cases, while those with an INR>3.0 made up 48% (95%CI: 34-61%) of major hemorrhage cases. Upon meta-regression, being VKA-naïve (-14%, p=0.04) and treated in the community (-7%, p<0.001) were associated with less time in range, while being treated in Europe/United Kingdom (compared to North America) was associated with (11%, p=0.003) greater time. CONCLUSIONS: Strategies to improve INR control or alternative anticoagulants, including the newer oral agents, should be widely implemented in VTE patients to reduce the rate of recurrent events and bleeding.


Subject(s)
Anticoagulants/therapeutic use , International Normalized Ratio , Venous Thromboembolism/drug therapy , Vitamin K/antagonists & inhibitors , Anticoagulants/adverse effects , Drug Monitoring , Hemorrhage/chemically induced , Humans , Treatment Outcome , Venous Thromboembolism/diagnosis
SELECTION OF CITATIONS
SEARCH DETAIL
...