ABSTRACT
AIMS: We investigated whether the use of an atrial fibrillation (AF) risk prediction algorithm could improve AF detection compared with opportunistic screening in primary care and assessed the associated budget impact. METHODS AND RESULTS: Eligible patients were registered with a general practice in UK, aged 65 years or older in 2018/19, and had complete data for weight, height, body mass index, and systolic and diastolic blood pressure recorded within 1 year. Three screening scenarios were assessed: (i) opportunistic screening and diagnosis (standard care); (ii) standard care replaced by the use of the algorithm; and (iii) combined use of standard care and the algorithm. The analysis considered a 3-year time horizon, and the budget impact for the National Health Service (NHS) costs alone or with personal social services (PSS) costs. Scenario 1 would identify 79 410 new AF cases (detection gap reduced by 22%). Scenario 2 would identify 70 916 (gap reduced by 19%) and Scenario 3 would identify 99 267 new cases (gap reduction 27%). These rates translate into 2639 strokes being prevented in Scenario 1, 2357 in Scenario 2, and 3299 in Scenario 3. The 3-year NHS budget impact of Scenario 1 would be £45.3 million, £3.6 million (difference â92.0%) with Scenario 2, and £46.3 million (difference 2.2%) in Scenario 3, but for NHS plus PSS would be â£48.8 million, â£80.4 million (64.8%), and â£71.3 million (46.1%), respectively. CONCLUSION: Implementation of an AF risk prediction algorithm alongside standard opportunistic screening could close the AF detection gap and prevent strokes while substantially reducing NHS and PSS combined care costs.
Subject(s)
Atrial Fibrillation , Stroke , Algorithms , Atrial Fibrillation/diagnosis , Atrial Fibrillation/epidemiology , Electrocardiography , Humans , Machine Learning , Primary Health Care , State Medicine , Stroke/diagnosis , Stroke/epidemiology , Stroke/etiologyABSTRACT
We report that the small Escherichia coli membrane protein DrpB (formerly YedR) is involved in cell division. We discovered DrpB in a screen for multicopy suppressors of a ΔftsEX mutation that prevents divisome assembly when cells are plated on low ionic strength medium, such as lysogeny broth without NaCl. Characterization of DrpB revealed that (i) translation initiates at an ATG annotated as codon 22 rather than the GTG annotated as codon 1, (ii) DrpB localizes to the septal ring when cells are grown in medium of low ionic strength but localization is greatly reduced in medium of high ionic strength, (iii) overproduction of DrpB in a ΔftsEX mutant background improves recruitment of the septal peptidoglycan synthase FtsI, implying multicopy suppression works by rescuing septal ring assembly, (iv) a ΔdrpB mutant divides quite normally, but a ΔdrpB ΔdedD double mutant has a strong division and viability defect, albeit only in medium of high ionic strength, and (v) DrpB homologs are found in E. coli and a few closely related enteric bacteria, but not outside this group. In sum, DrpB is a poorly conserved nonessential division protein that improves the efficiency of cytokinesis under suboptimal conditions. Proteins like DrpB are likely to be a widespread feature of the bacterial cell division apparatus, but they are easily overlooked because mutants lack obvious shape defects.IMPORTANCE A thorough understanding of bacterial cell division requires identifying and characterizing all of the proteins that participate in this process. Our discovery of DrpB brings us one step closer to this goal in E. coli.
Subject(s)
Escherichia coli/cytology , Escherichia coli/metabolism , Cell Division , Cytokinesis , Escherichia coli/genetics , MutationABSTRACT
The aim of this study was to systematically review published network meta-analyses (NMAs) that compare venous thromboembolism (VTE) treatments. A systematic literature search (in MEDLINE, Embase, and Cochrane Database of Systematic Reviews through September 2017) was conducted to identify NMAs that compared the safety and efficacy of direct oral anticoagulants (DOACs) for the treatment of VTE in the acute and extended treatment settings. The NMAs included randomized controlled trials comparing multiple DOACs, low-molecular weight heparin, unfractionated heparin, and vitamin K antagonists (VKAs). The quality of the NMA results were evaluated using the Grading of Recommendations and Evaluation (GRADE) assessment. The SLR identified 294 records and nine NMAs (68 trials). Among the NMAs, three evaluated the acute treatment setting, five the extended, and one in both treatment settings. The NMAs showed a significant reduction in major bleeding and clinically relevant bleeding (CRB) with apixaban compared to other DOACs. Major bleeding with apixaban was reduced compared to dabigatran, edoxaban, and fondaparinux-VKA combination in all comparisons in the acute setting (range of effect estimates: 0.30-0.43). CRB was reduced with apixaban compared to dabigatran, edoxaban, and rivaroxaban in the acute and extended settings (range of effect estimates: 0.23-0.72). No significant differences were seen in efficacy outcomes between the DOACs. This SLR of NMAs systematically collected all indirect evidence of the impact of apixaban compared to other anticoagulants in patients with VTE. In the absence of head-to-head trials, well-conducted NMAs provide the best evidence.
Subject(s)
Anticoagulants/therapeutic use , Heparin/therapeutic use , Venous Thromboembolism/drug therapy , Anticoagulants/adverse effects , Hemorrhage/chemically induced , Heparin/adverse effects , Humans , Randomized Controlled Trials as Topic , Treatment Outcome , Venous Thromboembolism/mortality , Vitamin K/antagonists & inhibitorsABSTRACT
BACKGROUND: Atrial fibrillation (AF) is associated with increased morbidity and mortality and exerts an increasingly significant burden on global healthcare resources, with its prevalence rising with an ageing population. Despite a substantial thromboembolic risk, particularly in the period immediately following diagnosis, oral anti-coagulation is frequently not initiated or is delayed. The aim of this study was to evaluate healthcare costs in people with AF, comparing those who were commenced on oral anti-coagulation in the immediate period following the index diagnosis date with those in whom initiation was late and those who never started anti-coagulation. METHODS: This retrospective cost analysis used linked Scottish health data to identify adults newly diagnosed with AF between January 1st 2012 and April 30th 2019 with a baseline CHA2DS2-VASc score of ≥ 2. This AF population was sub-divided according to timing of the first prescription of oral anti-coagulant (OAC) during a 2-year follow-up period: never started (OAC never initiated), immediate OAC (OAC prescribed within 60 days of incident AF diagnosis), and delayed OAC (OAC prescribed more than 60 days after incident AF diagnosis). A two-part model was developed, adjusted for key covariates, including age, sex, and frailty, to estimate costs for inpatient admissions, outpatient care, prescriptions, and care home admissions, and overall costs. RESULTS: Of an overall AF population of 54,385, 26,805 (49.3%) never commenced OAC, 7654 (14.1%) initiated an OAC late, and 19,926 (36.6%) were prescribed anti-coagulation immediately. The mean adjusted cost for the overall AF population was £7807 per person per year (unadjusted: £8491). Delayed OAC initiation was associated with the greatest adjusted estimated mean annual cost (unadjusted: £13,983; adjusted: £9763), compared to those that never started (unadjusted: £10,433; adjusted: £7981) and those that received an immediate OAC prescription (unadjusted: £3976; adjusted: £6621). Increasing frailty, mortality, and female sex were associated with greater healthcare costs. CONCLUSION: AF is associated with significant healthcare resource utilisation and costs, particularly in the context of delayed or non-initiation of anti-coagulation. Indeed, there exists substantial opportunity to improve the utilisation and prompt initiation in people newly diagnosed with AF in Scotland. Interventions to mitigate against the growing economic burden of AF should focus on reducing admissions to hospitals and care homes, which are the principal drivers of costs; prescriptions and outpatient appointments account for a relatively small proportion of overall costs for AF.
Subject(s)
Anticoagulants , Atrial Fibrillation , Health Care Costs , Humans , Atrial Fibrillation/drug therapy , Atrial Fibrillation/economics , Retrospective Studies , Male , Female , Anticoagulants/economics , Anticoagulants/administration & dosage , Anticoagulants/therapeutic use , Aged , Administration, Oral , Middle Aged , Health Care Costs/statistics & numerical data , Aged, 80 and over , Scotland , Time-to-Treatment , Time Factors , Adult , Hospitalization/economics , Hospitalization/statistics & numerical dataABSTRACT
Aims: Whilst anti-coagulation is typically recommended for thromboprophylaxis in atrial fibrillation (AF), it is often never prescribed or prematurely discontinued. The aim of this study was to evaluate the effect of inequalities in anti-coagulant prescribing by assessing stroke/systemic embolism (SSE) and bleeding risk in people with AF who continue anti-coagulation compared with those who stop transiently, permanently, or never start. Methods and results: This retrospective cohort study utilized linked Scottish healthcare data to identify adults diagnosed with AF between January 2010 and April 2016, with a CHA2DS2-VASC score of ≥2. They were sub-categorized based on anti-coagulant exposure: never started, continuous, discontinuous, and cessation. Inverse probability of treatment weighting-adjusted Cox regression and competing risk regression was utilized to compare SSE and bleeding risks between cohorts during 5-year follow-up. Of an overall cohort of 47 427 people, 26 277 (55.41%) were never anti-coagulated, 7934 (16.72%) received continuous anti-coagulation, 9107 (19.2%) temporarily discontinued, and 4109 (8.66%) permanently discontinued. Lower socio-economic status, elevated frailty score, and age ≥ 75 were associated with a reduced likelihood of initiation and continuation of anti-coagulation. Stroke/systemic embolism risk was significantly greater in those with discontinuous anti-coagulation, compared with continuous [subhazard ratio (SHR): 2.65; 2.39-2.94]. In the context of a major bleeding event, there was no significant difference in bleeding risk between the cessation and continuous cohorts (SHR 0.94; 0.42-2.14). Conclusion: Our data suggest significant inequalities in anti-coagulation prescribing, with substantial opportunity to improve initiation and continuation. Decision-making should be patient-centred and must recognize that discontinuation or cessation is associated with considerable thromboembolic risk not offset by mitigated bleeding risk.
ABSTRACT
OBJECTIVE: To evaluate the feasibility of recruiting participants diagnosed with atrial fibrillation (AF) taking oral anticoagulation therapies (OATs) and recently experiencing a bleed to collect health-related quality of life (HRQoL) information. DESIGN: Observational feasibility study. The study aimed to determine the feasibility of recruiting participants with minor and major bleeds, the most appropriate route for recruitment and the appropriateness of the patient-reported outcome measures (PROMs) selected for collecting HRQoL information in AF patients, and the preferred format of the surveys. SETTING: Primary care, secondary care and via an online patient forum. PARTICIPANTS: The study population was adult patients (≥18) with AF taking OATs who had experienced a recent major or minor bleed within the last 4 weeks. PRIMARY AND SECONDARY OUTCOME MEASURES: Primary outcomes - PROMs: EuroQol 5 Dimensions-5 Levels, Perception of Anticoagulant Treatment Questionnaire, part 2 only (part 2), atrial fibrillation effect on quality of life. Secondary outcomes - Location of bleed, bleed severity, current treatment, patient perceptions of HRQoL in relation to bleeding events. RESULTS: We received initial expressions of interest from 103 participants. We subsequently recruited 32 participants to the study-14 from primary care and 18 through the AF forum. No participants were recruited through secondary care. Despite 32 participants consenting, only 26 initial surveys were completed. We received follow-up surveys from 11 participants (8 primary care and 3 AF forum). COVID-19 had a major impact on the study. CONCLUSIONS: Primary care was the most successful route for recruitment. Most participants recruited to the study experienced a minor bleed. Further ways to recruit in secondary care should be explored, especially to capture more serious bleeds. TRIAL REGISTRATION NUMBER: The study is registered in the Clinicaltrials.gov database, NCT04921176.
Subject(s)
Atrial Fibrillation , COVID-19 , Adult , Humans , Atrial Fibrillation/drug therapy , Atrial Fibrillation/diagnosis , Secondary Care , Feasibility Studies , Quality of Life , Wales , Hemorrhage/diagnosis , Anticoagulants/therapeutic useABSTRACT
Aims: In patients with non-valvular atrial fibrillation (NVAF) prescribed warfarin, the association between guideline defined international normalised ratio (INR) control and adverse outcomes in unknown. We aimed to (i) determine stroke and systemic embolism (SSE) and bleeding events in NVAF patients prescribed warfarin; and (ii) estimate the increased risk of these adverse events associated with poor INR control in this population. Methods and results: Individual-level population-scale linked patient data were used to investigate the association between INR control and both SSE and bleeding events using (i) the National Institute for Health and Care Excellence (NICE) criteria of poor INR control [time in therapeutic range (TTR) <65%, two INRs <1.5 or two INRs >5 in a 6-month period or any INR >8]. A total of 35 891 patients were included for SSE and 35 035 for bleeding outcome analyses. Mean CHA2DS2-VASc score was 3.5 (SD = 1.7), and the mean follow up was 4.3 years for both analyses. Mean TTR was 71.9%, with 34% of time spent in poor INR control according to NICE criteria.SSE and bleeding event rates (per 100 patient years) were 1.01 (95%CI 0.95-1.08) and 3.4 (95%CI 3.3-3.5), respectively, during adequate INR control, rising to 1.82 (95%CI 1.70-1.94) and 4.8 (95% CI 4.6-5.0) during poor INR control.Poor INR control was independently associated with increased risk of both SSE [HR = 1.69 (95%CI = 1.54-1.86), P < 0.001] and bleeding [HR = 1.40 (95%CI 1.33-1.48), P < 0.001] in Cox-multivariable models. Conclusion: Guideline-defined poor INR control is associated with significantly higher SSE and bleeding event rates, independent of recognised risk factors for stroke or bleeding.
ABSTRACT
Background Important disparities in the treatment and outcomes of women and men with atrial fibrillation (AF) are well recognized. Whether introduction of direct oral anticoagulants has reduced disparities in treatment is uncertain. Methods and Results All patients who had an incident hospitalization from 2010 to 2019 with nonvalvular AF in Scotland were included in the present cohort study. Community drug dispensing data were used to determine prescribed oral anticoagulation therapy and comorbidity status. Logistic regression modeling was used to evaluate patient factors associated with treatment with vitamin K antagonists and direct oral anticoagulants. A total of 172 989 patients (48% women [82 833 of 172 989]) had an incident hospitalization with nonvalvular AF in Scotland between 2010 and 2019. By 2019, factor Xa inhibitors accounted for 83.6% of all oral anticoagulants prescribed, while treatment with vitamin K antagonists and direct thrombin inhibitors declined to 15.9% and 0.6%, respectively. Women were less likely to be prescribed any oral anticoagulation therapy compared with men (adjusted odds ratio [aOR], 0.68 [95% CI, 0.67-0.70]). This disparity was mainly attributed to vitamin K antagonists (aOR, 0.68 [95% CI, 0.66-0.70]), while there was less disparity in the use of factor Xa inhibitors between women and men (aOR, 0.92 [95% CI, 0.90-0.95]). Conclusions Women with nonvalvular AF were significantly less likely to be prescribed vitamin K antagonists compared with men. Most patients admitted to the hospital in Scotland with incident nonvalvular AF are now treated with factor Xa inhibitors and this is associated with fewer treatment disparities between women and men.
Subject(s)
Atrial Fibrillation , Humans , Female , Male , Atrial Fibrillation/complications , Atrial Fibrillation/diagnosis , Atrial Fibrillation/drug therapy , Sex Characteristics , Cohort Studies , Factor Xa Inhibitors/therapeutic use , Anticoagulants , Fibrinolytic Agents , Vitamin KABSTRACT
BACKGROUND: Oral anticoagulation therapies (OATs) are often prescribed in conjunction with medications to restore normal heart rate rhythm which can limit the risk of an atrial fibrillation (AF) related stroke and systemic thromboembolism. However, they are associated with the serious side effect of bleeding. Both clinically relevant nonmajor bleeding (CRNMB) and major bleeding while anticoagulated are believed to have a significant impact on patient quality of life (QoL). There is currently limited research into the effect bleeding has on QoL. The aim of this study is to evaluate the feasibility of identifying and recruiting patients diagnosed with AF, who are taking OATs and have recently experienced a bleed and collecting information on their QoL. METHODS: We will recruit a minimum of 50 patients to this cross-sectional, observational study. We will recruit from general practices, secondary care, and through an online AF forum. We will ask participants to complete three validated patient-reported outcome measures (PROMs), EQ5D, AFEQT, and PACT-Q, approximately 4 weeks following a bleed and again 3 months later. We will randomly select a subset of 10 participants (of those who agree to be interviewed) to undergo a structured interview with a member of the research team to explore the impact of bleeding on their QoL and to gain feedback on the three PROMs used. We will undertake a descriptive analysis of the PROMs and demographic data. We will analyse the qualitative interviews thematically to identify key themes. DISCUSSION: We aim to establish if it is possible to recruit patients and use PROMs to collect information regarding how patient QoL is affected when they experience either a clinically relevant non-major bleed (CRNMB) or major bleed while taking OATs for the management of AF. We will also explore the appropriateness, or otherwise, of the three identified PROMs for assessing quality of life following a bleed. PROMS: Three PROMs were selected following a literature review of similar QoL studies and using the COnsensus-based Standards for the selection of health Measurement INstruments (COSMIN) checklist for comparison. A review of the current literature produced no suitable validated PROM to record QoL experiences in patients who have been diagnosed with AF and have experienced a bleed while anticoagulated. As such, the EQ5D, AFEQT, and PACT-Q (part 2) were deemed most appropriate for use in this feasibility study. TRIAL REGISTRATION: The trial has been adopted onto the NIHR Portfolio (ID no. 47771) and registered with www. CLINICALTRIALS: gov (no. NCT04921176) retrospectively registered in June 2021.
ABSTRACT
OBJECTIVE: The PULsE-AI trial sought to determine the effectiveness of a screening strategy that included a machine learning risk prediction algorithm in conjunction with diagnostic testing for identification of undiagnosed atrial fibrillation (AF) in primary care. This study aimed to evaluate the cost-effectiveness of implementing the screening strategy in a real-world setting. METHODS: Data from the PULsE-AI trial - a prospective, randomized, controlled trial conducted across six general practices in England from June 2019 to February 2021 - were used to inform a cost-effectiveness analysis that included a hybrid screening decision tree and Markov AF disease progression model. Model outcomes were reported at both individual- and population-level (estimated UK population ≥30 years of age at high-risk of undiagnosed AF) and included number of patients screened, number of AF cases identified, mean total and incremental costs (screening, events, treatment), quality-adjusted-life-years (QALYs), and incremental cost-effectiveness ratio (ICER). RESULTS: The screening strategy was estimated to result in 45,493 new diagnoses of AF across the high-risk population in the UK (3.3 million), and an estimated additional 14,004 lifetime diagnoses compared with routine care only. Per-patient costs for high-risk individuals who underwent the screening strategy were estimated at £1,985 (vs £1,888 for individuals receiving routine care only). At a population-level, the screening strategy was associated with a cost increase of approximately £322 million and an increase of 81,000 QALYs. The screening strategy demonstrated cost-effectiveness versus routine care only at an accepted ICER threshold of £20,000 per QALY-gained, with an ICER of £3,994/QALY. CONCLUSIONS: Compared with routine care only, it is cost-effective to target individuals at high risk of undiagnosed AF, through an AF risk prediction algorithm, who should then undergo diagnostic testing. This AF risk prediction algorithm can reduce the number of patients needed to be screened to identify undiagnosed AF, thus alleviating primary care burden.
Subject(s)
Atrial Fibrillation , Algorithms , Artificial Intelligence , Atrial Fibrillation/complications , Cost-Benefit Analysis , Electrocardiography , Humans , Machine Learning , Mass Screening , Primary Health Care , Prospective Studies , Quality-Adjusted Life YearsABSTRACT
AIMS: In patients with non-valvular atrial fibrillation prescribed warfarin, the UK National Institute of Health and Care Excellence (NICE) defines poor anticoagulation as a time in therapeutic range (TTR) of <65%, any two international normalized ratios (INRs) within a 6-month period of ≤1.5 ('low'), two INRs ≥5 within 6 months, or any INR ≥8 ('high'). Our objectives were to (i) quantify the number of patients with poor INR control and (ii) describe the demographic and clinical characteristics associated with poor INR control. METHOD AND RESULTS: Linked anonymized health record data for Wales, UK (2006-2017) was used to evaluate patients prescribed warfarin who had at least 6 months of INR data. 32 380 patients were included. In total, 13 913 (43.0%) patients had at least one of the NICE markers of poor INR control. Importantly, in the 24 123 (74.6%) of the cohort with an acceptable TTR (≥65%), 5676 (23.5%) had either low or high INR readings at some point in their history. In a multivariable regression female gender, age (≥75 years), excess alcohol, diabetes heart failure, ischaemic heart disease, and respiratory disease were independently associated with all markers of poor INR control. CONCLUSION: Acceptable INR control according to NICE standards is poor. Of those with an acceptable TTR (>65%), one-quarter still had unacceptably low or high INR levels according to NICE criteria. Thus, only using TTR to assess effectiveness with warfarin has the potential to miss a large number of patients with non-therapeutic INRs who are likely to be at increased risk.
Subject(s)
Atrial Fibrillation , Warfarin , Aged , Atrial Fibrillation/drug therapy , Female , Humans , International Normalized Ratio , Male , Warfarin/therapeutic useABSTRACT
Aims: As many cases of atrial fibrillation (AF) are asymptomatic, patients often remain undiagnosed until complications (e.g. stroke) manifest. Risk-prediction algorithms may help to efficiently identify people with undiagnosed AF. However, the cost-effectiveness of targeted screening remains uncertain. This study aimed to assess the cost-effectiveness of targeted screening, informed by a machine learning (ML) risk prediction algorithm, to identify patients with AF.Methods: Cost-effectiveness analyses were undertaken utilizing a hybrid screening decision tree and Markov disease progression model. Costs and outcomes associated with the detection of AF compared traditional systematic and opportunistic AF screening strategies to targeted screening informed by a ML risk prediction algorithm. Model analyses were based on adults ≥50 years and adopted the UK NHS perspective.Results: Targeted screening using the ML risk prediction algorithm required fewer patients to be screened (61 per 1,000 patients, compared to 534 and 687 patients in the systematic and opportunistic strategies) and detected more AF cases (11 per 1,000 patients, compared to 6 and 8 AF cases in the systematic and opportunistic screening strategies). The targeted approach demonstrated cost-effectiveness under base case settings (cost per QALY gained of £4,847 and £5,544 against systematic and opportunistic screening respectively). The targeted screening strategy was predicted to provide an additional 3.40 and 2.05 QALYs per 1,000 patients screened versus systematic and opportunistic strategies. The targeted screening strategy remained cost-effective in all scenarios evaluated.Limitations: The analysis relied on assumptions that include the extended period of patient life span and the lack of consideration for treatment discontinuations/switching, as well as the assumption that the ML risk-prediction algorithm will identify asymptomatic AF.Conclusions: Targeted screening using a ML risk prediction algorithm has the potential to enhance the clinical and cost-effectiveness of AF screening, improving health outcomes through efficient use of limited healthcare resources.
Subject(s)
Atrial Fibrillation/diagnosis , Machine Learning , Mass Screening/economics , Mass Screening/methods , Risk Assessment , Algorithms , Cost-Benefit Analysis , Decision Trees , Humans , Markov Chains , Quality-Adjusted Life Years , Risk Assessment/statistics & numerical data , Undiagnosed Diseases/diagnosis , United KingdomABSTRACT
Atrial fibrillation (AF) is associated with an increased risk of stroke, enhanced stroke severity, and other comorbidities. However, AF is often asymptomatic, and frequently remains undiagnosed until complications occur. Current screening approaches for AF lack either cost-effectiveness or diagnostic sensitivity; thus, there is interest in tools that could be used for population screening. An AF risk prediction algorithm, developed using machine learning from a UK dataset of 2,994,837 patients, was found to be more effective than existing models at identifying patients at risk of AF. Therefore, the aim of the trial is to assess the effectiveness of this risk prediction algorithm combined with diagnostic testing for the identification of AF in a real-world primary care setting. Eligible participants (aged ≥30 years and without an existing AF diagnosis) registered at participating UK general practices will be randomised into intervention and control arms. Intervention arm participants identified at highest risk of developing AF (algorithm risk score ≥ 7.4%) will be invited for a 12lead electrocardiogram (ECG) followed by two-weeks of home-based ECG monitoring with a KardiaMobile device. Control arm participants will be used for comparison and will be managed routinely. The primary outcome is the number of AF diagnoses in the intervention arm compared with the control arm during the research window. If the trial is successful, there is potential for the risk prediction algorithm to be implemented throughout primary care for narrowing the population considered at highest risk for AF who could benefit from more intensive screening for AF. Trial Registration: NCT04045639.
Subject(s)
Atrial Fibrillation , Algorithms , Atrial Fibrillation/diagnosis , Atrial Fibrillation/epidemiology , Electrocardiography , Heart Rate , Humans , Machine Learning , Mass Screening , Randomized Controlled Trials as TopicABSTRACT
Background There is little evidence on how the occurrence of a bleed in individuals on vitamin K antagonists (VKAs) impacts the risk of subsequent bleeds, and thromboembolic and ischemic events. Such information would help to inform treatment decisions following bleeds. Objective To estimate the impact of bleeding events on the risk of subsequent bleeds, venous thromboembolism (VTE), stroke, and myocardial infarction (MI) among patients initiating VKA treatment for new-onset nonvalvular atrial fibrillation (NVAF). Methods We conducted an observational cohort study using a linked Clinical Practice Research Datalink-Hospital Episode Statistics dataset. Among a cohort of individuals with NVAF, the risk of clinically relevant bleeding, VTE, stroke, and MI was compared between the period prior to the first bleed and the periods following each subsequent bleed. The rate and cost of general practitioner (GP) consultations, prescriptions, and hospitalizations were also compared across these periods. Results The risk of clinically relevant bleeding events was observed to be elevated at least twofold in all periods following the first bleeding event. The risk of VTE, stroke, and MI was not found to differ according to the number of clinically relevant bleeding events. The rate and cost of GP consultations, GP prescriptions, and hospitalizations were increased in all periods relative to the period prior to the first bleed. Conclusions The doubling in the risk of bleeding following the first bleed, taken alongside the stable risk of MI, VTE, and stroke, suggests that the risk-benefit balance for VKA treatment should be reconsidered following the first clinically relevant bleed.
ABSTRACT
BACKGROUND: Atrial fibrillation (AF) is the most common sustained heart arrhythmia. However, as many cases are asymptomatic, a large proportion of patients remain undiagnosed until serious complications arise. Efficient, cost-effective detection of the undiagnosed may be supported by risk-prediction models relating patient factors to AF risk. However, there exists a need for an implementable risk model that is contemporaneous and informed by routinely collected patient data, reflecting the real-world pathology of AF. METHODS: This study sought to develop and evaluate novel and conventional statistical and machine learning models for risk-predication of AF. This was a retrospective, cohort study of adults (aged ≥30 years) without a history of AF, listed on the Clinical Practice Research Datalink, from January 2006 to December 2016. Models evaluated included published risk models (Framingham, ARIC, CHARGE-AF), machine learning models, which evaluated baseline and time-updated information (neural network, LASSO, random forests, support vector machines), and Cox regression. RESULTS: Analysis of 2,994,837 individuals (3.2% AF) identified time-varying neural networks as the optimal model achieving an AUROC of 0.827 vs. 0.725, with number needed to screen of 9 vs. 13 patients at 75% sensitivity, when compared with the best existing model CHARGE-AF. The optimal model confirmed known baseline risk factors (age, previous cardiovascular disease, antihypertensive medication usage) and identified additional time-varying predictors (proximity of cardiovascular events, body mass index (both levels and changes), pulse pressure, and the frequency of blood pressure measurements). CONCLUSION: The optimal time-varying machine learning model exhibited greater predictive performance than existing AF risk models and reflected known and new patient risk factors for AF.
Subject(s)
Atrial Fibrillation/diagnosis , Machine Learning , Primary Health Care/methods , Adult , Age Factors , Aged , Antihypertensive Agents/therapeutic use , Atrial Fibrillation/etiology , Blood Pressure , Body Mass Index , Cardiovascular Diseases/complications , Female , Humans , Male , Middle Aged , Neural Networks, Computer , Retrospective Studies , Risk Assessment/methods , Risk FactorsABSTRACT
OBJECTIVES: Patients refractory to older therapies for neuropathic pain (NeP) have few remaining therapeutic options. This study evaluates the cost-utility of pregabalin in the treatment of patients with refractory neuropathic pain in Sweden, from a healthcare and a societal perspective. STUDY LIMITATIONS: The use of non-randomized (observational) data to determine the effectiveness of treatments for NeP. The use of non-Swedish data for some input parameters in the model. METHODS: A previously constructed discrete event simulation model was adapted to compare pregabalin combined with usual care to usual care alone in a Swedish setting. Pain profiles were generated using clinical data from five non-randomized pregabalin studies in refractory NeP patients. Utility data were generated from a UK survey of patients with NeP. Cost data were generated from the Swedish Dental and Pharmaceutical Benefits Board (TLV's) product price database, a national NeP register, and a regional registry study. Indirect costs were estimated from published sources. One-way and probabilistic sensitivity analyses evaluated uncertainty in the model's output. RESULTS: The incremental cost-effectiveness ratio (ICER) for pregabalin plus usual care treatment compared to usual care was 51,616 SEK/5364 and 123,993 SEK/12,886 with and without indirect costs, respectively. One-way sensitivity analyses confirmed the clinical input data as the main driver of the model; even considerable changes to all other input parameters had only a modest effect on the ICER. The ICER remained well below a conservative threshold of 347,495 SEK/36,113/£30,000 in all scenarios modelled. CONCLUSIONS: This study found pregabalin combined with usual care to be cost-effective compared to usual care in patients with refractory NeP from a Swedish Health Care perspective. Moreover, sensitivity analysis showed pregabalin's cost-effectiveness to be robust in all scenarios modelled.
Subject(s)
Analgesics/economics , Analgesics/therapeutic use , Neuralgia/drug therapy , gamma-Aminobutyric Acid/analogs & derivatives , Clinical Trials as Topic , Cost-Benefit Analysis , Health Services/economics , Health Services/statistics & numerical data , Humans , Models, Econometric , Neuralgia/economics , Pain Management/economics , Pain Management/methods , Pregabalin , Quality of Life , Quality-Adjusted Life Years , Sweden , Time Factors , gamma-Aminobutyric Acid/economics , gamma-Aminobutyric Acid/therapeutic useABSTRACT
OBJECTIVES: A small but significant proportion of patients with peripheral neuropathic pain (NeP) are refractory to the typical treatments applied in clinical practice, including amitriptyline and gabapentin. Thus, they continue to suffer the debilitating effects of NeP. This study aimed to evaluate the cost-effectiveness of pregabalin in comparison to usual care, in patients with refractory NeP, from a third party payer's perspective (NHS). METHODS: A stochastic simulation model was constructed, using clinical data from four non-randomized studies, to generate pain pathways of patients receiving usual care and pregabalin. Treatment effect (pain reduction) was converted to quality-of-life (QoL) data, using a regression analysis based on new utility data, collected from a survey of refractory NeP patients presenting to pain clinics in Cardiff, Wales. All relevant direct costs were estimated using resource use from the survey data (where available) and unit costs from the British National Formulary (BNF). The analysis was run over a 5-year time horizon, with costs and benefits discounted at 3.5%. STUDY LIMITATIONS: The use of non-randomized (observational) data to characterize the effectiveness of treatments for NeP. Exclusion of productivity costs and consequences from the analysis. RESULTS: In the base case analysis, an incremental cost-effectiveness ratio (ICER) of £10,803 per quality adjusted life year (QALY) was attained. This result was found to be reasonably insensitive to variations in the key input parameters, with ICERs ranging from £8505 to £22,845 per QALY gained. CONCLUSIONS: The analysis shows that pregabalin is a cost-effective alternative to usual care in patients with refractory NeP, with an ICER well below the threshold typically adopted by UK health technology assessment groups, such as NICE.