Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 37
Filtrar
1.
J Bone Miner Res ; 2024 Apr 15.
Artículo en Inglés | MEDLINE | ID: mdl-38619297

RESUMEN

Evidence on the comparative effectiveness of osteoporosis treatments is heterogeneous. This may be attributed to different populations and clinical practice, but also to differing methodologies ensuring comparability of treatment groups before treatment effect estimation and the amount of residual confounding by indication. This study assessed the comparability of denosumab vs oral bisphosphonate (OBP) groups using propensity score (PS) methods and negative control outcome (NCO) analysis. A total of 280 288 women aged ≥50 years initiating denosumab or OBP in 2011-2018 were included from the UK Clinical Practice Research Datalink (CPRD) and the Danish National Registries (DNR). Balance of observed covariates was assessed using absolute standardised mean difference (ASMD) before and after PS weighting, matching, and stratification, with ASMD >0.1 indicating imbalance. Residual confounding was assessed using NCOs with ≥100 events. Hazard ratio (HR) and 95% confidence interval (CI) between treatment and NCO was estimated using Cox models. Presence of residual confounding was evaluated with two approaches1: >5% of NCOs with 95% CI excluding 1,2 >5% of NCOs with an upper CI <0.75 or lower CI >1.3. The number of imbalanced covariates before adjustment (CPRD 22/87; DNR 18/83) decreased, with 2-11% imbalance remaining after weighting, matching or stratification. Using approach 1, residual confounding was present for all PS methods in both databases (≥8% of NCOs), except for stratification in DNR (3.8%). Using approach 2, residual confounding was present in CPRD with PS matching (5.3%) and stratification (6.4%), but not with weighting (4.3%). Within DNR, no NCOs had HR estimates with upper or lower CI limits beyond the specified bounds indicating residual confounding for any PS method. Achievement of covariate balance and determination of residual bias were dependent upon several factors including the population under study, PS method, prevalence of NCO, and the threshold indicating residual confounding.


Treatment groups in clinical practice may not be comparable as patient characteristics differ according to the need for the prescribed medication, known as confounding. We assessed comparability of two common osteoporosis treatments, denosumab and oral bisphosphonate, in 280 288 postmenopausal women using electronic health records from UK Clinical Practice Research Datalink (CPRD) and Danish National Registries (DNR). We evaluated comparability of recorded patient characteristics with three propensity score (PS) methods, matching, stratification, and weighting. We assessed residual confounding from unrecorded patient characteristics via negative control outcomes (NCO), events known not to be associated with treatment such as delirium. We found that achieving comparability of osteoporosis treatment groups depended on the study population, PS method, and definition of residual confounding. Weighting and stratification performed the best in DNR and CPRD, respectively. Using a stricter threshold based on statistical significance for the NCO suggested the treatment groups were not comparable, except for PS stratification in DNR. Applying clinically significant thresholds of treatment effect size showed comparability using weighting in CPRD and all PS methods in DNR. Studies should consider more than one PS method to test robustness and identify the largest number of NCO to give the greatest flexibility in detecting residual confounding.

2.
Drug Saf ; 47(2): 117-123, 2024 Feb.
Artículo en Inglés | MEDLINE | ID: mdl-38019365

RESUMEN

The use of artificial intelligence (AI)-based tools to guide prescribing decisions is full of promise and may enhance patient outcomes. These tools can perform actions such as choosing the 'safest' medication, choosing between competing medications, promoting de-prescribing or even predicting non-adherence. These tools can exist in a variety of formats; for example, they may be directly integrated into electronic medical records or they may exist in a stand-alone website accessible by a web browser. One potential impact of these tools is that they could manipulate our understanding of the benefit-risk of medicines in the real world. Currently, the benefit risk of approved medications is assessed according to carefully planned agreements covering spontaneous reporting systems and planned surveillance studies. But AI-based tools may limit or even block prescription to high-risk patients or prevent off-label use. The uptake and temporal availability of these tools may be uneven across healthcare systems and geographies, creating artefacts in data that are difficult to account for. It is also hard to estimate the 'true impact' that a tool had on a prescribing decision. International borders may also be highly porous to these tools, especially in cases where tools are available over the web. These tools already exist, and their use is likely to increase in the coming years. How they can be accounted for in benefit-risk decisions is yet to be seen.


Asunto(s)
Inteligencia Artificial , Atención a la Salud , Humanos , Prescripciones de Medicamentos , Registros Electrónicos de Salud , Medición de Riesgo
3.
Front Pharmacol ; 14: 988605, 2023.
Artículo en Inglés | MEDLINE | ID: mdl-37033623

RESUMEN

Purpose: Surgeon and hospital-related features, such as volume, can be associated with treatment choices and outcomes. Accounting for these covariates with propensity score (PS) analysis can be challenging due to the clustered nature of the data. We studied six different PS estimation strategies for clustered data using random effects modelling (REM) compared with logistic regression. Methods: Monte Carlo simulations were used to generate variable cluster-level confounding intensity [odds ratio (OR) = 1.01-2.5] and cluster size (20-1,000 patients per cluster). The following PS estimation strategies were compared: i) logistic regression omitting cluster-level confounders; ii) logistic regression including cluster-level confounders; iii) the same as ii) but including cross-level interactions; iv), v), and vi), similar to i), ii), and iii), respectively, but using REM instead of logistic regression. The same strategies were tested in a trial emulation of partial versus total knee replacement (TKR) surgery, where observational versus trial-based estimates were compared as a proxy for bias. Performance metrics included bias and mean square error (MSE). Results: In most simulated scenarios, logistic regression, including cluster-level confounders, led to the lowest bias and MSE, for example, with 50 clusters × 200 individuals and confounding intensity OR = 1.5, a relative bias of 10%, and MSE of 0.003 for (i) compared to 32% and 0.010 for (iv). The results from the trial emulation also gave similar trends. Conclusion: Logistic regression, including patient and surgeon-/hospital-level confounders, appears to be the preferred strategy for PS estimation.

4.
Front Pharmacol ; 14: 1118203, 2023.
Artículo en Inglés | MEDLINE | ID: mdl-37033631

RESUMEN

Background: Thrombosis with thrombocytopenia syndrome (TTS) has been identified as a rare adverse event following some COVID-19 vaccines. Various guidelines have been issued on the treatment of TTS. We aimed to characterize the treatment of TTS and other thromboembolic events (venous thromboembolism (VTE), and arterial thromboembolism (ATE) after COVID-19 vaccination and compared to historical (pre-vaccination) data in Europe and the US. Methods: We conducted an international network cohort study using 8 primary care, outpatient, and inpatient databases from France, Germany, Netherlands, Spain, The United Kingdom, and The United States. We investigated treatment pathways after the diagnosis of TTS, VTE, or ATE for a pre-vaccination (background) cohort (01/2017-11/2020), and a vaccinated cohort of people followed for 28 days after a dose of any COVID-19 vaccine recorded from 12/2020 onwards). Results: Great variability was observed in the proportion of people treated (with any recommended therapy) across databases, both before and after vaccination. Most patients with TTS received heparins, platelet aggregation inhibitors, or direct Xa inhibitors. The majority of VTE patients (before and after vaccination) were first treated with heparins in inpatient settings and direct Xa inhibitors in outpatient settings. In ATE patients, treatments were also similar before and after vaccinations, with platelet aggregation inhibitors prescribed most frequently. Inpatient and claims data also showed substantial heparin use. Conclusion: TTS, VTE, and ATE after COVID-19 vaccination were treated similarly to background events. Heparin use post-vaccine TTS suggests most events were not identified as vaccine-induced thrombosis with thrombocytopenia by the treating clinicians.

5.
BMC Geriatr ; 23(1): 58, 2023 01 31.
Artículo en Inglés | MEDLINE | ID: mdl-36721104

RESUMEN

BACKGROUND: While several definitions exist for multimorbidity, frailty or polypharmacy, it is yet unclear to what extent single healthcare markers capture the complexity of health-related needs in older people in the community. We aimed to identify and characterise older people with complex health needs based on healthcare resource use (unplanned hospitalisations or polypharmacy) or frailty using large population-based linked records. METHODS: In this cohort study, data was extracted from UK primary care records (CPRD GOLD), with linked Hospital Episode Statistics inpatient data. People aged > 65 on 1st January 2010, registered in CPRD for ≥ 1 year were included. We identified complex health needs as the top quintile of unplanned hospitalisations, number of prescribed medicines, and electronic frailty index. We characterised all three cohorts, and quantified point-prevalence and incidence rates of preventive medicines use. RESULTS: Overall, 90,597, 110,225 and 116,076 individuals were included in the hospitalisation, frailty, and polypharmacy cohorts respectively; 28,259 (5.9%) were in all three cohorts, while 277,332 (58.3%) were not in any (background population). Frailty and polypharmacy cohorts had the highest bi-directional overlap. Most comorbidities such as diabetes and chronic kidney disease were more common in the frailty and polypharmacy cohorts compared to the hospitalisation cohort. Generally, prevalence of preventive medicines use was highest in the polypharmacy cohort compared to the other two cohorts: For instance, one-year point-prevalence of statins was 64.2% in the polypharmacy cohort vs. 60.5% in the frailty cohort. CONCLUSIONS: Three distinct groups of older people with complex health needs were identified. Compared to the hospitalisation cohort, frailty and polypharmacy cohorts had more comorbidities and higher preventive therapies use. Research is needed into the benefit-risk of different definitions of complex health needs and use of preventive therapies in the older population.


Asunto(s)
Fragilidad , Humanos , Anciano , Estudios de Cohortes , Fragilidad/diagnóstico , Fragilidad/epidemiología , Web Semántica , Hospitales , Atención Primaria de Salud , Reino Unido/epidemiología
6.
Rheumatology (Oxford) ; 62(11): 3592-3600, 2023 11 02.
Artículo en Inglés | MEDLINE | ID: mdl-36688706

RESUMEN

OBJECTIVES: To explore clustering of comorbidities among patients with a new diagnosis of OA and estimate the 10-year mortality risk for each identified cluster. METHODS: This is a population-based cohort study of individuals with first incident diagnosis of OA of the hip, knee, ankle/foot, wrist/hand or 'unspecified' site between 2006 and 2020, using SIDIAP (a primary care database representative of Catalonia, Spain). At the time of OA diagnosis, conditions associated with OA in the literature that were found in ≥1% of the individuals (n = 35) were fitted into two cluster algorithms, k-means and latent class analysis. Models were assessed using a range of internal and external evaluation procedures. Mortality risk of the obtained clusters was assessed by survival analysis using Cox proportional hazards. RESULTS: We identified 633 330 patients with a diagnosis of OA. Our proposed best solution used latent class analysis to identify four clusters: 'low-morbidity' (relatively low number of comorbidities), 'back/neck pain plus mental health', 'metabolic syndrome' and 'multimorbidity' (higher prevalence of all studied comorbidities). Compared with the 'low-morbidity' cluster, the 'multimorbidity' cluster had the highest risk of 10-year mortality (adjusted hazard ratio [HR]: 2.19 [95% CI: 2.15, 2.23]), followed by the 'metabolic syndrome' cluster (adjusted HR: 1.24 [95% CI: 1.22, 1.27]) and the 'back/neck pain plus mental health' cluster (adjusted HR: 1.12 [95% CI: 1.09, 1.15]). CONCLUSION: Patients with a new diagnosis of OA can be clustered into groups based on their comorbidity profile, with significant differences in 10-year mortality risk. Further research is required to understand the interplay between OA and particular comorbidity groups, and the clinical significance of such results.


Asunto(s)
Osteoartritis de la Cadera , Osteoartritis de la Rodilla , Humanos , España/epidemiología , Osteoartritis de la Rodilla/epidemiología , Estudios de Cohortes , Dolor de Cuello , Osteoartritis de la Cadera/epidemiología , Osteoartritis de la Cadera/diagnóstico , Comorbilidad
7.
Nat Commun ; 13(1): 7167, 2022 11 23.
Artículo en Inglés | MEDLINE | ID: mdl-36418291

RESUMEN

Population-based studies can provide important evidence on the safety of COVID-19 vaccines. Using data from the United Kingdom, here we compare observed rates of thrombosis and thrombocytopenia following vaccination against SARS-CoV-2 and infection with SARS-CoV-2 with background (expected) rates in the general population. First and second dose cohorts for ChAdOx1 or BNT162b2 between 8 December 2020 and 2 May 2021 in the United Kingdom were identified. A further cohort consisted of people with no prior COVID-19 vaccination who were infected with SARS-Cov-2 identified by a first positive PCR test between 1 September 2020 and 2 May 2021. The fourth general population cohort for background rates included those people in the database as of 1 January 2017. In total, we included 3,768,517 ChAdOx1 and 1,832,841 BNT162b2 vaccinees, 401,691 people infected with SARS-CoV-2, and 9,414,403 people from the general population. An increased risk of venous thromboembolism was seen after first dose of ChAdOx1 (standardized incidence ratio: 1.12 [95% CI: 1.05 to 1.20]), BNT162b2 (1.12 [1.03 to 1.21]), and positive PCR test (7.27 [6.86 to 7.72]). Rates of cerebral venous sinus thrombosis were higher than otherwise expected after first dose of ChAdOx1 (4.14 [2.54 to 6.76]) and a SARS-CoV-2 PCR positive test (3.74 [1.56 to 8.98]). Rates of arterial thromboembolism after vaccination were no higher than expected but were increased after a SARS-CoV-2 PCR positive test (1.39 [1.21 to 1.61]). Rates of venous thromboembolism with thrombocytopenia were higher than expected after a SARS-CoV-2 PCR positive test (5.76 [3.19 to 10.40]).


Asunto(s)
Vacunas contra la COVID-19 , COVID-19 , Trombocitopenia , Trombosis , Tromboembolia Venosa , Humanos , Vacuna BNT162 , COVID-19/epidemiología , COVID-19/prevención & control , Vacunas contra la COVID-19/efectos adversos , SARS-CoV-2 , Trombocitopenia/epidemiología , Trombocitopenia/etiología , Trombosis/epidemiología , Trombosis/etiología , Vacunación/efectos adversos , Tromboembolia Venosa/epidemiología , Tromboembolia Venosa/etiología , Reino Unido
8.
Nat Commun ; 13(1): 7169, 2022 11 23.
Artículo en Inglés | MEDLINE | ID: mdl-36418321

RESUMEN

Population-based studies can provide important evidence on the safety of COVID-19 vaccines. Here we compare rates of thrombosis and thrombocytopenia following vaccination against SARS-CoV-2 with the background (expected) rates in the general population. In addition, we compare the rates of the same adverse events among persons infected with SARS-CoV-2 with background rates. Primary care and linked hospital data from Catalonia, Spain informed the study, with participants vaccinated with BNT162b2 or ChAdOx1 (27/12/2020-23/06/2021), COVID-19 cases (01/09/2020-23/06/2021) or present in the database as of 01/01/2017. We included 2,021,366 BNT162b2 (1,327,031 with 2 doses), 592,408 ChAdOx1, 174,556 COVID-19 cases, and 4,573,494 background participants. Standardised incidence ratios for venous thromboembolism were 1.18 (95% CI 1.06-1.32) and 0.92 (0.81-1.05) after first- and second dose BNT162b2, and 0.92 (0.71-1.18) after first dose ChAdOx1. The standardised incidence ratio for venous thromboembolism in COVID-19 was 10.19 (9.43-11.02). Standardised incidence ratios for arterial thromboembolism were 1.02 (0.95-1.09) and 1.04 (0.97-1.12) after first- and second dose BNT162b2, 1.06 (0.91-1.23) after first-dose ChAdOx1 and 4.13 (3.83-4.45) for COVID-19. Standardised incidence ratios for thrombocytopenia were 1.49 (1.43-1.54) and 1.40 (1.35-1.45) after first- and second dose BNT162b2, 1.28 (1.19-1.38) after first-dose ChAdOx1 and 4.59 (4.41- 4.77) for COVID-19. While rates of thrombosis with thrombocytopenia were generally similar to background rates, the standardised incidence ratio for pulmonary embolism with thrombocytopenia after first-dose BNT162b2 was 1.70 (1.11-2.61). These findings suggest that the safety profiles of BNT162b2 and ChAdOx1 are similar, with rates of adverse events seen after vaccination typically similar to background rates. Meanwhile, rates of adverse events are much increased for COVID-19 cases further underlining the importance of vaccination.


Asunto(s)
COVID-19 , Trombocitopenia , Trombosis , Tromboembolia Venosa , Humanos , SARS-CoV-2 , España/epidemiología , Tromboembolia Venosa/epidemiología , Tromboembolia Venosa/etiología , COVID-19/epidemiología , COVID-19/prevención & control , Vacunas contra la COVID-19/efectos adversos , Vacuna BNT162 , Trombocitopenia/epidemiología , Trombocitopenia/etiología , Trombosis/epidemiología , Trombosis/etiología , Vacunación/efectos adversos
9.
BMJ ; 379: e071594, 2022 10 26.
Artículo en Inglés | MEDLINE | ID: mdl-36288813

RESUMEN

OBJECTIVE: To quantify the comparative risk of thrombosis with thrombocytopenia syndrome or thromboembolic events associated with use of adenovirus based covid-19 vaccines versus mRNA based covid-19 vaccines. DESIGN: International network cohort study. SETTING: Routinely collected health data from contributing datasets in France, Germany, the Netherlands, Spain, the UK, and the US. PARTICIPANTS: Adults (age ≥18 years) registered at any contributing database and who received at least one dose of a covid-19 vaccine (ChAdOx1-S (Oxford-AstraZeneca), BNT162b2 (Pfizer-BioNTech), mRNA-1273 (Moderna), or Ad26.COV2.S (Janssen/Johnson & Johnson)), from December 2020 to mid-2021. MAIN OUTCOME MEASURES: Thrombosis with thrombocytopenia syndrome or venous or arterial thromboembolic events within the 28 days after covid-19 vaccination. Incidence rate ratios were estimated after propensity scores matching and were calibrated using negative control outcomes. Estimates specific to the database were pooled by use of random effects meta-analyses. RESULTS: Overall, 1 332 719 of 3 829 822 first dose ChAdOx1-S recipients were matched to 2 124 339 of 2 149 679 BNT162b2 recipients from Germany and the UK. Additionally, 762 517 of 772 678 people receiving Ad26.COV2.S were matched to 2 851 976 of 7 606 693 receiving BNT162b2 in Germany, Spain, and the US. All 628 164 Ad26.COV2.S recipients from the US were matched to 2 230 157 of 3 923 371 mRNA-1273 recipients. A total of 862 thrombocytopenia events were observed in the matched first dose ChAdOx1-S recipients from Germany and the UK, and 520 events after a first dose of BNT162b2. Comparing ChAdOx1-S with a first dose of BNT162b2 revealed an increased risk of thrombocytopenia (pooled calibrated incidence rate ratio 1.33 (95% confidence interval 1.18 to 1.50) and calibrated incidence rate difference of 1.18 (0.57 to 1.8) per 1000 person years). Additionally, a pooled calibrated incidence rate ratio of 2.26 (0.93 to 5.52) for venous thrombosis with thrombocytopenia syndrome was seen with Ad26.COV2.S compared with BNT162b2. CONCLUSIONS: In this multinational study, a pooled 30% increased risk of thrombocytopenia after a first dose of the ChAdOx1-S vaccine was observed, as was a trend towards an increased risk of venous thrombosis with thrombocytopenia syndrome after Ad26.COV2.S compared with BNT162b2. Although rare, the observed risks after adenovirus based vaccines should be considered when planning further immunisation campaigns and future vaccine development.


Asunto(s)
Vacunas contra la COVID-19 , Trombocitopenia , Tromboembolia , Trombosis , Adolescente , Adulto , Humanos , Ad26COVS1/efectos adversos , Vacuna BNT162/efectos adversos , Estudios de Cohortes , COVID-19/epidemiología , COVID-19/prevención & control , Vacunas contra la COVID-19/efectos adversos , Trombocitopenia/epidemiología , Tromboembolia/epidemiología , Trombosis/epidemiología , Trombosis de la Vena/epidemiología
10.
J Thromb Haemost ; 20(12): 2887-2895, 2022 12.
Artículo en Inglés | MEDLINE | ID: mdl-36111372

RESUMEN

BACKGROUND: COVID-19 vaccination has been associated with increased venous thromboembolism (VTE) risk. However, it is unknown whether genetic predisposition to VTE is associated with an increased risk of thrombosis following vaccination. METHODS: Using data from the UK Biobank, which contains in-depth genotyping and linked vaccination and health outcomes information, we generated a polygenic risk score (PRS) using 299 genetic variants. We prospectively assessed associations between PRS and incident VTE immediately after first- and the second-dose vaccination and among historical unvaccinated cohorts during the pre- and early pandemic. We estimated hazard ratios (HR) for PRS-VTE associations using Cox models. RESULTS: Of 359 310 individuals receiving one dose of a COVID-19 vaccine, 160 327 (44.6%) were males, and the mean age at the vaccination date was 69.05 (standard deviation [SD] 8.04) years. After 28- and 90-days' follow-up, 88 and 299 individuals developed VTE, respectively, equivalent to an incidence rate of 0.88 (95% confidence interval [CI] 0.70-1.08) and 0.92 (0.82-1.04) per 100 000 person-days. The PRS was significantly associated with a higher risk of VTE (HR per 1 SD increase in PRS, 1.41 (1.15-1.73) in 28 days and 1.36 (1.22-1.52) in 90 days). Similar associations were found in the historical unvaccinated cohorts. CONCLUSIONS: The strength of genetic susceptibility with post-COVID-19-vaccination VTE is similar to that seen in historical data. Additionally, the observed PRS-VTE associations were equivalent for adenovirus- and mRNA-based vaccines. These findings suggest that, at the population level, the VTE that occurred after the COVID-19 vaccination has a similar genetic etiology to the conventional VTE.


Asunto(s)
Vacunas contra la COVID-19 , COVID-19 , Tromboembolia Venosa , Anciano , Femenino , Humanos , Masculino , Persona de Mediana Edad , COVID-19/epidemiología , COVID-19/prevención & control , Vacunas contra la COVID-19/efectos adversos , Predisposición Genética a la Enfermedad , Factores de Riesgo , Vacunación/efectos adversos , Tromboembolia Venosa/epidemiología , Tromboembolia Venosa/etiología
11.
Front Pharmacol ; 13: 912361, 2022.
Artículo en Inglés | MEDLINE | ID: mdl-35754470

RESUMEN

Objective: To characterize the trend of opioid use (number of users, dispensations and oral morphine milligram equivalents) in Catalonia (Spain). Design, setting, and participants: This population-based cohort study included all individuals aged 18 years or older, registered in the Information System for Research in Primary Care (SIDIAP), which covers >75% of the population in Catalonia, Spain, from 1 January 2007, to 31 December 2019. Main exposure and outcomes: The exposures were all commercialized opioids and their combinations (ATC-codes): codeine, tramadol, oxycodone, tapentadol, fentanyl, morphine, and other opioids (dihydrocodeine, hydromorphone, dextropropoxyphene, buprenorphine, pethidine, pentazocine). The main outcomes were the annual figures per 1,000 individuals of 1) opioid users, 2) dispensations, and 3) oral morphine milligram equivalents (MME). Results were stratified separately by opioid types, age (5-year age groups), sex (male or female), living area (rural or urban), and socioeconomic status (from least, U1, to most deprived, U5). The overall trends were quantified using the percentage change (PC) between 2007 and 2019. Results: Among 4,656,197 and 4,798,114 residents from 2007 to 2019, the number of opioid users, dispensations and morphine milligram equivalents per 1,000 individuals increased 12% (percentage change: 95% confidence interval (CI) 11.9-12.3%), 105% (95% confidence interval 83%-126%) and 339% (95% CI 289%-390%) respectively. Tramadol represented the majority of opioid use in 2019 (61, 59, and 54% of opioid users, dispensations, and total MME, respectively). Individuals aged 80 years or over reported the sharpest increase regarding opioid users (PC: 162%), dispensations (PC: 424%), and MME (PC: 830%). Strong opioids were increasingly prescribed for non-cancer pains over the years. Conclusion: Despite the modest increase of opioid users, opioid dispensations and MME increased substantially, particularly in the older population. In addition, strong opioids were incrementally indicated for non-cancer pains over the years. These findings suggest a transition of opioid prescriptions from intermittent to chronic and weak to strong and call for more rigorous opioid stewardship.

12.
BMJ ; 376: e068373, 2022 03 16.
Artículo en Inglés | MEDLINE | ID: mdl-35296468

RESUMEN

OBJECTIVE: To study the association between covid-19 vaccines, SARS-CoV-2 infection, and risk of immune mediated neurological events. DESIGN: Population based historical rate comparison study and self-controlled case series analysis. SETTING: Primary care records from the United Kingdom, and primary care records from Spain linked to hospital data. PARTICIPANTS: 8 330 497 people who received at least one dose of covid-19 vaccines ChAdOx1 nCoV-19, BNT162b2, mRNA-1273, or Ad.26.COV2.S between the rollout of the vaccination campaigns and end of data availability (UK: 9 May 2021; Spain: 30 June 2021). The study sample also comprised a cohort of 735 870 unvaccinated individuals with a first positive reverse transcription polymerase chain reaction test result for SARS-CoV-2 from 1 September 2020, and 14 330 080 participants from the general population. MAIN OUTCOME MEASURES: Outcomes were incidence of Bell's palsy, encephalomyelitis, Guillain-Barré syndrome, and transverse myelitis. Incidence rates were estimated in the 21 days after the first vaccine dose, 90 days after a positive test result for SARS-CoV-2, and between 2017 and 2019 for background rates in the general population cohort. Indirectly standardised incidence ratios were estimated. Adjusted incidence rate ratios were estimated from the self-controlled case series. RESULTS: The study included 4 376 535 people who received ChAdOx1 nCoV-19, 3 588 318 who received BNT162b2, 244 913 who received mRNA-1273, and 120 731 who received Ad26.CoV.2; 735 870 people with SARS-CoV-2 infection; and 14 330 080 people from the general population. Overall, post-vaccine rates were consistent with expected (background) rates for Bell's palsy, encephalomyelitis, and Guillain-Barré syndrome. Self-controlled case series was conducted only for Bell's palsy, given limited statistical power, but with no safety signal seen for those vaccinated. Rates were, however, higher than expected after SARS-CoV-2 infection. For example, in the data from the UK, the standardised incidence ratio for Bell's palsy was 1.33 (1.02 to 1.74), for encephalomyelitis was 6.89 (3.82 to 12.44), and for Guillain-Barré syndrome was 3.53 (1.83 to 6.77). Transverse myelitis was rare (<5 events in all vaccinated cohorts) and could not be analysed. CONCLUSIONS: No safety signal was observed between covid-19 vaccines and the immune mediated neurological events of Bell's palsy, encephalomyelitis, Guillain-Barré syndrome, and transverse myelitis. An increased risk of Bell's palsy, encephalomyelitis, and Guillain-Barré syndrome was, however, observed for people with SARS-CoV-2 infection.


Asunto(s)
Parálisis de Bell/epidemiología , Vacunas contra la COVID-19/administración & dosificación , COVID-19/prevención & control , Encefalomielitis/epidemiología , Síndrome de Guillain-Barré/epidemiología , Mielitis Transversa/epidemiología , SARS-CoV-2/inmunología , Adulto , Anciano , Femenino , Humanos , Incidencia , Masculino , Persona de Mediana Edad , Datos de Salud Recolectados Rutinariamente , España , Reino Unido , Vacunación/efectos adversos
13.
Health Technol Assess ; 25(66): 1-126, 2021 11.
Artículo en Inglés | MEDLINE | ID: mdl-34812138

RESUMEN

BACKGROUND: Although routine NHS data potentially include all patients, confounding limits their use for causal inference. Methods to minimise confounding in observational studies of implantable devices are required to enable the evaluation of patients with severe systemic morbidity who are excluded from many randomised controlled trials. OBJECTIVES: Stage 1 - replicate the Total or Partial Knee Arthroplasty Trial (TOPKAT), a surgical randomised controlled trial comparing unicompartmental knee replacement with total knee replacement using propensity score and instrumental variable methods. Stage 2 - compare the risk benefits and cost-effectiveness of unicompartmental knee replacement with total knee replacement surgery in patients with severe systemic morbidity who would have been ineligible for TOPKAT using the validated methods from stage 1. DESIGN: This was a cohort study. SETTING: Data were obtained from the National Joint Registry database and linked to hospital inpatient (Hospital Episode Statistics) and patient-reported outcome data. PARTICIPANTS: Stage 1 - people undergoing unicompartmental knee replacement surgery or total knee replacement surgery who met the TOPKAT eligibility criteria. Stage 2 - participants with an American Society of Anesthesiologists grade of ≥ 3. INTERVENTION: The patients were exposed to either unicompartmental knee replacement surgery or total knee replacement surgery. MAIN OUTCOME MEASURES: The primary outcome measure was the postoperative Oxford Knee Score. The secondary outcome measures were 90-day postoperative complications (venous thromboembolism, myocardial infarction and prosthetic joint infection) and 5-year revision risk and mortality. The main outcome measures for the health economic analysis were health-related quality of life (EuroQol-5 Dimensions) and NHS hospital costs. RESULTS: In stage 1, propensity score stratification and inverse probability weighting replicated the results of TOPKAT. Propensity score adjustment, propensity score matching and instrumental variables did not. Stage 2 included 2256 unicompartmental knee replacement patients and 57,682 total knee replacement patients who had severe comorbidities, of whom 145 and 23,344 had linked Oxford Knee Scores, respectively. A statistically significant but clinically irrelevant difference favouring unicompartmental knee replacement was observed, with a mean postoperative Oxford Knee Score difference of < 2 points using propensity score stratification; no significant difference was observed using inverse probability weighting. Unicompartmental knee replacement more than halved the risk of venous thromboembolism [relative risk 0.33 (95% confidence interval 0.15 to 0.74) using propensity score stratification; relative risk 0.39 (95% confidence interval 0.16 to 0.96) using inverse probability weighting]. Unicompartmental knee replacement was not associated with myocardial infarction or prosthetic joint infection using either method. In the long term, unicompartmental knee replacement had double the revision risk of total knee replacement [hazard ratio 2.70 (95% confidence interval 2.15 to 3.38) using propensity score stratification; hazard ratio 2.60 (95% confidence interval 1.94 to 3.47) using inverse probability weighting], but half of the mortality [hazard ratio 0.52 (95% confidence interval 0.36 to 0.74) using propensity score stratification; insignificant effect using inverse probability weighting]. Unicompartmental knee replacement had lower costs and higher quality-adjusted life-year gains than total knee replacement for stage 2 participants. LIMITATIONS: Although some propensity score methods successfully replicated TOPKAT, unresolved confounding may have affected stage 2. Missing Oxford Knee Scores may have led to information bias. CONCLUSIONS: Propensity score stratification and inverse probability weighting successfully replicated TOPKAT, implying that some (but not all) propensity score methods can be used to evaluate surgical innovations and implantable medical devices using routine NHS data. Unicompartmental knee replacement was safer and more cost-effective than total knee replacement for patients with severe comorbidity and should be considered the first option for suitable patients. FUTURE WORK: Further research is required to understand the performance of propensity score methods for evaluating surgical innovations and implantable devices. TRIAL REGISTRATION: This trial is registered as EUPAS17435. FUNDING: This project was funded by the National Institute for Health Research (NIHR) Health Technology Assessment programme and will be published in full in Health Technology Assessment; Vol. 25, No. 66. See the NIHR Journals Library website for further project information.


We compared the risks and benefits of partial and total knee replacements in NHS patients with a complex medical history who would normally be excluded from randomised trials on this topic. We used information that was collected during hospital appointments for people who had a knee replacement between 2009 and 2016. It is difficult to directly compare the two groups because each individual patient has a different medical history. We tested advanced statistical methods to account for these differences. In stage 1, we showed that some of these advanced statistical methods could replicate the results of a recently published surgical trial using routine data from the NHS. We compared patients in the trial with similar patients who were operated on in the NHS. Three of the proposed methods showed results similar to those obtained from the Total or Partial Knee Arthroplasty Trial (TOPKAT). In stage 2, we used the successful methods from stage 1 to study the risks, benefits and costs of partial and total knee replacement surgery in patients with complex medical histories. Two of the statistical methods found that patients who had a partial knee replacement had less self-reported pain and better function after surgery than patients who had a total knee replacement. All three methods found that partial knee replacement was safer, was associated with a lower risk of blood clots (a known complication of knee surgery) and had lower mortality over 5 years. However, patients who had a partial knee replacement were twice as likely as those with a total knee replacement to need a second surgery within 5 years. We found that partial knee replacements were less costly to the NHS and were associated with better overall quality of life for patients than total knee replacement.


Asunto(s)
Artroplastia de Reemplazo de Rodilla , Estudios de Cohortes , Análisis Costo-Beneficio , Humanos , Puntaje de Propensión , Calidad de Vida , Años de Vida Ajustados por Calidad de Vida
14.
JAMA ; 326(15): 1504-1515, 2021 10 19.
Artículo en Inglés | MEDLINE | ID: mdl-34665205

RESUMEN

Importance: Although tramadol is increasingly used to manage chronic noncancer pain, few safety studies have compared it with other opioids. Objective: To assess the associations of tramadol, compared with codeine, with mortality and other adverse clinical outcomes as used in outpatient settings. Design, Setting, and Participants: Retrospective, population-based, propensity score-matched cohort study using a primary care database with routinely collected medical records and pharmacy dispensations covering more than 80% of the population of Catalonia, Spain (≈6 million people). Patients 18 years or older with 1 or more year of available data and dispensation of tramadol or codeine (2007-2017) were included and followed up to December 31, 2017. Exposures: New prescription dispensation of tramadol or codeine (no dispensation in the previous year). Main Outcomes and Measures: Outcomes studied were all-cause mortality, cardiovascular events, fractures, constipation, delirium, falls, opioid abuse/dependence, and sleep disorders within 1 year after the first dispensation. Absolute rate differences (ARDs) and hazard ratios (HRs) with 95% confidence intervals were calculated using cause-specific Cox models. Results: Of the 1 093 064 patients with a tramadol or codeine dispensation during the study period (326 921 for tramadol, 762 492 for codeine, 3651 for both drugs concomitantly), a total of 368 960 patients (184 480 propensity score-matched pairs) were included after study exclusions and propensity score matching (mean age, 53.1 [SD, 16.1] years; 57.3% women). Compared with codeine, tramadol dispensation was significantly associated with a higher risk of all-cause mortality (incidence, 13.00 vs 5.61 per 1000 person-years; HR, 2.31 [95% CI, 2.08-2.56]; ARD, 7.37 [95% CI, 6.09-8.78] per 1000 person-years), cardiovascular events (incidence, 10.03 vs 8.67 per 1000 person-years; HR, 1.15 [95% CI, 1.05-1.27]; ARD, 1.36 [95% CI, 0.45-2.36] per 1000 person-years), and fractures (incidence, 12.26 vs 8.13 per 1000 person-years; HR, 1.50 [95% CI, 1.37-1.65]; ARD, 4.10 [95% CI, 3.02-5.29] per 1000 person-years). No significant difference was observed for the risk of falls, delirium, constipation, opioid abuse/dependence, or sleep disorders. Conclusions and Relevance: In this population-based cohort study, a new prescription dispensation of tramadol, compared with codeine, was significantly associated with a higher risk of subsequent all-cause mortality, cardiovascular events, and fractures, but there was no significant difference in the risk of constipation, delirium, falls, opioid abuse/dependence, or sleep disorders. The findings should be interpreted cautiously, given the potential for residual confounding.


Asunto(s)
Analgésicos Opioides/efectos adversos , Causas de Muerte , Codeína/efectos adversos , Tramadol/efectos adversos , Accidentes por Caídas/estadística & datos numéricos , Atención Ambulatoria , Enfermedades Cardiovasculares/inducido químicamente , Enfermedades Cardiovasculares/epidemiología , Bases de Datos Factuales , Delirio/epidemiología , Prescripciones de Medicamentos/estadística & datos numéricos , Femenino , Fracturas Óseas/inducido químicamente , Fracturas Óseas/epidemiología , Humanos , Incidencia , Masculino , Persona de Mediana Edad , Trastornos Relacionados con Opioides/epidemiología , Puntaje de Propensión , Modelos de Riesgos Proporcionales , Estudios Retrospectivos , Trastornos del Sueño-Vigilia/epidemiología
15.
J Bone Miner Res ; 36(11): 2153-2161, 2021 11.
Artículo en Inglés | MEDLINE | ID: mdl-34173277

RESUMEN

Conflicting results exist about the relationship between bariatric surgery and fracture risk. Also, prediction of who is at increased risk of fracture after bariatric surgery is not currently available. Hence, we used a combination of a self-controlled case series (SCCS) study to establish the association between bariatric surgery and fracture, and develop a prediction model for postoperative fracture risk estimation using a cohort study. Patients from UK Primary care records from the Clinical Practice Research Datalink GOLD linked to Hospital Episode Statistics undergoing bariatric surgery with body mass index (BMI) ≥30 kg/m2 between 1997 and 2018 were included in the cohort. Those sustaining one or more fractures in the 5 years before or after surgery were included in the SCCS. Fractures were considered in three categories: (i) any except skull and digits (primary outcome); (ii) major (hip, vertebrae, wrist/forearm, and humerus); and (iii) peripheral (forearm and lower leg). Of 5487 participants, 252 (4.6%) experienced 272 fractures (of which 80 were major and 135 peripheral) and were included in the SCCS analyses. Major fracture risk increased after surgery, incidence rate ratios (IRRs) and 95% confidence intervals (CIs): 2.77 (95% CI, 1.34-5.75) and 3.78 (95% CI, 1.42-10.08) at ≤3 years and 3.1 to 5 years postsurgery when compared to 5 years prior to surgery, respectively. Any fracture risk was higher only in the 2.1 to 5 years following surgery (IRR 1.73; 95% CI, 1.08-2.77) when compared to 5 years prior to surgery. No excess risk of peripheral fracture after surgery was identified. A prediction tool for major fracture was developed using 5487 participants included in the cohort study. It was also internally validated (area under the receiver-operating characteristic curve [AUC ROC] 0.70) with use of anxiolytics/sedatives/hypnotics and female as major predictors. Hence, major fractures are nearly threefold more likely after bariatric surgery. A simple prediction tool with five variables identifies high risk patients for major fracture. © 2021 The Authors. Journal of Bone and Mineral Research published by Wiley Periodicals LLC on behalf of American Society for Bone and Mineral Research (ASBMR).


Asunto(s)
Cirugía Bariátrica , Fracturas Óseas , Cirugía Bariátrica/efectos adversos , Estudios de Cohortes , Femenino , Fracturas Óseas/epidemiología , Humanos , Factores de Riesgo , Reino Unido
16.
Health Technol Assess ; 25(17): 1-106, 2021 03.
Artículo en Inglés | MEDLINE | ID: mdl-33739919

RESUMEN

BACKGROUND: Bisphosphonates are contraindicated in patients with stage 4+ chronic kidney disease. However, they are widely used to prevent fragility fractures in stage 3 chronic kidney disease, despite a lack of good-quality data on their effects. OBJECTIVES: The aims of each work package were as follows. Work package 1: to study the relationship between bisphosphonate use and chronic kidney disease progression. Work package 2: to study the association between using bisphosphonates and fracture risk. Work package 3: to determine the risks of hypocalcaemia, hypophosphataemia, acute kidney injury and upper gastrointestinal events associated with using bisphosphonates. Work package 4: to investigate the association between using bisphosphonates and changes in bone mineral density over time. DESIGN: This was a new-user cohort study design with propensity score matching. SETTING AND DATA SOURCES: Data were obtained from UK NHS primary care (Clinical Practice Research Datalink GOLD database) and linked hospital inpatient records (Hospital Episode Statistics) for work packages 1-3 and from the Danish Odense University Hospital Databases for work package 4. PARTICIPANTS: Patients registered in the data sources who had at least one measurement of estimated glomerular filtration rate of < 45 ml/minute/1.73 m2 were eligible. A second estimated glomerular filtration rate value of < 45 ml/minute/1.73 m2 within 1 year after the first was requested for work packages 1 and 3. Patients with no Hospital Episode Statistics linkage were excluded from work packages 1-3. Patients with < 1 year of run-in data before index estimated glomerular filtration rate and previous users of anti-osteoporosis medications were excluded from work packages 1-4. INTERVENTIONS/EXPOSURE: Bisphosphonate use, identified from primary care prescriptions (for work packages 1-3) or pharmacy dispensations (for work package 4), was the main exposure. MAIN OUTCOME MEASURES: Work package 1: chronic kidney disease progression, defined as stage worsening or starting renal replacement. Work package 2: hip fracture. Work package 3: acute kidney injury, hypocalcaemia and hypophosphataemia identified from Hospital Episode Statistics, and gastrointestinal events identified from Clinical Practice Research Datalink or Hospital Episode Statistics. Work package 4: annualised femoral neck bone mineral density percentage change. RESULTS: Bisphosphonate use was associated with an excess risk of chronic kidney disease progression (subdistribution hazard ratio 1.12, 95% confidence interval 1.02 to 1.24) in work package 1, but did not increase the probability of other safety outcomes in work package 3. The results from work package 2 suggested that bisphosphonate use increased fracture risk (hazard ratio 1.25, 95% confidence interval 1.13 to 1.39) for hip fractures, but sensitivity analyses suggested that this was related to unresolved confounding. Conversely, work package 4 suggested that bisphosphonates improved bone mineral density, with an average 2.65% (95% confidence interval 1.32% to 3.99%) greater gain in femoral neck bone mineral density per year in bisphosphonate users than in matched non-users. LIMITATIONS: Confounding by indication was a concern for the clinical effectiveness (i.e. work package 2) data. Bias analyses suggested that these findings were due to inappropriate adjustment for pre-treatment risk. work packages 3 and 4 were based on small numbers of events and participants, respectively. CONCLUSIONS: Bisphosphonates were associated with a 12% excess risk of chronic kidney disease progression in participants with stage 3B+ chronic kidney disease. No other safety concerns were identified. Bisphosphonate therapy increased bone mineral density, but the research team failed to demonstrate antifracture effectiveness. FUTURE WORK: Randomised controlled trial data are needed to demonstrate antifracture efficacy in patients with stage 3B+ chronic kidney disease. More safety analyses are needed to characterise the renal toxicity of bisphosphonates in stage 3A chronic kidney disease, possibly using observational data. STUDY REGISTRATION: This study is registered as EUPAS10029. FUNDING: This project was funded by the National Institute for Health Research (NIHR) Health Technology Assessment programme and will be published in full in Health Technology Assessment; Vol. 25, No. 17. See the NIHR Journals Library website for further project information. The project was also supported by the National Institute for Health Research Biomedical Research Centre, Oxford.


RATIONALE AND AIMS: Bisphosphonates are used to prevent fractures in people with fragile bones. People with chronic kidney disease have a high risk of fracturing, but the safety and effectiveness of bisphosphonates in severe chronic kidney disease is unclear. The aim of this study was to assess the benefits (e.g. bone strength improvement and fracture prevention) and the risks of unwanted effects associated with bisphosphonates for people with moderate to severe chronic kidney disease. METHODS: Anonymised primary and secondary care electronic medical records data from the UK NHS were used, as well as a Danish equivalent that included bone density scans. Anyone in these databases with a measure of reduced kidney function that suggested moderate to severe chronic kidney disease was eligible, which was > 220,000 people from the UK. Over 20,000 of them used bisphosphonates. Bisphosphonate users were matched to non-users with similar age, sex and other characteristics. RESULTS: Bisphosphonate users had a 12% higher risk of their chronic kidney disease getting worse than non-users. Their risks of other side effects, such as acute kidney injuries and gastrointestinal problems, did not change. Bisphosphonate users had a 25% higher risk of fractures than non-users in the UK database, probably because the matching methods did not create similar-enough groups of users and non-users. However, it was found that bisphosphonate improved bone density in the Danish database. Bone density is a proxy for bone strength, so better bone density should mean fewer fractures. CONCLUSIONS: These results suggest that bisphosphonate therapy may make moderate to severe chronic kidney disease worse. More studies are needed on how bisphosphonates affect milder chronic kidney disease. Bisphosphonates were associated with better bone strength, but it could not be demonstrated that they reduced fracture risk. More data are required, probably from a placebo-controlled trial, to determine whether or not bisphosphonates prevent fractures in people with moderate to severe chronic kidney disease and whether or not this is worth the risk of their chronic kidney disease worsening.


Asunto(s)
Fracturas Óseas , Insuficiencia Renal Crónica , Estudios de Cohortes , Difosfonatos/efectos adversos , Fracturas Óseas/epidemiología , Humanos , Puntaje de Propensión , Insuficiencia Renal Crónica/complicaciones , Insuficiencia Renal Crónica/epidemiología
17.
Heart ; 107(11): 902-908, 2021 06.
Artículo en Inglés | MEDLINE | ID: mdl-33692093

RESUMEN

OBJECTIVE: To improve the echocardiographic assessment of heart failure in patients with atrial fibrillation (AF) by comparing conventional averaging of consecutive beats with an index-beat approach, whereby measurements are taken after two cycles with similar R-R interval. METHODS: Transthoracic echocardiography was performed using a standardised and blinded protocol in patients enrolled in the RATE-AF (RAte control Therapy Evaluation in permanent Atrial Fibrillation) randomised trial. We compared reproducibility of the index-beat and conventional consecutive-beat methods to calculate left ventricular ejection fraction (LVEF), global longitudinal strain (GLS) and E/e' (mitral E wave max/average diastolic tissue Doppler velocity), and assessed intraoperator/interoperator variability, time efficiency and validity against natriuretic peptides. RESULTS: 160 patients were included, 46% of whom were women, with a median age of 75 years (IQR 69-82) and a median heart rate of 100 beats per minute (IQR 86-112). The index-beat had the lowest within-beat coefficient of variation for LVEF (32%, vs 51% for 5 consecutive beats and 53% for 10 consecutive beats), GLS (26%, vs 43% and 42%) and E/e' (25%, vs 41% and 41%). Intraoperator (n=50) and interoperator (n=18) reproducibility were both superior for index-beats and this method was quicker to perform (p<0.001): 35.4 s to measure E/e' (95% CI 33.1 to 37.8) compared with 44.7 s for 5-beat (95% CI 41.8 to 47.5) and 98.1 s for 10-beat (95% CI 91.7 to 104.4) analyses. Using a single index-beat did not compromise the association of LVEF, GLS or E/e' with natriuretic peptide levels. CONCLUSIONS: Compared with averaging of multiple beats in patients with AF, the index-beat approach improves reproducibility and saves time without a negative impact on validity, potentially improving the diagnosis and classification of heart failure in patients with AF.


Asunto(s)
Fibrilación Atrial/fisiopatología , Ecocardiografía Doppler de Pulso , Insuficiencia Cardíaca/diagnóstico , Anciano , Anciano de 80 o más Años , Biomarcadores/sangre , Diástole/fisiología , Femenino , Humanos , Masculino , Péptido Natriurético Encefálico/sangre , Fragmentos de Péptidos/sangre , Reproducibilidad de los Resultados , Volumen Sistólico/fisiología , Sístole/fisiología , Función Ventricular Izquierda/fisiología
18.
Rheumatology (Oxford) ; 60(10): 4832-4843, 2021 10 02.
Artículo en Inglés | MEDLINE | ID: mdl-33560340

RESUMEN

OBJECTIVES: Better indicators from affordable, sustainable data sources are needed to monitor population burden of musculoskeletal conditions. We propose five indicators of musculoskeletal health and assessed if routinely available primary care electronic health records (EHR) can estimate population levels in musculoskeletal consulters. METHODS: We collected validated patient-reported measures of pain experience, function and health status through a local survey of adults (≥35 years) presenting to English general practices over 12 months for low back pain, shoulder pain, osteoarthritis and other regional musculoskeletal disorders. Using EHR data we derived and validated models for estimating population levels of five self-reported indicators: prevalence of high impact chronic pain, overall musculoskeletal health (based on Musculoskeletal Health Questionnaire), quality of life (based on EuroQoL health utility measure), and prevalence of moderate-to-severe low back pain and moderate-to-severe shoulder pain. We applied models to a national EHR database (Clinical Practice Research Datalink) to obtain national estimates of each indicator for three successive years. RESULTS: The optimal models included recorded demographics, deprivation, consultation frequency, analgesic and antidepressant prescriptions, and multimorbidity. Applying models to national EHR, we estimated that 31.9% of adults (≥35 years) presenting with non-inflammatory musculoskeletal disorders in England in 2016/17 experienced high impact chronic pain. Estimated population health levels were worse in women, older aged and those in the most deprived neighbourhoods, and changed little over 3 years. CONCLUSION: National and subnational estimates for a range of subjective indicators of non-inflammatory musculoskeletal health conditions can be obtained using information from routine electronic health records.


Asunto(s)
Costo de Enfermedad , Enfermedades Musculoesqueléticas/epidemiología , Adulto , Factores de Edad , Anciano , Anciano de 80 o más Años , Registros Electrónicos de Salud/estadística & datos numéricos , Inglaterra/epidemiología , Femenino , Humanos , Masculino , Persona de Mediana Edad , Modelos Estadísticos , Atención Primaria de Salud/estadística & datos numéricos , Factores Sexuales , Encuestas y Cuestionarios
19.
J Bone Miner Res ; 36(5): 820-832, 2021 05.
Artículo en Inglés | MEDLINE | ID: mdl-33373491

RESUMEN

Bisphosphonates are the first-line treatment for preventing fractures in osteoporosis patients. However, their use is contraindicated or to be used with caution in chronic kidney disease (CKD) patients, primarily because of a lack of information about their safety and effectiveness. We aimed to investigate the safety of oral bisphosphonates in patients with moderate to severe CKD, using primary-care electronic records from two cohorts, CPRD GOLD (1997-2016) and SIDIAP (2007-2015) in the UK and Catalonia, respectively. Both databases were linked to hospital records. SIDIAP was also linked to end-stage renal disease registry data. Patients with CKD stages 3b to 5, based on two or more estimated glomerular filtration rate measurements less than 45 mL/min/1.73 m2 , aged 40 years or older were identified. New bisphosphonate users were propensity score-matched with up to five non-users to minimize confounding within this population. Our primary outcome was CKD stage worsening (estimated glomerular filtration rate [eGFR] decline or renal replacement therapy). Secondary outcomes were acute kidney injury, gastrointestinal bleeding/ulcers, and severe hypocalcemia. Hazard ratios (HRs) were estimated using Cox regression and Fine and Gray sub-HRs were calculated for competing risks. We matched 2447 bisphosphonate users with 8931 non-users from CPRD and 1399 users with 6547 non-users from SIDIAP. Bisphosphonate use was associated with greater risk of CKD progression in CPRD (sub-HR [95% CI]: 1.14 [1.04, 1.26]) and SIDIAP (sub-HR: 1.15 [1.04, 1.27]). No risk differences were found for acute kidney injury, gastrointestinal bleeding/ulcers, or hypocalcemia. Hence, we can conclude a modest (15%) increased risk of CKD progression was identified in association with bisphosphonate use. No other safety concerns were identified. Our findings should be considered before prescribing bisphosphonates to patients with moderate to severe CKD. © 2020 American Society for Bone and Mineral Research (ASBMR).


Asunto(s)
Osteoporosis , Insuficiencia Renal Crónica , Estudios de Cohortes , Difosfonatos/efectos adversos , Tasa de Filtración Glomerular , Humanos , Osteoporosis/tratamiento farmacológico , Osteoporosis/epidemiología , Insuficiencia Renal Crónica/complicaciones , Insuficiencia Renal Crónica/tratamiento farmacológico , Insuficiencia Renal Crónica/epidemiología , Factores de Riesgo
20.
JAMA ; 324(24): 2497-2508, 2020 12 22.
Artículo en Inglés | MEDLINE | ID: mdl-33351042

RESUMEN

Importance: There is little evidence to support selection of heart rate control therapy in patients with permanent atrial fibrillation, in particular those with coexisting heart failure. Objective: To compare low-dose digoxin with bisoprolol (a ß-blocker). Design, Setting, and Participants: Randomized, open-label, blinded end-point clinical trial including 160 patients aged 60 years or older with permanent atrial fibrillation (defined as no plan to restore sinus rhythm) and dyspnea classified as New York Heart Association class II or higher. Patients were recruited from 3 hospitals and primary care practices in England from 2016 through 2018; last follow-up occurred in October 2019. Interventions: Digoxin (n = 80; dose range, 62.5-250 µg/d; mean dose, 161 µg/d) or bisoprolol (n = 80; dose range, 1.25-15 mg/d; mean dose, 3.2 mg/d). Main Outcomes and Measures: The primary end point was patient-reported quality of life using the 36-Item Short Form Health Survey physical component summary score (SF-36 PCS) at 6 months (higher scores are better; range, 0-100), with a minimal clinically important difference of 0.5 SD. There were 17 secondary end points (including resting heart rate, modified European Heart Rhythm Association [EHRA] symptom classification, and N-terminal pro-brain natriuretic peptide [NT-proBNP] level) at 6 months, 20 end points at 12 months, and adverse event (AE) reporting. Results: Among 160 patients (mean age, 76 [SD, 8] years; 74 [46%] women; mean baseline heart rate, 100/min [SD, 18/min]), 145 (91%) completed the trial and 150 (94%) were included in the analysis for the primary outcome. There was no significant difference in the primary outcome of normalized SF-36 PCS at 6 months (mean, 31.9 [SD, 11.7] for digoxin vs 29.7 [11.4] for bisoprolol; adjusted mean difference, 1.4 [95% CI, -1.1 to 3.8]; P = .28). Of the 17 secondary outcomes at 6 months, there were no significant between-group differences for 16 outcomes, including resting heart rate (a mean of 76.9/min [SD, 12.1/min] with digoxin vs a mean of 74.8/min [SD, 11.6/min] with bisoprolol; difference, 1.5/min [95% CI, -2.0 to 5.1/min]; P = .40). The modified EHRA class was significantly different between groups at 6 months; 53% of patients in the digoxin group reported a 2-class improvement vs 9% of patients in the bisoprolol group (adjusted odds ratio, 10.3 [95% CI, 4.0 to 26.6]; P < .001). At 12 months, 8 of 20 outcomes were significantly different (all favoring digoxin), with a median NT-proBNP level of 960 pg/mL (interquartile range, 626 to 1531 pg/mL) in the digoxin group vs 1250 pg/mL (interquartile range, 847 to 1890 pg/mL) in the bisoprolol group (ratio of geometric means, 0.77 [95% CI, 0.64 to 0.92]; P = .005). Adverse events were less common with digoxin; 20 patients (25%) in the digoxin group had at least 1 AE vs 51 patients (64%) in the bisoprolol group (P < .001). There were 29 treatment-related AEs and 16 serious AEs in the digoxin group vs 142 and 37, respectively, in the bisoprolol group. Conclusions and Relevance: Among patients with permanent atrial fibrillation and symptoms of heart failure treated with low-dose digoxin or bisoprolol, there was no statistically significant difference in quality of life at 6 months. These findings support potentially basing decisions about treatment on other end points. Trial Registration: ClinicalTrials.gov Identifier: NCT02391337 and clinicaltrialsregister.eu Identifier: 2015-005043-13.


Asunto(s)
Antiarrítmicos/uso terapéutico , Fibrilación Atrial/tratamiento farmacológico , Bisoprolol/uso terapéutico , Digoxina/uso terapéutico , Frecuencia Cardíaca/efectos de los fármacos , Calidad de Vida , Antagonistas de Receptores Adrenérgicos beta 1/uso terapéutico , Anciano , Anciano de 80 o más Años , Antiarrítmicos/efectos adversos , Antiarrítmicos/farmacología , Fibrilación Atrial/complicaciones , Fibrilación Atrial/fisiopatología , Bisoprolol/efectos adversos , Bisoprolol/farmacología , Digoxina/efectos adversos , Digoxina/farmacología , Femenino , Insuficiencia Cardíaca/complicaciones , Insuficiencia Cardíaca/tratamiento farmacológico , Humanos , Masculino , Persona de Mediana Edad , Método Simple Ciego , Volumen Sistólico
SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA