Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 29
Filtrar
1.
J Bone Miner Res ; 2024 Apr 15.
Artigo em Inglês | MEDLINE | ID: mdl-38619297

RESUMO

Evidence on the comparative effectiveness of osteoporosis treatments is heterogeneous. This may be attributed to different populations and clinical practice, but also to differing methodologies ensuring comparability of treatment groups before treatment effect estimation and the amount of residual confounding by indication. This study assessed the comparability of denosumab vs oral bisphosphonate (OBP) groups using propensity score (PS) methods and negative control outcome (NCO) analysis. A total of 280 288 women aged ≥50 years initiating denosumab or OBP in 2011-2018 were included from the UK Clinical Practice Research Datalink (CPRD) and the Danish National Registries (DNR). Balance of observed covariates was assessed using absolute standardised mean difference (ASMD) before and after PS weighting, matching, and stratification, with ASMD >0.1 indicating imbalance. Residual confounding was assessed using NCOs with ≥100 events. Hazard ratio (HR) and 95% confidence interval (CI) between treatment and NCO was estimated using Cox models. Presence of residual confounding was evaluated with two approaches1: >5% of NCOs with 95% CI excluding 1,2 >5% of NCOs with an upper CI <0.75 or lower CI >1.3. The number of imbalanced covariates before adjustment (CPRD 22/87; DNR 18/83) decreased, with 2-11% imbalance remaining after weighting, matching or stratification. Using approach 1, residual confounding was present for all PS methods in both databases (≥8% of NCOs), except for stratification in DNR (3.8%). Using approach 2, residual confounding was present in CPRD with PS matching (5.3%) and stratification (6.4%), but not with weighting (4.3%). Within DNR, no NCOs had HR estimates with upper or lower CI limits beyond the specified bounds indicating residual confounding for any PS method. Achievement of covariate balance and determination of residual bias were dependent upon several factors including the population under study, PS method, prevalence of NCO, and the threshold indicating residual confounding.


Treatment groups in clinical practice may not be comparable as patient characteristics differ according to the need for the prescribed medication, known as confounding. We assessed comparability of two common osteoporosis treatments, denosumab and oral bisphosphonate, in 280 288 postmenopausal women using electronic health records from UK Clinical Practice Research Datalink (CPRD) and Danish National Registries (DNR). We evaluated comparability of recorded patient characteristics with three propensity score (PS) methods, matching, stratification, and weighting. We assessed residual confounding from unrecorded patient characteristics via negative control outcomes (NCO), events known not to be associated with treatment such as delirium. We found that achieving comparability of osteoporosis treatment groups depended on the study population, PS method, and definition of residual confounding. Weighting and stratification performed the best in DNR and CPRD, respectively. Using a stricter threshold based on statistical significance for the NCO suggested the treatment groups were not comparable, except for PS stratification in DNR. Applying clinically significant thresholds of treatment effect size showed comparability using weighting in CPRD and all PS methods in DNR. Studies should consider more than one PS method to test robustness and identify the largest number of NCO to give the greatest flexibility in detecting residual confounding.

2.
Drug Saf ; 47(2): 117-123, 2024 Feb.
Artigo em Inglês | MEDLINE | ID: mdl-38019365

RESUMO

The use of artificial intelligence (AI)-based tools to guide prescribing decisions is full of promise and may enhance patient outcomes. These tools can perform actions such as choosing the 'safest' medication, choosing between competing medications, promoting de-prescribing or even predicting non-adherence. These tools can exist in a variety of formats; for example, they may be directly integrated into electronic medical records or they may exist in a stand-alone website accessible by a web browser. One potential impact of these tools is that they could manipulate our understanding of the benefit-risk of medicines in the real world. Currently, the benefit risk of approved medications is assessed according to carefully planned agreements covering spontaneous reporting systems and planned surveillance studies. But AI-based tools may limit or even block prescription to high-risk patients or prevent off-label use. The uptake and temporal availability of these tools may be uneven across healthcare systems and geographies, creating artefacts in data that are difficult to account for. It is also hard to estimate the 'true impact' that a tool had on a prescribing decision. International borders may also be highly porous to these tools, especially in cases where tools are available over the web. These tools already exist, and their use is likely to increase in the coming years. How they can be accounted for in benefit-risk decisions is yet to be seen.


Assuntos
Inteligência Artificial , Atenção à Saúde , Humanos , Prescrições de Medicamentos , Registros Eletrônicos de Saúde , Medição de Risco
3.
Front Pharmacol ; 14: 988605, 2023.
Artigo em Inglês | MEDLINE | ID: mdl-37033623

RESUMO

Purpose: Surgeon and hospital-related features, such as volume, can be associated with treatment choices and outcomes. Accounting for these covariates with propensity score (PS) analysis can be challenging due to the clustered nature of the data. We studied six different PS estimation strategies for clustered data using random effects modelling (REM) compared with logistic regression. Methods: Monte Carlo simulations were used to generate variable cluster-level confounding intensity [odds ratio (OR) = 1.01-2.5] and cluster size (20-1,000 patients per cluster). The following PS estimation strategies were compared: i) logistic regression omitting cluster-level confounders; ii) logistic regression including cluster-level confounders; iii) the same as ii) but including cross-level interactions; iv), v), and vi), similar to i), ii), and iii), respectively, but using REM instead of logistic regression. The same strategies were tested in a trial emulation of partial versus total knee replacement (TKR) surgery, where observational versus trial-based estimates were compared as a proxy for bias. Performance metrics included bias and mean square error (MSE). Results: In most simulated scenarios, logistic regression, including cluster-level confounders, led to the lowest bias and MSE, for example, with 50 clusters × 200 individuals and confounding intensity OR = 1.5, a relative bias of 10%, and MSE of 0.003 for (i) compared to 32% and 0.010 for (iv). The results from the trial emulation also gave similar trends. Conclusion: Logistic regression, including patient and surgeon-/hospital-level confounders, appears to be the preferred strategy for PS estimation.

4.
Front Pharmacol ; 14: 1118203, 2023.
Artigo em Inglês | MEDLINE | ID: mdl-37033631

RESUMO

Background: Thrombosis with thrombocytopenia syndrome (TTS) has been identified as a rare adverse event following some COVID-19 vaccines. Various guidelines have been issued on the treatment of TTS. We aimed to characterize the treatment of TTS and other thromboembolic events (venous thromboembolism (VTE), and arterial thromboembolism (ATE) after COVID-19 vaccination and compared to historical (pre-vaccination) data in Europe and the US. Methods: We conducted an international network cohort study using 8 primary care, outpatient, and inpatient databases from France, Germany, Netherlands, Spain, The United Kingdom, and The United States. We investigated treatment pathways after the diagnosis of TTS, VTE, or ATE for a pre-vaccination (background) cohort (01/2017-11/2020), and a vaccinated cohort of people followed for 28 days after a dose of any COVID-19 vaccine recorded from 12/2020 onwards). Results: Great variability was observed in the proportion of people treated (with any recommended therapy) across databases, both before and after vaccination. Most patients with TTS received heparins, platelet aggregation inhibitors, or direct Xa inhibitors. The majority of VTE patients (before and after vaccination) were first treated with heparins in inpatient settings and direct Xa inhibitors in outpatient settings. In ATE patients, treatments were also similar before and after vaccinations, with platelet aggregation inhibitors prescribed most frequently. Inpatient and claims data also showed substantial heparin use. Conclusion: TTS, VTE, and ATE after COVID-19 vaccination were treated similarly to background events. Heparin use post-vaccine TTS suggests most events were not identified as vaccine-induced thrombosis with thrombocytopenia by the treating clinicians.

5.
BMC Geriatr ; 23(1): 58, 2023 01 31.
Artigo em Inglês | MEDLINE | ID: mdl-36721104

RESUMO

BACKGROUND: While several definitions exist for multimorbidity, frailty or polypharmacy, it is yet unclear to what extent single healthcare markers capture the complexity of health-related needs in older people in the community. We aimed to identify and characterise older people with complex health needs based on healthcare resource use (unplanned hospitalisations or polypharmacy) or frailty using large population-based linked records. METHODS: In this cohort study, data was extracted from UK primary care records (CPRD GOLD), with linked Hospital Episode Statistics inpatient data. People aged > 65 on 1st January 2010, registered in CPRD for ≥ 1 year were included. We identified complex health needs as the top quintile of unplanned hospitalisations, number of prescribed medicines, and electronic frailty index. We characterised all three cohorts, and quantified point-prevalence and incidence rates of preventive medicines use. RESULTS: Overall, 90,597, 110,225 and 116,076 individuals were included in the hospitalisation, frailty, and polypharmacy cohorts respectively; 28,259 (5.9%) were in all three cohorts, while 277,332 (58.3%) were not in any (background population). Frailty and polypharmacy cohorts had the highest bi-directional overlap. Most comorbidities such as diabetes and chronic kidney disease were more common in the frailty and polypharmacy cohorts compared to the hospitalisation cohort. Generally, prevalence of preventive medicines use was highest in the polypharmacy cohort compared to the other two cohorts: For instance, one-year point-prevalence of statins was 64.2% in the polypharmacy cohort vs. 60.5% in the frailty cohort. CONCLUSIONS: Three distinct groups of older people with complex health needs were identified. Compared to the hospitalisation cohort, frailty and polypharmacy cohorts had more comorbidities and higher preventive therapies use. Research is needed into the benefit-risk of different definitions of complex health needs and use of preventive therapies in the older population.


Assuntos
Fragilidade , Humanos , Idoso , Estudos de Coortes , Fragilidade/diagnóstico , Fragilidade/epidemiologia , Web Semântica , Hospitais , Atenção Primária à Saúde , Reino Unido/epidemiologia
6.
Rheumatology (Oxford) ; 62(11): 3592-3600, 2023 11 02.
Artigo em Inglês | MEDLINE | ID: mdl-36688706

RESUMO

OBJECTIVES: To explore clustering of comorbidities among patients with a new diagnosis of OA and estimate the 10-year mortality risk for each identified cluster. METHODS: This is a population-based cohort study of individuals with first incident diagnosis of OA of the hip, knee, ankle/foot, wrist/hand or 'unspecified' site between 2006 and 2020, using SIDIAP (a primary care database representative of Catalonia, Spain). At the time of OA diagnosis, conditions associated with OA in the literature that were found in ≥1% of the individuals (n = 35) were fitted into two cluster algorithms, k-means and latent class analysis. Models were assessed using a range of internal and external evaluation procedures. Mortality risk of the obtained clusters was assessed by survival analysis using Cox proportional hazards. RESULTS: We identified 633 330 patients with a diagnosis of OA. Our proposed best solution used latent class analysis to identify four clusters: 'low-morbidity' (relatively low number of comorbidities), 'back/neck pain plus mental health', 'metabolic syndrome' and 'multimorbidity' (higher prevalence of all studied comorbidities). Compared with the 'low-morbidity' cluster, the 'multimorbidity' cluster had the highest risk of 10-year mortality (adjusted hazard ratio [HR]: 2.19 [95% CI: 2.15, 2.23]), followed by the 'metabolic syndrome' cluster (adjusted HR: 1.24 [95% CI: 1.22, 1.27]) and the 'back/neck pain plus mental health' cluster (adjusted HR: 1.12 [95% CI: 1.09, 1.15]). CONCLUSION: Patients with a new diagnosis of OA can be clustered into groups based on their comorbidity profile, with significant differences in 10-year mortality risk. Further research is required to understand the interplay between OA and particular comorbidity groups, and the clinical significance of such results.


Assuntos
Osteoartrite do Quadril , Osteoartrite do Joelho , Humanos , Espanha/epidemiologia , Osteoartrite do Joelho/epidemiologia , Estudos de Coortes , Cervicalgia , Osteoartrite do Quadril/epidemiologia , Osteoartrite do Quadril/diagnóstico , Comorbidade
7.
Nat Commun ; 13(1): 7167, 2022 11 23.
Artigo em Inglês | MEDLINE | ID: mdl-36418291

RESUMO

Population-based studies can provide important evidence on the safety of COVID-19 vaccines. Using data from the United Kingdom, here we compare observed rates of thrombosis and thrombocytopenia following vaccination against SARS-CoV-2 and infection with SARS-CoV-2 with background (expected) rates in the general population. First and second dose cohorts for ChAdOx1 or BNT162b2 between 8 December 2020 and 2 May 2021 in the United Kingdom were identified. A further cohort consisted of people with no prior COVID-19 vaccination who were infected with SARS-Cov-2 identified by a first positive PCR test between 1 September 2020 and 2 May 2021. The fourth general population cohort for background rates included those people in the database as of 1 January 2017. In total, we included 3,768,517 ChAdOx1 and 1,832,841 BNT162b2 vaccinees, 401,691 people infected with SARS-CoV-2, and 9,414,403 people from the general population. An increased risk of venous thromboembolism was seen after first dose of ChAdOx1 (standardized incidence ratio: 1.12 [95% CI: 1.05 to 1.20]), BNT162b2 (1.12 [1.03 to 1.21]), and positive PCR test (7.27 [6.86 to 7.72]). Rates of cerebral venous sinus thrombosis were higher than otherwise expected after first dose of ChAdOx1 (4.14 [2.54 to 6.76]) and a SARS-CoV-2 PCR positive test (3.74 [1.56 to 8.98]). Rates of arterial thromboembolism after vaccination were no higher than expected but were increased after a SARS-CoV-2 PCR positive test (1.39 [1.21 to 1.61]). Rates of venous thromboembolism with thrombocytopenia were higher than expected after a SARS-CoV-2 PCR positive test (5.76 [3.19 to 10.40]).


Assuntos
Vacinas contra COVID-19 , COVID-19 , Trombocitopenia , Trombose , Tromboembolia Venosa , Humanos , Vacina BNT162 , COVID-19/epidemiologia , COVID-19/prevenção & controle , Vacinas contra COVID-19/efeitos adversos , SARS-CoV-2 , Trombocitopenia/epidemiologia , Trombocitopenia/etiologia , Trombose/epidemiologia , Trombose/etiologia , Vacinação/efeitos adversos , Tromboembolia Venosa/epidemiologia , Tromboembolia Venosa/etiologia , Reino Unido
8.
Nat Commun ; 13(1): 7169, 2022 11 23.
Artigo em Inglês | MEDLINE | ID: mdl-36418321

RESUMO

Population-based studies can provide important evidence on the safety of COVID-19 vaccines. Here we compare rates of thrombosis and thrombocytopenia following vaccination against SARS-CoV-2 with the background (expected) rates in the general population. In addition, we compare the rates of the same adverse events among persons infected with SARS-CoV-2 with background rates. Primary care and linked hospital data from Catalonia, Spain informed the study, with participants vaccinated with BNT162b2 or ChAdOx1 (27/12/2020-23/06/2021), COVID-19 cases (01/09/2020-23/06/2021) or present in the database as of 01/01/2017. We included 2,021,366 BNT162b2 (1,327,031 with 2 doses), 592,408 ChAdOx1, 174,556 COVID-19 cases, and 4,573,494 background participants. Standardised incidence ratios for venous thromboembolism were 1.18 (95% CI 1.06-1.32) and 0.92 (0.81-1.05) after first- and second dose BNT162b2, and 0.92 (0.71-1.18) after first dose ChAdOx1. The standardised incidence ratio for venous thromboembolism in COVID-19 was 10.19 (9.43-11.02). Standardised incidence ratios for arterial thromboembolism were 1.02 (0.95-1.09) and 1.04 (0.97-1.12) after first- and second dose BNT162b2, 1.06 (0.91-1.23) after first-dose ChAdOx1 and 4.13 (3.83-4.45) for COVID-19. Standardised incidence ratios for thrombocytopenia were 1.49 (1.43-1.54) and 1.40 (1.35-1.45) after first- and second dose BNT162b2, 1.28 (1.19-1.38) after first-dose ChAdOx1 and 4.59 (4.41- 4.77) for COVID-19. While rates of thrombosis with thrombocytopenia were generally similar to background rates, the standardised incidence ratio for pulmonary embolism with thrombocytopenia after first-dose BNT162b2 was 1.70 (1.11-2.61). These findings suggest that the safety profiles of BNT162b2 and ChAdOx1 are similar, with rates of adverse events seen after vaccination typically similar to background rates. Meanwhile, rates of adverse events are much increased for COVID-19 cases further underlining the importance of vaccination.


Assuntos
COVID-19 , Trombocitopenia , Trombose , Tromboembolia Venosa , Humanos , SARS-CoV-2 , Espanha/epidemiologia , Tromboembolia Venosa/epidemiologia , Tromboembolia Venosa/etiologia , COVID-19/epidemiologia , COVID-19/prevenção & controle , Vacinas contra COVID-19/efeitos adversos , Vacina BNT162 , Trombocitopenia/epidemiologia , Trombocitopenia/etiologia , Trombose/epidemiologia , Trombose/etiologia , Vacinação/efeitos adversos
9.
J Thromb Haemost ; 20(12): 2887-2895, 2022 12.
Artigo em Inglês | MEDLINE | ID: mdl-36111372

RESUMO

BACKGROUND: COVID-19 vaccination has been associated with increased venous thromboembolism (VTE) risk. However, it is unknown whether genetic predisposition to VTE is associated with an increased risk of thrombosis following vaccination. METHODS: Using data from the UK Biobank, which contains in-depth genotyping and linked vaccination and health outcomes information, we generated a polygenic risk score (PRS) using 299 genetic variants. We prospectively assessed associations between PRS and incident VTE immediately after first- and the second-dose vaccination and among historical unvaccinated cohorts during the pre- and early pandemic. We estimated hazard ratios (HR) for PRS-VTE associations using Cox models. RESULTS: Of 359 310 individuals receiving one dose of a COVID-19 vaccine, 160 327 (44.6%) were males, and the mean age at the vaccination date was 69.05 (standard deviation [SD] 8.04) years. After 28- and 90-days' follow-up, 88 and 299 individuals developed VTE, respectively, equivalent to an incidence rate of 0.88 (95% confidence interval [CI] 0.70-1.08) and 0.92 (0.82-1.04) per 100 000 person-days. The PRS was significantly associated with a higher risk of VTE (HR per 1 SD increase in PRS, 1.41 (1.15-1.73) in 28 days and 1.36 (1.22-1.52) in 90 days). Similar associations were found in the historical unvaccinated cohorts. CONCLUSIONS: The strength of genetic susceptibility with post-COVID-19-vaccination VTE is similar to that seen in historical data. Additionally, the observed PRS-VTE associations were equivalent for adenovirus- and mRNA-based vaccines. These findings suggest that, at the population level, the VTE that occurred after the COVID-19 vaccination has a similar genetic etiology to the conventional VTE.


Assuntos
Vacinas contra COVID-19 , COVID-19 , Tromboembolia Venosa , Idoso , Feminino , Humanos , Masculino , Pessoa de Meia-Idade , COVID-19/epidemiologia , COVID-19/prevenção & controle , Vacinas contra COVID-19/efeitos adversos , Predisposição Genética para Doença , Fatores de Risco , Vacinação/efeitos adversos , Tromboembolia Venosa/epidemiologia , Tromboembolia Venosa/etiologia
10.
Front Pharmacol ; 13: 912361, 2022.
Artigo em Inglês | MEDLINE | ID: mdl-35754470

RESUMO

Objective: To characterize the trend of opioid use (number of users, dispensations and oral morphine milligram equivalents) in Catalonia (Spain). Design, setting, and participants: This population-based cohort study included all individuals aged 18 years or older, registered in the Information System for Research in Primary Care (SIDIAP), which covers >75% of the population in Catalonia, Spain, from 1 January 2007, to 31 December 2019. Main exposure and outcomes: The exposures were all commercialized opioids and their combinations (ATC-codes): codeine, tramadol, oxycodone, tapentadol, fentanyl, morphine, and other opioids (dihydrocodeine, hydromorphone, dextropropoxyphene, buprenorphine, pethidine, pentazocine). The main outcomes were the annual figures per 1,000 individuals of 1) opioid users, 2) dispensations, and 3) oral morphine milligram equivalents (MME). Results were stratified separately by opioid types, age (5-year age groups), sex (male or female), living area (rural or urban), and socioeconomic status (from least, U1, to most deprived, U5). The overall trends were quantified using the percentage change (PC) between 2007 and 2019. Results: Among 4,656,197 and 4,798,114 residents from 2007 to 2019, the number of opioid users, dispensations and morphine milligram equivalents per 1,000 individuals increased 12% (percentage change: 95% confidence interval (CI) 11.9-12.3%), 105% (95% confidence interval 83%-126%) and 339% (95% CI 289%-390%) respectively. Tramadol represented the majority of opioid use in 2019 (61, 59, and 54% of opioid users, dispensations, and total MME, respectively). Individuals aged 80 years or over reported the sharpest increase regarding opioid users (PC: 162%), dispensations (PC: 424%), and MME (PC: 830%). Strong opioids were increasingly prescribed for non-cancer pains over the years. Conclusion: Despite the modest increase of opioid users, opioid dispensations and MME increased substantially, particularly in the older population. In addition, strong opioids were incrementally indicated for non-cancer pains over the years. These findings suggest a transition of opioid prescriptions from intermittent to chronic and weak to strong and call for more rigorous opioid stewardship.

11.
Health Technol Assess ; 25(66): 1-126, 2021 11.
Artigo em Inglês | MEDLINE | ID: mdl-34812138

RESUMO

BACKGROUND: Although routine NHS data potentially include all patients, confounding limits their use for causal inference. Methods to minimise confounding in observational studies of implantable devices are required to enable the evaluation of patients with severe systemic morbidity who are excluded from many randomised controlled trials. OBJECTIVES: Stage 1 - replicate the Total or Partial Knee Arthroplasty Trial (TOPKAT), a surgical randomised controlled trial comparing unicompartmental knee replacement with total knee replacement using propensity score and instrumental variable methods. Stage 2 - compare the risk benefits and cost-effectiveness of unicompartmental knee replacement with total knee replacement surgery in patients with severe systemic morbidity who would have been ineligible for TOPKAT using the validated methods from stage 1. DESIGN: This was a cohort study. SETTING: Data were obtained from the National Joint Registry database and linked to hospital inpatient (Hospital Episode Statistics) and patient-reported outcome data. PARTICIPANTS: Stage 1 - people undergoing unicompartmental knee replacement surgery or total knee replacement surgery who met the TOPKAT eligibility criteria. Stage 2 - participants with an American Society of Anesthesiologists grade of ≥ 3. INTERVENTION: The patients were exposed to either unicompartmental knee replacement surgery or total knee replacement surgery. MAIN OUTCOME MEASURES: The primary outcome measure was the postoperative Oxford Knee Score. The secondary outcome measures were 90-day postoperative complications (venous thromboembolism, myocardial infarction and prosthetic joint infection) and 5-year revision risk and mortality. The main outcome measures for the health economic analysis were health-related quality of life (EuroQol-5 Dimensions) and NHS hospital costs. RESULTS: In stage 1, propensity score stratification and inverse probability weighting replicated the results of TOPKAT. Propensity score adjustment, propensity score matching and instrumental variables did not. Stage 2 included 2256 unicompartmental knee replacement patients and 57,682 total knee replacement patients who had severe comorbidities, of whom 145 and 23,344 had linked Oxford Knee Scores, respectively. A statistically significant but clinically irrelevant difference favouring unicompartmental knee replacement was observed, with a mean postoperative Oxford Knee Score difference of < 2 points using propensity score stratification; no significant difference was observed using inverse probability weighting. Unicompartmental knee replacement more than halved the risk of venous thromboembolism [relative risk 0.33 (95% confidence interval 0.15 to 0.74) using propensity score stratification; relative risk 0.39 (95% confidence interval 0.16 to 0.96) using inverse probability weighting]. Unicompartmental knee replacement was not associated with myocardial infarction or prosthetic joint infection using either method. In the long term, unicompartmental knee replacement had double the revision risk of total knee replacement [hazard ratio 2.70 (95% confidence interval 2.15 to 3.38) using propensity score stratification; hazard ratio 2.60 (95% confidence interval 1.94 to 3.47) using inverse probability weighting], but half of the mortality [hazard ratio 0.52 (95% confidence interval 0.36 to 0.74) using propensity score stratification; insignificant effect using inverse probability weighting]. Unicompartmental knee replacement had lower costs and higher quality-adjusted life-year gains than total knee replacement for stage 2 participants. LIMITATIONS: Although some propensity score methods successfully replicated TOPKAT, unresolved confounding may have affected stage 2. Missing Oxford Knee Scores may have led to information bias. CONCLUSIONS: Propensity score stratification and inverse probability weighting successfully replicated TOPKAT, implying that some (but not all) propensity score methods can be used to evaluate surgical innovations and implantable medical devices using routine NHS data. Unicompartmental knee replacement was safer and more cost-effective than total knee replacement for patients with severe comorbidity and should be considered the first option for suitable patients. FUTURE WORK: Further research is required to understand the performance of propensity score methods for evaluating surgical innovations and implantable devices. TRIAL REGISTRATION: This trial is registered as EUPAS17435. FUNDING: This project was funded by the National Institute for Health Research (NIHR) Health Technology Assessment programme and will be published in full in Health Technology Assessment; Vol. 25, No. 66. See the NIHR Journals Library website for further project information.


We compared the risks and benefits of partial and total knee replacements in NHS patients with a complex medical history who would normally be excluded from randomised trials on this topic. We used information that was collected during hospital appointments for people who had a knee replacement between 2009 and 2016. It is difficult to directly compare the two groups because each individual patient has a different medical history. We tested advanced statistical methods to account for these differences. In stage 1, we showed that some of these advanced statistical methods could replicate the results of a recently published surgical trial using routine data from the NHS. We compared patients in the trial with similar patients who were operated on in the NHS. Three of the proposed methods showed results similar to those obtained from the Total or Partial Knee Arthroplasty Trial (TOPKAT). In stage 2, we used the successful methods from stage 1 to study the risks, benefits and costs of partial and total knee replacement surgery in patients with complex medical histories. Two of the statistical methods found that patients who had a partial knee replacement had less self-reported pain and better function after surgery than patients who had a total knee replacement. All three methods found that partial knee replacement was safer, was associated with a lower risk of blood clots (a known complication of knee surgery) and had lower mortality over 5 years. However, patients who had a partial knee replacement were twice as likely as those with a total knee replacement to need a second surgery within 5 years. We found that partial knee replacements were less costly to the NHS and were associated with better overall quality of life for patients than total knee replacement.


Assuntos
Artroplastia do Joelho , Estudos de Coortes , Análise Custo-Benefício , Humanos , Pontuação de Propensão , Qualidade de Vida , Anos de Vida Ajustados por Qualidade de Vida
12.
JAMA ; 326(15): 1504-1515, 2021 10 19.
Artigo em Inglês | MEDLINE | ID: mdl-34665205

RESUMO

Importance: Although tramadol is increasingly used to manage chronic noncancer pain, few safety studies have compared it with other opioids. Objective: To assess the associations of tramadol, compared with codeine, with mortality and other adverse clinical outcomes as used in outpatient settings. Design, Setting, and Participants: Retrospective, population-based, propensity score-matched cohort study using a primary care database with routinely collected medical records and pharmacy dispensations covering more than 80% of the population of Catalonia, Spain (≈6 million people). Patients 18 years or older with 1 or more year of available data and dispensation of tramadol or codeine (2007-2017) were included and followed up to December 31, 2017. Exposures: New prescription dispensation of tramadol or codeine (no dispensation in the previous year). Main Outcomes and Measures: Outcomes studied were all-cause mortality, cardiovascular events, fractures, constipation, delirium, falls, opioid abuse/dependence, and sleep disorders within 1 year after the first dispensation. Absolute rate differences (ARDs) and hazard ratios (HRs) with 95% confidence intervals were calculated using cause-specific Cox models. Results: Of the 1 093 064 patients with a tramadol or codeine dispensation during the study period (326 921 for tramadol, 762 492 for codeine, 3651 for both drugs concomitantly), a total of 368 960 patients (184 480 propensity score-matched pairs) were included after study exclusions and propensity score matching (mean age, 53.1 [SD, 16.1] years; 57.3% women). Compared with codeine, tramadol dispensation was significantly associated with a higher risk of all-cause mortality (incidence, 13.00 vs 5.61 per 1000 person-years; HR, 2.31 [95% CI, 2.08-2.56]; ARD, 7.37 [95% CI, 6.09-8.78] per 1000 person-years), cardiovascular events (incidence, 10.03 vs 8.67 per 1000 person-years; HR, 1.15 [95% CI, 1.05-1.27]; ARD, 1.36 [95% CI, 0.45-2.36] per 1000 person-years), and fractures (incidence, 12.26 vs 8.13 per 1000 person-years; HR, 1.50 [95% CI, 1.37-1.65]; ARD, 4.10 [95% CI, 3.02-5.29] per 1000 person-years). No significant difference was observed for the risk of falls, delirium, constipation, opioid abuse/dependence, or sleep disorders. Conclusions and Relevance: In this population-based cohort study, a new prescription dispensation of tramadol, compared with codeine, was significantly associated with a higher risk of subsequent all-cause mortality, cardiovascular events, and fractures, but there was no significant difference in the risk of constipation, delirium, falls, opioid abuse/dependence, or sleep disorders. The findings should be interpreted cautiously, given the potential for residual confounding.


Assuntos
Analgésicos Opioides/efeitos adversos , Causas de Morte , Codeína/efeitos adversos , Tramadol/efeitos adversos , Acidentes por Quedas/estatística & dados numéricos , Assistência Ambulatorial , Doenças Cardiovasculares/induzido quimicamente , Doenças Cardiovasculares/epidemiologia , Bases de Dados Factuais , Delírio/epidemiologia , Prescrições de Medicamentos/estatística & dados numéricos , Feminino , Fraturas Ósseas/induzido quimicamente , Fraturas Ósseas/epidemiologia , Humanos , Incidência , Masculino , Pessoa de Meia-Idade , Transtornos Relacionados ao Uso de Opioides/epidemiologia , Pontuação de Propensão , Modelos de Riscos Proporcionais , Estudos Retrospectivos , Transtornos do Sono-Vigília/epidemiologia
13.
J Bone Miner Res ; 36(11): 2153-2161, 2021 11.
Artigo em Inglês | MEDLINE | ID: mdl-34173277

RESUMO

Conflicting results exist about the relationship between bariatric surgery and fracture risk. Also, prediction of who is at increased risk of fracture after bariatric surgery is not currently available. Hence, we used a combination of a self-controlled case series (SCCS) study to establish the association between bariatric surgery and fracture, and develop a prediction model for postoperative fracture risk estimation using a cohort study. Patients from UK Primary care records from the Clinical Practice Research Datalink GOLD linked to Hospital Episode Statistics undergoing bariatric surgery with body mass index (BMI) ≥30 kg/m2 between 1997 and 2018 were included in the cohort. Those sustaining one or more fractures in the 5 years before or after surgery were included in the SCCS. Fractures were considered in three categories: (i) any except skull and digits (primary outcome); (ii) major (hip, vertebrae, wrist/forearm, and humerus); and (iii) peripheral (forearm and lower leg). Of 5487 participants, 252 (4.6%) experienced 272 fractures (of which 80 were major and 135 peripheral) and were included in the SCCS analyses. Major fracture risk increased after surgery, incidence rate ratios (IRRs) and 95% confidence intervals (CIs): 2.77 (95% CI, 1.34-5.75) and 3.78 (95% CI, 1.42-10.08) at ≤3 years and 3.1 to 5 years postsurgery when compared to 5 years prior to surgery, respectively. Any fracture risk was higher only in the 2.1 to 5 years following surgery (IRR 1.73; 95% CI, 1.08-2.77) when compared to 5 years prior to surgery. No excess risk of peripheral fracture after surgery was identified. A prediction tool for major fracture was developed using 5487 participants included in the cohort study. It was also internally validated (area under the receiver-operating characteristic curve [AUC ROC] 0.70) with use of anxiolytics/sedatives/hypnotics and female as major predictors. Hence, major fractures are nearly threefold more likely after bariatric surgery. A simple prediction tool with five variables identifies high risk patients for major fracture. © 2021 The Authors. Journal of Bone and Mineral Research published by Wiley Periodicals LLC on behalf of American Society for Bone and Mineral Research (ASBMR).


Assuntos
Cirurgia Bariátrica , Fraturas Ósseas , Cirurgia Bariátrica/efeitos adversos , Estudos de Coortes , Feminino , Fraturas Ósseas/epidemiologia , Humanos , Fatores de Risco , Reino Unido
14.
Health Technol Assess ; 25(17): 1-106, 2021 03.
Artigo em Inglês | MEDLINE | ID: mdl-33739919

RESUMO

BACKGROUND: Bisphosphonates are contraindicated in patients with stage 4+ chronic kidney disease. However, they are widely used to prevent fragility fractures in stage 3 chronic kidney disease, despite a lack of good-quality data on their effects. OBJECTIVES: The aims of each work package were as follows. Work package 1: to study the relationship between bisphosphonate use and chronic kidney disease progression. Work package 2: to study the association between using bisphosphonates and fracture risk. Work package 3: to determine the risks of hypocalcaemia, hypophosphataemia, acute kidney injury and upper gastrointestinal events associated with using bisphosphonates. Work package 4: to investigate the association between using bisphosphonates and changes in bone mineral density over time. DESIGN: This was a new-user cohort study design with propensity score matching. SETTING AND DATA SOURCES: Data were obtained from UK NHS primary care (Clinical Practice Research Datalink GOLD database) and linked hospital inpatient records (Hospital Episode Statistics) for work packages 1-3 and from the Danish Odense University Hospital Databases for work package 4. PARTICIPANTS: Patients registered in the data sources who had at least one measurement of estimated glomerular filtration rate of < 45 ml/minute/1.73 m2 were eligible. A second estimated glomerular filtration rate value of < 45 ml/minute/1.73 m2 within 1 year after the first was requested for work packages 1 and 3. Patients with no Hospital Episode Statistics linkage were excluded from work packages 1-3. Patients with < 1 year of run-in data before index estimated glomerular filtration rate and previous users of anti-osteoporosis medications were excluded from work packages 1-4. INTERVENTIONS/EXPOSURE: Bisphosphonate use, identified from primary care prescriptions (for work packages 1-3) or pharmacy dispensations (for work package 4), was the main exposure. MAIN OUTCOME MEASURES: Work package 1: chronic kidney disease progression, defined as stage worsening or starting renal replacement. Work package 2: hip fracture. Work package 3: acute kidney injury, hypocalcaemia and hypophosphataemia identified from Hospital Episode Statistics, and gastrointestinal events identified from Clinical Practice Research Datalink or Hospital Episode Statistics. Work package 4: annualised femoral neck bone mineral density percentage change. RESULTS: Bisphosphonate use was associated with an excess risk of chronic kidney disease progression (subdistribution hazard ratio 1.12, 95% confidence interval 1.02 to 1.24) in work package 1, but did not increase the probability of other safety outcomes in work package 3. The results from work package 2 suggested that bisphosphonate use increased fracture risk (hazard ratio 1.25, 95% confidence interval 1.13 to 1.39) for hip fractures, but sensitivity analyses suggested that this was related to unresolved confounding. Conversely, work package 4 suggested that bisphosphonates improved bone mineral density, with an average 2.65% (95% confidence interval 1.32% to 3.99%) greater gain in femoral neck bone mineral density per year in bisphosphonate users than in matched non-users. LIMITATIONS: Confounding by indication was a concern for the clinical effectiveness (i.e. work package 2) data. Bias analyses suggested that these findings were due to inappropriate adjustment for pre-treatment risk. work packages 3 and 4 were based on small numbers of events and participants, respectively. CONCLUSIONS: Bisphosphonates were associated with a 12% excess risk of chronic kidney disease progression in participants with stage 3B+ chronic kidney disease. No other safety concerns were identified. Bisphosphonate therapy increased bone mineral density, but the research team failed to demonstrate antifracture effectiveness. FUTURE WORK: Randomised controlled trial data are needed to demonstrate antifracture efficacy in patients with stage 3B+ chronic kidney disease. More safety analyses are needed to characterise the renal toxicity of bisphosphonates in stage 3A chronic kidney disease, possibly using observational data. STUDY REGISTRATION: This study is registered as EUPAS10029. FUNDING: This project was funded by the National Institute for Health Research (NIHR) Health Technology Assessment programme and will be published in full in Health Technology Assessment; Vol. 25, No. 17. See the NIHR Journals Library website for further project information. The project was also supported by the National Institute for Health Research Biomedical Research Centre, Oxford.


RATIONALE AND AIMS: Bisphosphonates are used to prevent fractures in people with fragile bones. People with chronic kidney disease have a high risk of fracturing, but the safety and effectiveness of bisphosphonates in severe chronic kidney disease is unclear. The aim of this study was to assess the benefits (e.g. bone strength improvement and fracture prevention) and the risks of unwanted effects associated with bisphosphonates for people with moderate to severe chronic kidney disease. METHODS: Anonymised primary and secondary care electronic medical records data from the UK NHS were used, as well as a Danish equivalent that included bone density scans. Anyone in these databases with a measure of reduced kidney function that suggested moderate to severe chronic kidney disease was eligible, which was > 220,000 people from the UK. Over 20,000 of them used bisphosphonates. Bisphosphonate users were matched to non-users with similar age, sex and other characteristics. RESULTS: Bisphosphonate users had a 12% higher risk of their chronic kidney disease getting worse than non-users. Their risks of other side effects, such as acute kidney injuries and gastrointestinal problems, did not change. Bisphosphonate users had a 25% higher risk of fractures than non-users in the UK database, probably because the matching methods did not create similar-enough groups of users and non-users. However, it was found that bisphosphonate improved bone density in the Danish database. Bone density is a proxy for bone strength, so better bone density should mean fewer fractures. CONCLUSIONS: These results suggest that bisphosphonate therapy may make moderate to severe chronic kidney disease worse. More studies are needed on how bisphosphonates affect milder chronic kidney disease. Bisphosphonates were associated with better bone strength, but it could not be demonstrated that they reduced fracture risk. More data are required, probably from a placebo-controlled trial, to determine whether or not bisphosphonates prevent fractures in people with moderate to severe chronic kidney disease and whether or not this is worth the risk of their chronic kidney disease worsening.


Assuntos
Fraturas Ósseas , Insuficiência Renal Crônica , Estudos de Coortes , Difosfonatos/efeitos adversos , Fraturas Ósseas/epidemiologia , Humanos , Pontuação de Propensão , Insuficiência Renal Crônica/complicações , Insuficiência Renal Crônica/epidemiologia
15.
Heart ; 107(11): 902-908, 2021 06.
Artigo em Inglês | MEDLINE | ID: mdl-33692093

RESUMO

OBJECTIVE: To improve the echocardiographic assessment of heart failure in patients with atrial fibrillation (AF) by comparing conventional averaging of consecutive beats with an index-beat approach, whereby measurements are taken after two cycles with similar R-R interval. METHODS: Transthoracic echocardiography was performed using a standardised and blinded protocol in patients enrolled in the RATE-AF (RAte control Therapy Evaluation in permanent Atrial Fibrillation) randomised trial. We compared reproducibility of the index-beat and conventional consecutive-beat methods to calculate left ventricular ejection fraction (LVEF), global longitudinal strain (GLS) and E/e' (mitral E wave max/average diastolic tissue Doppler velocity), and assessed intraoperator/interoperator variability, time efficiency and validity against natriuretic peptides. RESULTS: 160 patients were included, 46% of whom were women, with a median age of 75 years (IQR 69-82) and a median heart rate of 100 beats per minute (IQR 86-112). The index-beat had the lowest within-beat coefficient of variation for LVEF (32%, vs 51% for 5 consecutive beats and 53% for 10 consecutive beats), GLS (26%, vs 43% and 42%) and E/e' (25%, vs 41% and 41%). Intraoperator (n=50) and interoperator (n=18) reproducibility were both superior for index-beats and this method was quicker to perform (p<0.001): 35.4 s to measure E/e' (95% CI 33.1 to 37.8) compared with 44.7 s for 5-beat (95% CI 41.8 to 47.5) and 98.1 s for 10-beat (95% CI 91.7 to 104.4) analyses. Using a single index-beat did not compromise the association of LVEF, GLS or E/e' with natriuretic peptide levels. CONCLUSIONS: Compared with averaging of multiple beats in patients with AF, the index-beat approach improves reproducibility and saves time without a negative impact on validity, potentially improving the diagnosis and classification of heart failure in patients with AF.


Assuntos
Fibrilação Atrial/fisiopatologia , Ecocardiografia Doppler de Pulso , Insuficiência Cardíaca/diagnóstico , Idoso , Idoso de 80 Anos ou mais , Biomarcadores/sangue , Diástole/fisiologia , Feminino , Humanos , Masculino , Peptídeo Natriurético Encefálico/sangue , Fragmentos de Peptídeos/sangue , Reprodutibilidade dos Testes , Volume Sistólico/fisiologia , Sístole/fisiologia , Função Ventricular Esquerda/fisiologia
16.
Rheumatology (Oxford) ; 60(10): 4832-4843, 2021 10 02.
Artigo em Inglês | MEDLINE | ID: mdl-33560340

RESUMO

OBJECTIVES: Better indicators from affordable, sustainable data sources are needed to monitor population burden of musculoskeletal conditions. We propose five indicators of musculoskeletal health and assessed if routinely available primary care electronic health records (EHR) can estimate population levels in musculoskeletal consulters. METHODS: We collected validated patient-reported measures of pain experience, function and health status through a local survey of adults (≥35 years) presenting to English general practices over 12 months for low back pain, shoulder pain, osteoarthritis and other regional musculoskeletal disorders. Using EHR data we derived and validated models for estimating population levels of five self-reported indicators: prevalence of high impact chronic pain, overall musculoskeletal health (based on Musculoskeletal Health Questionnaire), quality of life (based on EuroQoL health utility measure), and prevalence of moderate-to-severe low back pain and moderate-to-severe shoulder pain. We applied models to a national EHR database (Clinical Practice Research Datalink) to obtain national estimates of each indicator for three successive years. RESULTS: The optimal models included recorded demographics, deprivation, consultation frequency, analgesic and antidepressant prescriptions, and multimorbidity. Applying models to national EHR, we estimated that 31.9% of adults (≥35 years) presenting with non-inflammatory musculoskeletal disorders in England in 2016/17 experienced high impact chronic pain. Estimated population health levels were worse in women, older aged and those in the most deprived neighbourhoods, and changed little over 3 years. CONCLUSION: National and subnational estimates for a range of subjective indicators of non-inflammatory musculoskeletal health conditions can be obtained using information from routine electronic health records.


Assuntos
Efeitos Psicossociais da Doença , Doenças Musculoesqueléticas/epidemiologia , Adulto , Fatores Etários , Idoso , Idoso de 80 Anos ou mais , Registros Eletrônicos de Saúde/estatística & dados numéricos , Inglaterra/epidemiologia , Feminino , Humanos , Masculino , Pessoa de Meia-Idade , Modelos Estatísticos , Atenção Primária à Saúde/estatística & dados numéricos , Fatores Sexuais , Inquéritos e Questionários
17.
J Bone Miner Res ; 36(5): 820-832, 2021 05.
Artigo em Inglês | MEDLINE | ID: mdl-33373491

RESUMO

Bisphosphonates are the first-line treatment for preventing fractures in osteoporosis patients. However, their use is contraindicated or to be used with caution in chronic kidney disease (CKD) patients, primarily because of a lack of information about their safety and effectiveness. We aimed to investigate the safety of oral bisphosphonates in patients with moderate to severe CKD, using primary-care electronic records from two cohorts, CPRD GOLD (1997-2016) and SIDIAP (2007-2015) in the UK and Catalonia, respectively. Both databases were linked to hospital records. SIDIAP was also linked to end-stage renal disease registry data. Patients with CKD stages 3b to 5, based on two or more estimated glomerular filtration rate measurements less than 45 mL/min/1.73 m2 , aged 40 years or older were identified. New bisphosphonate users were propensity score-matched with up to five non-users to minimize confounding within this population. Our primary outcome was CKD stage worsening (estimated glomerular filtration rate [eGFR] decline or renal replacement therapy). Secondary outcomes were acute kidney injury, gastrointestinal bleeding/ulcers, and severe hypocalcemia. Hazard ratios (HRs) were estimated using Cox regression and Fine and Gray sub-HRs were calculated for competing risks. We matched 2447 bisphosphonate users with 8931 non-users from CPRD and 1399 users with 6547 non-users from SIDIAP. Bisphosphonate use was associated with greater risk of CKD progression in CPRD (sub-HR [95% CI]: 1.14 [1.04, 1.26]) and SIDIAP (sub-HR: 1.15 [1.04, 1.27]). No risk differences were found for acute kidney injury, gastrointestinal bleeding/ulcers, or hypocalcemia. Hence, we can conclude a modest (15%) increased risk of CKD progression was identified in association with bisphosphonate use. No other safety concerns were identified. Our findings should be considered before prescribing bisphosphonates to patients with moderate to severe CKD. © 2020 American Society for Bone and Mineral Research (ASBMR).


Assuntos
Osteoporose , Insuficiência Renal Crônica , Estudos de Coortes , Difosfonatos/efeitos adversos , Taxa de Filtração Glomerular , Humanos , Osteoporose/tratamento farmacológico , Osteoporose/epidemiologia , Insuficiência Renal Crônica/complicações , Insuficiência Renal Crônica/tratamento farmacológico , Insuficiência Renal Crônica/epidemiologia , Fatores de Risco
18.
JAMA ; 324(24): 2497-2508, 2020 12 22.
Artigo em Inglês | MEDLINE | ID: mdl-33351042

RESUMO

Importance: There is little evidence to support selection of heart rate control therapy in patients with permanent atrial fibrillation, in particular those with coexisting heart failure. Objective: To compare low-dose digoxin with bisoprolol (a ß-blocker). Design, Setting, and Participants: Randomized, open-label, blinded end-point clinical trial including 160 patients aged 60 years or older with permanent atrial fibrillation (defined as no plan to restore sinus rhythm) and dyspnea classified as New York Heart Association class II or higher. Patients were recruited from 3 hospitals and primary care practices in England from 2016 through 2018; last follow-up occurred in October 2019. Interventions: Digoxin (n = 80; dose range, 62.5-250 µg/d; mean dose, 161 µg/d) or bisoprolol (n = 80; dose range, 1.25-15 mg/d; mean dose, 3.2 mg/d). Main Outcomes and Measures: The primary end point was patient-reported quality of life using the 36-Item Short Form Health Survey physical component summary score (SF-36 PCS) at 6 months (higher scores are better; range, 0-100), with a minimal clinically important difference of 0.5 SD. There were 17 secondary end points (including resting heart rate, modified European Heart Rhythm Association [EHRA] symptom classification, and N-terminal pro-brain natriuretic peptide [NT-proBNP] level) at 6 months, 20 end points at 12 months, and adverse event (AE) reporting. Results: Among 160 patients (mean age, 76 [SD, 8] years; 74 [46%] women; mean baseline heart rate, 100/min [SD, 18/min]), 145 (91%) completed the trial and 150 (94%) were included in the analysis for the primary outcome. There was no significant difference in the primary outcome of normalized SF-36 PCS at 6 months (mean, 31.9 [SD, 11.7] for digoxin vs 29.7 [11.4] for bisoprolol; adjusted mean difference, 1.4 [95% CI, -1.1 to 3.8]; P = .28). Of the 17 secondary outcomes at 6 months, there were no significant between-group differences for 16 outcomes, including resting heart rate (a mean of 76.9/min [SD, 12.1/min] with digoxin vs a mean of 74.8/min [SD, 11.6/min] with bisoprolol; difference, 1.5/min [95% CI, -2.0 to 5.1/min]; P = .40). The modified EHRA class was significantly different between groups at 6 months; 53% of patients in the digoxin group reported a 2-class improvement vs 9% of patients in the bisoprolol group (adjusted odds ratio, 10.3 [95% CI, 4.0 to 26.6]; P < .001). At 12 months, 8 of 20 outcomes were significantly different (all favoring digoxin), with a median NT-proBNP level of 960 pg/mL (interquartile range, 626 to 1531 pg/mL) in the digoxin group vs 1250 pg/mL (interquartile range, 847 to 1890 pg/mL) in the bisoprolol group (ratio of geometric means, 0.77 [95% CI, 0.64 to 0.92]; P = .005). Adverse events were less common with digoxin; 20 patients (25%) in the digoxin group had at least 1 AE vs 51 patients (64%) in the bisoprolol group (P < .001). There were 29 treatment-related AEs and 16 serious AEs in the digoxin group vs 142 and 37, respectively, in the bisoprolol group. Conclusions and Relevance: Among patients with permanent atrial fibrillation and symptoms of heart failure treated with low-dose digoxin or bisoprolol, there was no statistically significant difference in quality of life at 6 months. These findings support potentially basing decisions about treatment on other end points. Trial Registration: ClinicalTrials.gov Identifier: NCT02391337 and clinicaltrialsregister.eu Identifier: 2015-005043-13.


Assuntos
Antiarrítmicos/uso terapêutico , Fibrilação Atrial/tratamento farmacológico , Bisoprolol/uso terapêutico , Digoxina/uso terapêutico , Frequência Cardíaca/efeitos dos fármacos , Qualidade de Vida , Antagonistas de Receptores Adrenérgicos beta 1/uso terapêutico , Idoso , Idoso de 80 Anos ou mais , Antiarrítmicos/efeitos adversos , Antiarrítmicos/farmacologia , Fibrilação Atrial/complicações , Fibrilação Atrial/fisiopatologia , Bisoprolol/efeitos adversos , Bisoprolol/farmacologia , Digoxina/efeitos adversos , Digoxina/farmacologia , Feminino , Insuficiência Cardíaca/complicações , Insuficiência Cardíaca/tratamento farmacológico , Humanos , Masculino , Pessoa de Meia-Idade , Método Simples-Cego , Volume Sistólico
19.
Pain ; 161(12): 2841-2851, 2020 12.
Artigo em Inglês | MEDLINE | ID: mdl-32639366

RESUMO

Knee osteoarthritis (OA) is a heterogeneous disease, and identification of its subgroups/phenotypes can improve patient treatment and drug development. We aimed to identify homogeneous OA subgroups/phenotypes using pain development over time; to understand the interplay between pain and functional limitation in time course; and to investigate subgroups' responses to available pharmacological and surgical treatments. We used group-based trajectory modelling to identify pain trajectories in the phase-3 VIDEO trial (n = 474, 3-year follow-up) and also in the Osteoarthritis Initiative cohort study (n = 4796, 9-year follow-up). We extended trajectory models by (1) fitting dual trajectories to investigate the interplay between pain and functional limitation over time, and (2) including analgesic use as a time-varying covariate. Also, we investigated the relationship between trajectory groups and knee replacement in regression models. We identified 4 pain trajectory groups in the trial and 6 in the cohort. These overlapped and led us to define 4 OA phenotypes: low-fluctuating, mild-increasing, moderate-treatment-sensitive, and severe-treatment-insensitive pain. Over time, functional knee limitation followed the same trajectory as pain with almost complete concordance (94.3%) between pain and functional limitation trajectory groups. Notably, we identified a phenotype with severe pain that did not benefit from available treatments, and another one most likely to benefit from knee replacement. Thus, knee OA subgroups/phenotypes can be identified based on patients' pain experiences in studies with long and regular follow-up. We provided a robust approach, reproducible between different study designs, which informs clinicians about symptom development and delivery of treatment options and opens a new avenue toward personalized medicine in OA.


Assuntos
Osteoartrite do Joelho , Estudos de Coortes , Progressão da Doença , Humanos , Articulação do Joelho , Osteoartrite do Joelho/complicações , Dor/etiologia
20.
Clin Transl Radiat Oncol ; 22: 44-49, 2020 May.
Artigo em Inglês | MEDLINE | ID: mdl-32211520

RESUMO

BACKGROUND AND PURPOSE: Oxygen-enhanced magnetic resonance imaging (MRI) and T1-mapping was used to explore its effectiveness as a prognostic imaging biomarker for chemoradiotherapy outcome in anal squamous cell carcinoma. MATERIALS AND METHODS: T2-weighted, T1 mapping, and oxygen-enhanced T1 maps were acquired before and after 8-10 fractions of chemoradiotherapy and examined whether the oxygen-enhanced MRI response relates to clinical outcome. Patient response to treatment was assessed 3 months following completion of chemoradiotherapy. A mean T1 was extracted from manually segmented tumour regions of interest and a paired two-tailed t-test was used to compare changes across the patient population. Regions of subcutaneous fat and muscle tissue were examined as control ROIs. RESULTS: There was a significant increase in T1 of the tumour ROIs across patients following the 8-10 fractions of chemoradiotherapy (paired t-test, p < 0.001, n = 7). At baseline, prior to receiving chemoradiotherapy, there were no significant changes in T1 across patients from breathing oxygen (n = 9). In the post-chemoRT scans (8-10 fractions), there was a significant decrease in T1 of the tumour ROIs across patients when breathing 100% oxygen (paired t-test, p < 0.001, n = 8). Out of the 12 patients from which we successfully acquired a visit 1 T1-map, only 1 patient did not respond to treatment, therefore, we cannot correlate these results with clinical outcome. CONCLUSIONS: These clinical data demonstrate feasibility and potential for T1-mapping and oxygen enhanced T1-mapping to indicate perfusion or treatment response in tumours of this nature. These data show promise for future work with a larger cohort containing more non-responders, which would allow us to relate these measurements to clinical outcome.

SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA
...