Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 10 de 10
Filter
1.
Br J Anaesth ; 131(5): 805-809, 2023 11.
Article in English | MEDLINE | ID: mdl-37481434

ABSTRACT

Causal inference in observational research requires a careful approach to adjustment for confounding. One such approach is the use of propensity score analyses. In this editorial, we focus on the role of propensity score-based methods in estimating causal effects from non-randomised observational data. We highlight the details, assumptions, and limitations of these methods and provide authors with guidelines for the conduct and reporting of propensity score analyses.


Subject(s)
Propensity Score , Humans , Causality , Data Interpretation, Statistical , Observational Studies as Topic
2.
Stat Med ; 39(29): 4386-4404, 2020 12 20.
Article in English | MEDLINE | ID: mdl-32854161

ABSTRACT

Instrumental variable (IV) analysis can be used to address bias due to unobserved confounding when estimating the causal effect of a treatment on an outcome of interest. However, if a proposed IV is correlated with unmeasured confounders and/or weakly correlated with the treatment, the standard IV estimator may be more biased than an ordinary least squares (OLS) estimator. Several methods have been proposed that compare the bias of the IV and OLS estimators relying on the belief that measured covariates can be used as proxies for the unmeasured confounder. Despite these developments, there is lack of discussion about approaches that can be used to formally test whether the IV estimator may be less biased than the OLS estimator. Thus, we have developed a testing framework to compare the bias and a criterion to select informative measured covariates for bias comparison and regression adjustment. We also have developed a bias-correction method, which allows one to use an invalid IV to correct the bias of the OLS or IV estimator. Numerical studies demonstrate that the proposed methods perform well with realistic sample sizes.


Subject(s)
Models, Statistical , Bias , Causality , Humans , Least-Squares Analysis , Sample Size
3.
Pharmacoepidemiol Drug Saf ; 27(2): 229-238, 2018 02.
Article in English | MEDLINE | ID: mdl-29316026

ABSTRACT

PURPOSE: To examine the dynamics of treatment with 2 bone-targeting agents (BTAs)-denosumab and zoledronic acid-among men with bone metastases from prostate cancer. METHODS: Using electronic health record data from oncology practices across the US, we identified prostate cancer patients diagnosed with bone metastasis in 2012/2013 without evidence of BTA use within 6 months prior to diagnosis. We examined the risk and predictors of BTA initiation, interruption, and re-initiation. RESULTS: Among 897 men diagnosed with prostate cancer, the cumulative incidence of BTA initiation after bone metastasis diagnosis was 34% (95% confidence interval [CI], 31-37%) at 30 days, 64% (95% CI, 61-68%) at 180 days, and 88% (95% CI, 85-91%) at 2 years. Denosumab was initiated more frequently than zoledronic acid. Men with diabetes, more bone lesions, history of androgen deprivation therapy, or no hospice enrollment were more likely to initiate treatment. Following initiation, the cumulative incidence of treatment interruption was 17% (95% CI, 14-19%) at 60 days and 70% (95% CI, 66-74%) at 2 years, with interruption more likely among patients receiving emerging therapies for prostate cancer or enrolling in hospice. The cumulative incidence of re-initiation following interruption was 36.3% (95% CI, 32.7-40.2%) at 15 days, 49.8% (95% CI, 45.9-54.1%) at 30 days, and 81.0% (95% CI, 77.5-84.7%) at 1 year. CONCLUSIONS: Bone-targeting agent therapy is initiated by the majority of men living with bone metastases following a prostate cancer diagnosis; however, the timing of initiation is highly variable. Once on treatment, gaps or interruptions in therapy are common.


Subject(s)
Bone Density Conservation Agents/therapeutic use , Bone Neoplasms/drug therapy , Drug Utilization/statistics & numerical data , Practice Patterns, Physicians'/statistics & numerical data , Prostatic Neoplasms/pathology , Aged , Bone Neoplasms/secondary , Denosumab/therapeutic use , Humans , Longitudinal Studies , Male , Middle Aged , Time Factors , United States , Zoledronic Acid/therapeutic use
4.
J Pediatric Infect Dis Soc ; 12(Supplement_2): S3-S8, 2023 Dec 26.
Article in English | MEDLINE | ID: mdl-38146860

ABSTRACT

BACKGROUND: At-home COVID-19 tests became available in the USA in April 2021 with widespread use by January 2022; however, the lack of infrastructure to report test results to public health agencies created a gap in public health data. Kindergarten through grade 12 (K-12) schools often tracked COVID-19 cases among students and staff; leveraging school data may have helped bridge data gaps. METHODS: We examined infection rates reported by school districts to ABC Science Collaborative with corresponding community rates from March 15, 2021 to June 3, 2022. We computed weekly ratios of community-to-district-reported rates (reporting ratios) across 3 study periods (spring 2021, fall 2021, and spring 2022) and estimated the difference and 95% confidence intervals (CIs) in the average reporting ratio between study periods. RESULTS: In spring 2021, before approval or widespread use of at-home testing, the community-reported infection rate was higher than the school-reported infection rate (reporting ratio: 1.40). In fall 2021 and spring 2022, as at-home testing rapidly increased, school-reported rates were higher than community-reported rates (reporting ratios: 0.82 and 0.66). Average reporting ratios decreased between spring 2021 and fall 2021 (-0.58, 95% CI -0.84, -0.32) and spring 2021 and spring 2022 (-0.73, 95% CI -0.96, -0.48); there was no significant change between fall 2021 and spring 2022 (-0.15, 95% CI -0.36, 0.06). CONCLUSIONS: At-home COVID-19 testing resulted in significant data gaps; K-12 data could have supplemented community data. In future public health emergencies, reporting of school data could minimize data gaps, but requires additional resources including funding to track infections and standardized data reporting methods.


Subject(s)
COVID-19 , Humans , COVID-19/epidemiology , COVID-19 Testing , Schools , Educational Status , Dietary Supplements
6.
Eur Heart J ; 31(5): 561-72, 2010 Mar.
Article in English | MEDLINE | ID: mdl-19942600

ABSTRACT

AIMS: To evaluate the effectiveness and safety of bivalirudin as used in routine care. Bivalirudin has been studied as an alternative to heparin plus glycoprotein IIb/IIIa inhibitor (GPI) during percutaneous coronary intervention (PCI). Trials have indicated that bivalirudin is non-inferior to heparin with respect to death and repeat revascularization and may decrease the risk of major bleeds. The use of bivalirudin in routine care has not been evaluated. METHODS AND RESULTS: Using a representative database, we identified 127 185 individuals who underwent inpatient PCI between June 2003 and December 2006 and were administered either bivalirudin plus provisional GPI or the comparator, heparin plus GPI. We estimated relative risks of blood transfusion, repeated PCI, and in-hospital death. The adjusted hazard ratio (HR) for blood transfusion was 0.67 (0.61-0.73); instrumental variable analysis showed an HR of 0.72 (0.12-4.47). We observed a risk of in-hospital death of 0.80% in the bivalirudin group and 2.1% in the heparin group; the adjusted HR was 0.51 (0.44-0.60). CONCLUSION: In our non-randomized study of routine care, we observed a reduction in blood transfusions and in short-term mortality for patients treated with bivalirudin compared with heparin plus GPI. The mortality benefit was more pronounced in our study than in randomized trials.


Subject(s)
Acute Coronary Syndrome/therapy , Angioplasty/methods , Antithrombins/therapeutic use , Myocardial Infarction/therapy , Peptide Fragments/therapeutic use , Acute Coronary Syndrome/mortality , Antithrombins/adverse effects , Cohort Studies , Female , Hirudins/adverse effects , Hospital Mortality , Hospitalization/statistics & numerical data , Humans , Male , Middle Aged , Myocardial Infarction/mortality , Peptide Fragments/adverse effects , Recombinant Proteins/adverse effects , Recombinant Proteins/therapeutic use , Treatment Outcome , United States
7.
Sci Rep ; 6: 19612, 2016 Apr 13.
Article in English | MEDLINE | ID: mdl-27071541

ABSTRACT

Cinacalcet lowers parathyroid hormone levels. Whether it can prolong survival of people with chronic kidney disease (CKD) complicated by secondary hyperparathyroidism (SHPT) remains controversial, in part because a recent randomized trial excluded patients with iPTH <300 pg/ml. We examined cinacalcet's effects at different iPTH levels. This was a prospective case-cohort and cohort study involving 8229 patients with CKD stage 5D requiring maintenance hemodialysis who had SHPT. We studied relationships between cinacalcet initiation and important clinical outcomes. To avoid confounding by treatment selection, we used marginal structural models, adjusting for time-dependent confounders. Over a mean of 33 months, cinacalcet was more effective in patients with more severe SHPT. In patients with iPTH ≥ 500 pg/ml, the reduction in the risk of death from any cause was about 50% (Incidence Rate Ratio [IRR] = 0.49; 95% Confidence Interval [95% CI]: 0.29-0.82). For a composite of cardiovascular hospitalization and mortality, the association was not statistically significant, but the IRR was 0.67 (95% CI: 0.43-1.06). These findings indicate that decisions about using cinacalcet should take into account the severity of SHPT.


Subject(s)
Calcimimetic Agents/adverse effects , Cinacalcet/adverse effects , Hyperparathyroidism, Secondary/drug therapy , Parathyroid Hormone/blood , Renal Dialysis , Renal Insufficiency, Chronic/therapy , Aged , Calcimimetic Agents/administration & dosage , Calcimimetic Agents/therapeutic use , Cinacalcet/administration & dosage , Cinacalcet/therapeutic use , Female , Humans , Hyperparathyroidism, Secondary/complications , Male , Middle Aged , Renal Insufficiency, Chronic/complications
8.
Comput Methods Programs Biomed ; 95(1): 89-94, 2009 Jul.
Article in English | MEDLINE | ID: mdl-19321221

ABSTRACT

The clustered permutation test is a nonparametric method of significance testing for correlated data. It is often used in cluster randomized trials where groups of patients rather than individuals are randomized to either a treatment or control intervention. We describe a flexible and efficient SAS macro that implements the 2-sample clustered permutation test. We discuss the theory and applications behind this test as well as details of the SAS code.


Subject(s)
Computational Biology/methods , Randomized Controlled Trials as Topic , Research Design , Algorithms , Cluster Analysis , Computers , Data Interpretation, Statistical , Humans , Monte Carlo Method , Programming Languages , Reproducibility of Results , Software , Statistics, Nonparametric
9.
J Am Geriatr Soc ; 56(9): 1644-50, 2008 Sep.
Article in English | MEDLINE | ID: mdl-18691283

ABSTRACT

OBJECTIVES: To investigate the potential mechanisms through which conventional antipsychotic medication (APM) might act, the specific causes of death in elderly patients newly started on conventional APM were compared with those of patients taking atypical APM. DESIGN: Cohort study. SETTING: Community. PARTICIPANTS: All British Columbia residents aged 65 and older who initiated a conventional or atypical APM between 1996 and 2004. MEASUREMENTS: Cox proportional hazards models were used to compare risks of developing a specific cause of death within 180 days of APM initiation. Potential confounders were adjusted for using traditional multivariable, propensity-score, and instrumental-variable adjustments. RESULTS: The study cohort included 12,882 initiators of conventional APM and 24,359 initiators of atypical APM. Of 3,821 total deaths within the first 180 day of use, cardiovascular (CV) deaths accounted for 49% of deaths. Initiators of conventional APM had a significantly higher adjusted risk of all CV death (hazard ratio (HR)=1.23, 95% confidence interval (CI)=1.10-1.36) and out-of-hospital CV death (HR=1.36, 95% CI=1.19-1.56) than initiators of atypical APM. Initiators of conventional APM also had a higher risk of death due to respiratory diseases, nervous system diseases, and other causes. CONCLUSION: These data suggest that greater risk of CV deaths might explain approximately half of the excess mortality in initiators of conventional APM. The risk of death due to respiratory causes was also significantly higher in conventional APM use.


Subject(s)
Antipsychotic Agents/toxicity , Cause of Death , Administration, Oral , Aged , Aged, 80 and over , Alzheimer Disease/drug therapy , Alzheimer Disease/mortality , Antipsychotic Agents/therapeutic use , British Columbia , Cardiovascular Diseases/chemically induced , Cardiovascular Diseases/mortality , Cohort Studies , Comorbidity , Cross-Sectional Studies , Delirium/drug therapy , Delirium/mortality , Dose-Response Relationship, Drug , Drug Therapy, Combination , Female , Humans , Male , Mood Disorders/drug therapy , Mood Disorders/mortality , Nervous System Diseases/chemically induced , Nervous System Diseases/mortality , Probability , Proportional Hazards Models , Psychotic Disorders/drug therapy , Psychotic Disorders/mortality , Respiratory Tract Diseases/chemically induced , Respiratory Tract Diseases/mortality , Risk Factors
10.
Nephrol Dial Transplant ; 21(12): 3559-66, 2006 Dec.
Article in English | MEDLINE | ID: mdl-17040993

ABSTRACT

BACKGROUND: Anaemia is prevalent in kidney transplant recipients (KTR), and only few KTR with anaemia receive treatment with erythropoietin. Some have claimed that this undertreatment might contribute to suboptimal outcomes such as mortality and cardiovascular events in these patients. However, no evidence is currently available that anaemia is actually associated with such risks in KTR. METHODS: We merged two cohorts of KTR to study the associations between anaemia and two outcomes: all-cause mortality and kidney allograft loss. Detailed information on the demographic and clinical characteristics of these 825 patients was available at baseline. As recommended by the American Society of Transplantation, anaemia was considered present if the haemoglobin concentration was < or =13 g/dl in men or < or =12 g/dl in women. Patients were followed using the Austrian Dialysis and Transplant Registry. RESULTS: After 8.2 years of follow-up, 251 patients died and 401 allografts were lost. In multivariate analyses, anaemia was not associated with all-cause mortality (HR: 1.08; 95% CI: 0.80-1.45), but it was associated with 25% greater risk of allograft loss (HR = 1.25; 95% CI: 1.02-1.59). This association was even more pronounced in death-censored analyses. Analyses using haemoglobin as a continuous variable or in categories also found no association with mortality. CONCLUSIONS: Anaemia may not be associated with mortality in KTR. In light of the recent findings of increased mortality in chronic kidney disease patients with higher haemoglobin treatment target, further evidence is needed to guide clinicians in the treatment of anaemia in these patients.


Subject(s)
Anemia/etiology , Kidney Transplantation/adverse effects , Anemia/epidemiology , Female , Humans , Male , Middle Aged , Prospective Studies , Time Factors , Treatment Outcome
SELECTION OF CITATIONS
SEARCH DETAIL