Bias-corrected and doubly robust inference for the three-level longitudinal cluster-randomized trials with missing continuous outcomes and small number of clusters: Simulation study and application to a study for adults with serious mental illnesses.
Contemp Clin Trials Commun
; 35: 101194, 2023 Oct.
Article
em En
| MEDLINE
| ID: mdl-37588771
ABSTRACT
Longitudinal cluster-randomized designs have been popular tools for comparative effective research in clinical trials. The methodologies for the three-level hierarchical design with longitudinal outcomes need to be better understood under more pragmatic settings; that is, with a small number of clusters, heterogeneous cluster sizes, and missing outcomes. Generalized estimating equations (GEEs) have been frequently used when the distribution of data and the correlation model are unknown. Standard GEEs lead to bias and an inflated type I error rate due to the small number of available clinics and non-completely random missing data in longitudinal outcomes. We evaluate the performance of inverse probability weighted (IPW) estimating equations, with and without augmentation, for two types of missing data in continuous outcomes and individual-level treatment allocation mechanisms combined with two bias-corrected variance estimators. Our intensive simulation results suggest that the proposed augmented IPW method with bias-corrected variance estimation successfully prevents the inflation of false positive findings and improves efficiency when the number of clinics is small, with moderate to severe missing outcomes. Our findings are expected to aid researchers in choosing appropriate analysis methods for three-level longitudinal cluster-randomized designs. The proposed approaches were applied to analyze data from a longitudinal cluster-randomized clinical trial involving adults with serious mental illnesses.
Texto completo:
1
Coleções:
01-internacional
Base de dados:
MEDLINE
Tipo de estudo:
Clinical_trials
/
Prognostic_studies
Idioma:
En
Ano de publicação:
2023
Tipo de documento:
Article