Your browser doesn't support javascript.
loading
DeepFDR: A Deep Learning-based False Discovery Rate Control Method for Neuroimaging Data.
Kim, Taehyo; Shu, Hai; Jia, Qiran; de Leon, Mony J.
Afiliação
  • Kim T; Department of Biostatistics, School of Global Public Health, New York University.
  • Shu H; Department of Biostatistics, School of Global Public Health, New York University.
  • Jia Q; Department of Biostatistics, School of Global Public Health, New York University.
  • de Leon MJ; Department of Population and Public Health Sciences, University of Southern California.
Proc Mach Learn Res ; 238: 946-954, 2024 May.
Article em En | MEDLINE | ID: mdl-38741695
ABSTRACT
Voxel-based multiple testing is widely used in neuroimaging data analysis. Traditional false discovery rate (FDR) control methods often ignore the spatial dependence among the voxel-based tests and thus suffer from substantial loss of testing power. While recent spatial FDR control methods have emerged, their validity and optimality remain questionable when handling the complex spatial dependencies of the brain. Concurrently, deep learning methods have revolutionized image segmentation, a task closely related to voxel-based multiple testing. In this paper, we propose DeepFDR, a novel spatial FDR control method that leverages unsupervised deep learning-based image segmentation to address the voxel-based multiple testing problem. Numerical studies, including comprehensive simulations and Alzheimer's disease FDG-PET image analysis, demonstrate DeepFDR's superiority over existing methods. DeepFDR not only excels in FDR control and effectively diminishes the false nondiscovery rate, but also boasts exceptional computational efficiency highly suited for tackling large-scale neuroimaging data.

Texto completo: 1 Coleções: 01-internacional Base de dados: MEDLINE Idioma: En Revista: Proc Mach Learn Res Ano de publicação: 2024 Tipo de documento: Article

Texto completo: 1 Coleções: 01-internacional Base de dados: MEDLINE Idioma: En Revista: Proc Mach Learn Res Ano de publicação: 2024 Tipo de documento: Article