Your browser doesn't support javascript.
loading
Applications of a Kullback-Leibler Divergence for Comparing Non-nested Models.
Wang, Chen-Pin; Jo, Booil.
Afiliação
  • Wang CP; Department of Epidemiology and Biostatistics, University of Texas Health Science Center, San Antonio TX, 78229, USA.
  • Jo B; Department of Psychiatry and Behavioral Sciences, Stanford University, Stanford CA 94305, USA.
Stat Modelling ; 13(5-6): 409-429, 2013 Dec.
Article em En | MEDLINE | ID: mdl-24795532
Wang and Ghosh (2011) proposed a Kullback-Leibler divergence (KLD) which is asymptotically equivalent to the KLD by Goutis and Robert (1998) when the reference model (in comparison with a competing fitted model) is correctly specified and when certain regularity conditions hold true. While properties of the KLD by Wang and Ghosh (2011) have been investigated in the Bayesian framework, this paper further explores the property of this KLD in the frequentist framework using four application examples, each fitted by two competing non-nested models.
Palavras-chave

Texto completo: 1 Base de dados: MEDLINE Tipo de estudo: Prognostic_studies Idioma: En Ano de publicação: 2013 Tipo de documento: Article

Texto completo: 1 Base de dados: MEDLINE Tipo de estudo: Prognostic_studies Idioma: En Ano de publicação: 2013 Tipo de documento: Article