Applications of a Kullback-Leibler Divergence for Comparing Non-nested Models.
Stat Modelling
; 13(5-6): 409-429, 2013 Dec.
Article
em En
| MEDLINE
| ID: mdl-24795532
Wang and Ghosh (2011) proposed a Kullback-Leibler divergence (KLD) which is asymptotically equivalent to the KLD by Goutis and Robert (1998) when the reference model (in comparison with a competing fitted model) is correctly specified and when certain regularity conditions hold true. While properties of the KLD by Wang and Ghosh (2011) have been investigated in the Bayesian framework, this paper further explores the property of this KLD in the frequentist framework using four application examples, each fitted by two competing non-nested models.
Texto completo:
1
Base de dados:
MEDLINE
Tipo de estudo:
Prognostic_studies
Idioma:
En
Ano de publicação:
2013
Tipo de documento:
Article