| Literature DB >> 24795532 |
Abstract
Wang and Ghosh (2011) proposed a Kullback-Leibler divergence (KLD) which is asymptotically equivalent to the KLD by Goutis and Robert (1998) when the reference model (in comparison with a competing fitted model) is correctly specified and when certain regularity conditions hold true. While properties of the KLD by Wang and Ghosh (2011) have been investigated in the Bayesian framework, this paper further explores the property of this KLD in the frequentist framework using four application examples, each fitted by two competing non-nested models.Entities:
Keywords: Comparison of Non-nested Models; Information Criterion; Kullback-Leibler Divergence
Year: 2013 PMID: 24795532 PMCID: PMC4006220 DOI: 10.1177/1471082X13494610
Source DB: PubMed Journal: Stat Modelling ISSN: 1471-082X Impact factor: 2.039