排序方式: 共有3条查询结果,搜索用时 15 毫秒
1
1.
Joseph E. Cavanaugh 《Australian & New Zealand Journal of Statistics》2004,46(2):257-274
Model selection criteria are frequently developed by constructing estimators of discrepancy measures that assess the disparity between the 'true' model and a fitted approximating model. The Akaike information criterion (AIC) and its variants result from utilizing Kullback's directed divergence as the targeted discrepancy. The directed divergence is an asymmetric measure of separation between two statistical models, meaning that an alternative directed divergence can be obtained by reversing the roles of the two models in the definition of the measure. The sum of the two directed divergences is Kullback's symmetric divergence. In the framework of linear models, a comparison of the two directed divergences reveals an important distinction between the measures. When used to evaluate fitted approximating models that are improperly specified, the directed divergence which serves as the basis for AIC is more sensitive towards detecting overfitted models, whereas its counterpart is more sensitive towards detecting underfitted models. Since the symmetric divergence combines the information in both measures, it functions as a gauge of model disparity which is arguably more balanced than either of its individual components. With this motivation, the paper proposes a new class of criteria for linear model selection based on targeting the symmetric divergence. The criteria can be regarded as analogues of AIC and two of its variants: 'corrected' AIC or AICc and 'modified' AIC or MAIC. The paper examines the selection tendencies of the new criteria in a simulation study and the results indicate that they perform favourably when compared to their AIC analogues. 相似文献
2.
Inder Jeet Taneja 《统计学通讯:理论与方法》2013,42(9):1654-1672
There are three classical divergence measures exist in the literature on information theory and statistics. These are namely, Jeffryes-Kullback-Leiber (Jeffreys, 1946; Kullback and Leibler, 1951) J-divergence. Sibson-Burbea-Rao (Sibson, 1969), Jensen-Shannon divegernce, (Burbea and Rao, 1982), and Taneja (1995). Arithmetic-Geometric divergence. These three measures bear an interesting relationship among each other. The divergence measures like Hellinger (1909) discrimination, symmetric χ2-divergence, and triangular discrimination are also known in the literature. In this article, we have considered generalized symmetric divergence measures having the measures given above as particular cases. Bounds on the probability of error are obtained in terms of generalized symmetric divergence measures. Study of bounds on probability of error is extended for the difference of divergence measures. 相似文献
3.
G.R. Dargahi-Noubary 《统计学通讯:理论与方法》2013,42(9):2439-2458
Discrimination between two Gaussian time series is examined assuming that the important difference between the alternative processes is their covarianoe (spectral) structure. Using the likelihood ratio method in frequency domain a discriminant function is derived and its approximate distribution is obtained. It is demonstrated that, utilizing the Kullbadk-Leibler information measure, the frequencies or frequency bands which carry information for discrimination can be determined. Using this, it is shown that when mean functions are equal, discrimination based on the frequency with the largest discrimination information is equivalent to the classification procedure based on the best linear discriminant, Application to seismology is described by including a discussion concerning the spectral ratio discriminant for underground nuclear explosion and natural earthquake and is illustrated numerically using Rayleigh wave data from an underground and an atmospheric explosions. 相似文献
1