首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   3篇
  免费   0篇
统计学   3篇
  2013年   2篇
  2004年   1篇
排序方式: 共有3条查询结果,搜索用时 15 毫秒
1
1.
Model selection criteria are frequently developed by constructing estimators of discrepancy measures that assess the disparity between the 'true' model and a fitted approximating model. The Akaike information criterion (AIC) and its variants result from utilizing Kullback's directed divergence as the targeted discrepancy. The directed divergence is an asymmetric measure of separation between two statistical models, meaning that an alternative directed divergence can be obtained by reversing the roles of the two models in the definition of the measure. The sum of the two directed divergences is Kullback's symmetric divergence. In the framework of linear models, a comparison of the two directed divergences reveals an important distinction between the measures. When used to evaluate fitted approximating models that are improperly specified, the directed divergence which serves as the basis for AIC is more sensitive towards detecting overfitted models, whereas its counterpart is more sensitive towards detecting underfitted models. Since the symmetric divergence combines the information in both measures, it functions as a gauge of model disparity which is arguably more balanced than either of its individual components. With this motivation, the paper proposes a new class of criteria for linear model selection based on targeting the symmetric divergence. The criteria can be regarded as analogues of AIC and two of its variants: 'corrected' AIC or AICc and 'modified' AIC or MAIC. The paper examines the selection tendencies of the new criteria in a simulation study and the results indicate that they perform favourably when compared to their AIC analogues.  相似文献   
2.
There are three classical divergence measures exist in the literature on information theory and statistics. These are namely, Jeffryes-Kullback-Leiber (Jeffreys, 1946 Jeffreys , H. ( 1946 ). An invariant form for the prior probability in estimation problems . Proc. Roy. Soc. Lon. Ser. A 186 : 453461 .[Crossref], [PubMed], [Web of Science ®] [Google Scholar]; Kullback and Leibler, 1951 Kullback , S. , Leibler , R. A. ( 1951 ). On information and sufficiency . Ann. Math. Statist. 22 : 7986 .[Crossref] [Google Scholar]) J-divergence. Sibson-Burbea-Rao (Sibson, 1969 Sibson , R. ( 1969 ). Information radius . Z. Wahrs. und verw Geb. 14 : 149160 .[Crossref], [Web of Science ®] [Google Scholar]), Jensen-Shannon divegernce, (Burbea and Rao, 1982 Burbea , J. , Rao , C. R. ( 1982 ). On the convexity of some divergence measures based on entropy functions . IEEE Trans. Inform. Theor. IT-28 : 489495 .[Crossref], [Web of Science ®] [Google Scholar]), and Taneja (1995 Taneja , I.J. ( 1995 ). New developments in generalized information measures , In: Hawkes , P. W. , Ed., Advances in Imaging and Electron Physics . 91:37–136 . [Google Scholar]). Arithmetic-Geometric divergence. These three measures bear an interesting relationship among each other. The divergence measures like Hellinger (1909 Hellinger , H. ( 1909 ). Neue Begründung der theorie der quadratischen formen von unendlichen vielen Veränderlichen . J. Reine Aug. Math. 136 : 210271 .[Crossref] [Google Scholar]) discrimination, symmetric χ2-divergence, and triangular discrimination are also known in the literature. In this article, we have considered generalized symmetric divergence measures having the measures given above as particular cases. Bounds on the probability of error are obtained in terms of generalized symmetric divergence measures. Study of bounds on probability of error is extended for the difference of divergence measures.  相似文献   
3.
Discrimination between two Gaussian time series is examined assuming that the important difference between the alternative processes is their covarianoe (spectral) structure. Using the likelihood ratio method in frequency domain a discriminant function is derived and its approximate distribution is obtained. It is demonstrated that, utilizing the Kullbadk-Leibler information measure, the frequencies or frequency bands which carry information for discrimination can be determined. Using this, it is shown that when mean functions are equal, discrimination based on the frequency with the largest discrimination information is equivalent to the classification procedure based on the best linear discriminant, Application to seismology is described by including a discussion concerning the spectral ratio discriminant for underground nuclear explosion and natural earthquake and is illustrated numerically using Rayleigh wave data from an underground and an atmospheric explosions.  相似文献   
1
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号