首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   289篇
  免费   6篇
管理学   4篇
丛书文集   2篇
理论方法论   2篇
综合类   24篇
社会学   4篇
统计学   259篇
  2022年   1篇
  2021年   1篇
  2020年   4篇
  2019年   10篇
  2018年   14篇
  2017年   20篇
  2016年   9篇
  2015年   4篇
  2014年   9篇
  2013年   85篇
  2012年   31篇
  2011年   9篇
  2010年   13篇
  2009年   9篇
  2008年   8篇
  2007年   16篇
  2006年   2篇
  2005年   6篇
  2004年   9篇
  2003年   1篇
  2001年   5篇
  2000年   5篇
  1999年   3篇
  1998年   3篇
  1997年   1篇
  1995年   1篇
  1994年   1篇
  1993年   2篇
  1992年   2篇
  1991年   1篇
  1990年   2篇
  1989年   1篇
  1986年   2篇
  1985年   1篇
  1984年   1篇
  1983年   1篇
  1981年   1篇
  1980年   1篇
排序方式: 共有295条查询结果,搜索用时 78 毫秒
71.
We construct those distributions minimizing Fisher information for scale in Kolmogorov neighbourhoods K?(G) = {F|supx|F(x) - G(x| ? ?} of d.f.'s G satisfying certain mild conditions. The theory is sufficiently general to include those cases in which G is normal, Laplace, logistic, Student's t, etc. As well, we consider G(x) = 1 - e-x, ? 0, and correct some errors in the literature concerning this case.  相似文献   
72.
弱费雪效应和名义利率粘性是货币政策有效的前提。本文使用傅里叶变换处理实际利率的时变性,扩展协整模型用以考察长期的费雪效应,并建立门限误差修正模型区分长期和短期的费雪效应,刻画名义利率短期的动态调整特征。基于我国1990年1月至2017年12月的月度数据研究发现:(1)我国名义利率和通货膨胀之间存在长期的弱费雪效应;(2)名义利率的短期动态调整特征存在显著的双重门限效应,在名义利率过度高于均衡值时会出现显著而快速的调整,而当名义利率低于均衡值或处于中间机制时,均没有发现显著的调整,即名义利率存在粘性。研究结果表明:当前阶段数量型货币政策在我国依然有效,因而存在综合使用数量型货币政策和价格型货币政策的空间。  相似文献   
73.
We propose a bivariate Farlie–Gumbel–Morgenstern (FGM) copula model for bivariate meta-analysis, and develop a maximum likelihood estimator for the common mean vector. With the aid of novel mathematical identities for the FGM copula, we derive the expression of the Fisher information matrix. We also derive an approximation formula for the Fisher information matrix, which is accurate and easy to compute. Based on the theory of independent but not identically distributed (i.n.i.d.) samples, we examine the asymptotic properties of the estimator. Simulation studies are given to demonstrate the performance of the proposed method, and a real data analysis is provided to illustrate the method.  相似文献   
74.
The Wilcoxon–Mann–Whitney (WMW) test is a popular rank-based two-sample testing procedure for the strong null hypothesis that the two samples come from the same distribution. A modified WMW test, the Fligner–Policello (FP) test, has been proposed for comparing the medians of two populations. A fact that may be under-appreciated among some practitioners is that the FP test can also be used to test the strong null like the WMW. In this article, we compare the power of the WMW and FP tests for testing the strong null. Our results show that neither test is uniformly better than the other and that there can be substantial differences in power between the two choices. We propose a new, modified WMW test that combines the WMW and FP tests. Monte Carlo studies show that the combined test has good power compared to either the WMW and FP test. We provide a fast implementation of the proposed test in an open-source software. Supplementary materials for this article are available online.  相似文献   
75.
If the amount of information contained in a r.v is greater than that contained in another r.v for one measure of information, it seems reasonable to require that this relation remains true for any other valid measure. In this paper we investigate divergence and Fisher-type measures of information with respect to this property which is due to Shiva, Ahmed and Georganas (1973). It is shown that the property is satisfied only for a certain region of values of the parameter (order) a of the measures of information.  相似文献   
76.
Nonlinear mixed‐effects (NLME) modeling is one of the most powerful tools for analyzing longitudinal data especially under the sparse sampling design. The determinant of the Fisher information matrix is a commonly used global metric of the information that can be provided by the data under a given model. However, in clinical studies, it is also important to measure how much information the data provide for a certain parameter of interest under the assumed model, for example, the clearance in population pharmacokinetic models. This paper proposes a new, easy‐to‐interpret information metric, the “relative information” (RI), which is designed for specific parameters of a model and takes a value between 0% and 100%. We establish the relationship between interindividual variability for a specific parameter and the variance of the associated parameter estimator, demonstrating that, under a “perfect” experiment (eg, infinite samples or/and minimum experimental error), the RI and the variance of the model parameter estimator converge, respectively, to 100% and the ratio of the interindividual variability for that parameter and the number of subjects. Extensive simulation experiments and analyses of three real datasets show that our proposed RI metric can accurately characterize the information for parameters of interest for NLME models. The new information metric can be readily used to facilitate study designs and model diagnosis.  相似文献   
77.
The correction for grouping is a sum of two terms, the first depending on the length of the grouping interval, the second being a periodic function of the position. Thiele (1873) studied the second term, but missed the first. Sheppard (1898) studied the first term, but missed the second. Bruns (1906) derived the first term as the aperiodic term of a Fourier series and the second as the sum of the periodic terms. He found the correction to the coefficients of the Gram–Charlier series and proved that the second term is negligible for a grouped normal distribution with at least eight groups. Independently, Fisher (1922) used the same method to derive the correction to the moments. For the normal distribution with a grouping interval less than the standard deviation Fisher proved that the second term is negligible compared with the first and with the standard error of the first four moments. Moreover, he proved that the estimates of the mean and the standard deviation obtained by the method of moments for a grouped sample with Sheppard's corrections have nearly the same variances as the maximum likelihood estimates, thus providing a new and compelling reason for using Sheppard's corrections.  相似文献   
78.
Introducing a shape parameter to an exponential model is nothing new. There are many ways to introduce a shape parameter to an exponential distribution. The different methods may result in variety of weighted exponential (WE) distributions. In this article, we have introduced a shape parameter to an exponential model using the idea of Azzalini, which results in a new class of WE distributions. This new WE model has the probability density function (PDF) whose shape is very close to the shape of the PDFS of Weibull, gamma or generalized exponential distributions. Therefore, this model can be used as an alternative to any of these distributions. It is observed that this model can also be obtained as a hidden truncation model. Different properties of this new model have been discussed and compared with the corresponding properties of well-known distributions. Two data sets have been analysed for illustrative purposes and it is observed that in both the cases it fits better than Weibull, gamma or generalized exponential distributions.  相似文献   
79.
The author investigates least squares as a method for fitting small-circle models to a sample of unit vectors in R3. He highlights a local linear model underlying the estimation of the parameters of a circle. This model is used to construct an estimation algorithm and regression-type inference procedures for the parameters of a circle. It makes it possible to compare the fit of a small circle with that of a spherical ellipse. The limitations of the least-squares approach are emphasized: when the errors are bounded away from 0, the least-squares estimators are not consistent as the sample size goes to infinity. Two examples, concerned with the migration of elephant seals and with the classification of geological folds, are analyzed using the linear model techniques proposed in this work.  相似文献   
80.
This article presents a design approach for sequential constant-stress accelerated life tests (ALT) with an auxiliary acceleration factor (AAF). The use of an AAF, if it exists, is to further amplify the failure probability of highly reliability testing items at low stress levels while maintaining an acceptable degree of extrapolation for reliability inference. Based on a Bayesian design criterion, the optimal plan optimizes the sample allocation, stress combination, as well as the loading profile of the AAF. In particular, a step-stress loading profile based on an appropriate cumulative exposure (CE) model is chosen for the AAF such that the initial auxiliary stress will not be too harsh. A case study, providing the motivation and practical importance of our study, is presented to illustrate the proposed planning approach.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号