全文获取类型
收费全文 | 974篇 |
免费 | 17篇 |
国内免费 | 2篇 |
专业分类
管理学 | 166篇 |
民族学 | 1篇 |
人才学 | 1篇 |
人口学 | 16篇 |
丛书文集 | 16篇 |
理论方法论 | 6篇 |
综合类 | 374篇 |
社会学 | 23篇 |
统计学 | 390篇 |
出版年
2024年 | 1篇 |
2023年 | 3篇 |
2022年 | 2篇 |
2021年 | 7篇 |
2020年 | 6篇 |
2019年 | 11篇 |
2018年 | 19篇 |
2017年 | 62篇 |
2016年 | 29篇 |
2015年 | 45篇 |
2014年 | 36篇 |
2013年 | 224篇 |
2012年 | 67篇 |
2011年 | 39篇 |
2010年 | 23篇 |
2009年 | 48篇 |
2008年 | 37篇 |
2007年 | 37篇 |
2006年 | 26篇 |
2005年 | 35篇 |
2004年 | 26篇 |
2003年 | 20篇 |
2002年 | 22篇 |
2001年 | 18篇 |
2000年 | 21篇 |
1999年 | 12篇 |
1998年 | 18篇 |
1997年 | 13篇 |
1996年 | 15篇 |
1995年 | 18篇 |
1994年 | 5篇 |
1993年 | 13篇 |
1992年 | 13篇 |
1991年 | 3篇 |
1990年 | 2篇 |
1989年 | 7篇 |
1988年 | 3篇 |
1987年 | 5篇 |
1986年 | 1篇 |
1985年 | 1篇 |
排序方式: 共有993条查询结果,搜索用时 136 毫秒
11.
Here, we consider a generalized form of the alternative zero-inflated logarithmic series distribution of Kumar and Riyaz (J. Statist. Comp. Simul., 2015) and study some of its important aspects. The parameters of the distribution are estimated by the method of maximum likelihood and some test procedures are developed for testing the significance of the additional parameter of the model. All these estimation and testing procedures are illustrated with the help of certain real life datasets. A simulation study is also carried out for assessing the performance of the estimators. 相似文献
12.
《Journal of Statistical Computation and Simulation》2012,82(8):1621-1643
When a spatial point process model is fitted to spatial point pattern data using standard software, the parameter estimates are typically biased. Contrary to folklore, the bias does not reflect weaknesses of the underlying mathematical methods, but is mainly due to the effects of discretization of the spatial domain. We investigate two approaches to correcting the bias: a Newton–Raphson-type correction and Richardson extrapolation. In simulation experiments, Richardson extrapolation performs best. 相似文献
13.
14.
《Journal of Statistical Computation and Simulation》2012,82(2-3):87-92
The standard frequency domain approximation to the Gaussian likelihood of a sample from an ARMA process is considered. The Newton-Raphson and Gauss-Newton numerical maximisation algorithms are evaluated for this approximate likelihood and the relationships between these algorithms and those of Akaike and Hannan explored. In particular it is shown that Hannan's method has certain computational advantages compared to the other spectral estimation methods considered 相似文献
15.
Frailty models can be fit as mixed-effects Poisson models after transforming time-to-event data to the Poisson model framework. We assess, through simulations, the robustness of Poisson likelihood estimation for Cox proportional hazards models with log-normal frailties under misspecified frailty distribution. The log-gamma and Laplace distributions were used as true distributions for frailties on a natural log scale. Factors such as the magnitude of heterogeneity, censoring rate, number and sizes of groups were explored. In the simulations, the Poisson modeling approach that assumes log-normally distributed frailties provided accurate estimates of within- and between-group fixed effects even under a misspecified frailty distribution. Non-robust estimation of variance components was observed in the situations of substantial heterogeneity, large event rates, or high data dimensions. 相似文献
16.
《Journal of Statistical Computation and Simulation》2012,82(3-4):227-236
The widely-used Tietjen—Moore multiple outlier statistic has a defect as originally proposed in that it may test the wrong observations as outliers. The defect is corrected by redefinition and the statistic extended to make use of possible additional information on underlying variance. Results of simulation of the revised statistic are presented. 相似文献
17.
Jared L. Deutsch Clayton V. Deutsch 《Journal of statistical planning and inference》2012,142(3):763-772
Complex models can only be realized a limited number of times due to large computational requirements. Methods exist for generating input parameters for model realizations including Monte Carlo simulation (MCS) and Latin hypercube sampling (LHS). Recent algorithms such as maximinLHS seek to maximize the minimum distance between model inputs in the multivariate space. A novel extension of Latin hypercube sampling (LHSMDU) for multivariate models is developed here that increases the multidimensional uniformity of the input parameters through sequential realization elimination. Correlations are considered in the LHSMDU sampling matrix using a Cholesky decomposition of the correlation matrix. Computer code implementing the proposed algorithm supplements this article. A simulation study comparing MCS, LHS, maximinLHS and LHSMDU demonstrates that increased multidimensional uniformity can significantly improve realization efficiency and that LHSMDU is effective for large multivariate problems. 相似文献
18.
This paper compares the performance between regression analysis and a clustering based neural network approach when the data deviates from the homoscedasticity assumption of regression. Heteroskedasticity is a problem that arises in linear regression due to the unequal error variances. One of the methods to deal heteroskedasticity in classical regression theory is weighted least-square regression (WLS). In order to deal the problem of heteroskedasticity, backpropagation neural network is applied. In this context, an algorithm is proposed which is based on robust estimates of location and dispersion matrix that helps in preserving the error assumption of the linear regression. Analysis is carried out with appropriate designs using simulated data and the results are presented. 相似文献
19.
《Journal of Statistical Computation and Simulation》2012,82(4):229-248
Identical numerical integration experiments are performed on a CYBER 205 and an IBM 3081 in order to gauge the relative performance of several methods of integration. The methods employed are the general methods of Gauss-Legendre, iterated Gauss-Legendre, Newton-Cotes, Romberg and Monte Carlo as well as three methods, due to Owen, Dutt, and Clark respectively, for integrating the normal density. The bi- and trivariate normal densities and four other functions are integrated; the latter four have integrals expressible in closed form and some of them can be parameterized to exhibit singularities or highly periodic behavior. The various Gauss-Legendre methods tend to be most accurate (when applied to the normal density they are even more accurate than the special purpose methods designed for the normal) and while they are not the fastest, they are at least competitive. In scalar mode the CYBER is about 2-6 times faster than the IBM 3081 and the speed advantage of vectorised to scalar mode ranges from 6 to 15. Large scale econometric problems of the probit type should now be routinely soluble. 相似文献
20.
以多相瞬变流理论为基础,建立了井筒内气液两相流动模型,并对低压欠平衡钻井中注气量对井内压力的影响关系和规律进行计算和分析。计算分析结果表明,注气量对井内压力的影响除与注气量本身大小有关外,还与井深、井眼与钻柱结构尺寸、井内液相流量和性质等因素密切相关。单纯增大注气量并不一定就必然会导致井内压力的降低,这取决于所给条件下构成井筒内气液两相流体的静液压力和流阻间的平衡关系。介绍的模型及方法对确定低压欠平衡钻井过程中各相关参数、地面压缩机组的配置、以及设计方案等有一定指导意义。 相似文献