首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   1968篇
  免费   50篇
  国内免费   49篇
管理学   194篇
民族学   6篇
人才学   2篇
人口学   204篇
丛书文集   97篇
理论方法论   44篇
综合类   890篇
社会学   127篇
统计学   503篇
  2024年   1篇
  2023年   5篇
  2022年   13篇
  2021年   23篇
  2020年   33篇
  2019年   29篇
  2018年   43篇
  2017年   63篇
  2016年   40篇
  2015年   50篇
  2014年   76篇
  2013年   335篇
  2012年   120篇
  2011年   117篇
  2010年   98篇
  2009年   118篇
  2008年   110篇
  2007年   97篇
  2006年   102篇
  2005年   67篇
  2004年   78篇
  2003年   63篇
  2002年   70篇
  2001年   55篇
  2000年   51篇
  1999年   53篇
  1998年   26篇
  1997年   22篇
  1996年   19篇
  1995年   15篇
  1994年   20篇
  1993年   11篇
  1992年   5篇
  1991年   7篇
  1990年   3篇
  1989年   2篇
  1988年   3篇
  1987年   2篇
  1986年   2篇
  1985年   3篇
  1984年   4篇
  1983年   3篇
  1982年   4篇
  1981年   3篇
  1979年   1篇
  1978年   1篇
  1977年   1篇
排序方式: 共有2067条查询结果,搜索用时 890 毫秒
901.
Progressively censored data from a classical Pareto distribution are to be used to make inferences about its shape and precision parameters and the reliability function. An approximation form due to Tierney and Kadane (1986) is used for obtaining the Bayes estimates. Bayesian prediction of further observations from this distribution is also considered. When the Bayesian approach is concerned, conjugate priors for either the one or the two parameters cases are considered. To illustrate the given procedures, a numerical example and a simulation study are given.  相似文献   
902.

A test for exponentiality based on progressively Type-II right censored spacings has been proposed recently by Balakrishnan et al. (2002). They derived the asymptotic null distribution of the test statistic. In this work, we utilize the algorithm of Huffer and Lin (2001) to evaluate the exact null probabilities and the exact critical values of this test statistic.  相似文献   
903.

The power of Pearson's chi-square test for uniformity depends heavily on the choice of the partition of the unit interval involved in the form of the test statistic. We propose a selection rule which chooses a proper partition based on the data. This selection rule leads usually to essentially unequal cells well suited to the observed distribution. We investigate the corresponding data driven chi-square test and present a Monte Carlo simulation study. The conclusion is that this test achieves a high and very stable power for a large class of alternatives, and is much more stable than any other test we compare to.  相似文献   
904.

Ordinal data are often modeled using a continuous latent response distribution, which is partially observed through windows of adjacent intervals defined by cutpoints. In this paper we propose the beta distribution as a model for the latent response. The beta distribution has several advantages over the other common distributions used, e.g. , normal and logistic. In particular, it enables separate modeling of location and dispersion effects which is essential in the Taguchi method of robust design. First, we study the problem of estimating the location and dispersion parameters of a single beta distribution (representing a single treatment) from ordinal data assuming known equispaced cutpoints. Two methods of estimation are compared: the maximum likelihood method and the method of moments. Two methods of treating the data are considered: in raw discrete form and in smoothed continuousized form. A large scale simulation study is carried out to compare the different methods. The mean square errors of the estimates are obtained under a variety of parameter configurations. Comparisons are made based on the ratios of the mean square errors (called the relative efficiencies). No method is universally the best, but the maximum likelihood method using continuousized data is found to perform generally well, especially for estimating the dispersion parameter. This method is also computationally much faster than the other methods and does not experience convergence difficulties in case of sparse or empty cells. Next, the problem of estimating unknown cutpoints is addressed. Here the multiple treatments setup is considered since in an actual application, cutpoints are common to all treatments, and must be estimated from all the data. A two-step iterative algorithm is proposed for estimating the location and dispersion parameters of the treatments, and the cutpoints. The proposed beta model and McCullagh's (1980) proportional odds model are compared by fitting them to two real data sets.  相似文献   
905.

Sign test using median ranked set samples (MRSS) is introduced and investigated. We show that, this test is more powerful than the sign tests based on simple random sample (SRS) and ranked set sample (RSS) for finite sample size. It is found that, when the set size of MRSS is odd, the null distribution of the MRSS sign test is the same as the sign test obtained by using SRS. The exact null distributions and the power functions, in case of finite sample sizes, of these tests are derived. Also, the asymptotic distribution of the MRSS sign tests are derived. Numerical comparison of the MRSS sign test power with the power of the SRS sign test and the RSS sign test is given. Illustration of the procedure, using real data set of bilirubin level in Jaundice babies who stay in neonatal intensive care is introduced.  相似文献   
906.
Since Robbins (1951) first introduced the compound decision problem, there has evolved a large literature on the subject for the most part dealing with the construction of compound rules whose excess risk over the simple envelope is no greater than zero in the limit as the number N of component problems goes to infinity. Such rules have compound risk which is asymptotically subminimax. Johns (1967) has introduced more stringent (extended) envelopes and has proposed extended compound rules whose risks achieve these envelopes in the limit. This paper reports some Monte Carlo results on the compound risk behavior of selected unextended and extended rules for moderate N values and certain parameter sequences for Robbins original example. The results show that the extended rules compare favorably with the minimax rule and the unextended rules for moderate N and parameter sequences exhibiting higher order empirical dependencies, for example, those generated by a Markov process.  相似文献   
907.
908.
In this paper, we have developed a likelihood ratio factorization technique for variable selection for multiple observations model. The asymptotic distribution of the selection criterion is also given. This technique has been applied to a marketing data set for illustration of the technique developed.  相似文献   
909.
This article shows that the main hypotheses used in the economic literature to explain the existence of low-skill traps are not necessary. In particular, if we relax two strong assumptions, those of perfect information in the labor market and individual homogeneity, less-developed countries may remain caught in a poverty trap even when there are not intergenerational or intertemporal spillovers, intersectoral complementarities, increasing returns to scale or credit market imperfections. Due to the lack of coordination among workers, the role played by some institutions such as universities or unions in escaping the trap becomes crucial. A numerical calibration of the model supports our conclusions.  相似文献   
910.
中国深部煤层CO2地质处置的哲学思考   总被引:1,自引:0,他引:1  
利用唯物辩证法的认识论、矛盾论观点全面阐述中国深部煤层CO2地质处置的发展历程,以真理的实践标准和事物普遍联系与发展观点分析深部煤层CO2地质处置在中国的应用前景。深部煤层CO2地质处置源于油气资源开发实践,并在煤层气开发和科学研究的持续实践中得到逐步完善,其选择应用,必须与中国国情相适应,在我国碳减排实践中进一步检验与发展。  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号