首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   3579篇
  免费   112篇
  国内免费   33篇
管理学   149篇
劳动科学   1篇
民族学   40篇
人才学   1篇
人口学   97篇
丛书文集   269篇
理论方法论   82篇
综合类   2128篇
社会学   97篇
统计学   860篇
  2024年   5篇
  2023年   20篇
  2022年   22篇
  2021年   27篇
  2020年   54篇
  2019年   59篇
  2018年   84篇
  2017年   94篇
  2016年   88篇
  2015年   83篇
  2014年   152篇
  2013年   415篇
  2012年   220篇
  2011年   205篇
  2010年   177篇
  2009年   175篇
  2008年   212篇
  2007年   270篇
  2006年   232篇
  2005年   199篇
  2004年   173篇
  2003年   186篇
  2002年   148篇
  2001年   115篇
  2000年   69篇
  1999年   47篇
  1998年   28篇
  1997年   20篇
  1996年   28篇
  1995年   21篇
  1994年   15篇
  1993年   17篇
  1992年   17篇
  1991年   11篇
  1990年   5篇
  1989年   6篇
  1988年   7篇
  1987年   3篇
  1986年   1篇
  1985年   4篇
  1984年   2篇
  1983年   1篇
  1982年   1篇
  1981年   1篇
  1980年   1篇
  1979年   1篇
  1978年   1篇
  1977年   1篇
  1976年   1篇
排序方式: 共有3724条查询结果,搜索用时 296 毫秒
81.
Logarithmic general error distribution, an extension of the log-normal distribution, is proposed. Some interesting properties of the log GED are derived. These properties are applied to establish the asymptotic behavior of the ratio of probability densities and the ratio of the tails of the logarithmic general error and log-normal distributions, and to derive the asymptotic distribution of the partial maximum of an independent and identically distributed sequence obeying the log GED.  相似文献   
82.
Abstract

In this paper, we propose a discrete-time risk model with the claim number following an integer-valued autoregressive conditional heteroscedasticity (ARCH) process with Poisson deviates. In this model, the current claim number depends on the previous observations. Within this framework, the equation for finding the adjustment coefficient is derived. Numerical studies are also carried out to examine the impact of the Poisson ARCH dependence structure on the ruin probability.  相似文献   
83.
In this article, we propose a new criterion to evaluate the similarity of probability density functions (pdfs). We call this the criterion on similar coefficient of cluster (SCC) and use it as a tool to deal with overlap coefficients of pdfs in normal standard on [0;1]. With the support of the self-update algorithm for determining the suitable number of clusters, SCC then becomes a criterion to establish the corresponding cluster for pdfs. Moreover, some results on determination of SCC in case of two and more than two pdfs as well as relations of different SCCs and other measures are presented. The numerical examples in both synthetic data and real data are given not only to illustrate the suitability of proposed theories and algorithms but also to demonstrate the applicability and innovation of the proposed algorithm.  相似文献   
84.
The access divide was once the basic form of the digital divide. The development of Internet infrastructure has narrowed the access divide and increased application coverage, but it has also touched off a connectivity dividend difference. Taking the online market as an example, we examine the sources of the dividend difference and the factors influencing it with in a connectivity framework. We found that the narrowing of the access divide has resulted in enhanced connectivity and platform development, giving people the chance to benefit from transforming the various assets in which they have previously invested into differentiated compound connectivity capital. In the course, the scale and rate of the conversion are affected by two multiplier effects and especially by online platforms. The process is ultimately expressed in the dividend difference.  相似文献   
85.
This article discusses the minimax estimator in partial linear model y = Zβ + f + ε under ellipsoidal restrictions on the parameter space and quadratic loss function. The superiority of the minimax estimator over the two-step estimator is studied in the mean squared error matrix criterion.  相似文献   
86.
In the present article we propose the modified lambda family (MLF) which is the Freimer, Mudholkar, Kollia, and Lin (FMKL) parametrization of generalized lambda distribution (GLD) as a model for censored data. The expressions for probability weighted moments of MLF are derived and used to estimate the parameters of the distribution. We modified the estimation technique using probability weighted moments. It is shown that the distribution provides reasonable fit to a real censored data.  相似文献   
87.
考虑到我国未来年度的OD分布预测中的路网构成变化、区域经济布局变动、区域经济增长速度差异等一些具有时变性和特殊性因素,采用在“四阶段法”运量预测基础上改进而成的“三阶段法”进行高速铁路短期客运量预测。首先采用组合预测模型进行趋势运量预测,然后采用多元LOGIT模型进行方式分担,最后采用弹性系数诱增模型进行诱增运量预测。以京沪高铁为例,采用“三阶段法”预测了2014年和2015年京沪高铁本线及跨线单向客流量。  相似文献   
88.
Eunju Hwang 《Statistics》2017,51(4):844-861
This paper studies the stationary bootstrap applicability for realized covariations of high frequency asynchronous financial data. The stationary bootstrap method, which is characterized by a block-bootstrap with random block length, is applied to estimate the integrated covariations. The bootstrap realized covariance, bootstrap realized regression coefficient and bootstrap realized correlation coefficient are proposed, and the validity of the stationary bootstrapping for them is established both for large sample and for finite sample. Consistencies of bootstrap distributions are established, which provide us valid stationary bootstrap confidence intervals. The bootstrap confidence intervals do not require a consistent estimator of a nuisance parameter arising from nonsynchronous unequally spaced sampling while those based on a normal asymptotic theory require a consistent estimator. A Monte-Carlo comparison reveals that the proposed stationary bootstrap confidence intervals have better coverage probabilities than those based on normal approximation.  相似文献   
89.
Previous literature has shown that the addition of an untested surplus-lag Granger causality test can provide highly robust to stationary, non stationary, long memory, and structural break processes in the forcing variables. This study extends this approach to the partial unit root framework by simulation. Results show good size and power. Therefore, the surplus-lag approach is also robust to partial unit root processes.  相似文献   
90.
The weighted kappa coefficient of a binary diagnostic test is a measure of the beyond-chance agreement between the diagnostic test and the gold standard, and is a measure that allows us to assess and compare the performance of binary diagnostic tests. In the presence of partial disease verification, the comparison of the weighted kappa coefficients of two or more binary diagnostic tests cannot be carried out ignoring the individuals with an unknown disease status, since the estimators obtained would be affected by verification bias. In this article, we propose a global hypothesis test based on the chi-square distribution to simultaneously compare the weighted kappa coefficients when in the presence of partial disease verification the missing data mechanism is ignorable. Simulation experiments have been carried out to study the type I error and the power of the global hypothesis test. The results have been applied to the diagnosis of coronary disease.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号