首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   1676篇
  免费   44篇
  国内免费   4篇
管理学   85篇
劳动科学   1篇
民族学   1篇
人口学   11篇
丛书文集   8篇
理论方法论   3篇
综合类   141篇
社会学   10篇
统计学   1464篇
  2023年   6篇
  2022年   12篇
  2021年   19篇
  2020年   35篇
  2019年   69篇
  2018年   71篇
  2017年   95篇
  2016年   38篇
  2015年   44篇
  2014年   50篇
  2013年   530篇
  2012年   167篇
  2011年   36篇
  2010年   34篇
  2009年   33篇
  2008年   48篇
  2007年   36篇
  2006年   27篇
  2005年   37篇
  2004年   53篇
  2003年   34篇
  2002年   22篇
  2001年   28篇
  2000年   32篇
  1999年   25篇
  1998年   33篇
  1997年   13篇
  1996年   15篇
  1995年   9篇
  1994年   8篇
  1993年   12篇
  1992年   10篇
  1991年   5篇
  1990年   6篇
  1989年   4篇
  1988年   1篇
  1987年   2篇
  1985年   2篇
  1984年   4篇
  1983年   2篇
  1982年   2篇
  1981年   3篇
  1980年   5篇
  1979年   3篇
  1978年   1篇
  1976年   1篇
  1975年   2篇
排序方式: 共有1724条查询结果,搜索用时 31 毫秒
31.
This paper deals with the problem of estimating all the unknown parameters of geometric fractional Brownian processes from discrete observations. The estimation procedure is built upon the marriage of the quadratic variation and the maximum likelihood approach. The asymptotic properties of the estimators are provided. Moveover, we compare our derived method with the approach proposed by Misiran et al. [Fractional Black-Scholes models: complete MLE with application to fractional option pricing. In International conference on optimization and control; Guiyang, China; 2010. p. 573–586.], namely the complete maximum likelihood estimation. Simulation studies confirm theoretical findings and illustrate that our methodology is efficient and reliable. To show how to apply our approach in realistic contexts, an empirical study of Chinese financial market is also presented.  相似文献   
32.
The small-sample bias and root mean squared error of several distribution-free estimators of the variance of the sample median are examined. A new estimator is proposed that is easy to compute and tends to have the smallest bias and root mean squared error.  相似文献   
33.

This paper arises out of psychoanalytically oriented consultancy to teams of staff in the helping professions where there is a statutory 'duty to care'. It takes as its premise the seemingly paradoxical hypothesis that workers may need to split off part of their emotional experience in order to preserve their own mental health and provide reliable services to their clients. I argue that while a professional 'duty to care' requires us to be emotionally 'in touch', the demands of our clients together with the demands of the institutional response to the 'duty to care' cause us to split off parts of our awareness. I also argue that provided the splitting does not become extreme we are doing no more or less than the rest of society. In other words, there is a degree of 'normal splitting' which numbs our awareness of danger and destructiveness and seeks to protect us from too much anxiety and pain. Yet if professional workers are charged with the responsibility of assessing risk and acting accordingly for the protection of all concerned they need ways of being 'in touch' (re-integrating the splits) for some or enough of the time. Finally, I will describe ways of being 'in touch', illustrating the difficulty and the pain of re-integrating the splits and some of the insights that can arise out of this work with examples from my consultancy work.  相似文献   
34.
In this paper, we propose a new three-parameter model called the exponential–Weibull distribution, which includes as special models some widely known lifetime distributions. Some mathematical properties of the proposed distribution are investigated. We derive four explicit expressions for the generalized ordinary moments and a general formula for the incomplete moments based on infinite sums of Meijer's G functions. We also obtain explicit expressions for the generating function and mean deviations. We estimate the model parameters by maximum likelihood and determine the observed information matrix. Some simulations are run to assess the performance of the maximum likelihood estimators. The flexibility of the new distribution is illustrated by means of an application to real data.  相似文献   
35.
We propose autoregressive moving average (ARMA) and generalized autoregressive conditional heteroscedastic (GARCH) models driven by asymmetric Laplace (AL) noise. The AL distribution plays, in the geometric-stable class, the analogous role played by the normal in the alpha-stable class, and has shown promise in the modelling of certain types of financial and engineering data. In the case of an ARMA model we derive the marginal distribution of the process, as well as its bivariate distribution when separated by a finite number of lags. The calculation of exact confidence bands for minimum mean-squared error linear predictors is shown to be straightforward. Conditional maximum likelihood-based inference is advocated, and corresponding asymptotic results are discussed. The models are particularly suited for processes that are skewed, peaked, and leptokurtic, but which appear to have some higher order moments. A case study of a fund of real estate returns reveals that AL noise models tend to deliver a superior fit with substantially less parameters than normal noise counterparts, and provide both a competitive fit and a greater degree of numerical stability with respect to other skewed distributions.  相似文献   
36.
ABSTRACT

A statistical test can be seen as a procedure to produce a decision based on observed data, where some decisions consist of rejecting a hypothesis (yielding a significant result) and some do not, and where one controls the probability to make a wrong rejection at some prespecified significance level. Whereas traditional hypothesis testing involves only two possible decisions (to reject or not a null hypothesis), Kaiser’s directional two-sided test as well as the more recently introduced testing procedure of Jones and Tukey, each equivalent to running two one-sided tests, involve three possible decisions to infer the value of a unidimensional parameter. The latter procedure assumes that a point null hypothesis is impossible (e.g., that two treatments cannot have exactly the same effect), allowing a gain of statistical power. There are, however, situations where a point hypothesis is indeed plausible, for example, when considering hypotheses derived from Einstein’s theories. In this article, we introduce a five-decision rule testing procedure, equivalent to running a traditional two-sided test in addition to two one-sided tests, which combines the advantages of the testing procedures of Kaiser (no assumption on a point hypothesis being impossible) and Jones and Tukey (higher power), allowing for a nonnegligible (typically 20%) reduction of the sample size needed to reach a given statistical power to get a significant result, compared to the traditional approach.  相似文献   
37.
In this work, we discuss the class of bilinear GARCH (BL-GARCH) models that are capable of capturing simultaneously two key properties of non-linear time series: volatility clustering and leverage effects. It has often been observed that the marginal distributions of such time series have heavy tails; thus we examine the BL-GARCH model in a general setting under some non-normal distributions. We investigate some probabilistic properties of this model and we conduct a Monte Carlo experiment to evaluate the small-sample performance of the maximum likelihood estimation (MLE) methodology for various models. Finally, within-sample estimation properties were studied using S&P 500 daily returns, when the features of interest manifest as volatility clustering and leverage effects. The main results suggest that the Student-t BL-GARCH seems highly appropriate to describe the S&P 500 daily returns.  相似文献   
38.
We study a new family of continuous distributions with two extra shape parameters called the Burr generalized family of distributions. We investigate the shapes of the density and hazard rate function. We derive explicit expressions for some of its mathematical quantities. The estimation of the model parameters is performed by maximum likelihood. We prove the flexibility of the new family by means of applications to two real data sets. Furthermore, we propose a new extended regression model based on the logarithm of the Burr generalized distribution. This model can be very useful to the analysis of real data and provide more realistic fits than other special regression models.  相似文献   
39.
In this article, the Ridge–GME parameter estimator, which combines Ridge Regression and Generalized Maximum Entropy, is improved in order to eliminate the subjectivity in the analysis of the ridge trace. A serious concern with the visual inspection of the ridge trace to define the supports for the parameters in the Ridge–GME parameter estimator is the misinterpretation of some ridge traces, in particular where some of them are very close to the axes. A simulation study and two empirical applications are used to illustrate the performance of the improved estimator. A MATLAB code is provided as supplementary material.  相似文献   
40.
Sample selection and attrition are inherent in a range of treatment evaluation problems such as the estimation of the returns to schooling or training. Conventional estimators tackling selection bias typically rely on restrictive functional form assumptions that are unlikely to hold in reality. This paper shows identification of average and quantile treatment effects in the presence of the double selection problem into (i) a selective subpopulation (e.g., working—selection on unobservables) and (ii) a binary treatment (e.g., training—selection on observables) based on weighting observations by the inverse of a nested propensity score that characterizes either selection probability. Weighting estimators based on parametric propensity score models are applied to female labor market data to estimate the returns to education.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号