首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   1520篇
  免费   33篇
  国内免费   15篇
管理学   221篇
民族学   3篇
人才学   1篇
人口学   32篇
丛书文集   59篇
理论方法论   16篇
综合类   796篇
社会学   15篇
统计学   425篇
  2024年   1篇
  2023年   7篇
  2022年   20篇
  2021年   14篇
  2020年   31篇
  2019年   27篇
  2018年   31篇
  2017年   51篇
  2016年   38篇
  2015年   51篇
  2014年   73篇
  2013年   184篇
  2012年   89篇
  2011年   84篇
  2010年   91篇
  2009年   69篇
  2008年   75篇
  2007年   85篇
  2006年   78篇
  2005年   54篇
  2004年   51篇
  2003年   51篇
  2002年   40篇
  2001年   39篇
  2000年   34篇
  1999年   28篇
  1998年   17篇
  1997年   28篇
  1996年   22篇
  1995年   36篇
  1994年   11篇
  1993年   16篇
  1992年   10篇
  1991年   5篇
  1990年   8篇
  1989年   6篇
  1988年   4篇
  1987年   3篇
  1985年   2篇
  1984年   2篇
  1981年   1篇
  1976年   1篇
排序方式: 共有1568条查询结果,搜索用时 15 毫秒
21.
This paper studies the effects of non-normality and autocorrelation on the performances of various individuals control charts for monitoring the process mean and/or variance. The traditional Shewhart X chart and moving range (MR) chart are investigated as well as several types of exponentially weighted moving average (EWMA) charts and combinations of control charts involving these EWMA charts. It is shown that the combination of the X and MR charts will not detect small and moderate parameter shifts as fast as combinations involving the EWMA charts, and that the performana of the X and MR charts is very sensitive to the normality assumption. It is also shown that certain combinations of EWMA charts can be designed to be robust to non-normality and very effective at detecting small and moderate shifts in the process mean and/or variance. Although autocorrelation can have a significant effect on the in-control performances of these combinations of EWMA charts, their relative out-of-control performances under independence are generally maintained for low to moderate levels of autocorrelation.  相似文献   
22.
This paper deals with the problem of estimating all the unknown parameters of geometric fractional Brownian processes from discrete observations. The estimation procedure is built upon the marriage of the quadratic variation and the maximum likelihood approach. The asymptotic properties of the estimators are provided. Moveover, we compare our derived method with the approach proposed by Misiran et al. [Fractional Black-Scholes models: complete MLE with application to fractional option pricing. In International conference on optimization and control; Guiyang, China; 2010. p. 573–586.], namely the complete maximum likelihood estimation. Simulation studies confirm theoretical findings and illustrate that our methodology is efficient and reliable. To show how to apply our approach in realistic contexts, an empirical study of Chinese financial market is also presented.  相似文献   
23.
Conditional value-at-risk (CVaR) model is a kind of financial risk measure that is extensively supported and accepted by international financial community. Its optimized form can be regarded as an optimized certainty equivalent (OCE) risk measurement. In this paper, we mainly discuss and analyze the strong laws of large numbers and the convergence rate of OCE's estimator under α-mixing sequences. The result shows that the almost sure convergence rate of CVaR estimator is given by the results of OCE estimator. Its convergence rate is inversely proportional to the square root of the sample size under certain conditions. Its effectiveness is verified by simulation experiments for two classical α-mixing sequences.  相似文献   
24.
The recently developed rolling year GEKS procedure makes maximum use of all matches in the data to construct nonrevisable price indexes that are approximately free from chain drift. A potential weakness is that unmatched items are ignored. In this article we use imputation Törnqvist price indexes as inputs into the rolling year GEKS procedure. These indexes account for quality changes by imputing the “missing prices” associated with new and disappearing items. Three imputation methods are discussed. The first method makes explicit imputations using a hedonic regression model which is estimated for each time period. The other two methods make implicit imputations; they are based on time dummy hedonic and time-product dummy regression models and are estimated on bilateral pooled data. We present empirical evidence for New Zealand from scanner data on eight consumer electronics products and find that accounting for quality change can make a substantial difference.  相似文献   
25.
In this paper, we propose a multiple deferred state repetitive group sampling plan which is a new sampling plan developed by incorporating the features of both multiple deferred state sampling plan and repetitive group sampling plan, for assuring Weibull or gamma distributed mean life of the products. The quality of the product is represented by the ratio of true mean life and specified mean life of the products. Two points on the operating characteristic curve approach is used to determine the optimal parameters of the proposed plan. The plan parameters are determined by formulating an optimization problem for various combinations of producer's risk and consumer's risk for both distributions. The sensitivity analysis of the proposed plan is discussed. The implementation of the proposed plan is explained using real-life data and simulated data. The proposed plan under Weibull distribution is compared with the existing sampling plans. The average sample number (ASN) of the proposed plan and failure probability of the product are obtained under Weibull, gamma and Birnbaum–Saunders distributions for a specified value of shape parameter and compared with each other. In addition, a comparative study is made between the ASN of the proposed plan under Weibull and gamma distributions.  相似文献   
26.
Clinical phase II trials in oncology are conducted to determine whether the activity of a new anticancer treatment is promising enough to merit further investigation. Two‐stage designs are commonly used for this situation to allow for early termination. Designs proposed in the literature so far have the common drawback that the sample sizes for the two stages have to be specified in the protocol and have to be adhered to strictly during the course of the trial. As a consequence, designs that allow a higher extent of flexibility are desirable. In this article, we propose a new adaptive method that allows an arbitrary modification of the sample size of the second stage using the results of the interim analysis or external information while controlling the type I error rate. If the sample size is not changed during the trial, the proposed design shows very similar characteristics to the optimal two‐stage design proposed by Chang et al. (Biometrics 1987; 43:865–874). However, the new design allows the use of mid‐course information for the planning of the second stage, thus meeting practical requirements when performing clinical phase II trials in oncology. Copyright © 2012 John Wiley & Sons, Ltd.  相似文献   
27.
In this paper, a new test statistic is presented for testing the null hypothesis of equal multinomial cell probabilities versus various trend alternatives. Exact asymptotic critical values are obtained, The power of the test is compared with several other statistics considered by Choulakian et al (1995), The test is shown to have better power for certain trend alternatives.  相似文献   
28.
Abstract

We introduce a new family of distributions using truncated discrete Linnik distribution. This family is a rich family of distributions which includes many important families of distributions such as Marshall–Olkin family of distributions, family of distributions generated through truncated negative binomial distribution, family of distributions generated through truncated discrete Mittag–Leffler distribution etc. Some properties of the new family of distributions are derived. A particular case of the family, a five parameter generalization of Weibull distribution, namely discrete Linnik Weibull distribution is given special attention. This distribution is a generalization of many distributions, such as extended exponentiated Weibull, exponentiated Weibull, Weibull truncated negative binomial, generalized exponential truncated negative binomial, Marshall-Olkin extended Weibull, Marshall–Olkin generalized exponential, exponential truncated negative binomial, Marshall–Olkin exponential and generalized exponential. The shape properties, moments, median, distribution of order statistics, stochastic ordering and stress–strength properties of the new generalized Weibull distribution are derived. The unknown parameters of the distribution are estimated using maximum likelihood method. The discrete Linnik Weibull distribution is fitted to a survival time data set and it is shown that the distribution is more appropriate than other competitive models.  相似文献   
29.
30.
We revisit the comparison of mathematical programming with equilibrium constraints (MPEC) and nested fixed point (NFXP) algorithms for estimating structural dynamic models by Su and Judd (2012). Their implementation of the nested fixed point algorithm used successive approximations to solve the inner fixed point problem (NFXP‐SA). We redo their comparison using the more efficient version of NFXP proposed by Rust (1987), which combines successive approximations and Newton–Kantorovich iterations to solve the fixed point problem (NFXP‐NK). We show that MPEC and NFXP are similar in speed and numerical performance when the more efficient NFXP‐NK variant is used.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号