首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   7338篇
  免费   353篇
  国内免费   72篇
管理学   770篇
民族学   118篇
人口学   163篇
丛书文集   476篇
理论方法论   389篇
综合类   4181篇
社会学   659篇
统计学   1007篇
  2024年   37篇
  2023年   99篇
  2022年   129篇
  2021年   168篇
  2020年   209篇
  2019年   164篇
  2018年   199篇
  2017年   267篇
  2016年   241篇
  2015年   243篇
  2014年   223篇
  2013年   844篇
  2012年   500篇
  2011年   441篇
  2010年   277篇
  2009年   219篇
  2008年   362篇
  2007年   487篇
  2006年   462篇
  2005年   415篇
  2004年   362篇
  2003年   304篇
  2002年   261篇
  2001年   226篇
  2000年   170篇
  1999年   93篇
  1998年   56篇
  1997年   44篇
  1996年   48篇
  1995年   44篇
  1994年   33篇
  1993年   20篇
  1992年   14篇
  1991年   17篇
  1990年   17篇
  1989年   14篇
  1988年   8篇
  1987年   6篇
  1986年   1篇
  1985年   3篇
  1984年   11篇
  1983年   7篇
  1982年   3篇
  1981年   5篇
  1980年   3篇
  1979年   4篇
  1978年   2篇
  1977年   1篇
排序方式: 共有7763条查询结果,搜索用时 312 毫秒
981.
ABSTRACT

In profile monitoring, control charts are proposed to detect unanticipated changes, and it is usually assumed that the in-control parameters are known. However, due to the characteristics of a system or process, the prespecified changes would appear in the process. Moreover, in most applications, the in-control parameters are usually unknown. To overcome these issues, we develop the zone control charts with estimated parameters to detect small shifts of these prespecified changes. The effects of estimation error have been investigated on the performance of the proposed charts. To account for the practitioner-to-practitioner variability, the expected average run length (ARL) and the standard deviation of the average run length (SDARL) is used as the performance metrics. Our results show that the estimation error results in the significant variation in the ARL distribution. Furthermore, in order to adequately reduce the variability, more phase I samples are required in terms of the SDARL metric than that in terms of the expected ARL metric. In addition, more observations on each sampled profile are suggested to improve the charts' performance, especially for small phase I sample sizes. Finally, an illustrative example is given to show the performance of the proposed zone control charts.  相似文献   
982.
Abstract

In time series, it is essential to check the independence of data by means of a proper method or an appropriate statistical test before any further analysis. Therefore, among different independence tests, a powerful and productive test has been introduced by Matilla-García and Marín via m-dimensional vectorial process, in which the value of the process at time t includes m-histories of the primary process. However, this method causes a dependency for the vectors even when the independence assumption of random variables is considered. Considering this dependency, a modified test is obtained in this article through presenting a new asymptotic distribution based on weighted chi-square random variables. Also, some other alterations to the test have been made via bootstrap method and by controlling the overlap. Compared with the primary test, it is obtained that not only the modified test is more accurate but also, it possesses higher power.  相似文献   
983.
In this paper a specification strategy is proposed for the determination of the orders in ARMA models. The strategy is based on two newly defined concepts: the q-conditioned partial auto-regressive function and the p-conditioned partial moving average function. These concepts are similar to the generalized partial autocorrelation function which has been recently suggested for order determination. The main difference is that they are defined and employed in connection with an asymptotically efficient estimation method instead of the rather inefficient generalized Yule-Walker method. The specification is performed by using sequential Wald type tests. In contrast to the traditional testing of hypotheses, these tests use critical values which increase with the sample size at an appropriate rate  相似文献   
984.
Economic selection of process parameters has been an important topic in modern statistical process control. The optimum process parameters setting have a major effect on the expected profit/cost per item. There are some concerns on the problem of setting process parameters. Boucher and Jafari (1991 Boucher , T. O. , Jafari , M. A. ( 1991 ). The optimum target value for single filling operations with quality sampling plans . J. Qual. Technol. 23 : 4447 . [CSA] [Taylor & Francis Online], [Web of Science ®] [Google Scholar]) first considered the attribute single sampling plan applied in the selection of process target. Pulak and Al-Sultan (1996 Pulak , M. F. S. , Al-Sultan , K. S. ( 1996 ). The optimum targeting for a single filling operation with rectifying inspection . Omega 24 : 727733 . [CSA] [CROSSREF] [Crossref], [Web of Science ®] [Google Scholar]) extended Boucher and Jafari's model and presented the rectifying inspection plan for determining the optimum process mean. In this article, we further propose a modified Pulak and Al-Sultan model for determining the optimum process mean and standard deviation under the rectifying inspection plan with the average outgoing quality limit (AOQL) protection. Taguchi's (1986 Taguchi , G. ( 1986 ). Introduction to Quality Engineering . Asian Productivity Organization . [Google Scholar]) symmetric quadratic quality loss function is adopted for evaluating the product quality. By solving the modified model, we can obtain the optimum process parameters with the maximum expected profit per item and the specified quality level can be reached.  相似文献   
985.
Recently, several new applications of control chart procedures for short production runs have been introduced. Bothe (1989) and Burr (1989) proposed the use of control chart statistics which are obtained by scaling the quality characteristic by target values or process estimates of a location and scale parameter. The performance of these control charts can be significantly affected by the use of incorrect scaling parameters, resulting in either an excessive "false alarm rate," or insensitivity to the detection of moderate shifts in the process. To correct for these deficiencies, Quesenberry (1990, 1991) has developed the Q-Chart which is formed from running process estimates of the sample mean and variance. For the case where both the process mean and variance are unknown, the Q-chaxt statistic is formed from the standard inverse Z-transformation of a t-statistic. Q-charts do not perform correctly, however, in the presence of special cause disturbances at process startup. This has recently been supported by results published by Del Castillo and Montgomery (1992), who recommend the use of an alternative control chart procedure which is based upon a first-order adaptive Kalman filter model Consistent with the recommendations by Castillo and Montgomery, we propose an alternative short run control chart procedure which is based upon the second order dynamic linear model (DLM). The control chart is shown to be useful for the early detection of unwanted process trends. Model and control chart parameters are updated sequentially in a Bayesian estimation framework, providing the greatest degree of flexibility in the level of prior information which is incorporated into the model. The result is a weighted moving average control chart statistic which can be used to provide running estimates of process capability. The average run length performance of the control chart is compared to the optimal performance of the exponentially weighted moving average (EWMA) chart, as reported by Gan (1991). Using a simulation approach, the second order DLM control chart is shown to provide better overall performance than the EWMA for short production run applications  相似文献   
986.
The statistical properties of control charts are usually evaluated under the assumption that the observations from the process are independent. For many processes however, observations which are closely spaced in time will be correlated. This paper considers EWMA and CUSUM control charts for the process mean when the observations are from an AR(1) process with additional random error. This simple model may be a reasonable model for many processes encountered in practice. The ARL and steady state ARL of the EWMA and CUSUM charts are evaluated numerically using an integral equation approach and a Markov chain approach. The numerical results show that correlation can have a significant effect on the properties of these charts. Tables are given to aid in the design of these charts when the observations follow the assumed model.  相似文献   
987.
Abstract

In a quantitative linear model with errors following a stationary Gaussian, first-order autoregressive or AR(1) process, Generalized Least Squares (GLS) on raw data and Ordinary Least Squares (OLS) on prewhitened data are efficient methods of estimation of the slope parameters when the autocorrelation parameter of the error AR(1) process, ρ, is known. In practice, ρ is generally unknown. In the so-called two-stage estimation procedures, ρ is then estimated first before using the estimate of ρ to transform the data and estimate the slope parameters by OLS on the transformed data. Different estimators of ρ have been considered in previous studies. In this article, we study nine two-stage estimation procedures for their efficiency in estimating the slope parameters. Six of them (i.e., three noniterative, three iterative) are based on three estimators of ρ that have been considered previously. Two more (i.e., one noniterative, one iterative) are based on a new estimator of ρ that we propose: it is provided by the sample autocorrelation coefficient of the OLS residuals at lag 1, denoted r(1). Lastly, REstricted Maximum Likelihood (REML) represents a different type of two-stage estimation procedure whose efficiency has not been compared to the others yet. We also study the validity of the testing procedures derived from GLS and the nine two-stage estimation procedures. Efficiency and validity are analyzed in a Monte Carlo study. Three types of explanatory variable x in a simple quantitative linear model with AR(1) errors are considered in the time domain: Case 1, x is fixed; Case 2, x is purely random; and Case 3, x follows an AR(1) process with the same autocorrelation parameter value as the error AR(1) process. In a preliminary step, the number of inadmissible estimates and the efficiency of the different estimators of ρ are compared empirically, whereas their approximate expected value in finite samples and their asymptotic variance are derived theoretically. Thereafter, the efficiency of the estimation procedures and the validity of the derived testing procedures are discussed in terms of the sample size and the magnitude and sign of ρ. The noniterative two-stage estimation procedure based on the new estimator of ρ is shown to be more efficient for moderate values of ρ at small sample sizes. With the exception of small sample sizes, REML and its derived F-test perform the best overall. The asymptotic equivalence of two-stage estimation procedures, besides REML, is observed empirically. Differences related to the nature, fixed or random (uncorrelated or autocorrelated), of the explanatory variable are also discussed.  相似文献   
988.
We consider Prais–Houthakker heteroscedastic normal regression model having variance of the dependent variable same as square of its expectation. Bayes predictors for the regression coefficient and the mean of a finite population are derived using Zellner's balanced loss function. Bayes predictive expected losses are obtained and compared with those of classical predictors and Bayes predictors under squared error loss function to examine their loss robustness.  相似文献   
989.
The classical confidence interval approach has failed to find exact intervals, or even a consensus on the best approximate intervals, for the ratio of two binomial probabilities, the so-called risk ratio. The problem is reexamined from a Bayesian viewpoint, and a simple graphical presentation of the risk ratio assessment is given in such a way that sensitivity to the selected prior distribution can be readily examined.  相似文献   
990.
《随机性模型》2013,29(2):193-227
The Double Chain Markov Model is a fully Markovian model for the representation of time-series in random environments. In this article, we show that it can handle transitions of high-order between both a set of observations and a set of hidden states. In order to reduce the number of parameters, each transition matrix can be replaced by a Mixture Transition Distribution model. We provide a complete derivation of the algorithms needed to compute the model. Three applications, the analysis of a sequence of DNA, the song of the wood pewee, and the behavior of young monkeys show that this model is of great interest for the representation of data that can be decomposed into a finite set of patterns.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号