首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   11019篇
  免费   371篇
  国内免费   136篇
管理学   754篇
劳动科学   35篇
民族学   264篇
人才学   8篇
人口学   246篇
丛书文集   2916篇
理论方法论   472篇
综合类   5039篇
社会学   794篇
统计学   998篇
  2024年   12篇
  2023年   55篇
  2022年   182篇
  2021年   187篇
  2020年   132篇
  2019年   134篇
  2018年   192篇
  2017年   273篇
  2016年   194篇
  2015年   366篇
  2014年   387篇
  2013年   690篇
  2012年   657篇
  2011年   764篇
  2010年   836篇
  2009年   797篇
  2008年   776篇
  2007年   790篇
  2006年   787篇
  2005年   686篇
  2004年   371篇
  2003年   282篇
  2002年   362篇
  2001年   305篇
  2000年   203篇
  1999年   218篇
  1998年   131篇
  1997年   134篇
  1996年   128篇
  1995年   120篇
  1994年   72篇
  1993年   66篇
  1992年   38篇
  1991年   39篇
  1990年   33篇
  1989年   34篇
  1988年   23篇
  1987年   14篇
  1986年   13篇
  1985年   15篇
  1984年   5篇
  1983年   5篇
  1981年   4篇
  1980年   4篇
  1978年   2篇
  1977年   1篇
  1976年   3篇
  1974年   1篇
  1973年   1篇
  1970年   1篇
排序方式: 共有10000条查询结果,搜索用时 0 毫秒
81.
Sample size determination for testing the hypothesis of equality of proportions with a specified type I and type I1 error probabilitiesis of ten based on normal approximation to the binomial distribution. When the proportionsinvolved are very small, the exact distribution of the test statistic may not follow the assumed distribution. Consequently, the sample size determined by the test statistic may not result in the sespecifiederror probabilities. In this paper the author proposes a square root formula and compares it with several existing sample size approximation methods. It is found that with small proportion (p≦.01) the squar eroot formula provides the closest approximation to the exact sample sizes which attain a specified type I and type II error probabilities. Thes quare root formula is simple inform and has the advantage that equal differencesare equally detectable.  相似文献   
82.
Economic selection of process parameters has been an important topic in modern statistical process control. The optimum process parameters setting have a major effect on the expected profit/cost per item. There are some concerns on the problem of setting process parameters. Boucher and Jafari (1991 Boucher , T. O. , Jafari , M. A. ( 1991 ). The optimum target value for single filling operations with quality sampling plans . J. Qual. Technol. 23 : 4447 . [CSA] [Taylor & Francis Online], [Web of Science ®] [Google Scholar]) first considered the attribute single sampling plan applied in the selection of process target. Pulak and Al-Sultan (1996 Pulak , M. F. S. , Al-Sultan , K. S. ( 1996 ). The optimum targeting for a single filling operation with rectifying inspection . Omega 24 : 727733 . [CSA] [CROSSREF] [Crossref], [Web of Science ®] [Google Scholar]) extended Boucher and Jafari's model and presented the rectifying inspection plan for determining the optimum process mean. In this article, we further propose a modified Pulak and Al-Sultan model for determining the optimum process mean and standard deviation under the rectifying inspection plan with the average outgoing quality limit (AOQL) protection. Taguchi's (1986 Taguchi , G. ( 1986 ). Introduction to Quality Engineering . Asian Productivity Organization . [Google Scholar]) symmetric quadratic quality loss function is adopted for evaluating the product quality. By solving the modified model, we can obtain the optimum process parameters with the maximum expected profit per item and the specified quality level can be reached.  相似文献   
83.
This article discusses regression analysis of current status data, which occur in many fields including cross-sectional studies, demographical investigations, and tumorigenicity experiments (Keiding, 1991 Keiding , N. ( 1991 ). Age-specific incidence and prevalence: a statistical perspective (with discussion) . J. Roy. Statist. Soc. Ser. A 154 : 371412 .[Crossref] [Google Scholar]; Sun 2006 Sun , J. ( 2006 ). The Statistical Analysis of Interval-Censored Failure Time Data . New York : Springer-Verlag . [Google Scholar]). For the problem, we focus on the situation where the survival time of interest can be described by the additive hazards model and a multiple imputation approach is presented for inference. A major advantage of the approach is its simplicity and it can be easily implemented by using the existing software packages for right-censored failure time data. Extensive simulation studies are conducted and indicate that the approach performs well for practical situations and is comparable to the existing methods. The methodology is applied to a set of current status data arising from a tumorigenicity experiment and the model checking is discussed.  相似文献   
84.
The basic idea of an interaction spline model was presented in Barry (1983). The general interaction spline models were proposed by Wahba (1986). The purely periodic spline model, a special case of the general interaction spline models, is considered in this paper. A stepwise approach using generalized cross validation (GCV) for fitting the model is proposed. Based on the nice orthogonality properties of the purely periodic functions, the stepwise approach is a promising method for the interaction spline model. The approach can also be generalized to the non-purely-periodic spline models. But this is no done here.  相似文献   
85.
Consider a life testing experiment in which n units are put on test, successive failure times are recorded, and the observation is terminated either at a specified number r of failures or a specified time T whichever is reached first. This mixture of type I and type II censoring schemes, called hybrid censoring, is of wide use. Under this censoring scheme and the assumption of an exponential life distribution, the distribution of the maximum likelihood estimator of the mean life θ is derived. It is then used to construct an exact lower confidence bound for θ.  相似文献   
86.
Consider a life testing experiment in which n units are put on test, successive failure times are recorded, and the observation is terminated either at a specified number r of failures or a specified time T whichever is reached first. This mixture of type I and type II censoring schemes, called hybrid censoring, is of wide use. Under this censoring scheme and the assumption of an exponential life distribution, the distribution of the maximum likelihood estimator of the mean life 6 is derived. It is then used to construct an exact lower confidence bound for θ.  相似文献   
87.
In a two-treatment trial, a two-sided test is often used to reach a conclusion, Usually we are interested in doing a two-sided test because of no prior preference between the two treatments and we want a three-decision framework. When a standard control is just as good as the new experimental treatment (which has the same toxicity and cost), then we will accept both treatments. Only when the standard control is clearly worse or better than the new experimental treatment, then we choose only one treatment. In this paper, we extend the concept of a two-sided test to the multiple treatment trial where three or more treatments are involved. The procedure turns out to be a subset selection procedure; however, the theoretical framework and performance requirement are different from the existing subset selection procedures. Two procedures (exclusion or inclusion) are developed here for the case of normal data with equal known variance. If the sample size is large, they can be applied with unknown variance and with the binomial data or survival data with random censoring.  相似文献   
88.
The p-value-based adjustment of individual endpoints and the global test for an overall inference are the two general approaches for the analysis of multiple endpoints. Statistical procedures developed for testing multivariate outcomes often assume that the multivariate endpoints are either independent or normally distributed. This paper presents a general approach for the analysis of multivariate binary data under the framework of generalized linear models. The generalized estimating equations (GEE) approach is applied to estimate the correlation matrix of the test statistics using the identity and exchangeable working correlation matrices with the model-based as well as robust estimators. The objectives of the approaches are the adjustment of p-values of individual endpoints to identify the affected endpoints as well as the global test of an overall effect. A Monte Carlo simulation was conducted to evaluate the overall family wise error (FWE) rates of the single-step down p-value adjustment approach from two adjustment methods to three global test statistics. The p-value adjustment approach seems to control the FWE better than the global approach Applications of the proposed methods are illustrated by analyzing a carcinogenicity experiment designed to study the dose response trend for 10 tumor sites, and a developmental toxicity experiment with three malformation types: external, visceral, and skeletal.  相似文献   
89.
When the X ¥ control chart is used to monitor a process, three parameters should be determined: the sample size, the sampling interval between successive samples, and the control limits of the chart. Duncan presented a cost model to determine the three parameters for an X ¥ chart. Alexander et al. combined Duncan's cost model with the Taguchi loss function to present a loss model for determining the three parameters. In this paper, the Burr distribution is employed to conduct the economic-statistical design of X ¥ charts for non-normal data. Alexander's loss model is used as the objective function, and the cumulative function of the Burr distribution is applied to derive the statistical constraints of the design. An example is presented to illustrate the solution procedure. From the results of the sensitivity analyses, we find that small values of the skewness coefficient have no significant effect on the optimal design; however, a larger value of skewness coefficient leads to a slightly larger sample size and sampling interval, as well as wider control limits. Meanwhile, an increase on the kurtosis coefficient results in an increase on the sample size and wider control limits.  相似文献   
90.
The authors give easy‐to‐check sufficient conditions for the geometric ergodicity and the finiteness of the moments of a random process xt = ?(xt‐1,…, xt‐p) + ?tσ(xt‐1,…, xt‐q) in which ?: Rp → R, σ Rq → R and (?t) is a sequence of independent and identically distributed random variables. They deduce strong mixing properties for this class of nonlinear autoregressive models with changing conditional variances which includes, among others, the ARCH(p), the AR(p)‐ARCH(p), and the double‐threshold autoregressive models.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号