全文获取类型
收费全文 | 333篇 |
免费 | 7篇 |
国内免费 | 3篇 |
专业分类
管理学 | 4篇 |
人才学 | 1篇 |
丛书文集 | 7篇 |
理论方法论 | 1篇 |
综合类 | 136篇 |
统计学 | 194篇 |
出版年
2022年 | 1篇 |
2020年 | 3篇 |
2019年 | 7篇 |
2018年 | 6篇 |
2017年 | 18篇 |
2016年 | 17篇 |
2015年 | 9篇 |
2014年 | 6篇 |
2013年 | 69篇 |
2012年 | 11篇 |
2011年 | 12篇 |
2010年 | 12篇 |
2009年 | 14篇 |
2008年 | 20篇 |
2007年 | 16篇 |
2006年 | 10篇 |
2005年 | 7篇 |
2004年 | 5篇 |
2003年 | 16篇 |
2002年 | 5篇 |
2001年 | 6篇 |
2000年 | 5篇 |
1999年 | 10篇 |
1998年 | 6篇 |
1997年 | 6篇 |
1996年 | 2篇 |
1995年 | 6篇 |
1994年 | 4篇 |
1993年 | 3篇 |
1992年 | 7篇 |
1991年 | 2篇 |
1990年 | 2篇 |
1989年 | 2篇 |
1988年 | 2篇 |
1987年 | 1篇 |
1986年 | 1篇 |
1985年 | 2篇 |
1984年 | 4篇 |
1982年 | 2篇 |
1980年 | 1篇 |
1978年 | 1篇 |
1977年 | 1篇 |
1976年 | 2篇 |
1975年 | 1篇 |
排序方式: 共有343条查询结果,搜索用时 15 毫秒
101.
This paper is concerned with estimating a mixing density g using a random sample from the mixture distribution f(x)=∫f x | θ)g(θ)dθ where f(· | θ) is a known discrete exponen tial family of density functions. Recently two techniques for estimating g have been proposed. The first uses Fourier analysis and the method of kernels and the second uses orthogonal polynomials. It is known that the first technique is capable of yielding estimators that achieve (or almost achieve) the minimax convergence rate. We show that this is true for the technique based on orthogonal polynomials as well. The practical implementation of these estimators is also addressed. Computer experiments indicate that the kernel estimators give somewhat disappoint ing finite sample results. However, the orthogonal polynomial estimators appear to do much better. To improve on the finite sample performance of the orthogonal polynomial estimators, a way of estimating the optimal truncation parameter is proposed. The resultant estimators retain the convergence rates of the previous estimators and a Monte Carlo finite sample study reveals that they perform well relative to the ones based on the optimal truncation parameter. 相似文献
102.
Anupama Prashar 《生产规划与管理》2016,27(16):1389-1404
Design of Experiments (DoE) has evolved as powerful industrial statistics tool for managing and improving processes in diverse industries. The three broad approaches of DoE in practice: classical or traditional methods, Taguchi methods and Shainin System (SS), have their merits and demerits for a extensive industrial experimentation. Nonetheless, the power of these DoE approach is not fully harnessed in established process improvement frameworks in industries. On the other side, Six Sigma DMAIC is a well-accepted methodology for improving process capability through focus on customer’s requirement (CTQ). The use of DoE in traditional DMAIC framework is limited to quantification of influence factors on CTQ. To offer a contribution to this paucity, this paper proposed a conceptual Six Sigma/DOE hybrid framework aiming to integrate SS, Taguchi methods and Six Sigma DMAIC for process improvement in complex industry environment. A case on improving DF generation process in shock absorber assembly was developed to validate the effectiveness of the framework for intricate problem-solving. 相似文献
103.
Hisham Hilow 《Journal of applied statistics》2014,41(4):802-816
Time trend resistant fractional factorial experiments have often been based on regular fractionated designs where several algorithms exist for sequencing their runs in minimum number of factor-level changes (i.e. minimum cost) such that main effects and/or two-factor interactions are orthogonal to and free from aliasing with the time trend, which may be present in the sequentially generated responses. On the other hand, only one algorithm exists for sequencing runs of the more economical non-regular fractional factorial experiments, namely Angelopoulos et al. [1]. This research studies sequential factorial experimentation under non-regular fractionated designs and constructs a catalog of 8 minimum cost linear trend-free 12-run designs (of resolution III) in 4 up to 11 two-level factors by applying the interactions-main effects assignment technique of Cheng and Jacroux [3] on the standard 12-run Plackett–Burman design, where factor-level changes between runs are minimal and where main effects are orthogonal to the linear time trend. These eight 12-run designs are non-orthogonal but are more economical than the linear trend-free designs of Angelopoulos et al. [1], where they can accommodate larger number of two-level factors in smaller number of experimental runs. These non-regular designs are also more economical than many regular trend-free designs. The following will be provided for each proposed systematic design:
(1) The run order in minimum number of factor-level changes.
(2) The total number of factor-level changes between the 12 runs (i.e. the cost).
(3) The closed-form least-squares contrast estimates for all main effects as well as their closed-form variance–covariance structure.
104.
Avinash C Singh 《统计学通讯:理论与方法》2013,42(11):3255-3273
105.
Some incomplete block designs for partial diallel crosses have been given in the literature. These designs are obtained by regarding the number of crosses as treatments, and consequently require several replications of each cross. The need for resorting to a partial diallel cross itself implies that it is desired to have fewer crosses. A method for constructing single replicate incomplete block designs for circulant partial diallel crosses is provided in this paper. The designs are orthogonal, and thus they retain full efficiency for estimation of the contrasts of interest. 相似文献
106.
《统计学通讯:理论与方法》2013,42(5):875-885
The order of experimental runs in a fractional factorial experiment is essential when the cost of level changes in factors is considered. The generalized foldover scheme given by [1]gives an optimal order to experimental runs in an experiment with specified defining contrasts. An experiment can be specified by a design requirement such as resolution or estimation of some interactions. To meet such a requirement, we can find several sets of defining contrasts. Applying the generalized foldover scheme to these sets of defining contrasts, we obtain designs with different numbers of level changes and then the design with minimum number of level changes. The difficulty is to find all the sets of defining contrasts. An alternative approach is investigated by [2]for two-level fractional factorial experiments. In this paper, we investigate experiments with all factors in slevels. 相似文献
107.
Echo state network (ESN) is viewed as a temporal expansion which naturally give rise to regressors of various relevance to a teacher output. We illustrate that often only a certain amount of the generated echo-regressors effectively explain the teacher output and we propose to determine the importance of the echo-regressors by a joint calculation of the individual variance contributions and Bayesian relevance using the locally regularized orthogonal forward regression (LROFR). This information can be advantageously used in a variety of ways for an analysis of an ESN structure. We present a locally regularized linear readout built using LROFR. The readout may have a smaller dimensionality than the ESN model itself, and improves robustness and accuracy of an ESN. Its main advantage is ability to determine what type of an additional readout is suitable for a task at hand. Comparison with PCA is provided too. We also propose a radial basis function (RBF) readout built using LROFR, since flexibility of the linear readout has limitations and might be insufficient for complex tasks. Its excellent generalization abilities make it a viable alternative to feed-forward neural networks or relevance-vector-machines. For cases where more temporal capacity is required we propose well studied delay&sum readout. 相似文献
108.
Robust parameter design is an effective methodology for reducing variance and improving the quality of a product and a process. Recent work has mainly concentrated on two‐level robust parameter designs. We consider general robust parameter designs with factors having two or more or mixed levels these levels being either qualitative or quantitative. We propose a methodology and develop a generalised minimum aberration optimality criterion for selecting optimal robust parameter designs. A catalogue of 18‐run optimal designs is constructed and tabulated. 相似文献
109.
Angela Montanari 《Journal of applied statistics》2010,37(3):473-487
Classical factor analysis relies on the assumption of normally distributed factors that guarantees the model to be estimated via the maximum likelihood method. Even when the assumption of Gaussian factors is not explicitly formulated and estimation is performed via the iterated principal factors’ method, the interest is actually mainly focussed on the linear structure of the data, since only moments up to the second ones are involved. In many real situations, the factors could not be adequately described by the first two moments only. For example, skewness characterizing most latent variables in social analysis can be properly measured by the third moment: the factors are not normally distributed and covariance is no longer a sufficient statistic. In this work we propose a factor model characterized by skew-normally distributed factors. Skew-normal refers to a parametric class of probability distributions, that extends the normal distribution by an additional shape parameter regulating the skewness. The model estimation can be solved by the generalized EM algorithm, in which the iterative Newthon–Raphson procedure is needed in the M-step to estimate the factor shape parameter. The proposed skew-normal factor analysis is applied to the study of student satisfaction towards university courses, in order to identify the factors representing different aspects of the latent overall satisfaction. 相似文献
110.
以Ca(NO3)2和(NH4)2HPO4为原料,采用沉淀法合成羟基磷灰石(HAP)粉体。采用正交实验设计讨论了反应物浓度、反应温度、分散剂聚乙二醇的添加量对羟基磷灰石粉体粒径的影响,并在此基础上考察了热处理温度对粉体粒径的影响。采用激光粒度仪测定粉体的粒径,并用XRD、IR等手段对粉体进行表征。实验结果表明,合成羟基磷灰石的最佳工艺条件:温度为60℃、浓度为0.8mol/L、分散剂聚乙二醇的添加量为3%。随热处理温度的升高,羟基磷灰石粉体颗粒长大并发生团聚。经XRD和IR测试结果分析表明,采用该方案可制备出纯度较高的羟基磷灰石超细粉体。 相似文献