排序方式: 共有13条查询结果,搜索用时 15 毫秒
1.
Egmar Rödel 《Statistics》2013,47(4):573-585
Normed bivariate density funtions were introduced by HOEFFDING (1940/41). In the present paper estimators for normed bivariate ranks and on a FOURIER series expansion in LEGENDRE polynomials. The estimation of normed bivarate density functions under positive dependence is also described 相似文献
2.
3.
We investigate the effect of measurement error on principal component analysis in the high‐dimensional setting. The effects of random, additive errors are characterized by the expectation and variance of the changes in the eigenvalues and eigenvectors. The results show that the impact of uncorrelated measurement error on the principal component scores is mainly in terms of increased variability and not bias. In practice, the error‐induced increase in variability is small compared with the original variability for the components corresponding to the largest eigenvalues. This suggests that the impact will be negligible when these component scores are used in classification and regression or for visualizing data. However, the measurement error will contribute to a large variability in component loadings, relative to the loading values, such that interpretation based on the loadings can be difficult. The results are illustrated by simulating additive Gaussian measurement error in microarray expression data from cancer tumours and control tissues. 相似文献
4.
Laleh Behjat Dorothy Kucar Anthony Vannelli 《Journal of Combinatorial Optimization》2002,6(3):271-286
Modern integrated circuit design involves laying out circuits which consist of millions of switching elements or transistors. Due to the sheer complexity, optimizing the connectivity between transistors is a very difficult problem. How a circuit is interconnected is the single most important factor in performance criteria such as signal delay, power dissipation, circuit size and cost. These factors dictate that interconnections—wires, be made as short as possible. The wire–minimization problem is formulated as a sequence of discrete optimization subproblems. These problems are known to be NP-hard, hence they can only be solved approximately using meta-heuristics or linear programming techniques. Nevertheless, these methods are computationally expensive and the quality of solution depends to a great extent on an appropriate choice of starting configuration. A matrix reordering technique for solving very hard discrete optimization problems in Very Large Scale Integrated (VLSI) design which overcomes some of these shortcomings is proposed. In particular, the computational cost is reasonable—of the order of n
1.4 running time. 相似文献
5.
Double arrays of n rows and p columns can be regarded as n drawings from some p-dimensional population. A sequence of such arrays is considered. Principal component analysis for each array forms sequences of sample principal components and eigenvalues. The continuity of these sequences, in the sense of convergence with probability one and convergence in probability, is investigated, that appears to be informative for pattern study and prediction of principal components. Various features of paths of sequences of population principal components are highlighted through an example. 相似文献
6.
Xue Ding 《统计学通讯:理论与方法》2013,42(18):3825-3840
In this paper, for the general non Gaussian spiked population model, where a few fixed eigenvalues of the population covariance matrix are separated from others, we investigate the convergence properties of the eigenvectors of sample covariance matrices corresponding to the spiked population eigenvalues and angle between the population eigenvectors and sample eigenvectors as both the sample size and population size are large. 相似文献
7.
We present two new statistics for estimating the number of factors underlying in a multivariate system. One of the two new methods, the original NUMFACT, has been used in high profile environmental studies. The two new methods are first explained from a geometrical viewpoint. We then present an algebraic development and asymptotic cutoff points. Next we present a simulation study that shows that for skewed data the new methods are typically superior to traditional methods and for normally distributed data the new methods are competitive to the best of the traditional methods. We finally show how the methods compare by using two environmental data sets. 相似文献
8.
This study compares the SPSS ordinary least squares (OLS) regression and ridge regression procedures in dealing with multicollinearity data. The LS regression method is one of the most frequently applied statistical procedures in application. It is well documented that the LS method is extremely unreliable in parameter estimation while the independent variables are dependent (multicollinearity problem). The Ridge Regression procedure deals with the multicollinearity problem by introducing a small bias in the parameter estimation. The application of Ridge Regression involves the selection of a bias parameter and it is not clear if it works better in applications. This study uses a Monte Carlo method to compare the results of OLS procedure with the Ridge Regression procedure in SPSS. 相似文献
9.
Bert M. Steece 《统计学通讯:理论与方法》2013,42(12):3599-3605
A simple analytical expression is derived for leverage in ridge regression. Leverage is shown to be a monotonically decreasing function of the value of the ridge parameter. This reduction in leverage is greatest for those observations lying substantially in the direction of the minor principal axes. Thus, ridge estimation copes with outliers in regressor space by downweighting their influence. A brief illustration is provided. 相似文献
10.