首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 281 毫秒
1.
Consider the linear regression model y =β01 ++ in the usual notation. It is argued that the class of ordinary ridge estimators obtained by shrinking the least squares estimator by the matrix (X1X + kI)-1X'X is sensitive to outliers in the ^variable. To overcome this problem, we propose a new class of ridge-type M-estimators, obtained by shrinking an M-estimator (instead of the least squares estimator) by the same matrix. Since the optimal value of the ridge parameter k is unknown, we suggest a procedure for choosing it adaptively. In a reasonably large scale simulation study with a particular M-estimator, we found that if the conditions are such that the M-estimator is more efficient than the least squares estimator then the corresponding ridge-type M-estimator proposed here is better, in terms of a Mean Squared Error criteria, than the ordinary ridge estimator with k chosen suitably. An example illustrates that the estimators proposed here are less sensitive to outliers in the y-variable than ordinary ridge estimators.  相似文献   

2.
Consider the linear regression model, y = Xβ + ε in the usual notation with X'X being in the correlation form. Galpin(1980) claimed that the ridge estimators of Hoerl, Kennard and Baldwin(1975) and Lawless and Wang(1976) give guaranteed lower mean squared error than the least squares estimator when X'X has at least two very small eigen values. We show that the arguments of Galpin(1980) leading to the above claim are incorrect, and hence the claim itself is unsubstantited. A Monte Carlo study shows that Galpin's claim is not correct in general.  相似文献   

3.
In the multiple linear regression analysis, the ridge regression estimator and the Liu estimator are often used to address multicollinearity. Besides multicollinearity, outliers are also a problem in the multiple linear regression analysis. We propose new biased estimators based on the least trimmed squares (LTS) ridge estimator and the LTS Liu estimator in the case of the presence of both outliers and multicollinearity. For this purpose, a simulation study is conducted in order to see the difference between the robust ridge estimator and the robust Liu estimator in terms of their effectiveness; the mean square error. In our simulations, the behavior of the new biased estimators is examined for types of outliers: X-space outlier, Y-space outlier, and X-and Y-space outlier. The results for a number of different illustrative cases are presented. This paper also provides the results for the robust ridge regression and robust Liu estimators based on a real-life data set combining the problem of multicollinearity and outliers.  相似文献   

4.
The problem of multicollinearity and outliers in the data set produce undesirable effects on the ordinary least squares estimator. Therefore, robust two parameter ridge estimation based on M-estimator (ME) is introduced to deal with multicollinearity and outliers in the y-direction. The proposed estimator outperforms ME, two parameter ridge estimator and robust ridge M-estimator according to mean square error criterion. Moreover, a numerical example and a Monte Carlo simulation experiment are presented.  相似文献   

5.
The ordinary least-square estimators for linear regression analysis with multicollinearity and outliers lead to unfavorable results. In this article, we propose a new robust modified ridge M-estimator (MRME) based on M-estimator (ME) to deal with the combined problem resulting from multicollinearity and outliers in the y-direction. MRME outperforms modified ridge estimator, robust ridge estimator and ME, according to mean squares error criterion. Furthermore, a numerical example and a Monte Carlo simulation experiment are given to illustrate some of the theoretical results.  相似文献   

6.
It is common for a linear regression model that the error terms display some form of heteroscedasticity and at the same time, the regressors are also linearly correlated. Both of these problems have serious impact on the ordinary least squares (OLS) estimates. In the presence of heteroscedasticity, the OLS estimator becomes inefficient and the similar adverse impact can also be found on the ridge regression estimator that is alternatively used to cope with the problem of multicollinearity. In the available literature, the adaptive estimator has been established to be more efficient than the OLS estimator when there is heteroscedasticity of unknown form. The present article proposes the similar adaptation for the ridge regression setting with an attempt to have more efficient estimator. Our numerical results, based on the Monte Carlo simulations, provide very attractive performance of the proposed estimator in terms of efficiency. Three different existing methods have been used for the selection of biasing parameter. Moreover, three different distributions of the error term have been studied to evaluate the proposed estimator and these are normal, Student's t and F distribution.  相似文献   

7.
In the presence of multicollinearity, the rk class estimator is proposed as an alternative to the ordinary least squares (OLS) estimator which is a general estimator including the ordinary ridge regression (ORR), the principal components regression (PCR) and the OLS estimators. Comparison of competing estimators of a parameter in the sense of mean square error (MSE) criterion is of central interest. An alternative criterion to the MSE criterion is the Pitman’s (1937) closeness (PC) criterion. In this paper, we compare the rk class estimator to the OLS estimator in terms of PC criterion so that we can get the comparison of the ORR estimator to the OLS estimator under the PC criterion which was done by Mason et al. (1990) and also the comparison of the PCR estimator to the OLS estimator by means of the PC criterion which was done by Lin and Wei (2002).  相似文献   

8.
Multicollinearity and model misspecification are frequently encountered problems in practice that produce undesirable effects on classical ordinary least squares (OLS) regression estimator. The ridge regression estimator is an important tool to reduce the effects of multicollinearity, but it is still sensitive to a model misspecification of error distribution. Although rank-based statistical inference has desirable robustness properties compared to the OLS procedures, it can be unstable in the presence of multicollinearity. This paper introduces a rank regression estimator for regression parameters and develops tests for general linear hypotheses in a multiple linear regression model. The proposed estimator and the tests have desirable robustness features against the multicollinearity and model misspecification of error distribution. Asymptotic behaviours of the proposed estimator and the test statistics are investigated. Real and simulated data sets are used to demonstrate the feasibility and the performance of the estimator and the tests.  相似文献   

9.
Newhouse and Oman (1971) identified the orientations with respect to the eigenvectors of X'X of the true coefficient vector of the linear regression model for which the ordinary ridge regression estimator performs best and performs worse when mean squared error is the measure of performance. In this paper the corresponding result is derived for generalized ridge regression for two risk functions: mean squared error and mean squared error of prediction.  相似文献   

10.
Presence of collinearity among the explanatory variables results in larger standard errors of parameters estimated. When multicollinearity is present among the explanatory variables, the ordinary least-square (OLS) estimators tend to be unstable due to larger variance of the estimators of the regression coefficients. As alternatives to OLS estimators few ridge estimators are available in the literature. This article presents some of the popular ridge estimators and attempts to provide (i) a generalized class of ridge estimators and (ii) a modified ridge estimator. The performance of the proposed estimators is investigated with the help of Monte Carlo simulation technique. Simulation results indicate that the suggested estimators perform better than the ordinary least-square (OLS) estimators and other estimators considered in this article.  相似文献   

11.
This paper deals with the problem of multicollinearity in a multiple linear regression model with linear equality restrictions. The restricted two parameter estimator which was proposed in case of multicollinearity satisfies the restrictions. The performance of the restricted two parameter estimator over the restricted least squares (RLS) estimator and the ordinary least squares (OLS) estimator is examined under the mean square error (MSE) matrix criterion when the restrictions are correct and not correct. The necessary and sufficient conditions for the restricted ridge regression, restricted Liu and restricted shrunken estimators, which are the special cases of the restricted two parameter estimator, to have a smaller MSE matrix than the RLS and the OLS estimators are derived when the restrictions hold true and do not hold true. Theoretical results are illustrated with numerical examples based on Webster, Gunst and Mason data and Gorman and Toman data. We conduct a final demonstration of the performance of the estimators by running a Monte Carlo simulation which shows that when the variance of the error term and the correlation between the explanatory variables are large, the restricted two parameter estimator performs better than the RLS estimator and the OLS estimator under the configurations examined.  相似文献   

12.
It is developed that non-sample prior information about regression vector-parameter, usually in the form of constraints, improves the risk performance of the ordinary least squares estimator (OLSE) when it is shrunken. However, in practice, it may happen that both multicollinearity and outliers exist simultaneously in the data. In such a situation, the use of robust ridge estimator is suggested to overcome the undesirable effects of the OLSE. In this article, some prior information in the form of constraints is employed to improve the performance of this estimator in the multiple regression model. In this regard, shrinkage ridge robust estimators are defined. Advantages of the proposed estimators over the usual robust ridge estimator are also investigated using Monte-Carlo simulation as well as a real data example.  相似文献   

13.
This article considers both Partial Least Squares (PLS) and Ridge Regression (RR) methods to combat multicollinearity problem. A simulation study has been conducted to compare their performances with respect to Ordinary Least Squares (OLS). With varying degrees of multicollinearity, it is found that both, PLS and RR, estimators produce significant reductions in the Mean Square Error (MSE) and Prediction Mean Square Error (PMSE) over OLS. However, from the simulation study it is evident that the RR performs better when the error variance is large and the PLS estimator achieves its best results when the model includes more variables. However, the advantage of the ridge regression method over PLS is that it can provide the 95% confidence interval for the regression coefficients while PLS cannot.  相似文献   

14.
In this article, we consider the problem of variable selection in linear regression when multicollinearity is present in the data. It is well known that in the presence of multicollinearity, performance of least square (LS) estimator of regression parameters is not satisfactory. Consequently, subset selection methods, such as Mallow's Cp, which are based on LS estimates lead to selection of inadequate subsets. To overcome the problem of multicollinearity in subset selection, a new subset selection algorithm based on the ridge estimator is proposed. It is shown that the new algorithm is a better alternative to Mallow's Cp when the data exhibit multicollinearity.  相似文献   

15.
Shrinkage estimator is a commonly applied solution to the general problem caused by multicollinearity. Recently, the ridge regression (RR) estimators for estimating the ridge parameter k in the negative binomial (NB) regression have been proposed. The Jackknifed estimators are obtained to remedy the multicollinearity and reduce the bias. A simulation study is provided to evaluate the performance of estimators. Both mean squared error (MSE) and the percentage relative error (PRE) are considered as the performance criteria. The simulated result indicated that some of proposed Jackknifed estimators should be preferred to the ML method and ridge estimators to reduce MSE and bias.  相似文献   

16.
In 2005 Lipovetsky and Conklin proposed an estimator, the two parameter ridge estimator (TRE), as an alternative to the ordinary least squares estimator (OLSE) and the ordinary ridge estimator (RE) in the presence of multicollinearity, and in 2006 Lipovetsky improved the two parameter model. In this paper, we introduce two new estimators, one of which is the modified two parameter ridge estimator (MTRE) defined by following Swindel's paper of 1976. The other one is the restricted two parameter ridge estimator (RTRE) which is derived by setting additional linear restrictions on the parameter vectors. This estimator is a generalization of the restricted least squares estimator (RLSE) and includes the restricted ridge estimator (RRE) proposed by Groß in 2003. A numerical example is provided and a simulation study is conducted for the comparisons of the RTRE with the OLSE, RLSE, RE, RRE and TRE.  相似文献   

17.
Tsallis entropy is a generalized form of entropy and tends to be Shannon entropy when q → 1. Using Tsallis entropy, an alternative estimation methodology (generalized maximum Tsallis entropy) is introduced and used to estimate the parameters in a linear regression model when the basic data are ill-conditioned. We describe the generalized maximum Tsallis entropy and for q = 2 we call that GMET2 estimator. We apply the GMET2 estimator for estimating the linear regression model Y = Xβ + e where the design matrix X is subject to severe multicollinearity. We compared the GMET2, generalized maximum entropy (GME), ordinary least-square (OLS), and inequality restricted least-square (IRLS) estimators on the analyzed dataset on Portland cement.  相似文献   

18.
It is known that multicollinearity inflates the variance of the maximum likelihood estimator in logistic regression. Especially, if the primary interest is in the coefficients, the impact of collinearity can be very serious. To deal with collinearity, a ridge estimator was proposed by Schaefer et al. The primary interest of this article is to introduce a Liu-type estimator that had a smaller total mean squared error (MSE) than the Schaefer's ridge estimator under certain conditions. Simulation studies were conducted that evaluated the performance of this estimator. Furthermore, the proposed estimator was applied to a real-life dataset.  相似文献   

19.
A new modified Jackknifed estimator for the Poisson regression model   总被引:1,自引:0,他引:1  
The Poisson regression is very popular in applied researches when analyzing the count data. However, multicollinearity problem arises for the Poisson regression model when the independent variables are highly intercorrelated. Shrinkage estimator is a commonly applied solution to the general problem caused by multicollinearity. Recently, the ridge regression (RR) estimators and some methods for estimating the ridge parameter k in the Poisson regression have been proposed. It has been found that some estimators are better than the commonly used maximum-likelihood (ML) estimator and some other RR estimators. In this study, the modified Jackknifed Poisson ridge regression (MJPR) estimator is proposed to remedy the multicollinearity. A simulation study and a real data example are provided to evaluate the performance of estimators. Both mean-squared error and the percentage relative error are considered as the performance criteria. The simulation study and the real data example results show that the proposed MJPR method outperforms the Poisson ridge regression, Jackknifed Poisson ridge regression and the ML in all of the different situations evaluated in this paper.  相似文献   

20.
This article introduces a general class of biased estimator, namely a generalized diagonal ridge-type (GDR) estimator, for the linear regression model when multicollinearity occurs. The estimator represents different kinds of biased estimators when different parameters are obtained. Some properties of this estimator are discussed and an iterative procedure is provided for selecting the parameters. A Monte Carlo simulation study and an application show that the GDR estimator performs much better than the ordinary least squares (OLS) estimator under the mean square error (MSE) criterion when severe multicollinearity is present.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号