首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
As the ordinary least squares (OLS) method is very sensitive to outliers as well as to correlated responses, a robust coefficient estimation method is proposed in this paper for multi-response surfaces in multistage processes based on M-estimators. In this approach, experimental designs are used in which the intermediate response variables may act as covariates in the next stages. The performances of both the ordinary multivariate OLS and the proposed robust multi-response surface approach are analyzed and compared through extensive simulation experiments. Sum of the squared errors in estimating the regression coefficients reveals the efficiency of the proposed robust approach.  相似文献   

2.
In modern quality engineering, dual response surface methodology is a powerful tool to model an industrial process by using both the mean and the standard deviation of the measurements as the responses. The least squares method in regression is often used to estimate the coefficients in the mean and standard deviation models, and various decision criteria are proposed by researchers to find the optimal conditions. Based on the inherent hierarchical structure of the dual response problems, we propose a Bayesian hierarchical approach to model dual response surfaces. Such an approach is compared with two frequentist least squares methods by using two real data sets and simulated data.  相似文献   

3.
Regression analysis is one of methods widely used in prediction problems. Although there are many methods used for parameter estimation in regression analysis, ordinary least squares (OLS) technique is the most commonly used one among them. However, this technique is highly sensitive to outlier observation. Therefore, in literature, robust techniques are suggested when data set includes outlier observation. Besides, in prediction a problem, using the techniques that reduce the effectiveness of outlier and using the median as a target function rather than an error mean will be more successful in modeling these kinds of data. In this study, a new parameter estimation method using the median of absolute rate obtained by division of the difference between observation values and predicted values by the observation value and based on particle swarm optimization was proposed. The performance of the proposed method was evaluated with a simulation study by comparing it with OLS and some other robust methods in the literature.  相似文献   

4.
We consider the estimation of a regression coefficient in a linear regression when observations are missing due to nonresponse. Response is assumed to be determined by a nonobservable variable which is linearly related to an observable variable. The values of the observable variable are assumed to be available for the whole sample but the variable is not includsd in the regression relationship of interest . Several alternative estimators have been proposed for this situation under various simplifying assumptions. A sampling theory approach provides three alternative estimatrs by considering the observatins as obtained from a sub-sample, selected on the basis of the fully observable variable , as formulated by Nathan and Holt (1980). Under an econometric approach, Heckman (1979) proposed a two-stage (probit and OLS) estimator which is consistent under specificconditions. A simulation comparison of the four estimators and the ordinary least squares estimator , under multivariate normality of all the variables involved, indicates that the econometric approach estimator is not robust to departures from the conditions underlying its derivation, while two of the other estimators exhibit a similar degree of stable performance over a wide range of conditions. Simulations for a non-normal distribution show that gains in performance can be obtained if observations on the independent variable are available for the whole population.  相似文献   

5.
Abstract. Inverse response plots are a useful tool in determining a response transformation function for response linearization in regression. Under some mild conditions it is possible to seek such transformations by plotting ordinary least squares fits versus the responses. A common approach is then to use nonlinear least squares to estimate a transformation by modelling the fits on the transformed response where the transformation function depends on an unknown parameter to be estimated. We provide insight into this approach by considering sensitivity of the estimation via the influence function. For example, estimation is insensitive to the method chosen to estimate the fits in the initial step. Additionally, the inverse response plot does not provide direct information on how well the transformation parameter is being estimated and poor inverse response plots may still result in good estimates. We also introduce a simple robustified process that can vastly improve estimation.  相似文献   

6.
During drug development, the calculation of inhibitory concentration that results in a response of 50% (IC50) is performed thousands of times every day. The nonlinear model most often used to perform this calculation is a four‐parameter logistic, suitably parameterized to estimate the IC50 directly. When performing these calculations in a high‐throughput mode, each and every curve cannot be studied in detail, and outliers in the responses are a common problem. A robust estimation procedure to perform this calculation is desirable. In this paper, a rank‐based estimate of the four‐parameter logistic model that is analogous to least squares is proposed. The rank‐based estimate is based on the Wilcoxon norm. The robust procedure is illustrated with several examples from the pharmaceutical industry. When no outliers are present in the data, the robust estimate of IC50 is comparable with the least squares estimate, and when outliers are present in the data, the robust estimate is more accurate. A robust goodness‐of‐fit test is also proposed. To investigate the impact of outliers on the traditional and robust estimates, a small simulation study was conducted. Copyright © 2012 John Wiley & Sons, Ltd.  相似文献   

7.
We propose a new robust regression estimator using data partition technique and M estimation (DPM). The data partition technique is designed to define a small fixed number of subsets of the partitioned data set and to produce corresponding ordinary least square (OLS) fits in each subset, contrary to the resampling technique of existing robust estimators such as the least trimmed squares estimator. The proposed estimator shares a common strategy with the median ball algorithm estimator that is obtained from the OLS trial fits only on a fixed number of subsets of the data. We examine performance of the DPM estimator in the eleven challenging data sets and simulation studies. We also compare the DPM with the five commonly used robust estimators using empirical convergence rates relative to the OLS for clean data, robustness through mean squared error and bias, masking and swamping probabilities, the ability of detecting the known outliers, and the regression and affine equivariances.  相似文献   

8.
In split-plot experiments, estimation of unknown parameters by generalized least squares (GLS), as opposed to ordinary least squares (OLS), is required, owing to the existence of whole- and subplot errors. However, estimating the error variances is often necessary for GLS. Restricted maximum likelihood (REML) is an established method for estimating the error variances, and its benefits have been highlighted in many previous studies. This article proposes a new two-step residual-based approach for estimating error variances. Results of numerical simulations indicate that the proposed method performs sufficiently well to be considered as a suitable alternative to REML.  相似文献   

9.
Control charts for residuals, based on the regression model, require a robust fitting technique for minimizing the error resulting from the fitted model. However, in the multivariate case, when the number of variables is high and data become complex, traditional fitting techniques, such as ordinary least squares (OLS), lose efficiency. In this paper, support vector regression (SVR) is used to construct robust control charts for residuals, called SVR-chart. This choice is based on the fact that the SVR is designed to minimize the structural error whereas other techniques minimize the empirical error. An application shows that SVR methods gives competitive results in comparison with the OLS and the partial least squares method, in terms of standard deviation of the error prediction and the standard error of performance. A sensitivity study is conducted to evaluate the SVR-chart performance based on the average run length (ARL) and showed that the SVR-chart has the best ARL behaviour in comparison with the other residuals control charts.  相似文献   

10.
Optimization of multi-response problems is a popular subject in the literature. However, the problem becomes complicated when the responses are functional due to the existence of signal factors. In this article, we have proposed a combined index to optimize multivariate multiple functional responses by considering functional specification limits and a target. The relation among the responses and controllable factors is characterized by polynomial equations to consider the curvature of the response functions. The validity of the proposed method is checked by a simulation example. To show the applicability of the proposed method, a real case about Tehran air quality is analyzed. Latitude and longitude are considered to be signal factors, and different pollutant values are responses of the experiment. Government policies in each time interval are considered as controllable factors. Finally, an optimization algorithm is used to find the best decisions for government policies.  相似文献   

11.
In this paper, we propose a robust statistical inference approach for the varying coefficient partially nonlinear models based on quantile regression. A three-stage estimation procedure is developed to estimate the parameter and coefficient functions involved in the model. Under some mild regularity conditions, the asymptotic properties of the resulted estimators are established. Some simulation studies are conducted to evaluate the finite performance as well as the robustness of our proposed quantile regression method versus the well known profile least squares estimation procedure. Moreover, the Boston housing price data is given to further illustrate the application of the new method.  相似文献   

12.
A unified approach is developed for testing hypotheses in the general linear model based on the ranks of the residuals. It complements the nonparametric estimation procedures recently reported in the literature. The testing and estimation procedures together provide a robust alternative to least squares. The methods are similar in spirit to least squares so that results are simple to interpret. Hypotheses concerning a subset of specified parameters can be tested, while the remaining parameters are treated as nuisance parameters. Asymptotically, the test statistic is shown to have a chi-square distribution under the null hypothesis. This result is then extended to cover a sequence of contiguous alternatives from which the Pitman efficacy is derived. The general application of the test requires the consistent estimation of a functional of the underlying distribution and one such estimate is furnished.  相似文献   

13.
In this paper, a penalized weighted composite quantile regression estimation procedure is proposed to estimate unknown regression parameters and autoregression coefficients in the linear regression model with heavy-tailed autoregressive errors. Under some conditions, we show that the proposed estimator possesses the oracle properties. In addition, we introduce an iterative algorithm to achieve the proposed optimization problem, and use a data-driven method to choose the tuning parameters. Simulation studies demonstrate that the proposed new estimation method is robust and works much better than the least squares based method when there are outliers in the dataset or the autoregressive error distribution follows heavy-tailed distributions. Moreover, the proposed estimator works comparably to the least squares based estimator when there are no outliers and the error is normal. Finally, we apply the proposed methodology to analyze the electricity demand dataset.  相似文献   

14.
A common problem in multivariate general linear models is partially missing response data. The simplest method of analysis in the presence of missing data has been to delete all observations on any individual with any missing data(listwise deletion) and utilize a traditional complete data approach. However: this can result in a great loss of information: and perhaps inconsistencies in the estimation of the variance-covariance matrix. In the generalized multivariate analysis of variance(GMANOVA) model with missing data: Kleinbaum(1973) proposed an estimated generalized least squares approach. In order to apply this: however: a consistent estimate of the variance-covariance matrix is needed. Kleinbaum proposed an estimator which is unbiased and consistent: but it does not take advantage of the fact that the underlying model is GMANOVA and not MANOVA. Using the fact that the underlying model is GMANOVA we have constructed four other con¬sistent estimators. A Monte Carlo simulation experiment is conducted tto further examine how well these estimators compare to the estimator proposed by Kleinbaum.  相似文献   

15.
It is common for linear regression models that the error variances are not the same for all observations and there are some high leverage data points. In such situations, the available literature advocates the use of heteroscedasticity consistent covariance matrix estimators (HCCME) for the testing of regression coefficients. Primarily, such estimators are based on the residuals derived from the ordinary least squares (OLS) estimator that itself can be seriously inefficient in the presence of heteroscedasticity. To get efficient estimation, many efficient estimators, namely the adaptive estimators are available but their performance has not been evaluated yet when the problem of heteroscedasticity is accompanied with the presence of high leverage data. In this article, the presence of high leverage data is taken into account to evaluate the performance of the adaptive estimator in terms of efficiency. Furthermore, our numerical work also evaluates the performance of the robust standard errors based on this efficient estimator in terms of interval estimation and null rejection rate (NRR).  相似文献   

16.
Numerous estimation techniques for regression models have been proposed. These procedures differ in how sample information is used in the estimation procedure. The efficiency of least squares (OLS) estimators implicity assumes normally distributed residuals and is very sensitive to departures from normality, particularly to "outliers" and thick-tailed distributions. Lead absolute deviation (LAD) estimators are less sensitive to outliers and are optimal for laplace random disturbances, but not for normal errors. This paper reports monte carlo comparisons of OLS,LAD, two robust estimators discussed by huber, three partially adaptiveestimators, newey's generalized method of moments estimator, and an adaptive maximum likelihood estimator based on a normal kernal studied by manski. This paper is the first to compare the relative performance of some adaptive robust estimators (partially adaptive and adaptive procedures) with some common nonadaptive robust estimators. The partially adaptive estimators are based on three flxible parametric distributions for the errors. These include the power exponential (Box-Tiao) and generalized t distributions, as well as a distribution for the errors, which is not necessarily symmetric. The adaptive procedures are "fully iterative" rather than one step estimators. The adaptive estimators have desirable large sample properties, but these properties do not necessarily carry over to the small sample case.

The monte carlo comparisons of the alternative estimators are based on four different specifications for the error distribution: a normal, a mixture of normals (or variance-contaminated normal), a bimodal mixture of normals, and a lognormal. Five hundred samples of 50 are used. The adaptive and partially adaptive estimators perform very well relative to the other estimation procedures considered, and preliminary results suggest that in some important cases they can perform much better than OLS with 50 to 80% reductions in standard errors.

  相似文献   

17.
Nowadays, many manufacturing and service systems provide products and services to their customers in several consecutive stages of operations, in each of which one or more quality characteristics of interest are monitored. In these environments, the final quality in the last stage not only depends on the quality of the task performed in that stage but also is dependent on the quality of the products and services in intermediate stages as well as the design parameters in each stage. In this paper, a novel methodology based on the posterior preference approach is proposed to robustly optimize these multistage processes. In this methodology, a multi-response surface optimization problem is solved in order to find preferred solutions among different non dominated solutions (NDSs) according to decision maker's preference. In addition, as the intermediate response variables (quality characteristics) may act as covariates in the next stages, a robust multi-response estimation method is applied to extract the relationships between the outputs and inputs of each stage. NDSs are generated by the ?-constraint method. The robust preferred solutions are selected considering some newly defined conformance criteria. The applicability of the proposed approach is illustrated by a numerical example at the end.  相似文献   

18.
Numerous estimation techniques for regression models have been proposed. These procedures differ in how sample information is used in the estimation procedure. The efficiency of least squares (OLS) estimators implicity assumes normally distributed residuals and is very sensitive to departures from normality, particularly to "outliers" and thick-tailed distributions. Lead absolute deviation (LAD) estimators are less sensitive to outliers and are optimal for laplace random disturbances, but not for normal errors. This paper reports monte carlo comparisons of OLS,LAD, two robust estimators discussed by huber, three partially adaptiveestimators, newey's generalized method of moments estimator, and an adaptive maximum likelihood estimator based on a normal kernal studied by manski. This paper is the first to compare the relative performance of some adaptive robust estimators (partially adaptive and adaptive procedures) with some common nonadaptive robust estimators. The partially adaptive estimators are based on three flxible parametric distributions for the errors. These include the power exponential (Box-Tiao) and generalized t distributions, as well as a distribution for the errors, which is not necessarily symmetric. The adaptive procedures are "fully iterative" rather than one step estimators. The adaptive estimators have desirable large sample properties, but these properties do not necessarily carry over to the small sample case.

The monte carlo comparisons of the alternative estimators are based on four different specifications for the error distribution: a normal, a mixture of normals (or variance-contaminated normal), a bimodal mixture of normals, and a lognormal. Five hundred samples of 50 are used. The adaptive and partially adaptive estimators perform very well relative to the other estimation procedures considered, and preliminary results suggest that in some important cases they can perform much better than OLS with 50 to 80% reductions in standard errors.  相似文献   

19.
The hazard function plays an important role in reliability or survival studies since it describes the instantaneous risk of failure of items at a time point, given that they have not failed before. In some real life applications, abrupt changes in the hazard function are observed due to overhauls, major operations or specific maintenance activities. In such situations it is of interest to detect the location where such a change occurs and estimate the size of the change. In this paper we consider the problem of estimating a single change point in a piecewise constant hazard function when the observed variables are subject to random censoring. We suggest an estimation procedure that is based on certain structural properties and on least squares ideas. A simulation study is carried out to compare the performance of this estimator with two estimators available in the literature: an estimator based on a functional of the Nelson-Aalen estimator and a maximum likelihood estimator. The proposed least squares estimator tums out to be less biased than the other two estimators, but has a larger variance. We illustrate the estimation method on some real data sets.  相似文献   

20.
In heteroskedastic regression models, the least squares (OLS) covariance matrix estimator is inconsistent and inference is not reliable. To deal with inconsistency one can estimate the regression coefficients by OLS, and then implement a heteroskedasticity consistent covariance matrix (HCCM) estimator. Unfortunately the HCCM estimator is biased. The bias is reduced by implementing a robust regression, and by using the robust residuals to compute the HCCM estimator (RHCCM). A Monte-Carlo study analyzes the behavior of RHCCM and of other HCCM estimators, in the presence of systematic and random heteroskedasticity, and of outliers in the explanatory variables.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号