首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Recent advances in statistical estimation theory have resulted in the development of new procedures, called robust methods, that can be used to estimate the coefficients of a regression model. Because such methods take into account the impact of discrepant data points during the initial estimation process, they offer a number of advantages over ordinary least squares and other analytical procedures (such as the analysis of outliers or regression diagnostics). This paper describes the robust method of analysis and illustrates its potential usefulness by applying the technique to two data sets. The first application uses artificial data; the second uses a data set analyzed previously by Tufte [15] and, more recently, by Chatterjee and Wiseman [6].  相似文献   

2.
Because the eight largest bank failures in United States history have occurred since 1973 [24], the development of early-warning problem-bank identification models is an important undertaking. It has been shown previously [3] [5] that M-estimator robust regression provides such a model. The present paper develops a similar model for the multivariate case using both a robustified Mahalanobis distance analysis [21] and principal components analysis [10]. In addition to providing a successful presumptive problem-bank identification model, combining the use of the M-estimator robust regression procedure and the robust Mahalanobis distance procedure with principal components analysis is also demonstrated to be a general method of outlier detection. The results from using these procedures are compared to some previously suggested procedures, and general conclusions are drawn.  相似文献   

3.
An approach to analyzing experimental data with multiple criteria is explained and demonstrated on data from a test of the effectiveness of two posters. As a supplement to traditional multivariate analysis of variance and covariance, the application of a step-down F test is advocated when an ordering of the criterion is meaningful, and an analysis of contrasts is recommended when such an ordering is not managerially relevant. The step-down procedure has the advantage of simultaneously testing an overall hypothesis and hypotheses on each criterion variable.  相似文献   

4.
5.
Decision support for managers and policy makers, such as required in planning and evaluation efforts, often requires ad hoc behavioral modeling to account for context-specific phenomena and to handle data limitations. This paper introduces a systematic approach useful for meeting these requirements in which time-varying parameter estimation plays an important role. A case study evaluating the impacts of public policy actions on residential natural gas conservation illustrates the approach.  相似文献   

6.
In this paper, we discuss some disturbing features of two linear programming (LP) approaches to the discriminant problem. Specifically, we show that both approaches are sensitive to the choice of origin for the data although, intuitively, placement of origin should have no effect on the method of assigning cases to groups. In addition, we show that these LP approaches may lead to discriminant functions which assign all cases to the same group. We show that the usual statistical approach to this problem does not share these difficulties, and we make recommendations for implementing these LP approaches which help to alleviate the difficulties.  相似文献   

7.
The identification and location of materials losses in nuclear facilities is an important issue. Many complexities arise in monitoring such losses. These complexities include the dependency among materials balance observations and the influence of errors (outliers) on parameter estimates of various monitoring methods. The proposed Joint Estimation procedure is superior to standard methods (control chart and CUSUM) and to methods that build in correlation (ARMA control chart, ARMA CUSUM, and the Generalized M procedure) in the detection of nuclear materials losses. The Joint Estimation procedure is robust to the influence of outliers, is flexible in accommodating a range of dependencies among observations, and provides information on the type of loss. Further, the procedure is reliable in that it yields a probability of false alarms and a probability of detecting losses closer to specifications.  相似文献   

8.
Four discriminant models were compared in a simulation study: Fisher's linear discriminant function [14], Smith's quadratic discriminant function [34], the logistic discriminant model, and a model based on linear programming [17]. The study was conducted to estimate expected rates of misclassification for these four procedures when observations were sampled from a variety of normal and nonnormal distributions. In contrast to previous research, data were taken from four types of Kurtotic population distributions. The results indicate the four discriminant procedures are robust toward data from many types of distributions. The misclassification rates for both the logistic discriminant model and the formulation based on linear programming consistently decreased as the kurtosis in the data increased. The decreases, however, were of small magnitude. None of these procedures yielded statistically significant lower rates of misclassification under nonnormality. The quadratic discriminant function produced significantly lower error rates when the variances across groups were heterogeneous.  相似文献   

9.
Small business loan applications have not been evaluated successfully by traditional methods. This paper explores the possibility of using three types of nonfinancial ratio variables (owner, firm, and loan characteristics) to predict whether a small business will pay off or default its loan. The owner and loan variables were better predictors of loan success than the firm variables.  相似文献   

10.
The two-group discriminant problem has applications in many areas, for example, differentiating between good credit risks and poor ones, between promising new firms and those likely to fail, or between patients with strong prospects for recovery and those highly at risk. To expand our tools for dealing with such problems, we propose a class of nonpara-metric discriminant procedures based on linear programming (LP). Although these procedures have attracted considerable attention recently, only a limited number of computational studies have examined the relative merits of alternative formulations. In this paper we provide a detailed study of three contrasting formulations for the two-group problem. The experimental design provides a variety of test conditions involving both normal and nonnormal populations. Our results establish the LP model which seeks to minimize the sum of deviations beyond the two-group boundary as a promising alternative to more conventional linear discriminant techniques.  相似文献   

11.
Ravinder Nath 《决策科学》1984,15(2):248-252
Expressions for misclassification probabilities are derived under a contaminated multivariate normal model for the linear-programming approaches to the two-group discriminant problem.  相似文献   

12.
This paper empirically investigation the performance of four relatively new nonparametric techniques against four different parameteric versions of discriminant analysis. The models were constructed and analyzed using financial data drawn from 232 bankrupt and nonbankrupt companies. Generally, the nonparametric approaches, with the exception of linear programming, performed as well as or better than the more traditional discriminant analysis.  相似文献   

13.
The bootstrap method is used to compute the standard error of regression parameters when the data are non-Gaussian distributed. Simulation results with L1 and L2 norms for various degrees of “non-Gaussianess” are provided. The computationally efficient L2 norm, based on the bootstrap method, provides a good approximation to the L1 norm. The methodology is illustrated with daily security return data. The results show that decisions can be reversed when the ordinary least-squares estimate of standard errors is used with non-Gaussian data.  相似文献   

14.
Estimating the unknown minimum (location) of a random variable has received some attention in the statistical literature, but not enough in the area of decision sciences. This is surprising, given that such estimation needs exist often in simulation and global optimization. This study explores the characteristics of two previously used simple percentile estimators of location. The study also identifies a new percentile estimator of the location parameter for the gamma, Weibull, and log-normal distributions with a smaller bias than the other two estimators. The performance of the new estimator, the minimum-bias percentile (MBP) estimator, and the other two percentile estimators are compared using Monte-Carlo simulation. The results indicate that, of the three estimators, the MBP estimator developed in this study provides, in most cases, the estimate with the lowest bias and smallest mean square error of the location for populations drawn from log-normal and gamma or Weibull (but not exponential) distributions. A decision diagram is provided for location estimator selection, based on the value of the coefficient of variation, when the statistical distribution is known or unknown.  相似文献   

15.
A substantial body of empirical accounting, finance, management, and marketing research utilizes single equation models with discrete dependent variables. Generally, the interpretation of the coefficients of the exogenous variables is limited to the sign and relative magnitude. This paper presents three methods of interpreting the coefficients in these models. The first method interprets the coefficients as marginal probabilities and the second method interprets the coefficients as elasticities of probability. The third method utilizes sensitivity analysis and examines the effect of hypothetical changes in exogenous variables on the probability of choice. This paper applies these methods to a published research study.  相似文献   

16.
This paper presents a minimum-cost methodology for determining a statistical sampling plan in substantive audit tests. In this model, the auditor specifies β, the risk of accepting an account balance as correct when it is not, according to audit evidence requirements. Using β as a constraint, the auditor then selects a sampling plan to optimize the trade-off between sampling costs and the costs of follow-up audit procedures. Tables to aid in this process and an illustration are provided.  相似文献   

17.
Forecasters typically select a statistical forecasting model from among a set of alternative models. Subsequently, forecasts are generated with the chosen model and reported to management (forecast consumers) as if specification uncertainty did not exist (i.e., as if the chosen model were the “true” model of the forecast variable). In this note, a well-known Bayesian model-comparison procedure is used to illustrate some of the ambiguities and distortions of forecasts that do not reflect specification uncertainty. It is shown that a single selected forecasting model (however chosen) will generally misstate measures of forecast risk and lead to point and interval forecasts that are misplaced from a decision-theoretic point of view.  相似文献   

18.
The application of optimization techniques in digital simulation experiments is frequently complicated by the presence of large experimental error variances. Two of the more widely accepted design strategies for the resolution of this problem include the assignment of common pseudorandom number streams and the assignment of antithetic pseudorandom number streams to the experimental points. When considered separately, however, each of these variance-reduction procedures has rather restrictive limitations. This paper examines the simultaneous use of these two techniques as a variance-reduction strategy in response surface methodology (RSM) analysis of simulation models. A simulation of an inventory system is used to illustrate the application and benefits of this assignment procedure, as well as the basic components of an RSM analysis.  相似文献   

19.
In the previous paper, Cooley and Houck [1] examined the simultaneous use of common and antithetic random number streams as a variance-reduction strategy for simulation studies employing response surface methodology (RSM). Our paper supplements their work and further explores pseudorandom number assignments in response surface designs. Specifically, an alternative strategy for assigning pseudorandom numbers is proposed; this strategy is more efficient than that given by Cooley and Houck, especially when more than two factors are involved.  相似文献   

20.
Using a regression approach to discriminant analysis is often incorrect because it forces the use of a binary dependent variable which violates virtually any distributional assumption for a linear model. However, assuming a Laplace distribution in an LP framework leads to a theoretical foundation for MSD discriminant analysis.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号