首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 125 毫秒
1.
In the previous paper, Cooley and Houck [1] examined the simultaneous use of common and antithetic random number streams as a variance-reduction strategy for simulation studies employing response surface methodology (RSM). Our paper supplements their work and further explores pseudorandom number assignments in response surface designs. Specifically, an alternative strategy for assigning pseudorandom numbers is proposed; this strategy is more efficient than that given by Cooley and Houck, especially when more than two factors are involved.  相似文献   

2.
SONET (Synchronous Optical NETworks) add-drop multiplexers (ADMs) are the dominant cost factor in the WDM(Wavelength Division Multiplexing)/SONET rings. The number of SONET ADMs required by a set of traffic streams is determined by the routing and wavelength assignment of the traffic streams. Previous works took as input the traffic streams with routings given a priori and developed various heuristics for wavelength assignment to minimize the SONET ADM costs. However, little was known about the performance guarantees of these heuristics. This paper contributes mainly in two aspects. First, in addition to the traffic streams with pre-specified routing, this paper also studies minimizing the ADM requirement by traffic streams without given routings, a problem which is shown to be NP-hard. Several heuristics for integrated routing and wavelength assignment are proposed to minimize the SONET ADM costs. Second, the approximation ratios of those heuristics for wavelength assignment only and those heuristics for integrated routing and wavelength assignment are analyzed. The new Preprocessed Iterative Matching heuristic has the best approximation ratio: at most 3/2.  相似文献   

3.
In recent years, much research has been done on the application of mathematical programming (MP) techniques to the discriminant problem. While promising results have been obtained, many of these techniques are plagued by a number of problems associated with the model formulation including unbounded, improper, and unacceptable solutions as well as solution instability under linear transformation of the data. In attempting to solve these problems, numerous formulations have been proposec involving additional variables and/or normalization constraints. While effective, these models can also become quite complex. In this paper we demonstrate that a simple, well-known special case of Hand's [13] original formulation provides an implicit normalization which avoids the problems for which various complicated remedies have been devised. While other researchers have made use of this formulation, its properties have not previously been fully recognized.  相似文献   

4.
The crucial steps in a quantitative analysis of a decision problem are problem formulation, model building, analysis, and implementation. Given an initial model specification, the goal of analysis is to determine the values of the controllable or decision variables that optimize the objective function. Frequently the initial model is inadequate and must be reformulated. While modeling is an evolutionary process involving art and science, under certain conditions Response Surface Methodology (RSM) is an effective vehicle for constructing and parameterizing optimization models. RSM, which draws upon the areas of experimental design, modeling, inference, and optimization, utilizes different opening and ending strategies. Through simultaneous and sequential experimentation, the approximate region of the model's maximum response is found by employing the steepest ascent method. Subsequently, the exact values of the controllable variables that maximize the model's response are determined by canonical analysis. The RSM concepts are first developed within the context of a manufacturing problem. A potential application to simulation studies is then presented.  相似文献   

5.
A number of recent studies have compared the performance of neural networks (NNs) to a variety of statistical techniques for the classification problem in discriminant analysis. The empirical results of these comparative studies indicate that while NNs often outperform the more traditional statistical approaches to classification, this is not always the case. Thus, decision makers interested in solving classification problems are left in a quandary as to what tool to use on a particular data set. We present a new approach to solving classification problems by combining the predictions of a well-known statistical tool with those of an NN to create composite predictions that are more accurate than either of the individual techniques used in isolation.  相似文献   

6.
This paper demonstrates the feasibility of applying nonlinear programming methods to solve the classification problem in discriminant analysis. The application represents a useful extension of previously proposed linear programming-based solutions for discriminant analysis. The analysis of data obtained by conducting a Monte Carlo simulation experiment shows that these new procedures are promising. Future research that should promote application of the proposed methods for solving classification problems in a business decision-making environment is discussed.  相似文献   

7.
Congruence has served as an important research framework for many leadership research topics. Perhaps the most frequently used methodological/statistical approach for testing the congruence framework is polynomial regression analysis (PRA) with response surface methodology (RSM). As this approach was introduced to organizational sciences more than two decades ago, we can now identify the main issues with the use of this approach in leadership research. To systematically investigate these issues, we first review how PRA and RSM have been used in various leadership studies. We then review the levels-of-analysis and rater model assumptions prevalent in PRA in terms of multilevel techniques, choice of centering options, and issues of endogeneity. Finally, to better understand the inconsistencies and variabilities that exist in leadership research, we review the use of two main RSM features and summarize additional statistical techniques for assessment in this realm. Overall, we aim to promote the rigorousness of this methodology within the study of congruence in leadership research by enhancing its capability in theory testing and building.  相似文献   

8.
An approach to analyzing experimental data with multiple criteria is explained and demonstrated on data from a test of the effectiveness of two posters. As a supplement to traditional multivariate analysis of variance and covariance, the application of a step-down F test is advocated when an ordering of the criterion is meaningful, and an analysis of contrasts is recommended when such an ordering is not managerially relevant. The step-down procedure has the advantage of simultaneously testing an overall hypothesis and hypotheses on each criterion variable.  相似文献   

9.
10.
Recently developed large sample inference procedures for least absolute value (LAV) regression are examined via Monte Carlo simulation to determine when sample sizes are large enough for the procedures to work effectively. A variety of different experimental settings were created by varying the disturbance distribution, the number of explanatory variables and the way the explanatory variables were generated. Necessary sample sizes range from as small as 20 when disturbances are normal to as large as 200 in extreme outlier-producing distributions.  相似文献   

11.
We look at a specific but pervasive problem in the use of secondary or published data in which the data are summarized in a histogram format, perhaps with additional mean or median information provided; two published sources yield histogram-type summaries involving the same variable, but the two sources do not group the values of the variable the same way; the researcher wishes to answer a question using information from both data streams; and the original, detailed data underlying the published summary, which could give a better answer to the question, are unavailable. We review relevant aspects of maximum-entropy (ME) estimation, and develop a heuristic for generating ME density estimates from data in histogram form when additional means and medians may be known. Application examples from several business and scientific areas illustrate the heuristic's use. Areas of application include business and social or market research, risk analysis, and individual risk profile analysis. Some instructional or classroom applications are possible as well.  相似文献   

12.
This paper investigates the use of the coefficient of variation (CV) as a measure of requirements lumpiness (the amount of variation in requirements from period to period) in material requirements planning (MRP) research. CV is used as a factor in MRP research even though it is limited as a measure. Any sequence of requirements will have a unique CV, but any CV can represent a variety of requirements sequences. This limitation raises questions concerning the robustness of CV as a measure. In this paper, two other aspects of the requirements sequence for a given CV are investigated: the procedure used to generate the requirements and the manner in which the requirements are grouped. A simulation comparison of selected lot-sizing techniques is used to conduct the investigation. With respect to these two (new) aspects of the requirements sequence, CV appears to be a robust measure.  相似文献   

13.
武器目标分配问题是多无人机超视距空战中的关键决策问题之一。本文考虑超视距空战的强对抗、不确定等特点给武器目标分配问题带来的新挑战,基于纳什均衡博弈思想,将超视距空战中的多无人机武器目标分配问题建模为双矩阵博弈模型,其中,基于证据理论设计多级信息融合方法计算多个无人机对抗多个目标时的总攻击有效性。在此基础上,基于纳什均衡中策略被选择的概率与遗憾值之间的关系以及超视距空战需求,将双矩阵博弈模型转换为混合整数规划模型进行求解,并通过典型案例、数值实验、仿真实验对本文方法的计算过程和有效性进行分析。结果表明,本文方法能够有效给出超视距空战中的多无人机武器目标分配方案。  相似文献   

14.
Fred Glover 《决策科学》1990,21(4):771-785
Discriminant analysis is an important tool for practical problem solving. Classical statistical applications have been joined recently by applications in the fields of management science and artificial intelligence. In a departure from the methodology of statistics, a series of proposals have appeared for capturing the goals of discriminant analysis in a collection of linear programming formulations. The evolution of these formulations has brought advances that have removed a number of initial shortcomings and deepened our understanding of how these models differ in essential ways from other familiar classes of LP formulations. We will demonstrate, however, that the full power of the LP discriminant analysis models has not been achieved, due to a previously undetected distortion that inhibits the quality of solutions generated. The purpose of this paper is to show how to eliminate this distortion and thereby increase the scope and flexibility of these models. We additionally show how these outcomes open the door to special model manipulations and simplifications, including the use of a successive goal method for establishing a series of conditional objectives to achieve improved discrimination.  相似文献   

15.
Depending on the techniques employed, the due date assignment, release, and sequencing procedures in job shop scheduling may depend on one another. This research investigates the effects of these interactions with a simulation model of a dynamic five-machine job shop in which early shipments are prohibited. Performance of the system is measured primarily in terms of the total cost (work-in-process cost, finished goods holding cost, and late penalty) incurred by the shop, but a number of non-cost performance measures are also reported. The results support existence of a three-way interaction between the due date, release, and sequencing procedures as well as interaction between shop utilization and procedure combination. Statistical tests are used to identify those rules that perform best both overall and in combination with other rules.  相似文献   

16.
The two-group discriminant problem has applications in many areas, for example, differentiating between good credit risks and poor ones, between promising new firms and those likely to fail, or between patients with strong prospects for recovery and those highly at risk. To expand our tools for dealing with such problems, we propose a class of nonpara-metric discriminant procedures based on linear programming (LP). Although these procedures have attracted considerable attention recently, only a limited number of computational studies have examined the relative merits of alternative formulations. In this paper we provide a detailed study of three contrasting formulations for the two-group problem. The experimental design provides a variety of test conditions involving both normal and nonnormal populations. Our results establish the LP model which seeks to minimize the sum of deviations beyond the two-group boundary as a promising alternative to more conventional linear discriminant techniques.  相似文献   

17.
Recent advances in statistical estimation theory have resulted in the development of new procedures, called robust methods, that can be used to estimate the coefficients of a regression model. Because such methods take into account the impact of discrepant data points during the initial estimation process, they offer a number of advantages over ordinary least squares and other analytical procedures (such as the analysis of outliers or regression diagnostics). This paper describes the robust method of analysis and illustrates its potential usefulness by applying the technique to two data sets. The first application uses artificial data; the second uses a data set analyzed previously by Tufte [15] and, more recently, by Chatterjee and Wiseman [6].  相似文献   

18.
This paper presents a new linear model methodology for clustering judges with homogeneous decision policies and differentiating dimensions which distinguish judgment policies. This linear policy capturing model based on canonical correlation analysis is compared to the standard model based on regression analysis and hierarchical agglomerative clustering. Potential advantages of the new methodology include simultaneous instead of sequential consideration of information in the dependent and independent variable sets, decreased interpretational difficulty in the presence of multicollinearity and/or suppressor/moderator variables, and a more clearly defined solution structure allowing assessment of a judge's relationship to all of the derived, ideal policy types. An application to capturing policies of information systems recruiters responsible for hiring entry-level personnel is used to compare and contrast the two techniques.  相似文献   

19.
A procedure is developed for determining two-group linear discriminant classifiers that misclassify the fewest number of observations in the training sample. An experimental study confirms the value of this approach.  相似文献   

20.
Stochastic DEA can deal effectively with noise in the non-parametric measurement of efficiency but unfortunately formal statistical inference on efficiency measures in not possible. In this paper, we provide a Bayesian approach to the problem organized around simulation techniques that allow for finite-sample inferences on efficiency scores. The new methods are applied to efficiency analysis of the Greek banking system for the period 1993–1999. The results show that the majority of the Greek banks operate close to best market practices.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号