首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 421 毫秒
1.
对目标带有权重的多目标决策问题,从“锥有效性”的意义上提出一类β-较重最优解,并对有限方案情形给出了求解方法。  相似文献   

2.
Discrete‐choice models are widely used to model consumer purchase behavior in assortment optimization and revenue management. In many applications, each customer segment is associated with a consideration set that represents the set of products that customers in this segment consider for purchase. The firm has to make a decision on what assortment to offer at each point in time without the ability to identify the customer's segment. A linear program called the Choice‐based Deterministic Linear Program (CDLP) has been proposed to determine these offer sets. Unfortunately, its size grows exponentially in the number of products and it is NP‐hard to solve when the consideration sets of the segments overlap. The Segment‐based Deterministic Concave Program with some additional consistency equalities (SDCP+) is an approximation of CDLP that provides an upper bound on CDLP's optimal objective value. SDCP+ can be solved in a fraction of the time required to solve CDLP and often achieves the same optimal objective value. This raises the question under what conditions can one guarantee equivalence of CDLP and SDCP+. In this study, we obtain a structural result to this end, namely that if the segment consideration sets overlap with a certain tree structure or if they are fully nested, CDLP can be equivalently replaced with SDCP+. We give a number of examples from the literature where this tree structure arises naturally in modeling customer behavior.  相似文献   

3.
This paper attempts to isolate and analyze the principal ideas of multiobjective optimization. We do this without casting aspersions on single-objective optimization or championing any one multiobjective technique. We examine each fundamental idea for strengths and weaknesses and subject two—efficiency and utility—to extended consideration. Some general recommendations are made in light of this analysis. Besides the simple advice to retain single-objective optimization as a possible approach, we suggest that three broad classes of multiobjective techniques are very promising in terms of reliably, and believably, achieving a most preferred solution. These are: (1) partial generation of the efficient set, a rubric we use to unify a wide spectrum of both interactive and analytic methods; (2) explicit utility maximization, a much-overlooked approach combining multiattribute decision theory and mathematical programming; and (3) interactive implicit utility maximization, the popular class of methods introduced by Geoffrion, Dyer, and Feinberg [24] and extended significantly by others.  相似文献   

4.
In this paper, we present a Pairwise Aggregated Hierarchical Analysis of Ratio-Scale Preferences (PAHAP), a new method for solving discrete alternative multicriteria decision problems. Following the Analytic Hierarchy Process (AHP), PAHAP uses pairwise preference judgments to assess the relative attractiveness of the alternatives. By first aggregating the pairwise judgment ratios of the alternatives across all criteria, and then synthesizing based on these aggregate measures, PAHAP determines overall ratio scale priorities and rankings of the alternatives which are not subject to rank reversal, provided that certain weak consistency requirements are satisfied. Hence, PAHAP can serve as a useful alternative to the original AHP if rank reversal is undesirable, for instance when the system is open and criterion scarcity does not affect the relative attractiveness of the alternatives. Moreover, the single matrix of pairwise aggregated ratings constructed in PAHAP provides useful insights into the decision maker's preference structure. PAHAP requires the same preference information as the original AHP (or, altematively, the same information as the Referenced AHP, if the criteria are compared based on average (total) value of the alternatives). As it is easier to implement and interpret than previously proposed variants of the conventional AHP which prevent rank reversal, PAHAP also appears attractive from a practitioner's viewpoint.  相似文献   

5.
Criteria to protect aquatic life are intended to protect diverse ecosystems, but in practice are usually developed from compilations of single‐species toxicity tests using standard test organisms that were tested in laboratory environments. Species sensitivity distributions (SSDs) developed from these compilations are extrapolated to set aquatic ecosystem criteria. The protectiveness of the approach was critically reviewed with a chronic SSD for cadmium comprising 27 species within 21 genera. Within the data set, one genus had lower cadmium effects concentrations than the SSD fifth percentile‐based criterion, so in theory this genus, the amphipod Hyalella, could be lost or at least allowed some level of harm by this criteria approach. However, population matrix modeling projected only slightly increased extinction risks for a temperate Hyalella population under scenarios similar to the SSD fifth percentile criterion. The criterion value was further compared to cadmium effects concentrations in ecosystem experiments and field studies. Generally, few adverse effects were inferred from ecosystem experiments at concentrations less than the SSD fifth percentile criterion. Exceptions were behavioral impairments in simplified food web studies. No adverse effects were apparent in field studies under conditions that seldom exceeded the criterion. At concentrations greater than the SSD fifth percentile, the magnitudes of adverse effects in the field studies were roughly proportional to the laboratory‐based fraction of species with adverse effects in the SSD. Overall, the modeling and field validation comparisons of the chronic criterion values generally supported the relevance and protectiveness of the SSD fifth percentile approach with cadmium.  相似文献   

6.
This paper develops a framework for performing estimation and inference in econometric models with partial identification, focusing particularly on models characterized by moment inequalities and equalities. Applications of this framework include the analysis of game‐theoretic models, revealed preference restrictions, regressions with missing and corrupted data, auction models, structural quantile regressions, and asset pricing models. Specifically, we provide estimators and confidence regions for the set of minimizers ΘI of an econometric criterion function Q(θ). In applications, the criterion function embodies testable restrictions on economic models. A parameter value θthat describes an economic model satisfies these restrictions if Q(θ) attains its minimum at this value. Interest therefore focuses on the set of minimizers, called the identified set. We use the inversion of the sample analog, Qn(θ), of the population criterion, Q(θ), to construct estimators and confidence regions for the identified set, and develop consistency, rates of convergence, and inference results for these estimators and regions. To derive these results, we develop methods for analyzing the asymptotic properties of sample criterion functions under set identification.  相似文献   

7.
We present the Integrated Preference Functional (IPF) for comparing the quality of proposed sets of near‐pareto‐optimal solutions to bi‐criteria optimization problems. Evaluating the quality of such solution sets is one of the key issues in developing and comparing heuristics for multiple objective combinatorial optimization problems. The IPF is a set functional that, given a weight density function provided by a decision maker and a discrete set of solutions for a particular problem, assigns a numerical value to that solution set. This value can be used to compare the quality of different sets of solutions, and therefore provides a robust, quantitative approach for comparing different heuristic, a posteriori solution procedures for difficult multiple objective optimization problems. We provide specific examples of decision maker preference functions and illustrate the calculation of the resulting IPF for specific solution sets and a simple family of combined objectives.  相似文献   

8.
A combinatorial optimization problem, called the Bandpass Problem, is introduced. Given a rectangular matrix A of binary elements {0,1} and a positive integer B called the Bandpass Number, a set of B consecutive non-zero elements in any column is called a Bandpass. No two bandpasses in the same column can have common rows. The Bandpass problem consists of finding an optimal permutation of rows of the matrix, which produces the maximum total number of bandpasses having the same given bandpass number in all columns. This combinatorial problem arises in considering the optimal packing of information flows on different wavelengths into groups to obtain the highest available cost reduction in design and operating the optical communication networks using wavelength division multiplexing technology. Integer programming models of two versions of the bandpass problems are developed. For a matrix A with three or more columns the Bandpass problem is proved to be NP-hard. For matrices with two or one column a polynomial algorithm solving the problem to optimality is presented. For the general case fast performing heuristic polynomial algorithms are presented, which provide near optimal solutions, acceptable for applications. High quality of the generated heuristic solutions has been confirmed in the extensive computational experiments. As an NP-hard combinatorial optimization problem with important applications the Bandpass problem offers a challenge for researchers to develop efficient computational solution methods. To encourage the further research a Library of Bandpass Problems has been developed. The Library is open to public and consists of 90 problems of different sizes (numbers of rows, columns and density of non-zero elements of matrix A and bandpass number B), half of them with known optimal solutions and the second half, without.  相似文献   

9.
Cluster‐based segmentation usually involves two sets of variables: (i) the needs‐based variables (referred to as the bases variables), which are used in developing the original segments to identify the value, and (ii) the classification or background variables, which are used to profile or target the customers. The managers’ goal is to utilize these two sets of variables in the most efficient manner. Pragmatic managerial interests recognize the underlying need to start shifting from methodologies that obtain highly precise value‐based segments but may be of limited practical use as they provide less targetable segments. Consequently, the imperative is to shift toward newer segmentation approaches that provide greater focus on targetable segments while maintaining homogeneity. This requires dual objective segmentation, which is a combinatorially difficult problem. Hence, we propose and examine a new evolutionary methodology based on genetic algorithms to address this problem. We show, based on a large‐scale Monte Carlo simulation and a case study, that the proposed approach consistently outperforms the existing methods for a wide variety of problem instances. We are able to obtain statistically significant and managerially important improvements in targetability with little diminution in the identifiability of value‐based segments. Moreover, the proposed methodology provides a set of good solutions, unlike existing methodologies that provide a single solution. We also show how these good solutions can be used to plot an efficient Pareto frontier. Finally, we present useful insights that would help managers in implementing the proposed solution approach effectively.  相似文献   

10.
Solving multicriteria decision making problems often requires the assessment of certain preferential information. In some occasions, this information must be given by several individuals or social groups, and these individual assessments need to be aggregated into single global preferences. This cardinal preferences aggregation problem has been tackled using different techniques, including multicriteria decision making ones. In this paper, a Meta-Goal Programming approach is proposed, where different target values can be set on several achievement functions that measure the goodness of the global assessments. This methodology presents strong advantages due to its modeling flexibility and its ability to find balanced solutions. The proposed approach is demonstrated with an illustrative example and a series of computational experiments, and it is shown that the Meta-Goal Programming method produces results with better values of the achievement functions than other classical and multicriteria approaches.  相似文献   

11.
We consider the problem of defining a strategy consisting of a set of facilities taking into account also the location where they have to be assigned and the time in which they have to be activated. The facilities are evaluated with respect to a set of criteria. The plan has to be devised respecting some constraints related to different aspects of the problem such as precedence restrictions due to the nature of the facilities. Among the constraints, there are some related to the available budget. We consider also the uncertainty related to the performances of the facilities with respect to considered criteria and plurality of stakeholders participating to the decision. The considered problem can be seen as the combination of some prototypical operations research problems: knapsack problem, location problem and project scheduling. Indeed, the basic brick of our model is a variable xilt which takes value 1 if facility i is activated in location l at time t, and 0 otherwise. Due to the conjoint consideration of a location and a time in the decision variables, what we propose can be seen as a general space-time model for operations research problems. We discuss how such a model permits to handle complex problems using several methodologies including multiple attribute value theory and multiobjective optimization. With respect to the latter point, without any loss of the generality, we consider the compromise programming and an interactive methodology based on the Dominance-based Rough Set Approach. We illustrate the application of our model with a simple didactic example.  相似文献   

12.
E-government refers to the use of information and communication technologies (ICT) by governments to provide digital services to citizens and businesses over the Internet, at local, national or international level. Benchmarking and assessing e-government is therefore necessary to monitor performance and progress by individual countries and identify areas for improvement. Although such measurements have already been initiated by various organizations, they scarcely highlight the multidimensional nature of the assessment. This paper outlines a multicriteria methodology to evaluate e-government using a system of eight evaluation criteria that are built on four points of view: (1) infrastructures, (2) investments, (3) e-processes, and (4) users’ attitude. The overall evaluation is obtained through an additive value model which is assessed with the involvement of a single decision maker–evaluator and the use of a multicriteria ordinal regression approach. Specifically, the UTA II method is used, whose interactive application process is divided in two phases. Its implementation is supported by MIIDAS (multicriteria interactive intelligent decision aiding system). This research work aims at supporting potential stakeholders to perform a global e-government evaluation, based on their own viewpoints and preferences. Finally, 21 European countries are evaluated and ranked considering the latest criteria data.  相似文献   

13.
We study a decentralized assembly supply chain in which an assembler (she) assembles a set of n components, each produced by a different supplier (he), into a final product to satisfy an uncertain market demand. Each supplier holds private cost information to himself, for which the assembler only has a subjective estimate. Furthermore, the assembler believes that the suppliers' costs follow a joint discrete probability distribution. The assembler aims to design an optimal menu of contracts to maximize her own expected profit. The assembler's problem is a complex multi‐dimensional constrained optimization problem. We prove that there exists a unique optimal menu of contracts for the assembler, and we further develop an efficient algorithm with a complexity of O(n) to compute the optimal contract. In addition, we conduct a comprehensive sensitivity analysis to analyze how environmental parameters affect individual firm's performance and the value of information to the assembler, to each supplier, and to the supply chain. Our results suggest that each supplier's private cost information becomes more valuable to the assembler and each supplier when the average market demand increases or when the final product unit revenue increases. Surprisingly, when a supplier's cost volatility increases and its mean remains the same, the value of information to the assembler or to each supplier does not necessarily increase. Furthermore, we show that when the suppliers' cost distributions become more positively correlated, the suppliers are always worse off, but the assembler is better off. However, the value of information for the assembler might increase or decrease.  相似文献   

14.
Hybridization techniques are very effective for the solution of combinatorial optimization problems. This paper presents a genetic algorithm based on Expanding Neighborhood Search technique (Marinakis, Migdalas, and Pardalos, Computational Optimization and Applications, 2004) for the solution of the traveling salesman problem: The initial population of the algorithm is created not entirely at random but rather using a modified version of the Greedy Randomized Adaptive Search Procedure. Farther more a stopping criterion based on Lagrangean Relaxation is proposed. The combination of these different techniques produces high quality solutions. The proposed algorithm was tested on numerous benchmark problems from TSPLIB with very satisfactory results. Comparisons with the algorithms of the DIMACS Implementation Challenge are also presented.  相似文献   

15.
In a make‐to‐order product recovery environment, we consider the allocation decision for returned products decision under stochastic demand of a firm with three options: refurbishing to resell, parts harvesting, and recycling. We formulate the problem as a multiperiod Markov decision process (MDP) and present a linear programming (LP) approximation that provides an upper bound on the optimal objective function value of the MDP model. We then present two solution approaches to the MDP using the LP solution: a static approach that uses the LP solution directly and a dynamic approach that adopts a revenue management perspective and employs bid‐price controls technique where the LP is resolved after each demand arrival. We calculate the bid prices based on the shadow price interpretation of the dual variables for the inventory constraints and accept a demand if the marginal value is higher than the bid price. Since the need for solving the LP at each demand arrival requires a very efficient solution procedure, we present a transportation problem formulation of the LP via variable redefinitions and develop a one‐pass optimal solution procedure for it. We carry out an extensive numerical analysis to compare the two approaches and find that the dynamic approach provides better performance in all of the tested scenarios. Furthermore, the solutions obtained are within 2% of the upper bound on the optimal objective function value of the MDP model.  相似文献   

16.
Herbert Moskowitz 《Omega》1982,10(6):647-661
Linear aggregation models employing unit and equal weights have been shown to be superior to human decisions in a surprising range of decision situations. In addition, decisions based on these models have often been found to be superior to those based on linear regression models (LRMs). This general issue was explored for repetitive decisions in production planning. The problem considered differs in several aspects from the types of problems investigated previously: (1) the problem is dynamic rather than static; (2) a set (or vector) of interactive decisions dependent on previous decisions is required to be made, where a decision in stage t, the dependent variable, becomes an independent variable in stage t + 1; and (3) the criterion function is cost with a quadratic loss function (rather than the correlation measure of R2). Moreover, since repetitive decisions were involved, the parameters of the models were estimated using past human decisions. These were used to predict specific values of the decision variables (rather than rank order), which in turn were employed recursively to predict values of the decision variables at subsequent stages. While decisions from equal weighting rules were found to be superior to human decisions and not greatly inferior to decisions from linear regression models, decisions from unit weighting rules performed poorly. The rationale for such performance is discussed, indicating that previous theoretical and empirical research on linear weighting models is not generally applicable to dynamic, multivariate interactive decisions problems with lagged variables.  相似文献   

17.
Invasive species risk maps provide broad guidance on where to allocate resources for pest monitoring and regulation, but they often present individual risk components (such as climatic suitability, host abundance, or introduction potential) as independent entities. These independent risk components are integrated using various multicriteria analysis techniques that typically require prior knowledge of the risk components’ importance. Such information is often nonexistent for many invasive pests. This study proposes a new approach for building integrated risk maps using the principle of a multiattribute efficient frontier and analyzing the partial order of elements of a risk map as distributed in multidimensional criteria space. The integrated risks are estimated as subsequent multiattribute frontiers in dimensions of individual risk criteria. We demonstrate the approach with the example of Agrilus biguttatus Fabricius, a high‐risk pest that may threaten North American oak forests in the near future. Drawing on U.S. and Canadian data, we compare the performance of the multiattribute ranking against a multicriteria linear weighted averaging technique in the presence of uncertainties, using the concept of robustness from info‐gap decision theory. The results show major geographic hotspots where the consideration of tradeoffs between multiple risk components changes integrated risk rankings. Both methods delineate similar geographical regions of high and low risks. Overall, aggregation based on a delineation of multiattribute efficient frontiers can be a useful tool to prioritize risks for anticipated invasive pests, which usually have an extremely poor prior knowledge base.  相似文献   

18.
The design of distributed computer systems (DCSs) requires compromise among several conflicting objectives. For instance, high system availability conflicts with low cost which in turn conflicts with quick response time. This paper presents an approach, based on multi-criteria decision-making techniques, to arrive at a good design in this multiobjective environment. An interactive procedure is developed to support the decision making of system designers. Starting from an initial solution, the procedure presents a sequence of non-dominated vectors to designers, allowing them to explore systematically alternative possibilities on the path to a final design. The model user has control over trade-offs among different design objectives. This paper focuses on the details of the mathematical model used to provide decision support. Accordingly, a formulation of DCS design as a multicriteria decision problem is developed. The exchange search heuristic used to generate nondominated solutions also is presented. We argue that multicriteria models provide a more realistic formulation of the DCS design problem than the single-criterion models used widely in the literature. While obtaining a clear definition of design objectives (single or multiple) is an important activity, by explicitly acknowledging the trade-offs among multiple objectives in the design process, our methodology is more likely to produce a better overall design than methods addressing a single criterion in isolation.  相似文献   

19.
This paper evaluates the applicability of different multi-objective optimization methods for environmentally conscious supply chain design. We analyze a case study with three objectives: costs, CO2 and fine dust (also known as PM – Particulate Matters) emissions. We approximate the Pareto front using the weighted sum and epsilon constraint scalarization methods with pre-defined or adaptively selected parameters, two popular evolutionary algorithms, SPEA2 and NSGA-II, with different selection strategies, and their interactive counterparts that incorporate Decision Maker׳s (DM׳s) indirect preferences into the search process. Within this case study, the CO2 emissions could be lowered significantly by accepting a marginal increase of costs over their global minimum. NSGA-II and SPEA2 enabled faster estimation of the Pareto front, but produced significantly worse solutions than the exact optimization methods. The interactive methods outperformed their a posteriori counterparts, and could discover solutions corresponding better to the DM preferences. In addition, by adjusting appropriately the elicitation interval and starting generation of the elicitation, the number of pairwise comparisons needed by the interactive evolutionary methods to construct a satisfactory solution could be decreased.  相似文献   

20.
Jang W. Ra 《决策科学》1999,30(2):581-599
The pairwise comparison technique is a building block of the Analytic Hierarchy Process (AHP), which has been popularly used for multicriteria decision analysis. This paper develops a shortcut technique in which only n paired comparisons forming a closed chain are needed for n decision elements. Together with the development of a simple and intuitive measure of (inconsistency, this technique derives the relative weights of decision elements via easy step-by-step calculations on a spreadsheet format. Its performance has been tested on Saaty's wealth of nations example. It is important to notice that ranking and weights yielded from this alternative technique are identical to Harker's incomplete pairwise comparison solution for the same chain orientation for the example tested.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号