首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.

This paper discusses the process of desigining a tabu search-based heuristic for the two-stage flow shop problem with makespan minimization as the primary criterion and the minimization of total flow time as the secondary criterion. A factorial experiment is designed to analyse thoroughly the effects of four different factors, i.e. the initial solution, type of move, size of neighbourhood and the list size, on the performance of the tabu search-based heuristic. Using the techniques of evolution curves, and response tables and response graphs, coupled with the Taguchi method, the best combination of the factors for the tabu search-based heuristic is identified, and the effectiveness of the heuristic algorithm in finding an optimal solution is evaluated by comparing its performance with the best known heuristic to solve this problem.  相似文献   

2.

The aim of this paper is to propose a new approach to the issue ofpositioning and pricing a firm's new products line; it is shown here how to conceive an artificial neural system able to find the optimal solution. The formulation of the problem follows the model proposed by Dobson and Kalish, while the implementation is based on the Hopfield Neural Network, thus used here for a new class of problems. The model shows a capacity to converge to the optimal solution; besides its effectiveness, this technique appears to also be efficient if applied to such a production problem.  相似文献   

3.

In this paper, we propose a productivity model for solving the machine-part grouping problem in cellular manufacturing (CM) systems. First, a non-linear 0-1 integer programming model is developed to identify machine groups and part families simultaneously. This model aims to maximize the system productivity defined as the ratio of total output to the total material handling cost. Second, an efficient simulated annealing (SA) algorithm is developed to solve large-scale problems. This algorithm provides several advantages over the existing algorithms. It forms part families and machine cells simultaneously. It also considers production volume, sales price, and maximum number of machines in each cell and total material handling cost. The proposed SA also has the ability to determine the optimum number of manufacturing cells. The performance of the developed models is tested on eight problems of different size and complexity selected from the literature. The results show the superiority of the SA algorithm over the mathematical programming model in both productivity and computational time.  相似文献   

4.

Simulated annealing (SA) has been widely used to solve hard combinatorial optimization problems during the last decade. The application of SA requires the initialization of certain parameters and an initial solution to start the search with. An emphasis in the application of SA has been on the determination of the best initial values of the parameters, however, the starting solution has traditionally been randomly generated. In this paper, we study the effects of the quality of the starting solution and the use of dominance rules on the performance of SA. We show that the better the initial solution used by the SA, the better the final solution produced by it, i.e. the level of improvement achieved by the SA is dependent on the quality of the initial solution. This is demonstrated using various parallel processor scheduling problems. We have also found that dominance rules (applied in conjunction with SA) may in some cases lead to further improved solutions, but their inclusion in an SA scheme must be determined during the preliminary experimentation, in parallel to the determination of the best SA-parameter values. These findings have a long-term impact, as they suggest that the performance of SA schemes that are currently available in the literature can be further improved by starting from a good solution, if available, or by implementing SA in a sequential manner, i.e. several times, by starting from the best solution found in the previous run.  相似文献   

5.

In this paper, a Multi Objective Genetic Algorithm (MOGA) is proposed to derive the optimal machine-wise priority dispatching rules ( pdrs ) to resolve the conflict among the contending jobs in the Giffler and Thompson (GT) procedure applied for job shop problems. The performance criterion considered is the weighed sum of the multiple objectives minimization of makespan, minimization of total idle time of machines and minimization of total tardiness. The weights assigned for combining the objectives into a scalar fitness function are not constant. They are specified randomly for each evaluation. This in turn leads to the multidirectional search in the proposed MOGA, which in turn mitigates the solution being entrapped in local minima. The applicability and usefulness of the proposed methodology for the scheduling of job shops is illustrated with 28 benchmark problems available in the open literature.  相似文献   

6.
投资组合绩效评价是学术界研究的热点问题。本文在经典的经济学框架下,基于真实前沿面,给出了投资组合效率的明确定义。由于实际投资环境的影响,投资组合优化模型非常复杂,难以获得真实前沿面的解析解,这给投资组合效率的应用带来了很大的困难。本文基于投资组合理论,在投资组合模型所对应的前沿面为凹函数的情况下,采用基于数据的投资组合DEA评价模型构造前沿面来逼近真实的前沿面,从而估计一般情形下投资组合的效率。在此基础上研究了考虑交易成本的投资组合效率评价问题,并用实例说明了本文方法的合理性与可行性。  相似文献   

7.
黄帝  周泓 《中国管理科学》2018,26(10):102-112
废旧产品的回收再制造过程往往在回收质量、再制造成本、再制造产出率、再制造产品需求等方面存在不确定性因素,极大地增加了再制造生产管理决策的复杂性。本文在一个回收再制造系统中研究了存在多种回收质量等级时的两阶段回收—再制造联合优化决策问题,并扩展到需求与价格相关和再制造产出率随机两种情形。在最大化再制造商期望利润的决策目标下,基于每种回收质量等级的单位回收和再制造成本构造出再制造系统的有效生产前沿面,给出了不同决策情形下再制造商的最优回收数量、销售定价的解析解,并且分析了一些主要的参数对再制造商最优决策的影响。本文的研究结果表明:(1)含有多种回收质量等级的再制造系统中存在一个下凸的有效生产前沿面,不在该前沿面上的任何质量等级的回收产品都将不会被用于再制造;(2)在同等的政府补贴额度下,回收补贴方式对再制造商决策的影响程度大于再制造补贴方式;(3)当再制造品的市场需求与价格相关时,最优销售价格至少大于第一种被使用的回收质量等级的边际回收和再制造成本;(4)任意两种回收质量等级之间存在着替代或互补效应,由其成本差异决定,并且这种效应随着需求不确定性的增大而增大;(5)再制造产出率的不确定性和再制造品需求的不确定性之间存在"对冲"效应,这种效应随着再制造产出率不确定性的降低而减弱。本文的研究可为不确定性环境下再制造企业的回收、生产管理决策提供有益的管理启示。  相似文献   

8.

From the available literature, there seems to be no defined approach to resource smoothing exercise except those attempted by Weist (1967, Management Science, 13, B359-B377) and Burgess and Killebrew (1962, Journal of Industrial Engineering, 13, 76-83). The aim of the smoothing exercise is to achieve optimal resource usage by avoiding high peaks and deep valleys in the project resource profile. The general approach has always been to move some activities with floats in the high peak regions to be started at a later date, and as this is done, the valleys will be filled to smooth the resource profile subject of course to time constraint. If this approach is followed as it is, it would be difficult to determine optimality especially when many resources are involved. A cost minimization approach is envisaged in the present study with no limitation on the number of resource inputs. In a situation where the resources are assumed to have the same value, the cost assigned to each of them should be similar. The method follows the general concept but with a difference; cost of the activity in question is considered. The exercise is continued until all the floats are exhausted. The optimum result would then be the one with the minimum cost profile. From examples used for the evaluation, the results obtained are comparable to those of the above two researchers, and some with better results in the majority of cases.  相似文献   

9.
We consider the problem of scheduling operations in bufferless robotic cells that produce identical parts using either single‐gripper or dual‐gripper robots. The objective is to find a cyclic sequence of robot moves that minimizes the long‐run average time to produce a part or, equivalently, maximizes the throughput. Obtaining an efficient algorithm for an optimum k‐unit cyclic solution (k ≥ 1) has been a longstanding open problem. For both single‐gripper and dual‐gripper cells, the approximation algorithms in this paper provide the best‐known performance guarantees (obtainable in polynomial time) for an optimal cyclic solution. We provide two algorithms that have a running time linear in the number of machines: for single‐gripper cells (respectively, dual‐gripper cells), the performance guarantee is 9/7 (respectively, 3/2). The domain considered is free‐pickup cells with constant intermachine travel time. Our structural analysis is an important step toward resolving the complexity status of finding an optimal cyclic solution in either a single‐gripper or a dual‐gripper cell. We also identify optimal cyclic solutions for a variety of special cases. Our analysis provides production managers valuable insights into the schedules that maximize productivity for both single‐gripper and dual‐gripper cells for any combination of processing requirements and physical parameters.  相似文献   

10.

In a JIT manufacturing environment it may be desirable to learn from an archived history of data that contains information that reflects less than optimal factory performance. The purpose of this paper is to use rule induction to predict JIT factory performance from past data that reflects both poor (saturated or starved) and good (efficient) factory performance. Inductive learning techniques have previously been applied to JIT production systems (Markham et al. , Computers and Industrial Engineering, 34 , 717-726, 1998; Markham et al. , International Journal of Manufacturing Technology Management, 11 (4), 239-246, 2000), but these techniques were only applied to data sets that reflected a well-performing factory. This paper presents an approach based on inductive learning in a JIT manufacturing environment that (1) accurately classifies and predicts factory performance based on shop factors, and (2) identifies the important relationships between the shop factors that determine factory performance. An example application is presented in which the classification and regression tree (CART) technique is used to predict saturated, starved or efficient factory performance based on dynamic shop floor data. This means that the relationship between the variables that cause poor factory performance can be discovered and measures to assure efficient performance can then be taken.  相似文献   

11.
In a make‐to‐order product recovery environment, we consider the allocation decision for returned products decision under stochastic demand of a firm with three options: refurbishing to resell, parts harvesting, and recycling. We formulate the problem as a multiperiod Markov decision process (MDP) and present a linear programming (LP) approximation that provides an upper bound on the optimal objective function value of the MDP model. We then present two solution approaches to the MDP using the LP solution: a static approach that uses the LP solution directly and a dynamic approach that adopts a revenue management perspective and employs bid‐price controls technique where the LP is resolved after each demand arrival. We calculate the bid prices based on the shadow price interpretation of the dual variables for the inventory constraints and accept a demand if the marginal value is higher than the bid price. Since the need for solving the LP at each demand arrival requires a very efficient solution procedure, we present a transportation problem formulation of the LP via variable redefinitions and develop a one‐pass optimal solution procedure for it. We carry out an extensive numerical analysis to compare the two approaches and find that the dynamic approach provides better performance in all of the tested scenarios. Furthermore, the solutions obtained are within 2% of the upper bound on the optimal objective function value of the MDP model.  相似文献   

12.
A heuristic to minimize total flow time in permutation flow shop   总被引:1,自引:0,他引:1  
In this paper, we address an n-job, m-machine permutation flow shop scheduling problem for the objective of minimizing the total flow time. We propose a modification of the best-known method of Framinan and Leisten [An efficient constructive heuristic for flowtime minimization in permutation flow shops. Omega 2003;31:311–7] for this problem. We show, through computational experimentation, that this modification significantly improves its performance while not affecting its time-complexity.  相似文献   

13.

The design of a physical distribution system (PDS) involves the determination of the number and locations of distribution centres, estimation of required vehicle numbers and design of vehicle routings. Due to the enormous numbers of possible combinatorial designs of the system, it is difficult to obtain an optimal design in acceptable computation effort on many occasions. In this paper, a new solution framework for the design of PDS by implementing the genetic algorithm (GA) is presented. With the characteristic of simultaneous optimization of a large population of configuration, the proposed methodology has been proved to be an extremely efficient optimizer. In the experimental simulation conducted in this paper, it also indicates this approach can provide a near-optimal solution to the design of PDS. To analyse the growth and decay of many schemas contained in the population, the effects of the operation of reproduction, crossover and mutation on the schema are studied. The simulation evaluation about the system performance and genetic parameters is presented along with the discussions at the end of this paper.  相似文献   

14.

In any business process reengineering (BPR) project, a thorough understanding of various tasks and activities of the organization is required. Very often this idea is captured using a simple flow chart or static representation diagram. The weakness here is that the process design complexity is not adequately represented by the use of flow charts, and this allows for limited human-computer interaction during the process design and analysis. In this paper, we propose an enhanced flow chart approach; the concept of activity-section flow chart to support BPR, which is a combination of the existing activity flow chart and section flow chart. Using this approach, a human-computer interactive model for BPR is developed. This model can identify the unreasonable activity loops and excessive business rounds between sections by the adjacent and reachable matrices. Via the human-computer interaction, the process can be revised by human experience. This approach provides an efficient tool for BPR of large-scale systems. It has been applied to the material supply management system of an iron and steel works, and satisfactory results have been achieved.  相似文献   

15.

This paper addresses the two-machine bicriteria dynamic flowshop problem where setup time of a job is separated from its processing time and is sequenced independently. The performance considered is the simultaneous minimization of total flowtime and makespan, which is more effective in reducing the total scheduling cost compared to the single objective. A frozen-event procedure is first proposed to transform a dynamic scheduling problem into a static one. To solve the transformed static scheduling problem, an integer programming model with N 2 + 5N variables and 7N constraints is formulated. Because the problem is known to be NP-complete, a heuristic algorithm with the complexity of O (N 3) is provided. A decision index is developed as the basis for the heuristic. Experimental results show that the proposed heuristic algorithm is effective and efficient. The average solution quality of the heuristic algorithm is above 99%. A 15-job case requires only 0.0235 s, on average, to obtain a near or even optimal solution.  相似文献   

16.
In the context of deterministic multicriteria maximization, a Pareto optimal, nondominated, or efficient solution is a feasible solution for which an increase in value of any one criterion can only be achieved at the expense of a decrease in value of at least one other criterion. Without restrictions of convexity or continuity, it is shown that a solution is efficient if and only if it solves an optimization problem that bounds the various criteria values from below and maximizes a strictly increasing function of these several criteria values. Also included are discussions of previous work concerned with generating or characterizing the set of efficient solutions, and of the interactive approach for resolving multicriteria optimization problems. The final section argues that the paper's main result should not actually be used to generate the set of efficient solutions, relates this result to Simon's theory of satisficing, and then indicates why and how it can be used as the basis for interactive procedures with desirable characteristics.  相似文献   

17.
Aggregate production planning decisions are inter mediate range decisions that can have a significant impact on both productivity and profitability. In this paper, we examine an interactive computer-based method that provides decision support for the aggregate planner. The proposed approach combines the judgement of the planner with the optimization of subproblems to arrive at an effective solution for multi-family aggregate production planning problems. In the interactive approach, the planner exercises direct control over sensitive workforce levels and production capacities. A network flow sub-problem solver is used to generate optimal production plans and inventory levels given the user-specified production capacities. Decision aids are provided to help the planner achieve a cost-effective solution that is consistent with judgement concerning workforce levels. Computational testing on five test problems indicates that very cost-effective solutions can be obtained. The results of applying the interactive method to a real-world problem are also reported.  相似文献   

18.
In this paper, we extend the standard data envelopment analysis (DEA) model to include longer term top management goals. This extension is in recognition of the fact that benchmarking for decision making units (DMUs) is more than a purely monitoring process, and includes a component of future planning. The new model uses a goal programming structure to find points on the efficient frontier which are realistically achievable by DMUs, but at the same time achieving a closer approach to long term organizational goals (as distinct from the local performance of individual DMUs). Consideration is given to the possibility of adjusting constraints on the DMU by investment in extended inputs or new technologies, in which case minimization of associated investment costs becomes an additional management objective.  相似文献   

19.

This work is an investigation about the relative effectiveness of two approaches to scheduling in flexible flow shops: one approach advocating the possible use of different dispatching rules at different stages of the flow shop, and the other suggesting the use of the same dispatching rule at all the stages of the flow shop. In the latter approach, the dispatching rule contains the information related to both process time and duedate. Both approaches aim at the minimization of measures related to flowtime and tardiness of jobs. This paper essentially is an attempt at exploring the relative effectiveness of these two approaches to scheduling.  相似文献   

20.
The determination of closest efficient targets has attracted increasing interest of researchers in recent Data Envelopment Analysis (DEA) literature. Several methods have been introduced in this respect. However, only a few attempts exist that analyze the implications of using closest targets on the technical inefficiency measurement. In particular, least distance measures based on Hölder norms satisfy neither weak nor strong monotonicity on the strongly efficient frontier. In this paper, we study Hölder distance functions and show why strong monotonicity fails. Along this line, we provide a solution for output-oriented models that allows assuring strong monotonicity on the strongly efficient frontier. Our approach may also be extended to the most general case, i.e. non-oriented models, under some conditions of regularity.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号