首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
This paper reports the results of a study of the use of heterogeneous dispatching rules for the scheduling of work in a job shop. The methodology employed included discrete event simulation, using rule combinations determined by prior genetic algorithm searches and generalization using neural networks. Eight dispatching rules were considered, including first in first out (FIFO), earliest due date ( EDD), shortest processing time (SPT), slack/ number of operations (SLK), critical ratio (CR), modified due date (MDD), modified operation due date (MOD), and apparent tardiness cost (ATC). A three-machine job shop was studied, in which three work organizations were employed, pure flow (fixed sequence), pure job shop ( random sequence), and a hybrid shop where flow is random but with unequal probabilities. Three levels of machine loading were used and average tardiness was used as the performance measure. In most cases, modified due date and apparent tardiness cost were the best rules. The application of the best rules effected the results primarily when applied to bottleneck machines or the first machine in a pure flow shop. Nearly any other rule was acceptable on non-botdeneck machines except FIFO and CR, which consistently perform poorly. No major advantage of mixing rules was found.  相似文献   

2.
Abstract. This paper investigates the effects of four simple dispatching rules on just-in-time production related performance measures of mean and maximum absolute lateness. The rules used are modified due date (MDD), shortest processing time (SPT), earliest due date (EDD), and first in first out (FIFO). A single machine is used under three utilization levels. Due-dates are set according to total work content rule. The results indicate that each rule performs well under certain conditions. The MDD rule is the best one to minimize mean absolute lateness. The EDD and FIFO rules do well in minimizing the maximum absolute lateness. Economic interpretation of these performance measures are also discussed.  相似文献   

3.
On January 18, 2011, President Obama signed Executive Order 13563, Improving Regulation and Regulatory Review, which instructs federal regulators to do the following: coordinate their agencies activities to simplify and harmonize rules that may be overlapping, inconsistent, or redundant; determine whether the present and future benefits of a proposed regulation justify its potential costs (including taking into account both quantitative and qualitative factors); increase participation of industry, experts, and the public (“stakeholders”) in the formal rule‐making process; encourage the use of warnings, default rules, disclosure requirements, and provisions of information to the public as an alternative to traditional “command‐and‐control” rule‐making restricting consumer choice; and mandate a government‐wide review of all existing administrative rules to remove outdated regulations. Executive Order 13563 includes a qualitative “values” provision to be considered in the required cost–benefit analysis, which can potentially counteract the alleged regulatory reform rationale of President Obama. Furthermore, in Executive Order 13563, President Obama established a deadline of May 18, 2011, for all executive branch agencies to submit their plans to streamline their rulemaking operations and repeal those “overlapping, inconsistent, or redundant” rules. These two issues, along with complementary regulatory review proposals being discussed in the U.S. Congress, are evaluated in this essay.  相似文献   

4.
While the majority of the literature on shop scheduling has emphasized time-based performance criteria such as mean flow time, lateness, and tardiness, the primary goal of management should be the maximization of shop profitability. In this research the net present value (NPV) criterion is introduced to measure shop profitability. This measure combines aspects of job flow time and inventory holding costs into a single measure. A simulation model of a job shop is used to examine the performance of a variety of time- and value-based scheduling rules. These rules are evaluated with respect to the NPV criterion in both random and flow shop environments. The results suggest that priority rules that utilize monetary information about jobs yield a higher NPV than many time-based rules in most situations, with little sacrifice in job tardiness. A well-researched time-based rule, critical ratio, also provides excellent performance when the shop is heavily loaded.  相似文献   

5.
An ecological risk assessment framework for low-altitude aircraft overflights was developed, with special emphasis on military applications. The problem formulation and exposure analysis phases are presented in this article; an analysis of effects and risk characterization is presented in a companion article. The intent of this article is threefold: (1) to illustrate the development of a generic framework for the ecological risk assessment of an activity, (2) to show how the U.S. Environmental Protection Agency's ecological risk assessment paradigm can be applied to an activity other than the release of a chemical, and (3) to provide guidance for the assessment of ecological risks from low-altitude aircraft overflights. The key stressor for low-altitude aircraft overflights is usually sound, although visual and physical (collision) stressors may also be significant. Susceptible and regulated wildlife populations are the major assessment endpoint entities, although plant communities may be impacted by takeoffs and landings. The exposure analysis utilizes measurements of wildlife locations, measurements of sound levels at the wildlife locations, measurements of slant distances from aircraft to wildlife, models that extrapolate sound from the source aircraft to the ground, and bird-strike probability models. Some of the challenges to conducting a risk assessment for aircraft overflights include prioritizing potential stressors and endpoints, choosing exposure metrics that relate to wildlife responses, obtaining good estimates of sound or distance, and estimating wildlife locations.  相似文献   

6.
This study introduces a universal “Dome” appointment rule that can be parameterized through a planning constant for different clinics characterized by the environmental factors—no‐shows, walk‐ins, number of appointments per session, variability of service times, and cost of doctor's time to patients’ time. Simulation and nonlinear regression are used to derive an equation to predict the planning constant as a function of the environmental factors. We also introduce an adjustment procedure for appointment systems to explicitly minimize the disruptive effects of no‐shows and walk‐ins. The procedure adjusts the mean and standard deviation of service times based on the expected probabilities of no‐shows and walk‐ins for a given target number of patients to be served, and it is thus relevant for any appointment rule that uses the mean and standard deviation of service times to construct an appointment schedule. The results show that our Dome rule with the adjustment procedure performs better than the traditional rules in the literature, with a lower total system cost calculated as a weighted sum of patients’ waiting time, doctor's idle time, and doctor's overtime. An open‐source decision‐support tool is also provided so that healthcare managers can easily develop appointment schedules for their clinical environment.  相似文献   

7.
An ecological risk assessment framework for aircraft overflights has been developed, with special emphasis on military applications. This article presents the analysis of effects and risk characterization phases; the problem formulation and exposure analysis phases are presented in a companion article. The framework addresses the effects of sound, visual stressors, and collision on the abundance and production of wildlife populations. Profiles of effects, including thresholds, are highlighted for two groups of endpoint species: ungulates (hoofed mammals) and pinnipeds (seals, sea lions, walruses). Several factors complicate the analysis of effects for aircraft overflights. Studies of the effects of aircraft overflights previously have not been associated with a quantitative assessment framework; therefore no consistent relations between exposure and population-level response have been developed. Information on behavioral effects of overflights by military aircraft (or component stressors) on most wildlife species is sparse. Moreover, models that relate behavioral changes to abundance or reproduction, and those that relate behavioral or hearing effects thresholds from one population to another are generally not available. The aggregation of sound frequencies, durations, and the view of the aircraft into the single exposure metric of slant distance is not always the best predictor of effects, but effects associated with more specific exposure metrics (e.g., narrow sound spectra) may not be easily determined or added. The weight of evidence and uncertainty analyses of the risk characterization for overflights are also discussed in this article.  相似文献   

8.

We consider cooperatives games (TU-games) enriched by a system of a priori unions and a communication forest graph which are independent from each other. These two structures reflect the limitations of cooperation possibilities. In this framework, we introduce four Owen-type allocation rules, which are defined by a two-step application of an allocation rule à la Owen (in: Henn R, Moeschlin O (eds) Essays in mathematical economics and game theory, Springer, Berlin, 1977) to TU-games with a priori unions where the TU-game is replaced by Myerson’s (Math Oper Res 2:225–229, 1977) graph-restricted TU-game. The four possibilities arise by applying, at each step, either the Myerson value (Myerson 1977) or the average tree solution (Herings et al. in Games Econ Behav 62:77–92, 2008). Our main result offers comparable axiomatizations of these four allocation rules.

  相似文献   

9.
Repairing obsolete data items to the up-to-date values faces great challenges in the area of improving data quality. Previous methods of data repairing are based on either quality rules or statistical techniques, but both of the two types of methods have their limitations. To overcome the shortages of the previous methods, this paper focuses on combining quality rules and statistical techniques to improve data currency. (1) A new class of currency repairing rules (CRR for short) is proposed to express both domain knowledge and statistical information. Domain knowledge is expressed by the rule pattern, and the statistical information is described by the conditional probability distribution corresponding to each rule. (2) The problem of generating minimized CRRs is studied in both static and dynamic world. In the static world, the problem of generating minimized CRR patterns is proved to be NP-hard, and two approximate algorithms are provided to solve the problem. In dynamic world, methods are provided to update the CRRs without recomputing the whole CRR set in case of data being changed. In some special cases, the updates can be finished in \(O(1)\) time. In both cases, the methods for learning conditional probabilities for each CRR pattern are provided. (3) Based on the CRRs, the problems of finding optimal repairing plans with and without cost budget is studied, and methods are provided to solve them. (4) The experiments based on both real and synthetic data sets show that the proposed methods are efficient and effective.  相似文献   

10.
Editorial decisions, based in part on reported hypothesis test results, affect the probabilities associated with those results: the probabilities of Type I and Type II errors thus become different for readers than for authors. The distributions of published parameter estimates are similarly affected. A framework for studying the consequences of test-based information filtering is developed and illustrative examples are provided. The examples indicate that filtering can markedly distort the power functions of hypothesis tests and can induce large estimator biases and increases in mean square error. It is argued that test-based filtering is relevant not only to journal publication but to other forms of information dissemination as well.  相似文献   

11.
Altough the dual resource-constrained (DRC) system has been studied, the decision rule used to determine when workers are eligible for transfer largely has been ignored. Some earlier studies examined the impact of this rule [5] [12] [15] but did not include labor-transfer times in their models. Gunther [6] incorporated labour-transfer times into his model, but the model involved only one worker and two machines. No previous study has examined decision rules that initiate labor transfers based on labor needs (“pull” rules). Labor transfers always have been initiated based on lack of need (“push” rules). This study examines three “pull” variations of the “When” labor-assignment decision rule. It compares their performances to the performances of two “push” rules and a comparable machine-limited system. A nonparametric statistical test, Jonckheere's S statistic, is used to test for significance of the rankings of the rules: a robust parametric multiple-comparison statistical test, Tukey's B statistic, is used to test the differences. One “pull” and one “push” decision rule provide similar performances and top the rankings consistently. Decision rules for determining when labor should be transferred from one work area to another are valuable aids for managers. This especially is true for the ever-increasing number of managers operating in organizations that recognize the benefits of a cross-trained work force. Recently there has been much interest in cross-training workers, perhaps because one of the mechanisms used in just-in-time systems to handle unbalanced work loads is to have cross-trained workers who can be shifted as demand dictates [8]. If management is to take full advantage of a cross-trained work force, it need to know when to transfer workers.  相似文献   

12.
We establish global convergence results for stochastic fictitious play for four classes of games: games with an interior ESS, zero sum games, potential games, and supermodular games. We do so by appealing to techniques from stochastic approximation theory, which relate the limit behavior of a stochastic process to the limit behavior of a differential equation defined by the expected motion of the process. The key result in our analysis of supermodular games is that the relevant differential equation defines a strongly monotone dynamical system. Our analyses of the other cases combine Lyapunov function arguments with a discrete choice theory result: that the choice probabilities generated by any additive random utility model can be derived from a deterministic model based on payoff perturbations that depend nonlinearly on the vector of choice probabilities.  相似文献   

13.
A simulation study was conducted to investigate the behavior of family scheduling procedures in a dynamic dispatching environment. Two scheduling rules that incorporate setup avoidance mechanisms (FCFS-F and SPT-F) and two that do not (FCFS and SPT) were applied to a single machine. The scheduling environment was varied by controlling several important factors: the machine utilization, the number of setup configurations (families), the size of the family setup times relative to the job run times, the frequency by which members of the part families were released for processing, and the distribution of job interarrival and job run times. The major results from the study are the following: (1) The degree of stability in the system is the most influential factor with respect to mean flow time and flow time variance. Under low variance service and interarrival time distributions, the impact of scheduling rule selection is minor. (2) Conversely, under unstable scheduling situations, family scheduling procedures can have a substantial impact. (3) Clear interaction effects are noticed between all factors. The environment most conducive to family scheduling is characterized by high resource utilization, low setup-to-run time ratio, few part families, and erratic job arrivals. (4) Under conditions favorable to family scheduling, setup avoiding procedures can be used to increase output while at the same time reduce the mean and variance of flow time. (5) The shortest processing time rule (SPT) performs well with respect to mean flow time when relative setup times are small. Overall, however, SPT-F generates the lowest mean flow time while FCFS-F produces the lowest flow time variance. This study shows that scheduling procedures that consider setups in their structure can outperform rules that do not under many different operating conditions. However, the magnitude of this advantage very much depends on the scheduling environment. The results also highlight the fact that it may be better to try to reshape the manufacturing environment than worry about selecting the correct scheduling rule. If the environment cannot be stabilized, then the choice of a setup avoiding procedure, allocation of families to machines, and setup reduction become important issues.  相似文献   

14.
This paper proposes a realistic queueing model of automated guided vehicle (agv) systems in just-in-time production systems. The model takes into consideration return paths, Erlang distributed service times, and pull-type dispatching rule, assuming finite buffer capacities. Since it has no product-form solution and natural decomposability due to complex nontree fork-cum-join architecture and dynamic dispatching rules, we propose a machine-based decomposition algorithm for the performance evaluation of the model. Each decomposed module consists of the processing machine and its dispatching station. Three flow probabilities, derived from flow conservation analysis, relate the modules, which are updated iteratively until the parameters converge. The numerical results from a real-life Agv system application show that the algorithm is reasonably accurate.  相似文献   

15.
This study examines the effects of sequencing flexibility on the performance of rules used to schedule operations in manufacturing systems. The findings show that taking advantage of even low levels of sequencing flexibility in the set of operations required to do a job results in substantial improvement in the performance of scheduling rules with respect to mean flowtime. Differences in the mean flowtime measure for various rules also diminish significantly with increasing sequencing flexibility. Performance improvements additionally result for such due-date related performance measures as mean tardiness and the proportion of jobs tardy. At high levels of sequencing flexibility, some nonparametric scheduling rules outperform the shortest processing time rule in terms of the mean flowtime criterion. Rules based on job due dates also outperform rules based on operation milestones in terms of tardiness related criteria at high levels of sequencing flexibility. The implications of these findings for the design of manufacturing systems and product design are noted.  相似文献   

16.
A Grasp for Aircraft Routing in Response to Groundings and Delays   总被引:11,自引:0,他引:11  
This paper presents a greedy randomized adaptive search procedure (GRASP) to reconstruct aircraft routings in response to groundings and delays experienced over the course of the day. Whenever the schedule is disrupted, the immediate objective of the airlines is to minimize the cost of reassigning aircraft to flights taking into account available resources and other system constraints. Associated costs are measured by flight delays and cancellations.In the procedure, the neighbors of an incumbent solution are generated and evaluated, and the most desirable are placed on a restricted candidate list. One is selected randomly and becomes the incumbent. The heuristic is polynomial with respect to the number of flights and aircraft. This is reflected in our computational experience with data provided by Continental Airlines. Empirical results demonstrate the ability of the GRASP to quickly explore a wide range of scenarios and, in most cases, to produce an optimal or near-optimal solution.  相似文献   

17.
A common manufacturing environment in many industries (such as the glass, steel, paper, costume jewelry, and textile industries) is a hybrid flow shop. This system has continuous-process machinery in the fist state of manufacturing and repetitive-batch equipment in the second. Little research has investigated this type system. Scheduling managers of hybrid flow shops tend either to use existing job-shop rules or to devise their own rules. These approaches often are less than adequate for efficient scheduling. In this paper we extend the rule presented by Narasimhan and Panwalker [4] to include a general class of hybrid flow shops. This extenstion, called the generalized cumulative minimum-deviation (GCMD) rule, is compared under various operation conditions to three other sequencing rules: shortest processing time, longest processing time, and minimum deviation. The operating conditions are determined by the number of machines at both stages. The results of 7200 simulation runs demonstrate that the GCMD rule is better than the other rules in minimizing each of five chosen criteria. Thus, the GCMD rule can help managers to schedule hybrid flow shops efficiently to achieve various corporate objectives.  相似文献   

18.
Typical forecast-error measures such as mean squared error, mean absolute deviation and bias generally are accepted indicators of forecasting performance. However, the eventual cost impact of forecast errors on system performance and the degree to which cost consequences are explained by typical error measures have not been studied thoroughly. The present paper demonstrates that these typical error measures often are not good predictors of cost consequences in material requirements planning (MRP) settings. MRP systems rely directly on the master production schedule (MPS) to specify gross requirements. These MRP environments receive forecast errors indirectly when the errors create inaccuracies in the MPS. Our study results suggest that within MRP environments the predictive capabilities of forecast-error measures are contingent on the lot-sizing rule and the product components structure When forecast errors and MRP system costs are coanalyzed, bias emerges as having reasonable predictive ability. In further investigations of bias, loss functions are evaluated to explain the MRP cost consequences of forecast errors. Estimating the loss functions of forecast errors through regression analysis demonstrates the superiority of loss functions as measures over typical forecast error measures in the MPS.  相似文献   

19.
In many problems of risk analysis, failure is equivalent to the event of a random risk factor exceeding a given threshold. Failure probabilities can be controlled if a decisionmaker is able to set the threshold at an appropriate level. This abstract situation applies, for example, to environmental risks with infrastructure controls; to supply chain risks with inventory controls; and to insurance solvency risks with capital controls. However, uncertainty around the distribution of the risk factor implies that parameter error will be present and the measures taken to control failure probabilities may not be effective. We show that parameter uncertainty increases the probability (understood as expected frequency) of failures. For a large class of loss distributions, arising from increasing transformations of location‐scale families (including the log‐normal, Weibull, and Pareto distributions), the article shows that failure probabilities can be exactly calculated, as they are independent of the true (but unknown) parameters. Hence it is possible to obtain an explicit measure of the effect of parameter uncertainty on failure probability. Failure probability can be controlled in two different ways: (1) by reducing the nominal required failure probability, depending on the size of the available data set, and (2) by modifying of the distribution itself that is used to calculate the risk control. Approach (1) corresponds to a frequentist/regulatory view of probability, while approach (2) is consistent with a Bayesian/personalistic view. We furthermore show that the two approaches are consistent in achieving the required failure probability. Finally, we briefly discuss the effects of data pooling and its systemic risk implications.  相似文献   

20.
Every production planning concept that incorporates controlled order release will initially withhold jobs from the shop floor and create a pre‐shop pool. Order release is a key component of the Workload Control concept that aims to maintain work‐in‐process within limits while ensuring due dates are met. Order release includes two decisions: (i) a sequencing decision that establishes the order in which jobs are considered for release; and, (ii) a selection decision that determines the criteria for choosing jobs for release. While selection has received much research attention, sequencing has been largely neglected. Using simulation, this study uncovers the potential for performance improvement in the sequencing decision and improves our understanding of how order release methods should be designed. Although most prior studies apply time‐oriented sequencing rules and load‐oriented selection rules, analysis reveals that load balancing considerations should also be incorporated in the sequencing decision. But an exclusive focus on load balancing is shown to increase mean tardiness and, paradoxically, require high workloads. A new sequencing rule is developed that only balances loads when multiple orders become urgent. It avoids high mean tardiness and allows the shop to operate at a low workload level. At the same time, the percentage tardy is reduced by up to 50% compared to a purely time‐oriented rule. The findings have implications not only for Workload Control but for any concept that features order release control, such as ConWIP and Drum‐Buffer‐Rope.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号