首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 0 毫秒
1.
In production and stock planning, the relationship between customer service, defined as the ability to meet demand for finished goods from in-stock inventory, and expected profits or expected costs can be represented by a simple reliability curve. The shape of this curve depends upon the parameters of the demand process, specifically the expected level of demand, standard deviation and correlation structure, as well as upon the capacities and initial state of the production and inventory system. A model is presented which explicitly determines this trade-off curve for a firm. The model is intended both as an operational model to aid managers in setting revenue and service targets which are compatible with the capacities and resources of the firm, and as a tool for exploring relationships between the parameters of the demand process and the constraints of the physical production and inventory system. The results illustrate that the level of risk depends strongly on the variability of the demand process, the cost structure, the capacities and initial state of the system and, to a lesser extent, the correlation in demand between succeeding periods. Results suggest that establishing service level targets consistent with the firm's strategic orientation must be done in consideration of both the characteristics of the demand process and the capacities of the production and inventory system. The model provides a tool for estimating the premium above unit cost which must be paid to provide a designated service level.  相似文献   

2.
The traditional quantity discount problem is analyzed from the perspective of game theory, including both noncooperative and cooperative models. For the noncooperative case, the Stackelberg equilibrium is derived. For the cooperative case, the Pareto Optimality criteria are used to find a group of optimal strategies. Both scenarios are illustrated through an example which quantifies the benefits resulting from cooperation between the buyer and the seller for game-theoretic solutions using geometric programming.  相似文献   

3.
Traditional approaches for modeling and solving dynamic demand lotsize problems are based on Zangwill's single-source network and dynamic programming algorithms. In this paper, we propose an arborescent fixed-charge network (ARBNET) programming model and dual ascent based branch-and-bound procedure for the two-stage multi-item dynamic demand lotsize problem. Computational results show that the new approach is significantly more efficient than earlier solution strategies. The largest set of problems that could be solved using dynamic programming contained 4 end items and 12 time periods, and required 475.38 CPU seconds per problem. The dual ascent algorithms averaged .06 CPU seconds for this problem set, and problems with 30 end items and 24 time periods were solved in 85.65 CPU seconds. Similar results verify the superiority of the new approach for handling backlogged demand. An additional advantage of the algorithm is the availability of a feasible solution, with a known worst-case optimality gap, throughout the problem-solving process.  相似文献   

4.
When analyzing a reorder point, order quantity (r, Q) inventory systems, one important question that often gets very little, if any, attention is: When a stockout occurs, how large is it? This paper is directed at researchers and practicing inventory planners with two objectives. First, we provide several models and algorithms to compute the Expected Shortages When a Stockout Occurs (ESWSO) for a variety of stochastic environments. We show that when ESWSO, is used in conjunction with the traditional fill rate measures it greatly enhances a planners ability to plan for shortages. Second, we develop two cost‐minimizing inventory models—one addressing the backorder and the other the shortage scenario—to show how the ESWSO can be seamlessly integrated into an inventory‐cost framework to specify lot sizes and safety stocks.  相似文献   

5.
This paper analyzes the problem of choosing the optimal order quantity and its associated number of standard containers making up the order for single-period inventory models under standard container size discounts. A range is determined that contains the optimal order quantity. Two algorithms are presented. The first algorithm solves the general case in which there is no restriction on the types of containers included in an order. The second algorithm solves a more restricted policy that requires the buyer to accept the order with successively smaller container sizes.  相似文献   

6.
Scott Jordan 《决策科学》1988,19(3):672-681
Production lines are often modeled as queueing networks with finite inventory between each stage. Little is known, however, about the average production rate and inventory levels when the service distribution at each stage is normal. This paper approximates the service distribution using iterative methods rather than simulation. The results show that iterative methods are useful when the problem is small and that approximation of the service distribution, by another distribution with the same mean and variance, is valid for steady-state results such as average production rate or average inventory level.  相似文献   

7.
Most studies in multiechelon inventory systems have concentrated on understanding the specific aspects of a system's behavior. The problem of optimal policy computation has largely been ignored. In this paper, we investigate a two-echelon inventory system experiencing stochastic demand and a pull system of inventory allocation. Both echelons use an order-up-to-level type control policy. A mathematical model is developed to determine the optimal order level at all echelons and validated through simulation. Two simple algorithms to locate the optimum solution are presented. The use of graphical tools in optimal policy calculation is also discussed.  相似文献   

8.
Setting the mean (target value) for a container-filling process is an important decision for a producer when the material cost is a significant portion of the production cost. Because the process mean determines the process conforming rate, it affects other production decisions, including, in particular, the production setup and raw material procurement policies. In this paper, we consider the situation in which quantity discounts exist in the raw material acquisition cost, and incorporate the quantity-discount issue into an existing model that was developed for simultaneously determining the process mean, production setup, and raw material procurement policies for a container-filling process. The product of interest is assumed to have a lower specification limit, and the items that do not conform to the specification limit are scrapped with no salvage value. The production cost of an item is proportional to the amount of the raw material used in producing the item. A two-echelon model is formulated for a single-product production process, and an algorithm is developed for finding the optimal solution. A sensitivity analysis is performed to study the effects of the model parameters on the optimal solution.  相似文献   

9.
An agency model is presented in which outsourcing strictly dominates in-house production. We argue that firms outsource in order to improve managerial incentives. Conditions are established under which the firm is strictly better off with outsourcing. The benefit of outsourcing, however, is constrained by the trade-off between the incremental coordination costs of outsourcing and the improved incentive structure. The optimal contract is also shown to be a function of whether or not the firm is publicly held. For a publicly held firm, the contract is constant. For a privately held supplier, the contract is likely to be of a cost-sharing type. These findings offer preliminary incentive explanations for commonly observed outsourcing practices.  相似文献   

10.
It is difficult to devise a statistical test to detect one student copying from another. Many prior efforts falsely accuse students of cheating. A general methodology based on the quantal choice model of decision theory overcomes these problems. Three steps are involved: (1) for each item estimate the probability and the variance that a given respondent will select each response, (2) for pairs of respondents, these probabilities determine the expected number of matches, and (3) compare the critical value to the number of items matched. Methods differ based on the probability estimation technique. Four methods (simple frequencies, Frary, Tideman, and Watts modification [8], logit, and multinomial probit) are compared on theoretical and empirical grounds. Theory and results show that it is crucial to incorporate the variance of the probability estimates. The probit model has theoretical advantages over the other methods and produces more accurate results.  相似文献   

11.
Early formulations of conjoint models focused on part-worth estimation at the individual level. As the methodology's popularity grew so did industry demands for increasingly larger numbers of attributes and levels. In response to these demands, new approaches, based on partial or full data aggregation (such as clusterwise/latent class conjoint and choice-based conjoint), have appeared. This paper suggests that pooled-data models will often be successful in predicting market shares when researchers employ monotonic attributes. In these cases more of a good attribute (or less of a bad attribute) is always more preferred. In the more realistic case, in which some of the attributes may be nonmonotonic, we find that data aggregation does not predict holdout sample preferences as well as individual part-worth models.  相似文献   

12.
Consider a set of chemical products to be produced in a single facility. Each product has its own unique reaction time (which is assumed to be independent of its batch size), as well as other cost and demand values. In this paper, we address the problem of determining the optimal number of batches, batch sizes, and an accompanying production schedule for these products in the single facility that will minimize the total cost. Two different algorithms have been developed for this problem, the performances of which are contrasted with classical cyclic production schedules. Finally, some guidelines for the application of these methods to real-life problems are outlined.  相似文献   

13.
Inventory management has undergone significant philosophical changes in recent decades such as the advent of the zero inventory concept. However, as attractive as the concept of minimal inventories may be, it is often unrealistic in application. Attention to basic features of inventory control systems such as order quantities, base stock levels, and reorder points remain crucial to ensure customer service at minimal cost. A nonlinear optimization model for determining base stock levels in a multi-echelon inventory network is presented. Lagrangian relaxation results in (1) newsboy-style relations that provide the optimal solutions, and (2) instantaneous shadow prices for the budget constraint. Sensitivity analysis of this model will facilitate making decisions concerning the desired investment in inventory for the entire system. This model may be solved on standard nonlinear programming software and is generalizable to problems in both production and distribution settings.  相似文献   

14.
This paper reports the development of an instrument to measure the organizational benefits of IS projects. The basis for this instrument was a published framework that suggests three categories of such benefits: strategic, informational, and transactional. In a cross-sectional study of 178 IS projects proposed and approved for development, this framework was operationalized and empirically tested using the measurement model of LISREL. The analysis culminated in the validation and refinement of the these categories. The final instrument offers items under three separate subdimensions of strategic benefits: competitive advantage, alignment, and customer relations. Informational benefits are similarly comprised of information access, information quality, and information flexibility. Finally, transactional benefits are also shown to be of three types: communications efficiency, systems development efficiency, and business efficiency. Implications of this multidimensional instrument for IS practitioners and researchers are discussed.  相似文献   

15.
The pressure to reduce inventory investments in supply chains has increased as competition expands and product variety grows. Managers are looking for areas they can improve to reduce inventories without hurting the level of service provided. Two areas that managers focus on are the reduction of the replenishment lead time from suppliers and the variability of this lead time. The normal approximation of lead time demand distribution indicates that both actions reduce inventories for cycle service levels above 50%. The normal approximation also indicates that reducing lead time variability tends to have a greater impact than reducing lead times, especially when lead time variability is large. We build on the work of Eppen and Martin (1988) to show that the conclusions from the normal approximation are flawed, especially in the range of service levels where most companies operate. We show the existence of a service‐level threshold greater than 50% below which reorder points increase with a decrease in lead time variability. Thus, for a firm operating just below this threshold, reducing lead times decreases reorder points, whereas reducing lead time variability increases reorder points. For firms operating at these service levels, decreasing lead time is the right lever if they want to cut inventories, not reducing lead time variability.  相似文献   

16.
The primary objective of this study is to examine the performance of order-based dispatching rules in a general job shop, where the environmental factors are shop utilization and due date tightness. An order is defined as a collection of jobs that are shipped as a group—an order—to the customer, only on completion of the last job of the order. We specifically compare dispatching rules from past job-based studies to some rules adapted to encompass order characteristics. Standard flow time and tardiness measures are used, but in addition, we introduce measures that combine average performance with variation in an attempt to capture the performance of a majority of the orders processed in the shop. Of the 16 dispatching rules tested, our results show that four of the simple rules dominate the others. We also found that order-based rules perform better than their job-based counterparts. The study makes use of multivariate statistical analysis, in addition to the usual univariate tests, which can provide additional insight to managers using multiple criteria in their decision process.  相似文献   

17.
Minimum surgical times are positive and often large. The lognormal distribution has been proposed for modeling surgical data, and the three‐parameter form of the lognormal, which includes a location parameter, should be appropriate for surgical data. We studied the goodness‐of‐fit performance, as measured by the Shapiro‐Wilk p‐value, of three estimators of the location parameter for the lognormal distribution, using a large data set of surgical times. Alternative models considered included the normal distribution and the two‐parameter lognormal model, which sets the location parameter to zero. At least for samples with n > 30, data adequately fit by the normal had significantly smaller skewness than data not well fit by the normal, and data with larger relative minima (smallest order statistic divided by the mean) were better fit by a lognormal model. The rule “If the skewness of the data is greater than 0.35, use the three‐parameter lognormal with the location parameter estimate proposed by Muralidhar & Zanakis (1992), otherwise, use the two‐parameter model” works almost as well at specifying the lognormal model as more complex guidelines formulated by linear discriminant analysis and by tree induction.  相似文献   

18.
Scientific techniques for inventory management typically are applied to systems containing many items. Such techniques require an estimation of the demand variance (and mean) of each item from historical data. This research demonstrates a significant potential for improvement in system cost performance from using least-squares regression fits of a variance-to-mean functional relation instead of the standard statistical variance estimate. Even when there is a moderate degree of heterogeneity among items and when the form of the variance-to-mean relation is misspecified, substantial cost savings may be realized. The cost of statistical uncertainty may be reduced by half. The research also provides evidence that system cost is fairly insensitive to the number of items used to fit the regression. This paper provides the underlying reason why a regression-derived variance estimator yields lower cost: it is less variable than the usual individual item variance estimator.  相似文献   

19.
Generally accepted auditing standards require auditors to plan audits of clients' account balances. If accounts are to be sampled, then part of this planning must include setting the tolerable misstatement for each account or class of transactions to be sampled. Although classical sampling approaches provide certain advantages, they have not been widely used because they are viewed as complex and difficult to implement. We present a remedy to these difficulties in an efficient, easily implemented optimal solution method for the problem of setting tolerable misstatements given constraints on tolerable misstatements for individual account balances as well as the overall audit. Further, our method suggests when the materialities of certain accounts or the materiality of the overall audit are irrelevant to the problem. Several example auditing problems demonstrate both our solution approach and the settings in which our approach provides a more effective or more efficient sampling plan than that provided by monetary unit sampling.  相似文献   

20.
A supply chain is a series of manufacturing plants that transform raw material into finished product. A pipeline within a supply chain refers to the stream of information, material, components, and assemblies that are associated with a particular product. It is typical for manufacturing plants to put considerable effort to optimize the performance of a horizontal slice of a supply chain (such as coordination among parts that share a common resource). The need to optimize the performance of the vertical slice (the supply chain connecting raw material to finished product) by controlling the transmission of schedule instability and the resulting inventory fluctuation is often overlooked. A schedule is stable if actual production requirements for a given period do not change from the forecast production requirements. Stable production schedules are important when managing supply chains as they help control inventory fluctuation and inventory accumulation. Failure to control schedule instability results in high average inventory levels in the system. In this paper a simulation analysis of supply chain instability and inventory is conducted, and it is shown how supply chains can be analyzed for continuous improvement opportunities using simulation. The focus is on a stamping pipeline in an automobile supply chain based on operating data from General Motors (GM). It is shown that the techniques used in this paper are a useful tool for supply chain analysis.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号