首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 500 毫秒
1.
Uncertainty of environmental concentrations is calculated with the regional multimedia exposure model of EUSES 1.0 by considering probability input distributions for aqueous solubility, vapor pressure, and octanol-water partition coefficient, K(ow). Only reliable experimentally determined data are selected from available literature for eight reference chemicals representing a wide substance property spectrum. Monte Carlo simulations are performed with uniform, triangular, and log-normal input distributions to assess the influence of the choice of input distribution type on the predicted concentration distributions. The impact of input distribution shapes on output variance exceeds the effect on the output mean by one order of magnitude. Both are affected by influence and uncertainty (i.e., variance) of the input variable as well. Distributional shape has no influence when the sensitivity function of the respective parameter is perfectly linear. For nonlinear relationships, overlap of probability mass of input distribution with influential ranges of the parameter space is important. Differences in computed output distribution are greatest when input distributions differ in the most influential parameter range.  相似文献   

2.
The aging domestic oil production infrastructure represents a high risk to the environment because of the type of fluids being handled (oil and brine) and the potential for accidental release of these fluids into sensitive ecosystems. Currently, there is not a quantitative risk model directly applicable to onshore oil exploration and production (E&P) facilities. We report on a probabilistic reliability model created for onshore exploration and production (E&P) facilities. Reliability theory, failure modes and effects analysis (FMEA), and event trees were used to develop the model estimates of the failure probability of typical oil production equipment. Monte Carlo simulation was used to translate uncertainty in input parameter values to uncertainty in the model output. The predicted failure rates were calibrated to available failure rate information by adjusting probability density function parameters used as random variates in the Monte Carlo simulations. The mean and standard deviation of normal variate distributions from which the Weibull distribution characteristic life was chosen were used as adjustable parameters in the model calibration. The model was applied to oil production leases in the Tallgrass Prairie Preserve, Oklahoma. We present the estimated failure probability due to the combination of the most significant failure modes associated with each type of equipment (pumps, tanks, and pipes). The results show that the estimated probability of failure for tanks is about the same as that for pipes, but that pumps have much lower failure probability. The model can provide necessary equipment reliability information for proactive risk management at the lease level by providing quantitative information to base allocation of maintenance resources to high-risk equipment that will minimize both lost production and ecosystem damage.  相似文献   

3.
Deterministic goal programs for employee scheduling decisions attempt to minimize expected operating costs by assigning the ideal number of employees to each feasible schedule. For each period in the planning horizon, managers must first determine the amount of labor that should be scheduled for duty. These requirements are often established with marginal analysis techniques, which use estimates for incremental labor costs and shortage expenses. Typically, each period in the planning horizon is evaluated as an independent epoch. An implicit assumption is that individual employees can be assigned to schedules with as little as a single period of work. If this assumption violates local work rules, the labor requirements parameters for the deterministic goal program may be suboptimal. As we show in this research, this well-known limitation can lead to costly staffing and scheduling errors. We propose an employee scheduling model that overcomes this limitation by integrating the labor requirements and scheduling decisions. Instead of a single, externally determined staffing goal for each period, the model uses a probability distribution for the quantity of labor required. The model is free to choose an appropriate staffing level for each period, eliminating the need for a separate goal-setting procedure. In most cases this results in better, less costly decisions. In addition, the proposed model easily accommodates both linear and nonlinear under- and overstaffing penalties. We use simple examples to demonstrate many of these advantages and to illustrate the key techniques necessary to implement our model. We also assess its performance in a study of more than 1,700 simulated stochastic employee scheduling problems.  相似文献   

4.
Risk analysis often depends on complex, computer-based models to describe links between policies (e.g., required emission-control equipment) and consequences (e.g., probabilities of adverse health effects). Appropriate specification of many model aspects is uncertain, including details of the model structure; transport, reaction-rate, and other parameters; and application-specific inputs such as pollutant-release rates. Because these uncertainties preclude calculation of the precise consequences of a policy, it is important to characterize the plausible range of effects. In principle, a probability distribution function for the effects can be constructed using Monte Carlo analysis, but the combinatorics of multiple uncertainties and the often high cost of model runs quickly exhaust available resources. This paper presents and applies a method to choose sets of input conditions (scenarios) that efficiently represent knowledge about the joint probability distribution of inputs. A simple score function approximately relating inputs to a policy-relevant output—in this case, globally averaged stratospheric ozone depletion—is developed. The probability density function for the score-function value is analytically derived from a subjective joint probability density for the inputs. Scenarios are defined by selected quantiles of the score function. Using this method, scenarios can be systematically selected in terms of the approximate probability distribution function for the output of concern, and probability intervals for the joint effect of the inputs can be readily constructed.  相似文献   

5.
Y Roll  A Sachish 《Omega》1981,9(1):37-42
A system of productivity indices, suitable for the plant level, is developed. The productivity of each input factor is measured separately against standard, input per unit of output, figures. This system enables the decomposition of comparative indices into components, whereby the contribution of each component to productivity differentials can be determined. A model is proposed for aggregating the single factor (physical) productivities into an overall (economic) index.  相似文献   

6.
Estimates of the cost of potential disasters, including indirect economic consequences, are an important input in the design of risk management strategies. The adaptive regional input‐output (ARIO) inventory model is a tool to assess indirect disaster losses and to analyze their drivers. It is based on an input‐output structure, but it also (i) explicitly represents production bottlenecks and input scarcity and (ii) introduces inventories as an additional flexibility in the production system. This modeling strategy distinguishes between (i) essential supplies that cannot be stocked (e.g., electricity, water) and whose scarcity can paralyze all economic activity; (ii) essential supplies that can be stocked at least temporarily (e.g., steel, chemicals), whose scarcity creates problems only over the medium term; and (iii) supplies that are not essential in the production process, whose scarcity is problematic only over the long run and are therefore easy to replace with imports. The model is applied to the landfall of Hurricane Katrina in Louisiana and identifies two periods in the disaster aftermath: (1) the first year, during which production bottlenecks are responsible for large output losses; (2) the rest of the reconstruction period, during which bottlenecks are inexistent and output losses lower. This analysis also suggests important research questions and policy options to mitigate disaster‐related output losses.  相似文献   

7.
Juan Du  Liang Liang  Yao Chen  Gong-bing Bi   《Omega》2010,38(1-2):105-112
Production in large organizations with a centralized decision-making environment like supermarket chains or factories with many workshops, usually involves the participation of more than one individual unit, each contributing a part of the total production. This study is motivated by a production-planning problem regularly faced by the central decision-making unit to arrange new input and output plans for all individual units in the next production season when demand changes can be forecasted. Two planning ideas have been proposed in this paper. One is optimizing the average or overall production performance of the entire organization, measured by the CCR efficiency of the average input and output levels of all units. The other is simultaneously maximizing total outputs produced and minimizing total inputs consumed by all units. According to these two ideas, we develop two DEA-based production planning approaches to find the most preferred production plans. All these individual units, considered as decision-making units (DMUs), are supposed to be able to modify their input usages and output productions. A simple numerical example and a real world data set are used to illustrate these approaches.  相似文献   

8.
We address the issue of performance analysis of fabrication/assembly (F/A) systems, which are systems that first fabricate components and then join the components and subassemblies into a product. Here we consider an F/A system consisting of a single assembly station with input from K fabrication stations. We assume that the system uses a Kanban control mechanism with a fixed number of kanbans circulating between each input station and the assembly station. Even with Markovian assumptions, computing an exact solution for the performance evaluation of such systems becomes intractable due to an explosion in the state-space. We develop computationally efficient algorithms to approximate the throughput and mean queue lengths. The accuracy of the approximations is studied by comparison to exact results (K = 2) and to simulations (K > 2). Part II of this paper demonstrates how these models can be used as building blocks to evaluate more complex F/A systems with multiple levels of assembly stations.  相似文献   

9.
The purposes of this paper are, first, to introduce several concepts and definitions related to Theory of Constraints design and management of unbalanced lines and, second, to illustrate the concepts of productive and protective capacity and inventory in a constrained line. Drum-buffer-rope is the Theory-of-Constraints based scheduling mechanism used to manage throughput at constraint work stations and flow at non-constraint work stations. A small simulation model is given to illustrate the importance of protective capacity and protective inventory at non-constraint stations. The line consists of several stations with the centre station being the constraint station. The capacity of (and inventory at) non-constraint stations is varied during the simulation. Line output increases as inventory at non-constraint stations increases. This result is contrary to traditional teaching about line design which says that line output is a function solely of the capacity of the slowest station.  相似文献   

10.
A change in profit can originate from the output side and the input side. In the spirit of work by Tone [1] and follows Grifell-Tatjé and Lovell's [2], we propose a non-oriented slacks-based measure (SBM) model to decompose the change in the operating profit into various meaningful components: quantity effect and a price effect. The quantity effect can be decomposed into a productivity effect and an activity effect. The productivity effect is further decomposed into a technical effect and an operating efficiency effect. Both of them include an output side, which will result in a change in revenue and an input side which will result in a change in cost. The activity effect can be decomposed into a product mix effect, a resource mix effect and a scale effect. We illustrate our decompositions to the Taiwanese banking sector during the period 1994-2002 using the average of the base and current prices to evaluate these contributions. We find ignoring input side effects on the decomposition of profit changes would cause misleading results in managerial issues.  相似文献   

11.
In the usual data envelopment analysis (DEA) setting, as pioneered by Charnes et al. (1978) [1], it is assumed that a set of decision making units (DMUs) is to be evaluated in terms of their relative efficiencies in converting a bundle of inputs into a bundle of outputs. The usual assumption in DEA is that each output is impacted by each and every member of the input set. One particular area of recent research is that relating to partial input to output impacts where the main issue addressed is that in many settings not all inputs impact all outputs. In that situation the authors view the DMU as consisting of a set of mutually exclusive subunits, with each subunit having its own unique bundle of inputs and outputs. Examined as well in this area, is the presence of multiple processes for generating sets of outputs. Missing from that earlier work is consideration of the presence of outputs in the form of by-products, giving rise to a parent-offspring phenomenon. One of the modeling complications there is that the parent assumes two different roles; as an input affecting the offspring, while at the same time being the dominant output. This gives rise to a model that we refer to as conditional two-stage. Another complication is that in the presence of multiple processes, by-products often arise out of only a subset of those processes. In the current paper we develop a DEA-type of methodology to handle partial input to output impacts in the presence of by-products.  相似文献   

12.
A wide range of uncertainties will be introduced inevitably during the process of performing a safety assessment of engineering systems. The impact of all these uncertainties must be addressed if the analysis is to serve as a tool in the decision-making process. Uncertainties present in the components (input parameters of model or basic events) of model output are propagated to quantify its impact in the final results. There are several methods available in the literature, namely, method of moments, discrete probability analysis, Monte Carlo simulation, fuzzy arithmetic, and Dempster-Shafer theory. All the methods are different in terms of characterizing at the component level and also in propagating to the system level. All these methods have different desirable and undesirable features, making them more or less useful in different situations. In the probabilistic framework, which is most widely used, probability distribution is used to characterize uncertainty. However, in situations in which one cannot specify (1) parameter values for input distributions, (2) precise probability distributions (shape), and (3) dependencies between input parameters, these methods have limitations and are found to be not effective. In order to address some of these limitations, the article presents uncertainty analysis in the context of level-1 probabilistic safety assessment (PSA) based on a probability bounds (PB) approach. PB analysis combines probability theory and interval arithmetic to produce probability boxes (p-boxes), structures that allow the comprehensive propagation through calculation in a rigorous way. A practical case study is also carried out with the developed code based on the PB approach and compared with the two-phase Monte Carlo simulation results.  相似文献   

13.
This paper addresses the problem of sequencing in decentralized kanban-controlled flow shops. The kanban production control system considered uses two card types and a constant withdrawal period. The flow shops are decentralized in the sense that sequencing decisions are made at the local workstation level rather than by a centralized scheduling system. Further, there are no material requirements planning (MRP)-generated due dates available to drive dispatching rules such as earliest due date, slack, and critical ratio. Local sequencing rules suitable for the decentralized kanban production-control environment are proposed and tested in a simulation experiment. These rules are designed so that they can be implemented with only the information available at the workstation level. Example sequencing problems are used to show why the shortest processing time rule minimizes neither average work-in-process inventory nor average finished-goods withdrawal kanban waiting time. Further, it is shown how work station supervisors can use the withdrawal period, in addition to the number of kanbans, to manage work-in-process inventories.  相似文献   

14.
在生产中,由于企业并未掌握关于生产技术的完全信息,因此可能导致不同生产要素具有不同的技术效率水平。而传统的"径向"技术效率测量方法并不能识别生产要素之间的技术效率差异。为解决这一问题,现有的文献中分别提出了基于利润函数和方向距离函数的测量方法。但利润函数的估计需要用到难以收集的价格数据;估计方向距离函数需要事先设定投入要素缩减的方向,而这一方向对于研究者是未知的。基于投入距离函数,本文构建了一个新的可以测量单要素技术效率的框架,且无需价格数据和事先设定"方向"。文章采用贝叶斯方法分两步估计该模型:首先得到模型参数的估计值;其次在模型参数估计值给定的基础上再估计单个要素的技术效率水平。蒙特卡洛模拟分析发现,与直接估计各要素的技术效率的方法相比,这种"两步法"可以更快的实现马尔科夫蒙特卡洛(MCMC)过程的收敛,并能够较为精确的估计各要素的技术效率水平。之后将该方法应用于北京大学企业社会责任调查的数据,估计了资本和劳动力的技术效率水平。结果显示,企业在利用资本中几乎不存在技术效率损失,并且企业间的资本技术效率水平无明显差异。企业技术效率损失主要来自于劳动力利用不足,且企业间劳动力技术效率水平差异较大。平均而言,劳动力的技术效率水平为77%,即在保持产出和资本投入不变的情况下,可以使劳动力的投入下降23%。这个例子表明,本文提出的方法可以识别生产中导致技术效率损失的主要原因,从而有助于找到提升生产效率的解决方案。  相似文献   

15.
Models for the assessment of the risk of complex engineering systems are affected by uncertainties due to the randomness of several phenomena involved and the incomplete knowledge about some of the characteristics of the system. The objective of this article is to provide operative guidelines to handle some conceptual and technical issues related to the treatment of uncertainty in risk assessment for engineering practice. In particular, the following issues are addressed: (1) quantitative modeling and representation of uncertainty coherently with the information available on the system of interest; (2) propagation of the uncertainty from the input(s) to the output(s) of the system model; (3) (Bayesian) updating as new information on the system becomes available; and (4) modeling and representation of dependences among the input variables and parameters of the system model. Different approaches and methods are recommended for efficiently tackling each of issues (1)?(4) above; the tools considered are derived from both classical probability theory as well as alternative, nonfully probabilistic uncertainty representation frameworks (e.g., possibility theory). The recommendations drawn are supported by the results obtained in illustrative applications of literature.  相似文献   

16.
17.
Bhp Rivett 《Omega》1980,8(1):81-93
Indifference mapping uses multi dimensional scaling techniques to allocate multi criteria policies to points in a space based on an input which consists of those pairs of policies which are equally attractive. The paper takes sets of policies for each of which a single value has been pre-assigned and maps them in two dimensions by applying a probability law to the assessment of indifferent pairs. The paper shows that the maps can be used both to deduce the original value of each policy and can also deduce the probability law which was applied.  相似文献   

18.
A risk assessment was performed to incorporate uncertainty in food processing conditions to develop a risk-based sterilization process design. The focus of this analysis was uncertainty associated with heterogeneous food products. Quartered button mushrooms were the chosen food product because it represents the most typical type. A model for sterilization of spherical particles was utilized, and each parameter's uncertainty was characterized for use under Monte Carlo simulation. Various particle distributions and fluid types were compared. The output of the model was the required sterilization time to achieve the target sterilization conditions with 95% probability. This value was then used to determine the mean fluid velocity for a given tube length. Finally, the output from the model was analyzed to determine the confidence in output based on uncertainty in the input parameters. The model was more sensitive to variation in particle size distribution than fluid type for power-law fluids. The 90% confidence interval included a holding time range of 1 min. With a 95% confidence level that only 8% of the data will be below the target sterilization conditions, a maximum of 9% of the data were expected to achieve double the target level. The results of such an analysis would be useful for management decisions concerning the design of aseptic food processing operations.  相似文献   

19.
Bayesian Forecasting via Deterministic Model   总被引:1,自引:0,他引:1  
Rational decision making requires that the total uncertainty about a variate of interest (a predictand) be quantified in terms of a probability distribution, conditional on all available information and knowledge. Suppose the state-of-knowledge is embodied in a deterministic model, which is imperfect and outputs only an estimate of the predictand. Fundamentals are presented of two Bayesian methods for producing a probabilistic forecast via any deterministic model. The Bayesian Processor of Forecast (BPF) quantifies the total uncertainty in terms of a posterior distribution, conditional on model output. The Bayesian Forecasting System (BFS) decomposes the total uncertainty into input uncertainty and model uncertainty, which are characterized independently and then integrated into a predictive distribution. The BFS is compared with Monte Carlo simulation and ensemble forecasting technique, none of which can alone produce a probabilistic forecast that quantifies the total uncertainty, but each can serve as a component of the BFS.  相似文献   

20.
确定任意投入-产出组合规模弹性的DEA模型   总被引:2,自引:0,他引:2  
本文探讨数据包络分析(DEA)对规模收益的定量分析.过去的研究文献只涉及全输入组合对全输出组合的产出弹性.本文通过建立一般模型来计算任意输入组合对任意输出组合的产出弹性,从而可以确定在产出弹性中占主导地位的最佳投入组合.按照最佳投入组合来扩展规模可避免盲目投资和浪费资源,达到以最小投入获取最大产出的效用.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号