首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
The purpose of the paper is to demonstrate the usefulness of (1) system dynamics as a structural theory for operations management and (2) system dynamics models as content theories in operations management. The key findings are that, although feedback loops, accumulation processes, and delays exist and are widespread in operations management, often these phenomena are ignored completely or not considered appropriately. Hence, it is reasoned why system dynamics is well suited as an approach for many operations management studies, and it is shown how system dynamics theory can be used to explain, analyze, and understand such phenomena in operations management. The discussion is based on a literature review and on conceptual considerations, with examples of operations management studies based on system dynamics. Implications of using this theory include the necessary re‐framing of some operations management issues and the extension of empirical studies by dynamic modeling and simulation. The value of the paper lies in the conceptualization of the link between system dynamics and operations management, which is discussed on the level of theory.  相似文献   

2.
Risk-Based Ranking of Dominant Contributors to Maritime Pollution Events   总被引:2,自引:0,他引:2  
This report describes a conceptual approach for identifying dominant contributors to risk from maritime shipping of hazardous materials. Maritime transportation accidents are relatively common occurrences compared to more frequently analyzed contributors to public risk. Yet research on maritime safety and pollution incidents has not been guided by a systematic, risk-based approach. Maritime shipping accidents can be analyzed using event trees to group the accidents into "bins," or groups, of similar characteristics such as type of cargo, location of accident (e.g., harbor, inland waterway), type of accident (e.g., fire, collision, grounding), and size of release. The importance of specific types of events to each accident bin can be quantified. Then the overall importance of accident events to risk can be estimated by weighting the events' individual bin importance measures by the risk associated with each accident bin.  相似文献   

3.
Accidents with automatic production systems are reported to be on the order of one in a hundred or thousand robot-years, while fatal accidents are found to occur one or two orders of magnitude less frequently. Traditions in occupational safety tend to seek for safety targets in terms of zero severe accidents for automatic systems. Decision-making requires a risk assessment balancing potential risk reduction measures and costs within the cultural environment of a production company. This paper presents a simplified procedure which acts as a decision tool. The procedure is based on a risk concept approaching prevention both in a deterministic and in a probabilistic manner. Eight accident scenarios are shown to represent the potential accident processes involving robot interactions with people. Seven prevention policies are shown to cover the accident scenarios in principle. An additional probabilistic approach may indicate which extra safety measures can be taken against what risk reduction and additional costs. The risk evaluation process aims at achieving a quantitative acceptable risk level. For that purpose, three risk evaluation methods are discussed with respect to reaching broad consensus on the safety targets.  相似文献   

4.
A Flexible Count Data Regression Model for Risk Analysis   总被引:1,自引:0,他引:1  
In many cases, risk and reliability analyses involve estimating the probabilities of discrete events such as hardware failures and occurrences of disease or death. There is often additional information in the form of explanatory variables that can be used to help estimate the likelihood of different numbers of events in the future through the use of an appropriate regression model, such as a generalized linear model. However, existing generalized linear models (GLM) are limited in their ability to handle the types of variance structures often encountered in using count data in risk and reliability analysis. In particular, standard models cannot handle both underdispersed data (variance less than the mean) and overdispersed data (variance greater than the mean) in a single coherent modeling framework. This article presents a new GLM based on a reformulation of the Conway-Maxwell Poisson (COM) distribution that is useful for both underdispersed and overdispersed count data and demonstrates this model by applying it to the assessment of electric power system reliability. The results show that the proposed COM GLM can provide as good of fits to data as the commonly used existing models for overdispered data sets while outperforming these commonly used models for underdispersed data sets.  相似文献   

5.
In this study, we consider the integrated inventory replenishment and transportation operations in a supply chain where the orders placed by the downstream retailer are dispatched by the upstream warehouse via an in‐house fleet of limited size. We first consider the single‐item single‐echelon case where the retailer operates with a quantity based replenishment policy, (r,Q), and the warehouse is an ample supplier. We model the transportation operations as a queueing system and derive the operating characteristics of the system in exact terms. We extend this basic model to a two‐echelon supply chain where the warehouse employs a base‐stock policy. The departure process of the warehouse is characterized in distribution, which is then approximated by an Erlang arrival process by matching the first two moments for the analysis of the transportation queueing system. The operating characteristics and the expected cost rate are derived. An extension of this system to multiple retailers is also discussed. Numerical results are presented to illustrate the performance and the sensitivity of the models and the value of coordinating inventory and transportation operations.  相似文献   

6.
Statistical procedures are developed to estimate accident occurrence rates from historical event records, to predict future rates and trends, and to estimate the accuracy of the rate estimates and predictions. Maximum likelihood estimation is applied to several learning models and results are compared to earlier graphical and analytical estimates. The models are based on (1) the cumulative number of operating years, (2) the cumulative number of plants built, and (3) accidents (explicitly), with the accident rate distinctly different before and after an accident. The statistical accuracies of the parameters estimated are obtained in analytical form using the Fisher information matrix. Using data on core damage accidents in electricity producing plants , it is estimated that the probability for a plant to have a serious flaw has decreased from 0.1 to 0.01 during the developmental phase of the nuclear industry. At the same time the equivalent frequency of accidents has decreased from 0.04 per reactor year to 0.0004 per reactor year, partly due to the increasing population of plants.  相似文献   

7.
The risk of catastrophic failures, for example in the aviation and aerospace industries, can be approached from different angles (e.g., statistics when they exist, or a detailed probabilistic analysis of the system). Each new accident carries information that has already been included in the experience base or constitutes new evidence that can be used to update a previous assessment of the risk. In this paper, we take a different approach and consider the risk and the updating from the investor's point of view. Based on the market response to past airplane accidents, we examine which ones have created a surprise response and which ones are considered part of the risk of the airline business as previously assessed. To do so, we quantify the magnitude and the timing of the observed market response to catastrophic accidents, and we compare it to an estimate of the response that would be expected based on the true actual cost of the accident including direct and indirect costs (full-cost information response). First, we develop a method based on stock market data to measure the actual market response to an accident and we construct an estimate of the full-cost information response to such an event. We then compare the two figures for the immediate and the long-term response of the market for the affected firm, as well as for the whole industry group to which the firm belongs. As an illustration, we analyze a sample of ten fatal accidents experienced by major US domestic airlines during the last seven years. In four cases, we observed an abnormal market response. In these instances, it seems that the shareholders may have updated their estimates of the probability of a future accident in the affected airlines or more generally of the firm's future business prospects. This market reaction is not always easy to explain much less to anticipate, a fact which management should bear in mind when planning a firm's response to such an event.  相似文献   

8.
The aim of this article is to investigate some implications of complexity in workplace risk assessment. Workplace is examined as a complex system, and some of its attributes and aspects of its behavior are investigated. Failure probability of various workplace elements is examined as a time variable and interference phenomena of these probabilities are presented. Potential inefficiencies of common perceptions in applying probabilistic risk assessment models are also discussed. This investigation is conducted through mathematical modeling and qualitative examples of workplace situations. A mathematical model for simulation of the evolution of workplace accident probability in time is developed. Its findings are then attempted to be translated in real-world terms and discussed through simple examples of workplace situations. The mathematical model indicates that workplace is more likely to exhibit an unpredictable behavior. Such a behavior raises issues about usual key assumptions for the workplace, such as aggregation. Chaotic phenomena (nonlinear feedback mechanisms) are also investigated for in simple workplace systems cases. The main conclusions are (1) that time is an important variable for risk assessment, since behavior patterns are complex and unpredictable in the long term and (2) that workplace risk identification should take place in a holistic view (not by work post).  相似文献   

9.
Defining a baseline for the frequency of occurrences at underground natural gas storage facilities is critical to maintaining safe operation and to the development of appropriate risk management plans and regulatory approaches. Currently used frequency-estimation methods are reviewed and broadened in this article to include critical factors of cause, severity, and uncertainty that contribute to risk. A Bayesian probabilistic analysis characterizes the aleatoric historical occurrence frequencies given imperfect sampling. Frequencies for the three main storage facility types in the United States (depleted oil-and-gas field storage, aquifer storage, solution-mined salt cavern storage) are generally on the order of 3 to 9 × 10–2 occurrences, of all causes (surface, well integrity, subsurface integrity) and severities (nuisance, serious, catastrophic), per facility-year. Loss of well integrity is associated with many, but not all, occurrences either within the subsurface or from there up to the surface. The probability of one serious or catastrophic leakage occurrence to the ground surface within the next 10 years, assuming constant number of facilities, is approximately 0.1–0.3% for any facility type. Storage operators and industry regulators can use occurrence frequencies, their associated probabilities and uncertainties, and forecasts of severity magnitudes to better prioritize resources, establish a baseline against which progress toward achieving a reduction target could be measured, and develop more effective mitigation/monitoring/reduction programs in a risk management plan.  相似文献   

10.
Urban road tunnels provide an increasingly cost‐effective engineering solution, especially in compact cities like Singapore. For some urban road tunnels, tunnel characteristics such as tunnel configurations, geometries, provisions of tunnel electrical and mechanical systems, traffic volumes, etc. may vary from one section to another. These urban road tunnels that have characterized nonuniform parameters are referred to as nonhomogeneous urban road tunnels. In this study, a novel quantitative risk assessment (QRA) model is proposed for nonhomogeneous urban road tunnels because the existing QRA models for road tunnels are inapplicable to assess the risks in these road tunnels. This model uses a tunnel segmentation principle whereby a nonhomogeneous urban road tunnel is divided into various homogenous sections. Individual risk for road tunnel sections as well as the integrated risk indices for the entire road tunnel is defined. The article then proceeds to develop a new QRA model for each of the homogeneous sections. Compared to the existing QRA models for road tunnels, this section‐based model incorporates one additional top event—toxic gases due to traffic congestion—and employs the Poisson regression method to estimate the vehicle accident frequencies of tunnel sections. This article further illustrates an aggregated QRA model for nonhomogeneous urban tunnels by integrating the section‐based QRA models. Finally, a case study in Singapore is carried out.  相似文献   

11.
This paper proposes a risk analysis model to determine the risk degrees of the risk factors occurring in product development processes. The model uses both fuzzy theory and Markov processes on a concurrent engineering (CE) basis. Fuzzy models determine the impact values of the risk factors, and Markov processes determine the probability of risk occurrences. The analysis model is used to compute the risk degrees by multiplying the probability of risk occurrences by the impact value. This study can be utilized for analyzing the influences of risk factors on product development projects and will contribute toward the development of a risk management framework (RMF) to defend against various risk factors. Implications and directions for future research are discussed.  相似文献   

12.
《Risk analysis》2018,38(9):1988-2009
Harbor seals in Iliamna Lake, Alaska, are a small, isolated population, and one of only two freshwater populations of harbor seals in the world, yet little is known about their abundance or risk for extinction. Bayesian hierarchical models were used to estimate abundance and trend of this population. Observational models were developed from aerial survey and harvest data, and they included effects for time of year and time of day on survey counts. Underlying models of abundance and trend were based on a Leslie matrix model that used prior information on vital rates from the literature. We developed three scenarios for variability in the priors and used them as part of a sensitivity analysis. The models were fitted using Markov chain Monte Carlo methods. The population production rate implied by the vital rate estimates was about 5% per year, very similar to the average annual harvest rate. After a period of growth in the 1980s, the population appears to be relatively stable at around 400 individuals. A population viability analysis assessing the risk of quasi‐extinction, defined as any reduction to 50 animals or below in the next 100 years, ranged from 1% to 3%, depending on the prior scenario. Although this is moderately low risk, it does not include genetic or catastrophic environmental events, which may have occurred to the population in the past, so our results should be applied cautiously.  相似文献   

13.
Human factors are widely regarded to be highly contributing factors to maritime accident prevention system failures. The conventional methods for human factor assessment, especially quantitative techniques, such as fault trees and bow-ties, are static and cannot deal with models with uncertainty, which limits their application to human factors risk analysis. To alleviate these drawbacks, in the present study, a new human factor analysis framework called multidimensional analysis model of accident causes (MAMAC) is introduced. MAMAC combines the human factors analysis and classification system and business process management. In addition, intuitionistic fuzzy set theory and Bayesian Network are integrated into MAMAC to form a comprehensive dynamic human factors analysis model characterized by flexibility and uncertainty handling. The proposed model is tested on maritime accident scenarios from a sand carrier accident database in China to investigate the human factors involved, and the top 10 most highly contributing primary events associated with the human factors leading to sand carrier accidents are identified. According to the results of this study, direct human factors, classified as unsafe acts, are not a focus for maritime investigators and scholars. Meanwhile, unsafe preconditions and unsafe supervision are listed as the top two considerations for human factors analysis, especially for supervision failures of shipping companies and ship owners. Moreover, potential safety countermeasures for the most highly contributing human factors are proposed in this article. Finally, an application of the proposed model verifies its advantages in calculating the failure probability of accidents induced by human factors.  相似文献   

14.
Integrated manufacturing operations typically are organized along hierarchical lines. Characterized by product aggregation and time horizon, hierarchical decompositions aim at easing problems associated with the complexity and scale of the manufacturing function taken as a whole. Static models have been developed and employed which facilitate the analysis and functioning of these organizations. Existing models are valuable aids in assisting goal-planning functions, but provide little guidance for directing the pursuit of goals. This paper presents a new hierarchical model of integrated manufacturing operations based on concepts of management control. The model is congruent with commonly used static planning models, while at the same lime depicting real-time, goal-achievement efforts within a dynamic operating environment. Emphasizing the interactions between goal planning and goal achievement, the dynamic model provides a means of assessing the effects of decentralization and autonomy on the goal planning and achievement process. The model is used to identify two resource-consuming chain reactions linked to replanning and goal pursuit within the hierarchy. A simple example based on the dynamic extension of a typical static decomposition illustrates the key findings.  相似文献   

15.
Terje Aven 《Risk analysis》2007,27(2):303-312
To protect people from hazards, the common safety regulation regime in many industries is based on the use of minimum standards formulated as risk acceptance or tolerability limits. The limits are seen as absolute, and in principle these should be met regardless of costs. The justification is ethical - people should not be exposed to a risk level exceeding certain limits. In this article, we discuss this approach to safety regulation and its justification. We argue that the use of such limits is based on some critical assumptions; that low accident risk has a value in itself, that risk can be accurately measured and the authorities specify the limits. However, these assumptions are not in general valid, and hence the justification of the approach can be questioned. In the article, we look closer into these issues, and we conclude that there is a need for rethinking this regulation approach - its ethical justification is not stronger than for alternative approaches. Essential for the analysis is the distinction between ethics of the mind and ethics of the consequences, which has several implications that are discussed.  相似文献   

16.
In this paper we extend the work reported in prior studies. The conclusion drawn from the aggregate of those studies was that the limitation on liability imposed by the Price–Anderson Act for a catastrophic accident at a nuclear power plant ($560 million) is comparable to de facto limitations on recovery following catastrophic events in many other industries. The analysis in those reports was at a high level of abstraction, comparing almost exclusively the potential loss from high consequence accidents with the current assets of major firms in relevant industries. We found that potential loss exceeded assets.  相似文献   

17.
We study inference in structural models with a jump in the conditional density, where location and size of the jump are described by regression curves. Two prominent examples are auction models, where the bid density jumps from zero to a positive value at the lowest cost, and equilibrium job‐search models, where the wage density jumps from one positive level to another at the reservation wage. General inference in such models remained a long‐standing, unresolved problem, primarily due to nonregularities and computational difficulties caused by discontinuous likelihood functions. This paper develops likelihood‐based estimation and inference methods for these models, focusing on optimal (Bayes) and maximum likelihood procedures. We derive convergence rates and distribution theory, and develop Bayes and Wald inference. We show that Bayes estimators and confidence intervals are attractive both theoretically and computationally, and that Bayes confidence intervals, based on posterior quantiles, provide a valid large sample inference method.  相似文献   

18.
不良贷款能否有回收是其定价、日常管理和回收策略的决定因素之一,而宏观经济和处置时效则是影响不良贷款能否有回收的双重利刃。本文依据我国最大的不良贷款数据库--LossMetricsTM数据库,利用logistic模型族对清收时间跨度为2001-2008年的不良贷款零回收强度的动态变化影响因素进行了研究,并在研究中针对不同的样本分别建立了子模型和全模型,对多个模型的结果进行了对比。在全时间跨度模型中分析了GDP增速与零回收强度的直接关系;并把单笔贷款回收处置时间跨度分为小于12个月、12-22个月、23-60个月和超过60个月四组子样本,针对子样本分别建立模型,分析影响其各自零回收强度因素的区别。结果表明:GDP增速在大部分模型与零回收强度为显著负相关关系;在大部分子模型中不良贷款的有效抵质押因素显著,但在不同处置时间的子模型中显著情况有所不同。通过对零回收强度的研究可更好的结合宏观经济和处置时间来制订有效科学的回收策略。  相似文献   

19.
Surgical suites are a key driver of a hospital's costs, revenues, and utilization of postoperative resources such as beds. This article describes some commonly occurring operations management problems faced by the managers of surgical suites. For three of these problems, the article also provides preliminary models and possible solution approaches. Its goal is to identify open challenges to spur further research by the operations management community on an important class of problems that have not received adequate attention in the literature, despite their economic importance.  相似文献   

20.
This article proposes a modified cognitive reliability and error analysis method (CREAM) for estimating the human error probability in the maritime accident process on the basis of an evidential reasoning approach. This modified CREAM is developed to precisely quantify the linguistic variables of the common performance conditions and to overcome the problem of ignoring the uncertainty caused by incomplete information in the existing CREAM models. Moreover, this article views maritime accident development from the sequential perspective, where a scenario‐ and barrier‐based framework is proposed to describe the maritime accident process. This evidential reasoning‐based CREAM approach together with the proposed accident development framework are applied to human reliability analysis of a ship capsizing accident. It will facilitate subjective human reliability analysis in different engineering systems where uncertainty exists in practice.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号