首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
This article proposes an intertemporal risk‐value (IRV) model that integrates probability‐time tradeoff, time‐value tradeoff, and risk‐value tradeoff into one unified framework. We obtain a general probability‐time tradeoff, which yields a formal representation form to reflect the psychological distance of a decisionmaker in evaluating a temporal lottery. This intuition of probability‐time tradeoff is supported by robust empirical findings as well as by psychological theory. Through an explicit formalization of probability‐time tradeoff, an IRV model taking into account three fundamental dimensions, namely, value, probability, and time, is established. The object of evaluation in our framework is a complex lottery. We also give some insights into the structure of the IRV model using a wildcatter problem.  相似文献   

2.
Knowledge on failure events and their associated factors, gained from past construction projects, is regarded as potentially extremely useful in risk management. However, a number of circumstances are constraining its wider use. Such knowledge is usually scarce, seldom documented, and even unavailable when it is required. Further, there exists a lack of proven methods to integrate and analyze it in a cost‐effective way. This article addresses possible options to overcome these difficulties. Focusing on limited but critical potential failure events, the article demonstrates how knowledge on a number of important potential failure events in tunnel works can be integrated. The problem of unavailable or incomplete information was addressed by gathering judgments from a group of experts. The elicited expert knowledge consisted of failure scenarios and associated probabilistic information. This information was integrated using Bayesian belief‐networks‐based models that were first customized in order to deal with the expected divergence in judgments caused by epistemic uncertainty of risks. The work described in the article shows that the developed models that integrate risk‐related knowledge provide guidance as to the use of specific remedial measures.  相似文献   

3.
This paper uses “revealed probability trade‐offs” to provide a natural foundation for probability weighting in the famous von Neumann and Morgenstern axiomatic set‐up for expected utility. In particular, it shows that a rank‐dependent preference functional is obtained in this set‐up when the independence axiom is weakened to stochastic dominance and a probability trade‐off consistency condition. In contrast with the existing axiomatizations of rank‐dependent utility, the resulting axioms allow for complete flexibility regarding the outcome space. Consequently, a parameter‐free test/elicitation of rank‐dependent utility becomes possible. The probability‐oriented approach of this paper also provides theoretical foundations for probabilistic attitudes towards risk. It is shown that the preference conditions that characterize the shape of the probability weighting function can be derived from simple probability trade‐off conditions.  相似文献   

4.
Regulatory agencies often perform microbial risk assessments to evaluate the change in the number of human illnesses as the result of a new policy that reduces the level of contamination in the food supply. These agencies generally have regulatory authority over the production and retail sectors of the farm‐to‐table continuum. Any predicted change in contamination that results from new policy that regulates production practices occurs many steps prior to consumption of the product. This study proposes a framework for conducting microbial food‐safety risk assessments; this framework can be used to quantitatively assess the annual effects of national regulatory policies. Advantages of the framework are that estimates of human illnesses are consistent with national disease surveillance data (which are usually summarized on an annual basis) and some of the modeling steps that occur between production and consumption can be collapsed or eliminated. The framework leads to probabilistic models that include uncertainty and variability in critical input parameters; these models can be solved using a number of different Bayesian methods. The Bayesian synthesis method performs well for this application and generates posterior distributions of parameters that are relevant to assessing the effect of implementing a new policy. An example, based on Campylobacter and chicken, estimates the annual number of illnesses avoided by a hypothetical policy; this output could be used to assess the economic benefits of a new policy. Empirical validation of the policy effect is also examined by estimating the annual change in the numbers of illnesses observed via disease surveillance systems.  相似文献   

5.
In this article, we propose an integrated direct and indirect flood risk model for small‐ and large‐scale flood events, allowing for dynamic modeling of total economic losses from a flood event to a full economic recovery. A novel approach is taken that translates direct losses of both capital and labor into production losses using the Cobb‐Douglas production function, aiming at improved consistency in loss accounting. The recovery of the economy is modeled using a hybrid input‐output model and applied to the port region of Rotterdam, using six different flood events (1/10 up to 1/10,000). This procedure allows gaining a better insight regarding the consequences of both high‐ and low‐probability floods. The results show that in terms of expected annual damage, direct losses remain more substantial relative to the indirect losses (approximately 50% larger), but for low‐probability events the indirect losses outweigh the direct losses. Furthermore, we explored parameter uncertainty using a global sensitivity analysis, and varied critical assumptions in the modeling framework related to, among others, flood duration and labor recovery, using a scenario approach. Our findings have two important implications for disaster modelers and practitioners. First, high‐probability events are qualitatively different from low‐probability events in terms of the scale of damages and full recovery period. Second, there are substantial differences in parameter influence between high‐probability and low‐probability flood modeling. These findings suggest that a detailed approach is required when assessing the flood risk for a specific region.  相似文献   

6.
Recent advances in approaches and production technologies for the production of goods and services have made just‐in‐time (JIT) a strong alternative for use in intermittent and small batch production systems, especially when time‐based competition is the norm and a low inventory is a must. However, the conventional JIT system is designed for mass production with a stable master production schedule. This paper suggests supplementing the information provided by production kanbans with information about customer waiting lines to be used by operators to schedule production in each work‐station of intermittent and small batch production systems. This paper uses simulation to analyze the effect of four scheduling policy variables—number of kanbans, length of the withdrawal cycle, information about customer waiting lines, and priority rules on two performance measures—customer wait‐time and inventory. The results show that using information about customer waiting lines reduces customer wait‐time by about 30% while also reducing inventory by about 2%. In addition, the effect of information about customer waiting lines overshadows the effect of priority rules on customer wait‐time and inventory.  相似文献   

7.
Expert knowledge is an important source of input to risk analysis. In practice, experts might be reluctant to characterize their knowledge and the related (epistemic) uncertainty using precise probabilities. The theory of possibility allows for imprecision in probability assignments. The associated possibilistic representation of epistemic uncertainty can be combined with, and transformed into, a probabilistic representation; in this article, we show this with reference to a simple fault tree analysis. We apply an integrated (hybrid) probabilistic‐possibilistic computational framework for the joint propagation of the epistemic uncertainty on the values of the (limiting relative frequency) probabilities of the basic events of the fault tree, and we use possibility‐probability (probability‐possibility) transformations for propagating the epistemic uncertainty within purely probabilistic and possibilistic settings. The results of the different approaches (hybrid, probabilistic, and possibilistic) are compared with respect to the representation of uncertainty about the top event (limiting relative frequency) probability. Both the rationale underpinning the approaches and the computational efforts they require are critically examined. We conclude that the approaches relevant in a given setting depend on the purpose of the risk analysis, and that further research is required to make the possibilistic approaches operational in a risk analysis context.  相似文献   

8.
The development of catastrophe models in recent years allows for assessment of the flood hazard much more effectively than when the federally run National Flood Insurance Program (NFIP) was created in 1968. We propose and then demonstrate a methodological approach to determine pure premiums based on the entire distribution of possible flood events. We apply hazard, exposure, and vulnerability analyses to a sample of 300,000 single‐family residences in two counties in Texas (Travis and Galveston) using state‐of‐the‐art flood catastrophe models. Even in zones of similar flood risk classification by FEMA there is substantial variation in exposure between coastal and inland flood risk. For instance, homes in the designated moderate‐risk X500/B zones in Galveston are exposed to a flood risk on average 2.5 times greater than residences in X500/B zones in Travis. The results also show very similar average annual loss (corrected for exposure) for a number of residences despite their being in different FEMA flood zones. We also find significant storm‐surge exposure outside of the FEMA designated storm‐surge risk zones. Taken together these findings highlight the importance of a microanalysis of flood exposure. The process of aggregating risk at a flood zone level—as currently undertaken by FEMA—provides a false sense of uniformity. As our analysis indicates, the technology to delineate the flood risks exists today.  相似文献   

9.
We develop a real‐options model for optimizing production and sourcing choices under evolutionary supply‐chain risk. We model lead time as an endogenous decision and calculate the cost differential required to compensate for the risk exposure coming from lead time. The shape of the resulting cost‐differential frontier reveals the term structure of supply‐chain risk premiums and provides guidance as to the potential value of lead‐time reduction. Under constant demand volatility, the break‐even cost differential increases in volatility and lead time at a decreasing rate, making incremental lead‐time reduction less valuable than full lead‐time reduction. Stochastic demand volatility increases the relative value of incremental lead‐time reduction. When demand has a heavy right tail, the value of lead‐time reduction depends on how extreme values of demand are incorporated into the forecasting process. The cost‐differential frontier is invariant to discount rates, making the cost of capital irrelevant for choosing between lead times. We demonstrate the managerial implications of the model by applying it first to the classic Sport‐Obermeyer case and then to a supplier‐selection problem faced by a global manufacturer.  相似文献   

10.
《Risk analysis》2018,38(4):694-709
Subsurface energy activities entail the risk of induced seismicity including low‐probability high‐consequence (LPHC) events. For designing respective risk communication, the scientific literature lacks empirical evidence of how the public reacts to different written risk communication formats about such LPHC events and to related uncertainty or expert confidence. This study presents findings from an online experiment (N = 590) that empirically tested the public's responses to risk communication about induced seismicity and to different technology frames, namely deep geothermal energy (DGE) and shale gas (between‐subject design). Three incrementally different formats of written risk communication were tested: (i) qualitative, (ii) qualitative and quantitative, and (iii) qualitative and quantitative with risk comparison. Respondents found the latter two the easiest to understand, the most exact, and liked them the most. Adding uncertainty and expert confidence statements made the risk communication less clear, less easy to understand and increased concern. Above all, the technology for which risks are communicated and its acceptance mattered strongly: respondents in the shale gas condition found the identical risk communication less trustworthy and more concerning than in the DGE conditions. They also liked the risk communication overall less. For practitioners in DGE or shale gas projects, the study shows that the public would appreciate efforts in describing LPHC risks with numbers and optionally risk comparisons. However, there seems to be a trade‐off between aiming for transparency by disclosing uncertainty and limited expert confidence, and thereby decreasing clarity and increasing concern in the view of the public.  相似文献   

11.
Security risk management is essential for ensuring effective airport operations. This article introduces AbSRiM, a novel agent‐based modeling and simulation approach to perform security risk management for airport operations that uses formal sociotechnical models that include temporal and spatial aspects. The approach contains four main steps: scope selection, agent‐based model definition, risk assessment, and risk mitigation. The approach is based on traditional security risk management methodologies, but uses agent‐based modeling and Monte Carlo simulation at its core. Agent‐based modeling is used to model threat scenarios, and Monte Carlo simulations are then performed with this model to estimate security risks. The use of the AbSRiM approach is demonstrated with an illustrative case study. This case study includes a threat scenario in which an adversary attacks an airport terminal with an improvised explosive device. The approach provides a promising way to include important elements, such as human aspects and spatiotemporal aspects, in the assessment of risk. More research is still needed to better identify the strengths and weaknesses of the AbSRiM approach in different case studies, but results demonstrate the feasibility of the approach and its potential.  相似文献   

12.
In this paper, we analyze the performance of call centers of financial service providers with two levels of support and a time‐dependent overflow mechanism. Waiting calls from the front‐office queue flow over to the back office if a waiting‐time limit is reached and at least one back‐office agent is available. The analysis of such a system with time‐dependent overflow is reduced to the analysis of a continuous‐time Markov chain with state‐dependent overflow probabilities. To approximate the system with time‐dependent overflow, some waiting‐based performance measures are modified. Numerical results demonstrate the reliability of this Markovian performance approximation for different parameter settings. A sensitivity analysis shows the impact of the waiting‐time limit and the dependence of the performance measures on the arrival rate.  相似文献   

13.
Yacov Y. Haimes 《Risk analysis》2012,32(9):1451-1467
This article is grounded on the premise that the complex process of risk assessment, management, and communication, when applied to systems of systems, should be guided by universal systems‐based principles. It is written from the perspective of systems engineering with the hope and expectation that the principles introduced here will be supplemented and complemented by principles from the perspectives of other disciplines. Indeed, there is no claim that the following 10 guiding principles constitute a complete set; rather, the intent is to initiate a discussion on this important subject that will incrementally lead us to a more complete set of guiding principles. The 10 principles are as follows: First Principle: Holism is the common denominator that bridges risk analysis and systems engineering. Second Principle: The process of risk modeling, assessment, management, and communication must be systemic and integrated. Third Principle: Models and state variables are central to quantitative risk analysis. Fourth Principle: Multiple models are required to represent the essence of the multiple perspectives of complex systems of systems. Fifth Principle: Meta‐modeling and subsystems integration must be derived from the intrinsic states of the system of systems. Sixth Principle: Multiple conflicting and competing objectives are inherent in risk management. Seventh Principle: Risk analysis must account for epistemic and aleatory uncertainties. Eighth Principle: Risk analysis must account for risks of low probability with extreme consequences. Ninth Principle: The time frame is central to quantitative risk analysis. Tenth Principle: Risk analysis must be holistic, adaptive, incremental, and sustainable, and it must be supported with appropriate data collection, metrics with which to measure efficacious progress, and criteria on the basis of which to act. The relevance and efficacy of each guiding principle is demonstrated by applying it to the U.S. Federal Aviation Administration complex Next Generation (NextGen) system of systems.  相似文献   

14.
Erlang风险模型有限时间的破产概率   总被引:3,自引:2,他引:3  
江涛 《中国管理科学》2006,14(1):112-116
Erlang风险模型广泛应用于排队论、控制论以及金融风险过程.本文在索赔来到(claim-arrival)为Erlang过程,索赔额服从帕雷托分布以及具有常数利息力度的假设下,得到了有限时间内破产概率的渐近表达公式.该结果实质性地推广了Kluppelberg and Stadtmuller[1]和Tang[2]的结果:前者考虑了无穷时间的破产概率,而后者考虑的过程局限为泊松的.由破产模型与排队模型之间的联系可知,本文的结果在管理科学中有许多应用.  相似文献   

15.
《Risk analysis》2018,38(8):1618-1633
Climate change and its projected natural hazards have an adverse impact on the functionality and operation of transportation infrastructure systems. This study presents a comprehensive framework to analyze the risk to transportation infrastructure networks that are affected by natural hazards. The proposed risk analysis method considers both the failure probability of infrastructure components and the expected infrastructure network efficiency and capacity loss due to component failure. This comprehensive approach facilitates the identification of high‐risk network links in terms of not only their susceptibility to natural hazards but also their overall impact on the network. The Chinese national rail system and its exposure to rainfall‐related multihazards are used as a case study. The importance of various links is comprehensively assessed from the perspectives of topological, efficiency, and capacity criticality. Risk maps of the national railway system are generated, which can guide decisive action regarding investments in preventative and adaptive measures to reduce risk.  相似文献   

16.
Cost‐benefit analysis (CBA) is commonly applied as a tool for deciding on risk protection. With CBA, one can identify risk mitigation strategies that lead to an optimal tradeoff between the costs of the mitigation measures and the achieved risk reduction. In practical applications of CBA, the strategies are typically evaluated through efficiency indicators such as the benefit‐cost ratio (BCR) and the marginal cost (MC) criterion. In many of these applications, the BCR is not consistently defined, which, as we demonstrate in this article, can lead to the identification of suboptimal solutions. This is of particular relevance when the overall budget for risk reduction measures is limited and an optimal allocation of resources among different subsystems is necessary. We show that this problem can be formulated as a hierarchical decision problem, where the general rules and decisions on the available budget are made at a central level (e.g., central government agency, top management), whereas the decisions on the specific measures are made at the subsystem level (e.g., local communities, company division). It is shown that the MC criterion provides optimal solutions in such hierarchical optimization. Since most practical applications only include a discrete set of possible risk protection measures, the MC criterion is extended to this situation. The findings are illustrated through a hypothetical numerical example. This study was prepared as part of our work on the optimal management of natural hazard risks, but its conclusions also apply to other fields of risk management.  相似文献   

17.
We develop an inventory placement model in the context of general multi‐echelon supply chains where the delivery lead time promised to the customer must be respected. The delivery lead time is calculated based on the available stocks of the different input and output products in the different facilities and takes into account the purchasing lead times, the manufacturing lead times, and the transportation lead times. We assume finite manufacturing capacities and consider the interactions of manufacturing orders between time periods. Each facility manages the stocks of its input and output products. The size of customer orders and their arrival dates and due dates are assumed to be known as in many B2B situations. We perform extensive computational experiments to derive managerial insights. We also derive analytical insights regarding the manufacturing capacities to be installed and the impacts of the frequency of orders on the system cost.  相似文献   

18.
Dose–response modeling of biological agents has traditionally focused on describing laboratory‐derived experimental data. Limited consideration has been given to understanding those factors that are controlled in a laboratory, but are likely to occur in real‐world scenarios. In this study, a probabilistic framework is developed that extends Brookmeyer's competing‐risks dose–response model to allow for variation in factors such as dose‐dispersion, dose‐deposition, and other within‐host parameters. With data sets drawn from dose–response experiments of inhalational anthrax, plague, and tularemia, we illustrate how for certain cases, there is the potential for overestimation of infection numbers arising from models that consider only the experimental data in isolation.  相似文献   

19.
Groundwater leakage into subsurface constructions can cause reduction of pore pressure and subsidence in clay deposits, even at large distances from the location of the construction. The potential cost of damage is substantial, particularly in urban areas. The large‐scale process also implies heterogeneous soil conditions that cannot be described in complete detail, which causes a need for estimating uncertainty of subsidence with probabilistic methods. In this study, the risk for subsidence is estimated by coupling two probabilistic models, a geostatistics‐based soil stratification model with a subsidence model. Statistical analyses of stratification and soil properties are inputs into the models. The results include spatially explicit probabilistic estimates of subsidence magnitude and sensitivities of included model parameters. From these, areas with significant risk for subsidence are distinguished from low‐risk areas. The efficiency and usefulness of this modeling approach as a tool for communication to stakeholders, decision support for prioritization of risk‐reducing measures, and identification of the need for further investigations and monitoring are demonstrated with a case study of a planned tunnel in Stockholm.  相似文献   

20.
Due to the growing concern over environmental issues, regardless of whether companies are going to voluntarily incorporate green policies in practice, or will be forced to do so in the context of new legislation, change is foreseen in the future of transportation management. Assigning and scheduling vehicles to service a pre‐determined set of clients is a common distribution problem. Accounting for time‐dependent travel times between customers, we present a model that considers travel time, fuel, and CO2 emissions costs. Specifically, we propose a framework for modeling CO2 emissions in a time‐dependent vehicle routing context. The model is solved via a tabu search procedure. As the amount of CO2 emissions is correlated with vehicle speed, our model considers limiting vehicle speed as part of the optimization. The emissions per kilometer as a function of speed are minimized at a unique speed. However, we show that in a time‐dependent environment this speed is sub‐optimal in terms of total emissions. This occurs if vehicles are able to avoid running into congestion periods where they incur high emissions. Clearly, considering this trade‐off in the vehicle routing problem has great practical potential. In the same line, we construct bounds on the total amount of emissions to be saved by making use of the standard VRP solutions. As fuel consumption is correlated with CO2 emissions, we show that reducing emissions leads to reducing costs. For a number of experimental settings, we show that limiting vehicle speeds is desired from a total cost perspective. This namely stems from the trade‐off between fuel and travel time costs.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号