首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Graphical Communication of Uncertain Quantities to Nontechnical People   总被引:1,自引:0,他引:1  
Nine pictorial displays for communicating quantitative information about the value of an uncertain quantity, x , were evaluated for their ability to communicate , p ( x > a ) and p ( b > x > a ) to well-educated semi-and nontechnical subjects. Different displays performed best in different applications. Cumulative distribution functions alone can severely mislead some subjects in estimating the mean. A "rusty" knowledge of statistics did not improve performance, and even people with a good basic knowledge of statistics did not perform as well as one would like. Until further experiments are performed, the authors recommend the use of a cumulative distribution function plotted directly above a probability density function with the same horizontal scale, and with the location of the mean clearly marked on both curves.  相似文献   

2.
Tim Bedford 《Risk analysis》2013,33(10):1884-1898
Group risk is usually represented by FN curves showing the frequency of different accident sizes for a given activity. Many governments regulate group risk through FN criterion lines, which define the tolerable location of an FN curve. However, to compare different risk reduction alternatives, one must be able to rank FN curves. The two main problems in doing this are that the FN curve contains multiple frequencies, and that there are usually large epistemic uncertainties about the curve. Since the mid 1970s, a number of authors have used the concept of “disutility” to summarize FN curves in which a family of disutility functions was defined with a single parameter controlling the degree of “risk aversion.” Here, we show it to be risk neutral, disaster averse, and insensitive to epistemic uncertainty on accident frequencies. A new approach is outlined that has a number of attractive properties. The formulation allows us to distinguish between risk aversion and disaster aversion, two concepts that have been confused in the literature until now. A two‐parameter family of disutilities generalizing the previous approach is defined, where one parameter controls risk aversion and the other disaster aversion. The family is sensitive to epistemic uncertainties. Such disutilities may, for example, be used to compare the impact of system design changes on group risks, or might form the basis for valuing reductions in group risk in a cost‐benefit analysis.  相似文献   

3.
This paper presents a new decision-making problem of a fair optimization with respect to the two equally important conflicting objective functions: cost and customer service level, in the presence of supply chain disruption risks. Given a set of customer orders for products, the decision maker needs to select suppliers of parts required to complete the orders, allocate the demand for parts among the selected suppliers, and schedule the orders over the planning horizon, to equitably optimize expected cost and expected customer service level. The supplies of parts are subject to independent random local and regional disruptions. The fair decision-making aims at achieving the normalized expected cost and customer service level values as much close to each other as possible. The obtained combinatorial stochastic optimization problem is formulated as a stochastic mixed integer program with the ordered weighted averaging aggregation of the two conflicting objective functions. Numerical examples and computational results, in particular comparison with the weighted-sum aggregation of the two objective functions are presented and some managerial insights are reported. The findings indicate that for the minimum cost objective the cheapest supplier is usually selected, and for the maximum service level objective a subset of most reliable and most expensive suppliers is usually chosen, whereas the equitably efficient supply portfolio usually combines the most reliable and the cheapest suppliers. While the minimum cost objective function leads to the largest expected unfulfilled demand and the expected production schedule for the maximum service level follows the customer demand with the smallest expected unfulfilled demand, the equitably efficient solution ensures a reasonable value of expected unfulfilled demand.  相似文献   

4.
We study inference in structural models with a jump in the conditional density, where location and size of the jump are described by regression curves. Two prominent examples are auction models, where the bid density jumps from zero to a positive value at the lowest cost, and equilibrium job‐search models, where the wage density jumps from one positive level to another at the reservation wage. General inference in such models remained a long‐standing, unresolved problem, primarily due to nonregularities and computational difficulties caused by discontinuous likelihood functions. This paper develops likelihood‐based estimation and inference methods for these models, focusing on optimal (Bayes) and maximum likelihood procedures. We derive convergence rates and distribution theory, and develop Bayes and Wald inference. We show that Bayes estimators and confidence intervals are attractive both theoretically and computationally, and that Bayes confidence intervals, based on posterior quantiles, provide a valid large sample inference method.  相似文献   

5.
A method to determine how much reduction in public exposure to power frequency magnetic fields can be obtained for different levels of investment is presented. Which if any "effects function" best describes the relationship between field exposure and biological effect is uncertain at this time. Also, in a particular context such as construction of new transmission lines there are a variety of different technologies which might be used to reduce exposure. We describe and demonstrate a method by which exposure reduction supply curves (i.e., the cost of purchasing different amounts of exposure reduction given various mitigation options) can be estimated parametrically for different exposure conditions and effects functions, and we display illustrative results.  相似文献   

6.
Catastrophic events, such as floods, earthquakes, hurricanes, and tsunamis, are rare, yet the cumulative risk of each event occurring at least once over an extended time period can be substantial. In this work, we assess the perception of cumulative flood risks, how those perceptions affect the choice of insurance, and whether perceptions and choices are influenced by cumulative risk information. We find that participants' cumulative risk judgments are well represented by a bimodal distribution, with a group that severely underestimates the risk and a group that moderately overestimates it. Individuals who underestimate cumulative risks make more risk‐seeking choices compared to those who overestimate cumulative risks. Providing explicit cumulative risk information for relevant time periods, as opposed to annual probabilities, is an inexpensive and effective way to improve both the perception of cumulative risk and the choices people make to protect against that risk.  相似文献   

7.
Designers and retailers in consumer products industry are faced with high demand volatility and potential loss of profit from design piracy. Many retailers rely on third-party supply chain managers (SCMs) to manage global supply chains. A SCM starts raw materials procurement and production process based on expected demand and takes financial risks associated with demand uncertainty. But a retailer often delays sharing product design information with SCM forcing it to expedite production and distribution processes incurring additional financial penalties. To analyse economic impact of delayed information sharing under uncertain demand, we develop a mathematical model. Our model indicates that higher demand volatility lessens the effect of penalty associated with delayed information sharing for retailers. The model also shows that for a given demand volatility, per-unit premium increases asymptotically for a retailer compared to marginal production cost increase for SCM. Such findings are not intuitive for SCMs or retailers.  相似文献   

8.
Automobile accident risks vary significantly across populations, places, and times. This study describes the time-varying pattern of societal risk. The relative risks of occupant fatality per person-mile of travel are estimated here for each hour of the week, using 1983 data. The results exhibit a strong time-of-day effect and have a highly skewed frequency distribution, implying wide variations in risk-taking behavior. Indeed, the 168 hourly estimates ranged from a low of 0.32 times the average around Sunday noon to a high of 43 times the average at 3:00 a.m. on Sunday, i.e., by a factor of 134 from bottom to top. Quantile-quantile plots or "Lorenz curves," introduced to display the unequal distribution of risks, show that approximately 34% of the vehicle occupant fatalities occur in hours representing only 5% of the travel. These findings have serious implications for risk analysis. First, when attempting to reconcile objective and subjective risk estimates, risk communicators should carefully control for when and to whom the risk in question is applicable. Second, comparisons of hazards on the basis of average risk are necessarily misleading for risks distributed so unevenly. Third, resource allocation decisions can benefit by knowing how incidence, exposure, and risk vary across time, place, and other relevant variables. Finally, certain cost-benefit analyses that use average values to estimate risk exposure can be misleading.  相似文献   

9.
The focus of this paper is placed on evaluating the impacts of supply disruption risks on the choice between the famous single and dual sourcing methods in a two-stage supply chain with a non-stationary and price-sensitive demand. The expected profit functions of the two sourcing modes in the presence of supply chain disruption risks are first obtained, and then compared so that the critical values of the key factors affecting the final choice are identified. Finally, the sensitivity of the buyer's expected profit to various input factors is examined through numerical examples, which provide guidelines for how to use each sourcing method.  相似文献   

10.
This paper presents a bi-objective stochastic mixed integer programming approach for a joint selection of suppliers and scheduling of production and distribution in a multi-echelon supply chain subject to local and regional disruption risks. Two conflicting problem objectives are minimization of cost and maximization of service level. The three shipping methods are considered for distribution of products: batch shipping with a single shipment of different customer orders, batch shipping with multiple shipments of different customer orders and individual shipping of each customer order immediately after its completion. The stochastic combinatorial optimization problem is formulated as a time-indexed mixed integer program with the weighted-sum aggregation of the two objective functions. The supply portfolio is determined by binary selection and fractional allocation variables while time-indexed assignment variables determine the production and distribution schedules. The problem formulation incorporates supply–production, production–distribution and supply–distribution coordinating constraints to efficiently coordinate supply, production and distribution schedules. Numerical examples modelled after an electronics supply chain and computational results are presented and some managerial insights are reported. The findings indicate that for all shipping methods, the service-oriented supply portfolio is more diversified than the cost-oriented portfolio and the more cost-oriented decision-making, the more delayed the expected supply, production and distribution schedules.  相似文献   

11.
The elements of societal risk from a nuclear power plant accident are clearly illustrated by the Fukushima accident: land contamination, long‐term relocation of large numbers of people, loss of productive farm area, loss of industrial production, and significant loss of electric capacity. NUREG‐1150 and other studies have provided compelling evidence that the individual health risk of nuclear power plant accidents is effectively negligible relative to other comparable risks, even for people living in close proximity to a plant. The objective of this study is to compare the societal risk of nuclear power plant accidents to that of other events to which the public is exposed. We have characterized the monetized societal risk in the United States from major societally disruptive events, such as hurricanes, in the form of a complementary cumulative distribution function. These risks are compared with nuclear power plant risks, based on NUREG‐1150 analyses and new MACCS code calculations to account for differences in source terms determined in the more recent SOARCA study. A candidate quantitative societal objective is discussed for potential adoption by the NRC. The results are also interpreted with regard to the acceptability of nuclear power as a major source of future energy supply.  相似文献   

12.
We perform a statistical study of risk in nuclear energy systems. This study provides and analyzes a data set that is twice the size of the previous best data set on nuclear incidents and accidents, comparing three measures of severity: the industry standard International Nuclear Event Scale, the Nuclear Accident Magnitude Scale of radiation release, and cost in U.S. dollars. The rate of nuclear accidents with cost above 20 MM 2013 USD, per reactor per year, has decreased from the 1970s until the present time. Along the way, the rate dropped significantly after Chernobyl (April 1986) and is expected to be roughly stable around a level of 0.003, suggesting an average of just over one event per year across the current global fleet. The distribution of costs appears to have changed following the Three Mile Island major accident (March 1979). The median cost became approximately 3.5 times smaller, but an extremely heavy tail emerged, being well described by a Pareto distribution with parameter α = 0.5–0.6. For instance, the cost of the two largest events, Chernobyl and Fukushima (March 2011), is equal to nearly five times the sum of the 173 other events. We also document a significant runaway disaster regime in both radiation release and cost data, which we associate with the “dragon‐king” phenomenon. Since the major accident at Fukushima (March 2011) occurred recently, we are unable to quantify an impact of the industry response to this disaster. Excluding such improvements, in terms of costs, our range of models suggests that there is presently a 50% chance that (i) a Fukushima event (or larger) occurs every 60–150 years, and (ii) that a Three Mile Island event (or larger) occurs every 10–20 years. Further—even assuming that it is no longer possible to suffer an event more costly than Chernobyl or Fukushima—the expected annual cost and its standard error bracket the cost of a new plant. This highlights the importance of improvements not only immediately following Fukushima, but also deeper improvements to effectively exclude the possibility of “dragon‐king” disasters. Finally, we find that the International Nuclear Event Scale (INES) is inconsistent in terms of both cost and radiation released. To be consistent with cost data, the Chernobyl and Fukushima disasters would need to have between an INES level of 10 and 11, rather than the maximum of 7.  相似文献   

13.
This article presents an analysis of postattack response strategies to mitigate the risks of reoccupying contaminated areas following a release of Bacillus anthracis spores (the bacterium responsible for causing anthrax) in an urban setting. The analysis is based on a hypothetical attack scenario in which individuals are exposed to B. anthracis spores during an initial aerosol release and then placed on prophylactic antibiotics that successfully protect them against the initial aerosol exposure. The risk from reoccupying buildings contaminated with spores due to their reaerosolization and inhalation is then evaluated. The response options considered include: decontamination of the buildings, vaccination of individuals reoccupying the buildings, extended evacuation of individuals from the contaminated buildings, and combinations of these options. The study uses a decision tree to estimate the costs and benefits of alternative response strategies across a range of exposure risks. Results for best estimates of model inputs suggest that the most cost‐effective response for high‐risk scenarios (individual chance of infection exceeding 11%) consists of evacuation and building decontamination. For infection risks between 4% and 11%, the preferred option is to evacuate for a short period, vaccinate, and then reoccupy once the vaccine has taken effect. For risks between 0.003% and 4%, the preferred option is to vaccinate only. For risks below 0.003%, none of the mitigation actions have positive expected monetary benefits. A sensitivity analysis indicates that for high‐infection‐likelihood scenarios, vaccination is recommended in the case where decontamination efficacy is less than 99.99%.  相似文献   

14.
This article presents a framework for using probabilistic terrorism risk modeling in regulatory analysis. We demonstrate the framework with an example application involving a regulation under consideration, the Western Hemisphere Travel Initiative for the Land Environment, (WHTI‐L). First, we estimate annualized loss from terrorist attacks with the Risk Management Solutions (RMS) Probabilistic Terrorism Model. We then estimate the critical risk reduction, which is the risk‐reducing effectiveness of WHTI‐L needed for its benefit, in terms of reduced terrorism loss in the United States, to exceed its cost. Our analysis indicates that the critical risk reduction depends strongly not only on uncertainties in the terrorism risk level, but also on uncertainty in the cost of regulation and how casualties are monetized. For a terrorism risk level based on the RMS standard risk estimate, the baseline regulatory cost estimate for WHTI‐L, and a range of casualty cost estimates based on the willingness‐to‐pay approach, our estimate for the expected annualized loss from terrorism ranges from $2.7 billion to $5.2 billion. For this range in annualized loss, the critical risk reduction for WHTI‐L ranges from 7% to 13%. Basing results on a lower risk level that results in halving the annualized terrorism loss would double the critical risk reduction (14–26%), and basing the results on a higher risk level that results in a doubling of the annualized terrorism loss would cut the critical risk reduction in half (3.5–6.6%). Ideally, decisions about terrorism security regulations and policies would be informed by true benefit‐cost analyses in which the estimated benefits are compared to costs. Such analyses for terrorism security efforts face substantial impediments stemming from the great uncertainty in the terrorist threat and the very low recurrence interval for large attacks. Several approaches can be used to estimate how a terrorism security program or regulation reduces the distribution of risks it is intended to manage. But, continued research to develop additional tools and data is necessary to support application of these approaches. These include refinement of models and simulations, engagement of subject matter experts, implementation of program evaluation, and estimating the costs of casualties from terrorism events.  相似文献   

15.
Risk analysis often depends on complex, computer-based models to describe links between policies (e.g., required emission-control equipment) and consequences (e.g., probabilities of adverse health effects). Appropriate specification of many model aspects is uncertain, including details of the model structure; transport, reaction-rate, and other parameters; and application-specific inputs such as pollutant-release rates. Because these uncertainties preclude calculation of the precise consequences of a policy, it is important to characterize the plausible range of effects. In principle, a probability distribution function for the effects can be constructed using Monte Carlo analysis, but the combinatorics of multiple uncertainties and the often high cost of model runs quickly exhaust available resources. This paper presents and applies a method to choose sets of input conditions (scenarios) that efficiently represent knowledge about the joint probability distribution of inputs. A simple score function approximately relating inputs to a policy-relevant output—in this case, globally averaged stratospheric ozone depletion—is developed. The probability density function for the score-function value is analytically derived from a subjective joint probability density for the inputs. Scenarios are defined by selected quantiles of the score function. Using this method, scenarios can be systematically selected in terms of the approximate probability distribution function for the output of concern, and probability intervals for the joint effect of the inputs can be readily constructed.  相似文献   

16.
In many problems of risk analysis, failure is equivalent to the event of a random risk factor exceeding a given threshold. Failure probabilities can be controlled if a decisionmaker is able to set the threshold at an appropriate level. This abstract situation applies, for example, to environmental risks with infrastructure controls; to supply chain risks with inventory controls; and to insurance solvency risks with capital controls. However, uncertainty around the distribution of the risk factor implies that parameter error will be present and the measures taken to control failure probabilities may not be effective. We show that parameter uncertainty increases the probability (understood as expected frequency) of failures. For a large class of loss distributions, arising from increasing transformations of location‐scale families (including the log‐normal, Weibull, and Pareto distributions), the article shows that failure probabilities can be exactly calculated, as they are independent of the true (but unknown) parameters. Hence it is possible to obtain an explicit measure of the effect of parameter uncertainty on failure probability. Failure probability can be controlled in two different ways: (1) by reducing the nominal required failure probability, depending on the size of the available data set, and (2) by modifying of the distribution itself that is used to calculate the risk control. Approach (1) corresponds to a frequentist/regulatory view of probability, while approach (2) is consistent with a Bayesian/personalistic view. We furthermore show that the two approaches are consistent in achieving the required failure probability. Finally, we briefly discuss the effects of data pooling and its systemic risk implications.  相似文献   

17.
In this paper we propose a framework for conducting a decision analysis for a societal problem such as earthquake safety. The application deals with the formulation and evaluation of alternative policies for the seismic safety problem faced by the city of Los Angeles with regard to its old masonry buildings. A social decision analysis compares the costs and benefits of the alternative policies from the viewpoints of the impacted constituents. The emphasis is on identifying acceptable policy that considers the interests of the impacted constituents and provides incentives for their cooperation. Alternatives ranging from strict regulation to free market are examined. In order to evaluate the trade-offs between additional cost and savings in lives, a direct willingness-to-pay and an economic approach, based on property value differential, are used. Recommendations range from strict regulation for the residential and critical buildings (schools, hospitals, fire stations, etc.) to simply informing the occupants (in the case of commercial and industrial buildings) of the risks involved.  相似文献   

18.
Decision biases can distort cost‐benefit evaluations of uncertain risks, leading to risk management policy decisions with predictably high retrospective regret. We argue that well‐documented decision biases encourage learning aversion, or predictably suboptimal learning and premature decision making in the face of high uncertainty about the costs, risks, and benefits of proposed changes. Biases such as narrow framing, overconfidence, confirmation bias, optimism bias, ambiguity aversion, and hyperbolic discounting of the immediate costs and delayed benefits of learning, contribute to deficient individual and group learning, avoidance of information seeking, underestimation of the value of further information, and hence needlessly inaccurate risk‐cost‐benefit estimates and suboptimal risk management decisions. In practice, such biases can create predictable regret in selection of potential risk‐reducing regulations. Low‐regret learning strategies based on computational reinforcement learning models can potentially overcome some of these suboptimal decision processes by replacing aversion to uncertain probabilities with actions calculated to balance exploration (deliberate experimentation and uncertainty reduction) and exploitation (taking actions to maximize the sum of expected immediate reward, expected discounted future reward, and value of information). We discuss the proposed framework for understanding and overcoming learning aversion and for implementing low‐regret learning strategies using regulation of air pollutants with uncertain health effects as an example.  相似文献   

19.
现有投资组合优化研究普遍假设投资者之间相互独立,且假定标的资产在不同阶段的收益序列不具相关性.然而在实际投资过程中,投资者往往是相互影响,资产收益序列也存在相依特征.基于多阶段投资组合优化和纳什均衡理论,利用相对绩效来刻画投资者之间的博弈现象,以每个投资者的相对终端财富的期望效用水平为目标,构建多阶段投资组合博弈模型.在资产收益序列相依情形下,给出了纳什均衡投资策略和相应值函数的解析表达式,以及纳什均衡投资策略与传统策略的关系.采用累计经验分布函数和夏普比率等指标,对纳什均衡投资策略与传统策略进行仿真比较,分析了纳什均衡投资策略随投资者反应敏感系数的变化趋势.结果表明:相比于传统的投资策略,当考虑竞争对手的相对绩效时,纳什均衡策略投资者更愿意冒高风险去追求高收益;并且投资者的反应敏感系数越大,其对风险的偏好程度也越高.  相似文献   

20.
Current stochastic dominance algorithms use step functions to approximate the cumulative distributions of the alternatives, even when the underlying random variables are known to be continuous. Since stochastic dominance tests require repeated integration of the cumulative distribution functions, a compounding of errors may result from this type of approximation. This article introduces a new stochastic dominance algorithm that approximates the cumulative distribution function by piecewise linear approximations. Comparisons between the new and old algorithms are performed for normally distributed alternatives. In about 95 percent of all cases, the two algorithms produce the same result.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号