首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
The Petroleum Safety Authority Norway (PSA‐N) has recently adopted a new definition of risk: “the consequences of an activity with the associated uncertainty.” The PSA‐N has also been using “deficient risk assessment” for some time as a basis for assigning nonconformities in audit reports. This creates an opportunity to study the link between risk perspective and risk assessment quality in a regulatory context, and, in the present article, we take a hard look at the term “deficient risk assessment” both normatively and empirically. First, we perform a conceptual analysis of how a risk assessment can be deficient in light of a particular risk perspective consistent with the new PSA‐N risk definition. Then, we examine the usages of the term “deficient” in relation to risk assessments in PSA‐N audit reports and classify these into a set of categories obtained from the conceptual analysis. At an overall level, we were able to identify on what aspects of the risk assessment the PSA‐N is focusing and where deficiencies are being identified in regulatory practice. A key observation is that there is a diversity in how the agency officials approach the risk assessments in audits. Hence, we argue that improving the conceptual clarity of what the authorities characterize as “deficient” in relation to the uncertainty‐based risk perspective may contribute to the development of supervisory practices and, eventually, potentially strengthen the learning outcome of the audit reports.  相似文献   

2.
3.
《Risk analysis》2018,38(5):1009-1035
The predominant definition of extinction risk in conservation biology involves evaluating the cumulative distribution function (CDF) of extinction time at a particular point (the “time horizon”). Using the principles of decision theory, this article develops an alternative definition of extinction risk as the expected loss (EL) to society resulting from eventual extinction of a species. Distinct roles are identified for time preference and risk aversion. Ranges of tentative values for the parameters of the two approaches are proposed, and the performances of the two approaches are compared and contrasted for a small set of real‐world species with published extinction time distributions and a large set of hypothetical extinction time distributions. Potential issues with each approach are evaluated, and the EL approach is recommended as the better of the two. The CDF approach suffers from the fact that extinctions that occur at any time before the specified time horizon are weighted equally, while extinctions that occur beyond the specified time horizon receive no weight at all. It also suffers from the fact that the time horizon does not correspond to any natural phenomenon, and so is impossible to specify nonarbitrarily; yet the results can depend critically on the specified value. In contrast, the EL approach has the advantage of weighting extinction time continuously, with no artificial time horizon, and the parameters of the approach (the rates of time preference and risk aversion) do correspond to natural phenomena, and so can be specified nonarbitrarily.  相似文献   

4.
Adam M. Finkel 《Risk analysis》2014,34(10):1785-1794
If exposed to an identical concentration of a carcinogen, every human being would face a different level of risk, determined by his or her genetic, environmental, medical, and other uniquely individual characteristics. Various lines of evidence indicate that this susceptibility variable is distributed rather broadly in the human population, with perhaps a factor of 25‐ to 50‐fold between the center of this distribution and either of its tails, but cancer risk assessment at the EPA and elsewhere has always treated every (adult) human as identically susceptible. The National Academy of Sciences “Silver Book” concluded that EPA and the other agencies should fundamentally correct their mis‐computation of carcinogenic risk in two ways: (1) adjust individual risk estimates upward to provide information about the upper tail; and (2) adjust population risk estimates upward (by about sevenfold) to correct an underestimation due to a mathematical property of the interindividual distribution of human susceptibility, in which the susceptibility averaged over the entire (right‐skewed) population exceeds the median value for the typical human. In this issue of Risk Analysis, Kenneth Bogen disputes the second adjustment and endorses the first, though he also relegates the problem of underestimated individual risks to the realm of “equity concerns” that he says should have little if any bearing on risk management policy. In this article, I show why the basis for the population risk adjustment that the NAS recommended is correct—that current population cancer risk estimates, whether they are derived from animal bioassays or from human epidemiologic studies, likely provide estimates of the median with respect to human variation, which in turn must be an underestimate of the mean. If cancer risk estimates have larger “conservative” biases embedded in them, a premise I have disputed in many previous writings, such a defect would not excuse ignoring this additional bias in the direction of underestimation. I also demonstrate that sensible, legally appropriate, and ethical risk policy must not only inform the public when the tail of the individual risk distribution extends into the “high‐risk” range, but must alter benefit‐cost balancing to account for the need to try to reduce these tail risks preferentially.  相似文献   

5.
Public perception research in different countries has suggested that real and perceived periods of high temperature strengthen people's climate change beliefs. Such findings raise questions about the climate change beliefs of people in regions with moderate climates. Relatively little is known about whether public concerns about climate change may also be associated with perceived changes in other weather‐related events, such as precipitation or flooding. We examine the relationship between perceived changes in weather‐related events and climate change beliefs among U.K. residents at a time of below‐average winter temperatures and recent flooding. National survey data (n = 1,848) revealed that heat waves and hot summers were perceived to have become less common during respondents’ lifetimes, while flooding, periods of heavy rainfall, coastal erosions, and mild winters were perceived to have increased in frequency and cold winters were perceived to be unchanged. Although perceived changes in hot‐weather‐related events were positively associated with climate change beliefs, perceived changes in wet‐weather‐related events were found to be an even stronger predictor. Self‐reported experience of “flooding in own area” and “heat‐wave discomfort” also significantly contributed to climate change beliefs. These findings highlight the importance of salient weather‐related events and experiences in the formation of beliefs about climate change. We link our findings to research in judgment and decision making, and propose that those wishing to engage with the public on the issue of climate change should not limit their focus to heat.  相似文献   

6.
The printing press was a game‐changing information technology. Risk assessment could be also. At present, risk assessments are commonly used as one‐time decision aids: they provide justification for a particular decision, and afterwards usually sit on a shelf. However, when viewed as information technologies, their potential uses are much broader. Risk assessments: (1) are repositories of structured information and a medium for communication; (2) embody evaluative structures for setting priorities; (3) can preserve information over time and permit asynchronous communication, thus encouraging learning and adaptation; and (4) explicitly address uncertain futures. Moreover, because of their “what‐if” capabilities, risk assessments can serve as a platform for constructive discussion among parties that hold different values. The evolution of risk assessment in the nuclear industry shows how such attributes have been used to lower core‐melt risks substantially through improved templates for maintenance and more effective coordination with regulators (although risk assessment has been less commonly used in improving emergency‐response capabilities). The end result of this evolution in the nuclear industry has been the development of “living” risk assessments that are updated more or less in real time to answer even routine operational questions. Similar but untapped opportunities abound for the use of living risk assessments to reduce risks in small operational decisions as well as large policy decisions in other areas of hazard management. They can also help improve understanding of and communication about risks, and future risk assessment and management. Realization of these opportunities will require significant changes in incentives and active promotion by the risk analytic community.  相似文献   

7.
Managers at all stages of a supply chain are concerned about meeting profit targets. We study contract design for a buyer–supplier supply chain, with each party maximizing expected profit subject to a chance constraint on meeting his respective profit target. We derive the optimal contract form (across all contract types) with randomized and deterministic payments. The best contract has the property that, if the chance constraints are binding, at most one party fails to satisfy his profit target for any given demand realization. This implies that “least risk sharing,”that is, minimizing the probability of outcomes for which both parties fail to achieve their profit targets, is optimal, contrary to the usual expectations of “risk sharing.” We show that an optimal contract can possess only two of the following three properties simultaneously: (i) supply chain coordination, (ii) truth‐telling, and (iii) non‐randomized payments. We discuss methods to mitigate the consequent implementation challenges. We also derive the optimal contract form when chance constraints are incorporated into several simpler and easier‐to‐implement contracts. From a numerical study, we find that an incremental returns contract (in which the marginal rebate rate depends on the return quantity) performs quite well across a relatively broad range of conditions.  相似文献   

8.
Humans are continuously exposed to chemicals with suspected or proven endocrine disrupting chemicals (EDCs). Risk management of EDCs presents a major unmet challenge because the available data for adverse health effects are generated by examining one compound at a time, whereas real‐life exposures are to mixtures of chemicals. In this work, we integrate epidemiological and experimental evidence toward a whole mixture strategy for risk assessment. To illustrate, we conduct the following four steps in a case study: (1) identification of single EDCs (“bad actors”)—measured in prenatal blood/urine in the SELMA study—that are associated with a shorter anogenital distance (AGD) in baby boys; (2) definition and construction of a “typical” mixture consisting of the “bad actors” identified in Step 1; (3) experimentally testing this mixture in an in vivo animal model to estimate a dose–response relationship and determine a point of departure (i.e., reference dose [RfD]) associated with an adverse health outcome; and (4) use a statistical measure of “sufficient similarity” to compare the experimental RfD (from Step 3) to the exposure measured in the human population and generate a “similar mixture risk indicator” (SMRI). The objective of this exercise is to generate a proof of concept for the systematic integration of epidemiological and experimental evidence with mixture risk assessment strategies. Using a whole mixture approach, we could find a higher rate of pregnant women under risk (13%) when comparing with the data from more traditional models of additivity (3%), or a compound‐by‐compound strategy (1.6%).  相似文献   

9.
I recently discussed pitfalls in attempted causal inference based on reduced‐form regression models. I used as motivation a real‐world example from a paper by Dr. Sneeringer, which interpreted a reduced‐form regression analysis as implying the startling causal conclusion that “doubling of [livestock] production leads to a 7.4% increase in infant mortality.” This conclusion is based on: (A) fitting a reduced‐form regression model to aggregate (e.g., county‐level) data; and (B) (mis)interpreting a regression coefficient in this model as a causal coefficient, without performing any formal statistical tests for potential causation (such as conditional independence, Granger‐Sims, or path analysis tests). Dr. Sneeringer now adds comments that confirm and augment these deficiencies, while advocating methodological errors that, I believe, risk analysts should avoid if they want to reach logically sound, empirically valid, conclusions about cause and effect. She explains that, in addition to (A) and (B) above, she also performed other steps such as (C) manually selecting specific models and variables and (D) assuming (again, without testing) that hand‐picked surrogate variables are valid (e.g., that log‐transformed income is an adequate surrogate for poverty). In her view, these added steps imply that “critiques of A and B are not applicable” to her analysis and that therefore “a causal argument can be made” for “such a strong, robust correlation” as she believes her regression coefficient indicates. However, multiple wrongs do not create a right. Steps (C) and (D) exacerbate the problem of unjustified causal interpretation of regression coefficients, without rendering irrelevant the fact that (A) and (B) do not provide evidence of causality. This reply focuses on whether any statistical techniques can produce the silk purse of a valid causal inference from the sow's ear of a reduced‐form regression analysis of ecological data. We conclude that Dr. Sneeringer's analysis provides no valid indication that air pollution from livestock operations causes any increase in infant mortality rates. More generally, reduced‐form regression modeling of aggregate population data—no matter how it is augmented by fitting multiple models and hand‐selecting variables and transformations—is not adequate for valid causal inference about health effects caused by specific, but unmeasured, exposures.  相似文献   

10.
Self‐driving vehicles (SDVs) promise to considerably reduce traffic crashes. One pressing concern facing the public, automakers, and governments is “How safe is safe enough for SDVs?” To answer this question, a new expressed‐preference approach was proposed for the first time to determine the socially acceptable risk of SDVs. In our between‐subject survey (N = 499), we determined the respondents’ risk‐acceptance rate of scenarios with varying traffic‐risk frequencies to examine the logarithmic relationships between the traffic‐risk frequency and risk‐acceptance rate. Logarithmic regression models of SDVs were compared to those of human‐driven vehicles (HDVs); the results showed that SDVs were required to be safer than HDVs. Given the same traffic‐risk‐acceptance rates for SDVs and HDVs, their associated acceptable risk frequencies of SDVs and HDVs were predicted and compared. Two risk‐acceptance criteria emerged: the tolerable risk criterion, which indicates that SDVs should be four to five times as safe as HDVs, and the broadly acceptable risk criterion, which suggests that half of the respondents hoped that the traffic risk of SDVs would be two orders of magnitude lower than the current estimated traffic risk. The approach and these results could provide insights for government regulatory authorities for establishing clear safety requirements for SDVs.  相似文献   

11.
We consider the problem of managing demand risk in tactical supply chain planning for a particular global consumer electronics company. The company follows a deterministic replenishment‐and‐planning process despite considerable demand uncertainty. As a possible way to formally address uncertainty, we provide two risk measures, “demand‐at‐risk” (DaR) and “inventory‐at‐risk” (IaR) and two linear programming models to help manage demand uncertainty. The first model is deterministic and can be used to allocate the replenishment schedule from the plants among the customers as per the existing process. The other model is stochastic and can be used to determine the “ideal” replenishment request from the plants under demand uncertainty. The gap between the output of the two models as regards requested replenishment and the values of the risk measures can be used by the company to reallocate capacity among different products and to thus manage demand/inventory risk.  相似文献   

12.
Pricing below cost is often classified as “dumping” in international trade and as “predatory pricing” in local markets. It is legally prohibited from practice because of earlier findings that it leads to predatory behavior by either eliminating competition or stealing market share. This study shows that a stochastic exchange rate can create incentives for a profit‐minded monopoly firm to set price below marginal cost. Our result departs from earlier findings because the optimal pricing decision is based on a rational behavior that does not exhibit any malicious intent against the competition to be considered as violating anti‐trust laws. The finding is a robust result, because our analysis demonstrates that this behavior occurs under various settings such as when the firm (i) is risk‐averse, (ii) can postpone prices until after exchange rates are realized, (iii) is capable of manufacturing in multiple countries, and (iv) operates under demand uncertainty in addition to the random exchange rate.  相似文献   

13.
In spite of increased attention to quality and efforts to provide safe medical care, adverse events (AEs) are still frequent in clinical practice. Reports from various sources indicate that a substantial number of hospitalized patients suffer treatment‐caused injuries while in the hospital. While risk cannot be entirely eliminated from health‐care activities, an important goal is to develop effective and durable mitigation strategies to render the system “safer.” In order to do this, though, we must develop models that comprehensively and realistically characterize the risk. In the health‐care domain, this can be extremely challenging due to the wide variability in the way that health‐care processes and interventions are executed and also due to the dynamic nature of risk in this particular domain. In this study, we have developed a generic methodology for evaluating dynamic changes in AE risk in acute care hospitals as a function of organizational and nonorganizational factors, using a combination of modeling formalisms. First, a system dynamics (SD) framework is used to demonstrate how organizational‐level and policy‐level contributions to risk evolve over time, and how policies and decisions may affect the general system‐level contribution to AE risk. It also captures the feedback of organizational factors and decisions over time and the nonlinearities in these feedback effects. SD is a popular approach to understanding the behavior of complex social and economic systems. It is a simulation‐based, differential equation modeling tool that is widely used in situations where the formal model is complex and an analytical solution is very difficult to obtain. Second, a Bayesian belief network (BBN) framework is used to represent patient‐level factors and also physician‐level decisions and factors in the management of an individual patient, which contribute to the risk of hospital‐acquired AE. BBNs are networks of probabilities that can capture probabilistic relations between variables and contain historical information about their relationship, and are powerful tools for modeling causes and effects in many domains. The model is intended to support hospital decisions with regard to staffing, length of stay, and investments in safety, which evolve dynamically over time. The methodology has been applied in modeling the two types of common AEs: pressure ulcers and vascular‐catheter‐associated infection, and the models have been validated with eight years of clinical data and use of expert opinion.  相似文献   

14.
This paper studies information aggregation in dynamic markets with a finite number of partially informed strategic traders. It shows that, for a broad class of securities, information in such markets always gets aggregated. Trading takes place in a bounded time interval, and in every equilibrium, as time approaches the end of the interval, the market price of a “separable” security converges in probability to its expected value conditional on the traders' pooled information. If the security is “non‐separable,” then there exists a common prior over the states of the world and an equilibrium such that information does not get aggregated. The class of separable securities includes, among others, Arrow–Debreu securities, whose value is 1 in one state of the world and 0 in all others, and “additive” securities, whose value can be interpreted as the sum of traders' signals.  相似文献   

15.
We perform a statistical study of risk in nuclear energy systems. This study provides and analyzes a data set that is twice the size of the previous best data set on nuclear incidents and accidents, comparing three measures of severity: the industry standard International Nuclear Event Scale, the Nuclear Accident Magnitude Scale of radiation release, and cost in U.S. dollars. The rate of nuclear accidents with cost above 20 MM 2013 USD, per reactor per year, has decreased from the 1970s until the present time. Along the way, the rate dropped significantly after Chernobyl (April 1986) and is expected to be roughly stable around a level of 0.003, suggesting an average of just over one event per year across the current global fleet. The distribution of costs appears to have changed following the Three Mile Island major accident (March 1979). The median cost became approximately 3.5 times smaller, but an extremely heavy tail emerged, being well described by a Pareto distribution with parameter α = 0.5–0.6. For instance, the cost of the two largest events, Chernobyl and Fukushima (March 2011), is equal to nearly five times the sum of the 173 other events. We also document a significant runaway disaster regime in both radiation release and cost data, which we associate with the “dragon‐king” phenomenon. Since the major accident at Fukushima (March 2011) occurred recently, we are unable to quantify an impact of the industry response to this disaster. Excluding such improvements, in terms of costs, our range of models suggests that there is presently a 50% chance that (i) a Fukushima event (or larger) occurs every 60–150 years, and (ii) that a Three Mile Island event (or larger) occurs every 10–20 years. Further—even assuming that it is no longer possible to suffer an event more costly than Chernobyl or Fukushima—the expected annual cost and its standard error bracket the cost of a new plant. This highlights the importance of improvements not only immediately following Fukushima, but also deeper improvements to effectively exclude the possibility of “dragon‐king” disasters. Finally, we find that the International Nuclear Event Scale (INES) is inconsistent in terms of both cost and radiation released. To be consistent with cost data, the Chernobyl and Fukushima disasters would need to have between an INES level of 10 and 11, rather than the maximum of 7.  相似文献   

16.
We show that a simple “reputation‐style” test can always identify which of two experts is informed about the true distribution. The test presumes no prior knowledge of the true distribution, achieves any desired degree of precision in some fixed finite time, and does not use “counterfactual” predictions. Our analysis capitalizes on a result of Fudenberg and Levine (1992) on the rate of convergence of supermartingales. We use our setup to shed some light on the apparent paradox that a strategically motivated expert can ignorantly pass any test. We point out that this paradox arises because in the single‐expert setting, any mixed strategy for Nature over distributions is reducible to a pure strategy. This eliminates any meaningful sense in which Nature can randomize. Comparative testing reverses the impossibility result because the presence of an expert who knows the realized distribution eliminates the reducibility of Nature's compound lotteries.  相似文献   

17.
We study mechanism design in dynamic quasilinear environments where private information arrives over time and decisions are made over multiple periods. We make three contributions. First, we provide a necessary condition for incentive compatibility that takes the form of an envelope formula for the derivative of an agent's equilibrium expected payoff with respect to his current type. It combines the familiar marginal effect of types on payoffs with novel marginal effects of the current type on future ones that are captured by “impulse response functions.” The formula yields an expression for dynamic virtual surplus that is instrumental to the design of optimal mechanisms and to the study of distortions under such mechanisms. Second, we characterize the transfers that satisfy the envelope formula and establish a sense in which they are pinned down by the allocation rule (“revenue equivalence”). Third, we characterize perfect Bayesian equilibrium‐implementable allocation rules in Markov environments, which yields tractable sufficient conditions that facilitate novel applications. We illustrate the results by applying them to the design of optimal mechanisms for the sale of experience goods (“bandit auctions”).  相似文献   

18.
In this article, we examine how the firms embedded in supply networks engage in decision making over time. The supply networks as a complex adaptive system are simulated using cellular automata (CA) through a dynamic evolution of cooperation (i.e., “voice” decision) and defection (i.e., “exit” decision) among supply network agents (i.e., firms). Simple local rules of interaction among firms generate complex patterns of cooperation and defection decisions in the supply network. The incentive schemes underlying decision making are derived through different configurations of the payoff‐matrix based on the game theory argument. The prisoner's dilemma game allows capturing the localized decision‐making process by rational agents, and the CA model allows the self‐organizing outcome to emerge. By observing the evolution of decision making by cooperating and defecting agents, we offer testable propositions regarding relationship development and distributed nature of governance mechanisms for managing supply networks.  相似文献   

19.
Two images, “black swans” and “perfect storms,” have struck the public's imagination and are used—at times indiscriminately—to describe the unthinkable or the extremely unlikely. These metaphors have been used as excuses to wait for an accident to happen before taking risk management measures, both in industry and government. These two images represent two distinct types of uncertainties (epistemic and aleatory). Existing statistics are often insufficient to support risk management because the sample may be too small and the system may have changed. Rationality as defined by the von Neumann axioms leads to a combination of both types of uncertainties into a single probability measure—Bayesian probability—and accounts only for risk aversion. Yet, the decisionmaker may also want to be ambiguity averse. This article presents an engineering risk analysis perspective on the problem, using all available information in support of proactive risk management decisions and considering both types of uncertainty. These measures involve monitoring of signals, precursors, and near‐misses, as well as reinforcement of the system and a thoughtful response strategy. It also involves careful examination of organizational factors such as the incentive system, which shape human performance and affect the risk of errors. In all cases, including rare events, risk quantification does not allow “prediction” of accidents and catastrophes. Instead, it is meant to support effective risk management rather than simply reacting to the latest events and headlines.  相似文献   

20.
This article considers a class of fresh‐product supply chains in which products need to be transported by the upstream producer from a production base to a distant retail market. Due to high perishablility a portion of the products being shipped may decay during transportation, and therefore, become unsaleable. We consider a supply chain consisting of a single producer and a single distributor, and investigate two commonly adopted business models: (i) In the “pull” model, the distributor places an order, then the producer determines the shipping quantity, taking into account potential product decay during transportation, and transports the products to the destination market of the distributor; (ii) In the “push” model, the producer ships a batch of products to a distant wholesale market, and then the distributor purchases and resells to end customers. By considering a price‐sensitive end‐customer demand, we investigate the optimal decisions for supply chain members, including order quantity, shipping quantity, and retail price. Our research shows that both the producer and distributor (and thus the supply chain) will perform better if the pull model is adopted. To improve the supply chain performance, we propose a fixed inventory‐plus factor (FIPF) strategy, in which the producer announces a pre‐determined inventory‐plus factor and the distributor compensates the producer for any surplus inventory that would otherwise be wasted. We show that this strategy is a Pareto improvement over the pull and push models for both parties. Finally, numerical experiments are conducted, which reveal some interesting managerial insights on the comparison between different business models.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号