首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Combining Probability Distributions From Experts in Risk Analysis   总被引:33,自引:0,他引:33  
This paper concerns the combination of experts' probability distributions in risk analysis, discussing a variety of combination methods and attempting to highlight the important conceptual and practical issues to be considered in designing a combination process in practice. The role of experts is important because their judgments can provide valuable information, particularly in view of the limited availability of hard data regarding many important uncertainties in risk analysis. Because uncertainties are represented in terms of probability distributions in probabilistic risk analysis (PRA), we consider expert information in terms of probability distributions. The motivation for the use of multiple experts is simply the desire to obtain as much information as possible. Combining experts' probability distributions summarizes the accumulated information for risk analysts and decision-makers. Procedures for combining probability distributions are often compartmentalized as mathematical aggregation methods or behavioral approaches, and we discuss both categories. However, an overall aggregation process could involve both mathematical and behavioral aspects, and no single process is best in all circumstances. An understanding of the pros and cons of different methods and the key issues to consider is valuable in the design of a combination process for a specific PRA. The output, a combined probability distribution, can ideally be viewed as representing a summary of the current state of expert opinion regarding the uncertainty of interest.  相似文献   

2.
The risk of catastrophic failures, for example in the aviation and aerospace industries, can be approached from different angles (e.g., statistics when they exist, or a detailed probabilistic analysis of the system). Each new accident carries information that has already been included in the experience base or constitutes new evidence that can be used to update a previous assessment of the risk. In this paper, we take a different approach and consider the risk and the updating from the investor's point of view. Based on the market response to past airplane accidents, we examine which ones have created a surprise response and which ones are considered part of the risk of the airline business as previously assessed. To do so, we quantify the magnitude and the timing of the observed market response to catastrophic accidents, and we compare it to an estimate of the response that would be expected based on the true actual cost of the accident including direct and indirect costs (full-cost information response). First, we develop a method based on stock market data to measure the actual market response to an accident and we construct an estimate of the full-cost information response to such an event. We then compare the two figures for the immediate and the long-term response of the market for the affected firm, as well as for the whole industry group to which the firm belongs. As an illustration, we analyze a sample of ten fatal accidents experienced by major US domestic airlines during the last seven years. In four cases, we observed an abnormal market response. In these instances, it seems that the shareholders may have updated their estimates of the probability of a future accident in the affected airlines or more generally of the firm's future business prospects. This market reaction is not always easy to explain much less to anticipate, a fact which management should bear in mind when planning a firm's response to such an event.  相似文献   

3.
Challenges to the Acceptance of Probabilistic Risk Analysis   总被引:3,自引:0,他引:3  
Bier  Vicki M. 《Risk analysis》1999,19(4):703-710
This paper discusses a number of the key challenges to the acceptance and application of probabilistic risk analysis (PRA). Those challenges include: (a) the extensive reliance on subjective judgment in PRA, requiring the development of guidance for the use of PRA in risk-informed regulation, and possibly the development of robust or reference prior distributions to minimize the reliance on judgment; and (b) the treatment of human performance in PRA, including not only human error per se but also management and organizational factors more broadly. All of these areas are seen as presenting interesting research challenges at the interface between engineering and other disciplines.  相似文献   

4.
Context in the Risk Assessment of Digital Systems   总被引:1,自引:0,他引:1  
As the use of digital computers for instrumentation and control of safety-critical systems has increased, there has been a growing debate over the issue of whether probabilistic risk assessment techniques can be applied to these systems. This debate has centered on the issue of whether software failures can be modeled probabilistically. This paper describes a context-based approach to software risk assessment that explicitly recognizes the fact that the behavior of software is not probabilistic. The source of the perceived uncertainty in its behavior results from both the input to the software as well as the application and environment in which the software is operating. Failures occur as the result of encountering some context for which the software was not properly designed, as opposed to the software simply failing randomly. The paper elaborates on the concept of error-forcing context as it applies to software. It also illustrates a methodology which utilizes event trees, fault trees, and the Dynamic Flowgraph Methodology (DFM) to identify error-forcing contexts for software in the form of fault tree prime implicants.  相似文献   

5.
Public policies to mitigate the impacts of extreme events such as hurricanes or terrorist attacks will differ depending on whether they focus on reducing risk or reducing vulnerability. Here we present and defend six assertions aimed at exploring the benefits of vulnerability-based policies. (1) Risk-based approaches to covering the costs of extreme events do not depend for their success on reduction of vulnerability. (2) Risk-based approaches to preparing for extreme events are focused on acquiring accurate probabilistic information about the events themselves. (3) Understanding and reducing vulnerability does not demand accurate predictions of the incidence of extreme events. (4) Extreme events are created by context. (5) It is politically difficult to justify vulnerability reduction on economic grounds. (6) Vulnerability reduction is a human rights issue; risk reduction is not.  相似文献   

6.
The Constrained Extremal Distribution Selection Method   总被引:5,自引:0,他引:5  
Engineering design and policy formulation often involve the assessment of the likelihood of future events commonly expressed through a probability distribution. Determination of these distributions is based, when possible, on observational data. Unfortunately, these data are often incomplete, biased, and/or incorrect. These problems are exacerbated when policy formulation involves the risk of extreme events—situations of low likelihood and high consequences. Usually, observational data simply do not exist for such events. Therefore, determination of probabilities which characterize extreme events must utilize all available knowledge, be it subjective or observational, so as to most accurately reflect the likelihood of such events. Extending previous work on the statistics of extremes, the Constrained Extremal Distribution Selection Method is a methodology that assists in the selection of probability distributions that characterize the risk of extreme events using expert opinion to constrain the feasible values for parameters which explicitly define a distribution. An extremal distribution is then "fit" to observational data, conditional that the selection of parameters does not violate any constraints. Using a random search technique, genetic algorithms, parameters that minimize a measure of fit between a hypothesized distribution and observational data are estimated. The Constrained Extremal Distribution Selection Method is applied to a real world policy problem faced by the U.S. Environmental Protection Agency. Selected distributions characterize the likelihood of extreme, fatal hazardous material accidents in the United States. These distributions are used to characterize the risk of large scale accidents with numerous fatalities.  相似文献   

7.
Hattis  Dale  Banati  Prerna  Goble  Rob  Burmaster  David E. 《Risk analysis》1999,19(4):711-726
This paper reviews existing data on the variability in parameters relevant for health risk analyses. We cover both exposure-related parameters and parameters related to individual susceptibility to toxicity. The toxicity/susceptibility data base under construction is part of a longer term research effort to lay the groundwork for quantitative distributional analyses of non-cancer toxic risks. These data are broken down into a variety of parameter types that encompass different portions of the pathway from external exposure to the production of biological responses. The discrete steps in this pathway, as we now conceive them, are:Contact Rate (Breathing rates per body weight; fish consumption per body weight)Uptake or Absorption as a Fraction of Intake or Contact RateGeneral Systemic Availability Net of First Pass Elimination and Dilution via Distribution Volume (e.g., initial blood concentration per mg/kg of uptake)Systemic Elimination (half life or clearance)Active Site Concentration per Systemic Blood or Plasma ConcentrationPhysiological Parameter Change per Active Site Concentration (expressed as the dose required to make a given percentage change in different people, or the dose required to achieve some proportion of an individual's maximum response to the drug or toxicant)Functional Reserve Capacity–Change in Baseline Physiological Parameter Needed to Produce a Biological Response or Pass a Criterion of Abnormal FunctionComparison of the amounts of variability observed for the different parameter types suggests that appreciable variability is associated with the final step in the process–differences among people in functional reserve capacity. This has the implication that relevant information for estimating effective toxic susceptibility distributions may be gleaned by direct studies of the population distributions of key physiological parameters in people that are not exposed to the environmental and occupational toxicants that are thought to perturb those parameters. This is illustrated with some recent observations of the population distributions of Low Density Lipoprotein Cholesterol from the second and third National Health and Nutrition Examination Surveys.  相似文献   

8.
The increased frequency of extreme events in recent years highlights the emerging need for the development of methods that could contribute to the mitigation of the impact of such events on critical infrastructures, as well as boost their resilience against them. This article proposes an online spatial risk analysis capable of providing an indication of the evolving risk of power systems regions subject to extreme events. A Severity Risk Index (SRI) with the support of real‐time monitoring assesses the impact of the extreme events on the power system resilience, with application to the effect of windstorms on transmission networks. The index considers the spatial and temporal evolution of the extreme event, system operating conditions, and the degraded system performance during the event. SRI is based on probabilistic risk by condensing the probability and impact of possible failure scenarios while the event is spatially moving across a power system. Due to the large number of possible failures during an extreme event, a scenario generation and reduction algorithm is applied in order to reduce the computation time. SRI provides the operator with a probabilistic assessment that could lead to effective resilience‐based decisions for risk mitigation. The IEEE 24‐bus Reliability Test System has been used to demonstrate the effectiveness of the proposed online risk analysis, which was embedded in a sequential Monte Carlo simulation for capturing the spatiotemporal effects of extreme events and evaluating the effectiveness of the proposed method.  相似文献   

9.
Putzrath  Resha M.  Wilson  James D. 《Risk analysis》1999,19(2):231-247
We investigated the way results of human health risk assessments are used, and the theory used to describe those methods, sometimes called the NAS paradigm. Contrary to a key tenet of that theory, current methods have strictly limited utility. The characterizations now considered standard, Safety Indices such as Acceptable Daily Intake, Reference Dose, and so on, usefully inform only decisions that require a choice between two policy alternatives (e.g., approve a food additive or not), decided solely on the basis of a finding of safety. Risk is characterized as the quotient of one of these Safety Indices divided by an estimate of exposure: a quotient greater than one implies that the situation may be considered safe. Such decisions are very widespread, both in the U. S. federal government and elsewhere. No current method is universal; different policies lead to different practices, for example, in California's Proposition 65, where statutory provisions specify some practices. Further, an important kind of human health risk assessment is not recognized by this theory: this kind characterizes risk as likelihood of harm, given estimates of exposure consequent to various decision choices. Likelihood estimates are necessary whenever decision makers have many possible decision choices and must weigh more than two societal values, such as in EPA's implementation of conventional air pollutants. These estimates can not be derived using current methods; different methods are needed. Our analysis suggests changes needed in both the theory and practice of human health risk assessment, and how what is done is depicted.  相似文献   

10.
Expert knowledge is an important source of input to risk analysis. In practice, experts might be reluctant to characterize their knowledge and the related (epistemic) uncertainty using precise probabilities. The theory of possibility allows for imprecision in probability assignments. The associated possibilistic representation of epistemic uncertainty can be combined with, and transformed into, a probabilistic representation; in this article, we show this with reference to a simple fault tree analysis. We apply an integrated (hybrid) probabilistic‐possibilistic computational framework for the joint propagation of the epistemic uncertainty on the values of the (limiting relative frequency) probabilities of the basic events of the fault tree, and we use possibility‐probability (probability‐possibility) transformations for propagating the epistemic uncertainty within purely probabilistic and possibilistic settings. The results of the different approaches (hybrid, probabilistic, and possibilistic) are compared with respect to the representation of uncertainty about the top event (limiting relative frequency) probability. Both the rationale underpinning the approaches and the computational efforts they require are critically examined. We conclude that the approaches relevant in a given setting depend on the purpose of the risk analysis, and that further research is required to make the possibilistic approaches operational in a risk analysis context.  相似文献   

11.
Recently, the concept of black swans has gained increased attention in the fields of risk assessment and risk management. Different types of black swans have been suggested, distinguishing between unknown unknowns (nothing in the past can convincingly point to its occurrence), unknown knowns (known to some, but not to relevant analysts), or known knowns where the probability of occurrence is judged as negligible. Traditional risk assessments have been questioned, as their standard probabilistic methods may not be capable of predicting or even identifying these rare and extreme events, thus creating a source of possible black swans. In this article, we show how a simulation model can be used to identify previously unknown potentially extreme events that if not identified and treated could occur as black swans. We show that by manipulating a verified and validated model used to predict the impacts of hazards on a system of interest, we can identify hazard conditions not previously experienced that could lead to impacts much larger than any previous level of impact. This makes these potential black swan events known and allows risk managers to more fully consider them. We demonstrate this method using a model developed to evaluate the effect of hurricanes on energy systems in the United States; we identify hurricanes with potentially extreme impacts, storms well beyond what the historic record suggests is possible in terms of impacts.  相似文献   

12.
Various methods for risk characterization have been developed using probabilistic approaches. Data on Vietnamese farmers are available for the comparison of outcomes for risk characterization using different probabilistic methods. This article addresses the health risk characterization of chlorpyrifos using epidemiological dose‐response data and probabilistic techniques obtained from a case study with rice farmers in Vietnam. Urine samples were collected from farmers and analyzed for trichloropyridinol (TCP), which was converted into absorbed daily dose of chlorpyrifos. Adverse health response doses due to chlorpyrifos exposure were collected from epidemiological studies to develop dose‐adverse health response relationships. The health risk of chlorpyrifos was quantified using hazard quotient (HQ), Monte Carlo simulation (MCS), and overall risk probability (ORP) methods. With baseline (prior to pesticide spraying) and lifetime exposure levels (over a lifetime of pesticide spraying events), the HQ ranged from 0.06 to 7.1. The MCS method indicated less than 0.05% of the population would be affected while the ORP method indicated that less than 1.5% of the population would be adversely affected. With postapplication exposure levels, the HQ ranged from 1 to 32.5. The risk calculated by the MCS method was that 29% of the population would be affected, and the risk calculated by ORP method was 33%. The MCS and ORP methods have advantages in risk characterization due to use of the full distribution of data exposure as well as dose response, whereas HQ methods only used the exposure data distribution. These evaluations indicated that single‐event spraying is likely to have adverse effects on Vietnamese rice farmers.  相似文献   

13.
Health Risk Assessment of a Modern Municipal Waste Incinerator   总被引:2,自引:0,他引:2  
During the modernization of the municipal waste incinerator (MWI, maximum capacity of 180,000 tons per year) of Metropolitan Grenoble (405,000 inhabitants), in France, a risk assessment was conducted, based on four tracer pollutants: two volatile organic compounds (benzene and 1, 1, 1 trichloroethane) and two heavy metals (nickel and cadmium, measured in particles). A Gaussian plume dispersion model, applied to maximum emissions measured at the MWI stacks, was used to estimate the distribution of these pollutants in the atmosphere throughout the metropolitan area. A random sample telephone survey (570 subjects) gathered data on time-activity patterns, according to demographic characteristics of the population. Life-long exposure was assessed as a time-weighted average of ambient air concentrations. Inhalation alone was considered because, in the Grenoble urban setting, other routes of exposure are not likely. A Monte Carlo simulation was used to describe probability distributions of exposures and risks. The median of the life-long personal exposures distribution to MWI benzene was 3.2·10–5 g/m3 (20th and 80th percentiles = 1.5·10–5 and 6.5·10–5 g/m3), yielding a 2.6·10–10 carcinogenic risk (1.2·10–10–5.4·10–10). For nickel, the corresponding life-time exposure and cancer risk were 1.8·10–4 g/m3 (0.9.10–4 – 3.6·10–4 g/m3) and 8.6·10–8 (4.3·10–8–17.3·10–8); for cadmium they were respectively 8.3·10–6 g/m3 (4.0·10–6–17.6·10–6) and 1.5·10–8 (7.2·10–9–3.1·10–8). Inhalation exposure to cadmium emitted by the MWI represented less than 1% of the WHO Air Quality Guideline (5 ng/m3), while there was a margin of exposure of more than 109 between the NOAEL (150 ppm) and exposure estimates to trichloroethane. Neither dioxins nor mercury, a volatile metal, were measured. This could lessen the attributable life-long risks estimated. The minute (VOCs and cadmium) to moderate (nickel) exposure and risk estimates are in accord with other studies on modern MWIs meeting recent emission regulations, however.  相似文献   

14.
Bayesian Forecasting via Deterministic Model   总被引:1,自引:0,他引:1  
Rational decision making requires that the total uncertainty about a variate of interest (a predictand) be quantified in terms of a probability distribution, conditional on all available information and knowledge. Suppose the state-of-knowledge is embodied in a deterministic model, which is imperfect and outputs only an estimate of the predictand. Fundamentals are presented of two Bayesian methods for producing a probabilistic forecast via any deterministic model. The Bayesian Processor of Forecast (BPF) quantifies the total uncertainty in terms of a posterior distribution, conditional on model output. The Bayesian Forecasting System (BFS) decomposes the total uncertainty into input uncertainty and model uncertainty, which are characterized independently and then integrated into a predictive distribution. The BFS is compared with Monte Carlo simulation and ensemble forecasting technique, none of which can alone produce a probabilistic forecast that quantifies the total uncertainty, but each can serve as a component of the BFS.  相似文献   

15.
A central part of probabilistic public health risk assessment is the selection of probability distributions for the uncertain input variables. In this paper, we apply the first-order reliability method (FORM)(1–3) as a probabilistic tool to assess the effect of probability distributions of the input random variables on the probability that risk exceeds a threshold level (termed the probability of failure) and on the relevant probabilistic sensitivities. The analysis was applied to a case study given by Thompson et al. (4) on cancer risk caused by the ingestion of benzene contaminated soil. Normal, lognormal, and uniform distributions were used in the analysis. The results show that the selection of a probability distribution function for the uncertain variables in this case study had a moderate impact on the probability that values would fall above a given threshold risk when the threshold risk is at the 50th percentile of the original distribution given by Thompson et al. (4) The impact was much greater when the threshold risk level was at the 95th percentile. The impact on uncertainty sensitivity, however, showed a reversed trend, where the impact was more appreciable for the 50th percentile of the original distribution of risk given by Thompson et al. 4 than for the 95th percentile. Nevertheless, the choice of distribution shape did not alter the order of probabilistic sensitivity of the basic uncertain variables.  相似文献   

16.
Matthew Revie 《Risk analysis》2011,31(7):1120-1132
Traditional statistical procedures for estimating the probability of an event result in an estimate of zero when no events are realized. Alternative inferential procedures have been proposed for the situation where zero events have been realized but often these are ad hoc, relying on selecting methods dependent on the data that have been realized. Such data‐dependent inference decisions violate fundamental statistical principles, resulting in estimation procedures whose benefits are difficult to assess. In this article, we propose estimating the probability of an event occurring through minimax inference on the probability that future samples of equal size realize no more events than that in the data on which the inference is based. Although motivated by inference on rare events, the method is not restricted to zero event data and closely approximates the maximum likelihood estimate (MLE) for nonzero data. The use of the minimax procedure provides a risk adverse inferential procedure where there are no events realized. A comparison is made with the MLE and regions of the underlying probability are identified where this approach is superior. Moreover, a comparison is made with three standard approaches to supporting inference where no event data are realized, which we argue are unduly pessimistic. We show that for situations of zero events the estimator can be simply approximated with , where n is the number of trials.  相似文献   

17.
《Risk analysis》2018,38(8):1534-1540
An extreme space weather event has the potential to disrupt or damage infrastructure systems and technologies that many societies rely on for economic and social well‐being. Space weather events occur regularly, but extreme events are less frequent, with a small number of historical examples over the last 160 years. During the past decade, published works have (1) examined the physical characteristics of the extreme historical events and (2) discussed the probability or return rate of select extreme geomagnetic disturbances, including the 1859 Carrington event. Here we present initial findings on a unified framework approach to visualize space weather event probability, using a Bayesian model average, in the context of historical extreme events. We present disturbance storm time (Dst ) probability (a proxy for geomagnetic disturbance intensity) across multiple return periods and discuss parameters of interest to policymakers and planners in the context of past extreme space weather events. We discuss the current state of these analyses, their utility to policymakers and planners, the current limitations when compared to other hazards, and several gaps that need to be filled to enhance space weather risk assessments.  相似文献   

18.
Use of probability distributions by regulatory agencies often focuses on the extreme events and scenarios that correspond to the tail of probability distributions. This paper makes the case that assessment of the tail of the distribution can and often should be performed separately from assessment of the central values. Factors to consider when developing distributions that account for tail behavior include (a) the availability of data, (b) characteristics of the tail of the distribution, and (c) the value of additional information in assessment. The integration of these elements will improve the modeling of extreme events by the tail of distributions, thereby providing policy makers with critical information on the risk of extreme events. Two examples provide insight into the theme of the paper. The first demonstrates the need for a parallel analysis that separates the extreme events from the central values. The second shows a link between the selection of the tail distribution and a decision criterion. In addition, the phenomenon of breaking records in time-series data gives insight to the information that characterizes extreme values. One methodology for treating risk of extreme events explicitly adopts the conditional expected value as a measure of risk. Theoretical results concerning this measure are given to clarify some of the concepts of the risk of extreme events.  相似文献   

19.
The paper applies classical statistical principles to yield new tools for risk assessment and makes new use of epidemiological data for human risk assessment. An extensive clinical and epidemiological study of workers engaged in the manufacturing and formulation of aldrin and dieldrin provides occupational hygiene and biological monitoring data on individual exposures over the years of employment and provides unusually accurate measures of individual lifetime average daily doses. In the cancer dose-response modeling, each worker is treated as a separate experimental unit with his own unique dose. Maximum likelihood estimates of added cancer risk are calculated for multistage, multistage-Weibull, and proportional hazards models. Distributional characterizations of added cancer risk are based on bootstrap and relative likelihood techniques. The cancer mortality data on these male workers suggest that low-dose exposures to aldrin and dieldrin do not significantly increase human cancer risk and may even decrease the human hazard rate for all types of cancer combined at low doses (e.g., 1 g/kg/day). The apparent hormetic effect in the best fitting dose-response models for this data set is statistically significant. The decrease in cancer risk at low doses of aldrin and dieldrin is in sharp contrast to the U.S. Environmental Protection Agency's upper bound on cancer potency based on mouse liver tumors. The EPA's upper bound implies that lifetime average daily doses of 0.0000625 and 0.00625 g/kg body weight/day would correspond to increased cancer risks of 0.000001 and 0.0001, respectively. However, the best estimate from the Pernis epidemiological data is that there is no increase in cancer risk in these workers at these doses or even at doses as large as 2 g/kg/day.  相似文献   

20.
As far as we know, for most polynomially solvable network optimization problems, their inverse problems under l 1 or l norm have been studied, except the inverse maximum-weight matching problem in non-bipartite networks. In this paper we discuss the inverse problem of maximum-weight perfect matching in a non-bipartite network under l 1 and l norms. It has been proved that the inverse maximum-weight perfect matching under l norm can be formulated as a maximum-mean alternating cycle problem of an undirected network, and can be solved in polynomial time by a binary search algorithm and in strongly polynomial time by an ascending algorithm, and under l 1 norm it can be solved by the ellipsoid method. Therefore, inverse problems of maximum-weight perfect matching under l 1 and l norms are solvable in polynomial time.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号