首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Expert knowledge is an important source of input to risk analysis. In practice, experts might be reluctant to characterize their knowledge and the related (epistemic) uncertainty using precise probabilities. The theory of possibility allows for imprecision in probability assignments. The associated possibilistic representation of epistemic uncertainty can be combined with, and transformed into, a probabilistic representation; in this article, we show this with reference to a simple fault tree analysis. We apply an integrated (hybrid) probabilistic‐possibilistic computational framework for the joint propagation of the epistemic uncertainty on the values of the (limiting relative frequency) probabilities of the basic events of the fault tree, and we use possibility‐probability (probability‐possibility) transformations for propagating the epistemic uncertainty within purely probabilistic and possibilistic settings. The results of the different approaches (hybrid, probabilistic, and possibilistic) are compared with respect to the representation of uncertainty about the top event (limiting relative frequency) probability. Both the rationale underpinning the approaches and the computational efforts they require are critically examined. We conclude that the approaches relevant in a given setting depend on the purpose of the risk analysis, and that further research is required to make the possibilistic approaches operational in a risk analysis context.  相似文献   

2.
A better understanding of the uncertainty that exists in models used for seismic risk assessment is critical to improving risk-based decisions pertaining to earthquake safety. Current models estimating the probability of collapse of a building do not consider comprehensively the nature and impact of uncertainty. This article presents a model framework to enhance seismic risk assessment and thus gives decisionmakers a fuller understanding of the nature and limitations of the estimates. This can help ensure that risks are not over- or underestimated and the value of acquiring accurate data is appreciated fully. The methodology presented provides a novel treatment of uncertainties in input variables, their propagation through the model, and their effect on the results. The study presents ranges of possible annual collapse probabilities for different case studies on buildings in different parts of the world, exposed to different levels of seismicity, and with different vulnerabilities. A global sensitivity analysis was conducted to determine the significance of uncertain variables. Two key outcomes are (1) that the uncertainty in ground-motion conversion equations has the largest effect on the uncertainty in the calculation of annual collapse probability; and (2) the vulnerability of a building appears to have an effect on the range of annual collapse probabilities produced, i.e., the level of uncertainty in the estimate of annual collapse probability, with less vulnerable buildings having a smaller uncertainty.  相似文献   

3.
4.
Our concept of nine risk evaluation criteria, six risk classes, a decision tree, and three management categories was developed to improve the effectiveness, efficiency, and political feasibility of risk management procedures. The main task of risk evaluation and management is to develop adequate tools for dealing with the problems of complexity, uncertainty. and ambiguity. Based on the characteristics of different risk types and these three major problems, we distinguished three types of management--risk-based, precaution-based, and discourse-based strategies. The risk-based strategy--is the common solution to risk problems. Once the probabilities and their corresponding damage potentials are calculated, risk managers are required to set priorities according to the severity of the risk, which may be operationalized as a linear combination of damage and probability or as a weighted combination thereof. Within our new risk classification, the two central components have been augmented with other physical and social criteria that still demand risk-based strategies as long as uncertainty is low and ambiguity absent. Risk-based strategies are best solutions to problems of complexity and some components of uncertainty, for example, variation among individuals. If the two most important risk criteria, probability of occurrence and extent of damage, are relatively well known and little uncertainty is left, the traditional risk-based approach seems reasonable. If uncertainty plays a large role, in particular, indeterminacy or lack of knowledge, the risk-based approach becomes counterproductive. Judging the relative severity of risks on the basis of uncertain parameters does not make much sense. Under these circumstances, management strategies belonging to the precautionary management style are required. The precautionary approach has been the basis for much of the European environmental and health protection legislation and regulation. Our own approach to risk management has been guided by the proposition that any conceptualization of the precautionary principle should be (1) in line with established methods of scientific risk assessments, (2) consistent and discriminatory (avoiding arbitrary results) when it comes to prioritization, and (3) at the same time, specific with respect to precautionary measures, such as ALARA or BACT, or the strategy of containing risks in time and space. This suggestion does, however, entail a major problem: looking only to the uncertainties does not provide risk managers with a clue about where to set priorities for risk reduction. Risks vary in their degree of remaining uncertainties. How can one judge the severity of a situation when the potential damage and its probability are unknown or contested? In this dilemma, we advise risk managers to use additional criteria of hazardousness, such as "ubiquity versibility," and "pervasiveness over time," as proxies for judging severity. Our approach also distinguishes clearly between uncertainty and ambiguity. Uncertainty refers to a situation of being unclear about factual statements; ambiguity to a situation of contested views about the desirability or severity of a given hazard. Uncertainty can be resolved in principle by more cognitive advances (with the exception of indeterminacy). ambiguity only by discourse. Discursive procedures include legal deliberations as well as novel participatory approaches. In addition, discursive methods of planning and conflict resolution can be used. If ambiguities are associated with a risk problem, it is not enough to demonstrate that risk regulators are open to public concerns and address the issues that many people wish them to take care ot The process of risk evaluation itself needs to be open to public input and new forms of deliberation. We have recommended a tested set of deliberative processes that are, at least in principle, capable of resolving ambiguities in risk debates (for a review, see Renn, Webler, & Wiedemaun. 1995). Deliberative processes are needed, however, for ail three types of management. Risk-based management relies on epistemiological, uncertainty-based management on reflective, and discourse-based management on participatory discourse forms. These three types of discourse could be labeled as an analytic-deliberative procedure for risk evaluation and management. We see the advantage of a deliberative style of regulation and management in a dynamic balance between procedure and outcome. Procedure should not have priority over the outcome; outcome should not have priority over the procedure. An intelligent combination of both can elaborate the required prerequisites of democratic deliberation and its substantial outcomes to enhance the legitimacy of political decisions (Guttman & Thompson, 1996; Bohman, 1997. 1998).  相似文献   

5.
The aim of this article is to investigate some implications of complexity in workplace risk assessment. Workplace is examined as a complex system, and some of its attributes and aspects of its behavior are investigated. Failure probability of various workplace elements is examined as a time variable and interference phenomena of these probabilities are presented. Potential inefficiencies of common perceptions in applying probabilistic risk assessment models are also discussed. This investigation is conducted through mathematical modeling and qualitative examples of workplace situations. A mathematical model for simulation of the evolution of workplace accident probability in time is developed. Its findings are then attempted to be translated in real-world terms and discussed through simple examples of workplace situations. The mathematical model indicates that workplace is more likely to exhibit an unpredictable behavior. Such a behavior raises issues about usual key assumptions for the workplace, such as aggregation. Chaotic phenomena (nonlinear feedback mechanisms) are also investigated for in simple workplace systems cases. The main conclusions are (1) that time is an important variable for risk assessment, since behavior patterns are complex and unpredictable in the long term and (2) that workplace risk identification should take place in a holistic view (not by work post).  相似文献   

6.
The aging domestic oil production infrastructure represents a high risk to the environment because of the type of fluids being handled (oil and brine) and the potential for accidental release of these fluids into sensitive ecosystems. Currently, there is not a quantitative risk model directly applicable to onshore oil exploration and production (E&P) facilities. We report on a probabilistic reliability model created for onshore exploration and production (E&P) facilities. Reliability theory, failure modes and effects analysis (FMEA), and event trees were used to develop the model estimates of the failure probability of typical oil production equipment. Monte Carlo simulation was used to translate uncertainty in input parameter values to uncertainty in the model output. The predicted failure rates were calibrated to available failure rate information by adjusting probability density function parameters used as random variates in the Monte Carlo simulations. The mean and standard deviation of normal variate distributions from which the Weibull distribution characteristic life was chosen were used as adjustable parameters in the model calibration. The model was applied to oil production leases in the Tallgrass Prairie Preserve, Oklahoma. We present the estimated failure probability due to the combination of the most significant failure modes associated with each type of equipment (pumps, tanks, and pipes). The results show that the estimated probability of failure for tanks is about the same as that for pipes, but that pumps have much lower failure probability. The model can provide necessary equipment reliability information for proactive risk management at the lease level by providing quantitative information to base allocation of maintenance resources to high-risk equipment that will minimize both lost production and ecosystem damage.  相似文献   

7.
Probabilistic safety analysis (PSA) has been used in nuclear, chemical, petrochemical, and several other industries. The probability and/or frequency results of most PSAs are based on average component unavailabilities during the mission of interest. While these average results are useful, they provide no indication of the significance of the facility's current status when one or more components are known to be out of service. Recently, several interactive computational models have been developed for nuclear power plants to allow the user to specify the plant's status at a particular time (i.e., to specify equipment known to be out of service) and then to receive updated PSA information. As with conventional PSA results, there are uncertainties associated with the numerical updated results. These uncertainties stem from a number of sources, including parameter uncertainty (uncertainty in equipment failure rates and human error probabilities). This paper presents an analysis of the impact of parameter uncertainty on updated PSA results.  相似文献   

8.
A central part of probabilistic public health risk assessment is the selection of probability distributions for the uncertain input variables. In this paper, we apply the first-order reliability method (FORM)(1–3) as a probabilistic tool to assess the effect of probability distributions of the input random variables on the probability that risk exceeds a threshold level (termed the probability of failure) and on the relevant probabilistic sensitivities. The analysis was applied to a case study given by Thompson et al. (4) on cancer risk caused by the ingestion of benzene contaminated soil. Normal, lognormal, and uniform distributions were used in the analysis. The results show that the selection of a probability distribution function for the uncertain variables in this case study had a moderate impact on the probability that values would fall above a given threshold risk when the threshold risk is at the 50th percentile of the original distribution given by Thompson et al. (4) The impact was much greater when the threshold risk level was at the 95th percentile. The impact on uncertainty sensitivity, however, showed a reversed trend, where the impact was more appreciable for the 50th percentile of the original distribution of risk given by Thompson et al. 4 than for the 95th percentile. Nevertheless, the choice of distribution shape did not alter the order of probabilistic sensitivity of the basic uncertain variables.  相似文献   

9.
Compliance Versus Risk in Assessing Occupational Exposures   总被引:1,自引:0,他引:1  
Assessments of occupational exposures to chemicals are generally based upon the practice of compliance testing in which the probability of compliance is related to the exceedance [γ, the likelihood that any measurement would exceed an occupational exposure limit (OEL)] and the number of measurements obtained. On the other hand, workers’ chronic health risks generally depend upon cumulative lifetime exposures which are not directly related to the probability of compliance. In this paper we define the probability of “overexposure” (θ) as the likelihood that individual risk (a function of cumulative exposure) exceeds the risk inherent in the OEL (a function of the OEL and duration of exposure). We regard θ as a relevant measure of individual risk for chemicals, such as carcinogens, which produce chronic effects after long-term exposures but not necessarily for acutely-toxic substances which can produce effects relatively quickly. We apply a random-effects model to data from 179 groups of workers, exposed to a variety of chemical agents, and obtain parameter estimates for the group mean exposure and the within- and between-worker components of variance. These estimates are then combined with OELs to generate estimates of γ and θ. We show that compliance testing can significantly underestimate the health risk when sample sizes are small. That is, there can be large probabilities of compliance with typical sample sizes, despite the fact that large proportions of the working population have individual risks greater than the risk inherent in the OEL. We demonstrate further that, because the relationship between θ and γ depends upon the within- and between-worker components of variance, it cannot be assumed a priori that exceedance is a conservative surrogate for overexposure. Thus, we conclude that assessment practices which focus upon either compliance or exceedance are problematic and recommend that employers evaluate exposures relative to the probabilities of overexposure.  相似文献   

10.
11.
Ali Mosleh 《Risk analysis》2012,32(11):1888-1900
Credit risk is the potential exposure of a creditor to an obligor's failure or refusal to repay the debt in principal or interest. The potential of exposure is measured in terms of probability of default. Many models have been developed to estimate credit risk, with rating agencies dating back to the 19th century. They provide their assessment of probability of default and transition probabilities of various firms in their annual reports. Regulatory capital requirements for credit risk outlined by the Basel Committee on Banking Supervision have made it essential for banks and financial institutions to develop sophisticated models in an attempt to measure credit risk with higher accuracy. The Bayesian framework proposed in this article uses the techniques developed in physical sciences and engineering for dealing with model uncertainty and expert accuracy to obtain improved estimates of credit risk and associated uncertainties. The approach uses estimates from one or more rating agencies and incorporates their historical accuracy (past performance data) in estimating future default risk and transition probabilities. Several examples demonstrate that the proposed methodology can assess default probability with accuracy exceeding the estimations of all the individual models. Moreover, the methodology accounts for potentially significant departures from “nominal predictions” due to “upsetting events” such as the 2008 global banking crisis.  相似文献   

12.
Andrea Herrmann 《Risk analysis》2013,33(8):1510-1531
How well can people estimate IT‐related risk? Although estimating risk is a fundamental activity in software management and risk is the basis for many decisions, little is known about how well IT‐related risk can be estimated at all. Therefore, we executed a risk estimation experiment with 36 participants. They estimated the probabilities of IT‐related risks and we investigated the effect of the following factors on the quality of the risk estimation: the estimator's age, work experience in computing, (self‐reported) safety awareness and previous experience with this risk, the absolute value of the risk's probability, and the effect of knowing the estimates of the other participants (see: Delphi method). Our main findings are: risk probabilities are difficult to estimate. Younger and inexperienced estimators were not significantly worse than older and more experienced estimators, but the older and more experienced subjects better used the knowledge gained by knowing the other estimators' results. Persons with higher safety awareness tend to overestimate risk probabilities, but can better estimate ordinal ranks of risk probabilities. Previous own experience with a risk leads to an overestimation of its probability (unlike in other fields like medicine or disasters, where experience with a disease leads to more realistic probability estimates and nonexperience to an underestimation).  相似文献   

13.
Operational risk management of autonomous vehicles in extreme environments is heavily dependent on expert judgments and, in particular, judgments of the likelihood that a failure mitigation action, via correction and prevention, will annul the consequences of a specific fault. However, extant research has not examined the reliability of experts in estimating the probability of failure mitigation. For systems operations in extreme environments, the probability of failure mitigation is taken as a proxy of the probability of a fault not reoccurring. Using a priori expert judgments for an autonomous underwater vehicle mission in the Arctic and a posteriori mission field data, we subsequently developed a generalized linear model that enabled us to investigate this relationship. We found that the probability of failure mitigation alone cannot be used as a proxy for the probability of fault not reoccurring. We conclude that it is also essential to include the effort to implement the failure mitigation when estimating the probability of fault not reoccurring. The effort is the time taken by a person (measured in person-months) to execute the task required to implement the fault correction action. We show that once a modicum of operational data is obtained, it is possible to define a generalized linear logistic model to estimate the probability a fault not reoccurring. We discuss how our findings are important to all autonomous vehicle operations and how similar operations can benefit from revising expert judgments of risk mitigation to take account of the effort required to reduce key risks.  相似文献   

14.
《Risk analysis》2018,38(9):1847-1870
In flood risk analysis, limitations in the multivariate statistical models adopted to model the hydraulic load have restricted the probability of a defense suffering structural failure to be expressed conditionally on a single hydraulic loading variable. This is an issue at the coastal level where multiple loadings act on defenses with the exact combination of loadings dictating their failure probabilities. Recently, a methodology containing a multivariate statistical model with the flexibility to robustly capture the dependence structure between the individual loadings was used to derive extreme nearshore loading conditions. Its adoption will permit the incorporation of more precise representations of a structure's vulnerability in future analyses. In this article, a fragility representation of a shingle beach, where the failure probability is expressed over a three‐dimensional loading parameter space—water level, wave height, and period—is derived at two localities. Within the approach, a Gaussian copula is used to capture any dependencies between the simplified geometric parameters of a beach's shape. Beach profiles are simulated from the copula and the failure probability, given the hydraulic load, determined by the reformulated Bradbury barrier inertia parameter model. At one site, substantial differences in the annual failure probability distribution are observed between the new and existing approaches. At the other, the beach only becomes vulnerable after a significant reduction of the crest height with its mean annual failure probability close to that presently predicted. It is concluded that further application of multivariate approaches is likely to yield more effective flood risk management.  相似文献   

15.
《Risk analysis》2018,38(8):1576-1584
Fault trees are used in reliability modeling to create logical models of fault combinations that can lead to undesirable events. The output of a fault tree analysis (the top event probability) is expressed in terms of the failure probabilities of basic events that are input to the model. Typically, the basic event probabilities are not known exactly, but are modeled as probability distributions: therefore, the top event probability is also represented as an uncertainty distribution. Monte Carlo methods are generally used for evaluating the uncertainty distribution, but such calculations are computationally intensive and do not readily reveal the dominant contributors to the uncertainty. In this article, a closed‐form approximation for the fault tree top event uncertainty distribution is developed, which is applicable when the uncertainties in the basic events of the model are lognormally distributed. The results of the approximate method are compared with results from two sampling‐based methods: namely, the Monte Carlo method and the Wilks method based on order statistics. It is shown that the closed‐form expression can provide a reasonable approximation to results obtained by Monte Carlo sampling, without incurring the computational expense. The Wilks method is found to be a useful means of providing an upper bound for the percentiles of the uncertainty distribution while being computationally inexpensive compared with full Monte Carlo sampling. The lognormal approximation method and Wilks’s method appear attractive, practical alternatives for the evaluation of uncertainty in the output of fault trees and similar multilinear models.  相似文献   

16.
Risk and uncertainty are integral parts of modern technology, and they must be managed effectively to allow the development of reliable, high-quality products. Because so many facets of technology and society involve risk and uncertainty, it is essential that risk management be handled in a systematic manner. Fault-tree analysis is one of the principal methods used in the analysis of systems'safety. Its detailed and systematic deductive structure makes it a valuable tool for design and diagnostic purposes. Point probability and the minimization of the expected failure probability have, until recently, dominated fault-tree analysis. A methodology that incorporates uncertainty analysis, conditional expected risk, and multiple objectives with fault-tree analysis is presented. A computer software package termed the "Distribution Analyzer and Risk Evaluator (DARE) Using Fault Trees," which translates the new methodology into a working decision-support system, is developed. DARE Using Fault Trees is a flexible computer code that is capable of analyzing the risk of the overall system in terms of the probability density function of failure probability. Emphasis is placed on the uncertainty and risk of extreme events. A comparative study between existing codes for fault-tree analysis and DARE demonstrates the strengths of the methodology. A case study for NASA's solid rocket booster is used to perform the comparative analysis.  相似文献   

17.
Decisions in the real world usually involve imprecise information or uncertainty about the precesses by which outcomes may be determined. This research reports the results of a laboratory experiment which examined whether the structure of uncertainty, namely, both the center and the range of the probability distribution describing the uncertainty, is an important determinant of choice. Specifically, it examines how the uncertainty of audit by the Internal Revenue Service of income tax returns affects taxpayers' decisions about intentional noncompliance. The context is relevant as almost nothing is known about how taxpayers assess detection risks using the probability information they have. The study focuses on intentional noncompliance. The factors affecting it are distinct and separate from those affecting unintentional noncompliance. Other factors that affect intentional tax noncompliance, such as risk, tax rates, and penalty rates, were controlled in the experiment. It was hypothesized that the lower the mean and the lesser the range (ambiguity) of the perceived audit probability, the greater the international noncompliance. As hypothesized, the analysis indicates that both the mean and the range of the perceived audit probability rate affect intentional noncompliance, though the effect of ambiguity is greater at a relatively higher level of mean. This result suggests that the strength of the information describing an uncertain event is captured better by both the mean and the range of the uncertainty than either of those components singly.  相似文献   

18.
《Risk analysis》2018,38(4):666-679
We test here the risk communication proposition that explicit expert acknowledgment of uncertainty in risk estimates can enhance trust and other reactions. We manipulated such a scientific uncertainty message, accompanied by probabilities (20%, 70%, implicit [“will occur”] 100%) and time periods (10 or 30 years) in major (≥magnitude 8) earthquake risk estimates to test potential effects on residents potentially affected by seismic activity on the San Andreas fault in the San Francisco Bay Area (n = 750). The uncertainty acknowledgment increased belief that these specific experts were more honest and open, and led to statistically (but not substantively) significant increases in trust in seismic experts generally only for the 20% probability (vs. certainty) and shorter versus longer time period. The acknowledgment did not change judged risk, preparedness intentions, or mitigation policy support. Probability effects independent of the explicit admission of expert uncertainty were also insignificant except for judged risk, which rose or fell slightly depending upon the measure of judged risk used. Overall, both qualitative expressions of uncertainty and quantitative probabilities had limited effects on public reaction. These results imply that both theoretical arguments for positive effects, and practitioners’ potential concerns for negative effects, of uncertainty expression may have been overblown. There may be good reasons to still acknowledge experts’ uncertainties, but those merit separate justification and their own empirical tests.  相似文献   

19.
This article tries to clarify the potential role to be played by uncertainty theories such as imprecise probabilities, random sets, and possibility theory in the risk analysis process. Instead of opposing an objective bounding analysis, where only statistically founded probability distributions are taken into account, to the full‐fledged probabilistic approach, exploiting expert subjective judgment, we advocate the idea that both analyses are useful and should be articulated with one another. Moreover, the idea that risk analysis under incomplete information is purely objective is misconceived. The use of uncertainty theories cannot be reduced to a choice between probability distributions and intervals. Indeed, they offer representation tools that are more expressive than each of the latter approaches and can capture expert judgments while being faithful to their limited precision. Consequences of this thesis are examined for uncertainty elicitation, propagation, and at the decision‐making step.  相似文献   

20.
In risk analysis, the treatment of the epistemic uncertainty associated to the probability of occurrence of an event is fundamental. Traditionally, probabilistic distributions have been used to characterize the epistemic uncertainty due to imprecise knowledge of the parameters in risk models. On the other hand, it has been argued that in certain instances such uncertainty may be best accounted for by fuzzy or possibilistic distributions. This seems the case in particular for parameters for which the information available is scarce and of qualitative nature. In practice, it is to be expected that a risk model contains some parameters affected by uncertainties that may be best represented by probability distributions and some other parameters that may be more properly described in terms of fuzzy or possibilistic distributions. In this article, a hybrid method that jointly propagates probabilistic and possibilistic uncertainties is considered and compared with pure probabilistic and pure fuzzy methods for uncertainty propagation. The analyses are carried out on a case study concerning the uncertainties in the probabilities of occurrence of accident sequences in an event tree analysis of a nuclear power plant.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号