首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 78 毫秒
1.
Quantifying safety goals is a key to the regulation of activities which are beneficial on the whole but entail some risks in being performed. Determining compliance with safety goals involves dealing with uncertainties. A recent article by Bier(I) describes some of the difficulties encountered using measures with uncertainty to determine compliance with safety goals for nuclear reactors. This paper uses a hierarchical Bayes approach to address two practical modeling problems in determining safety goal compliance under uncertainty: (1) allowing some modeling assumptions to be relaxed, and (2) allowing data from previous related samples to be included in the analysis. The two issues effect each other to the extent that relaxing some assumptions allows the use of a broader range of data. The usefulness of these changes and their impact on assessing safety compliance for nuclear reactors is shown.  相似文献   

2.
《Risk analysis》2018,38(5):876-888
To solve real‐life problems—such as those related to technology, health, security, or climate change—and make suitable decisions, risk is nearly always a main issue. Different types of sciences are often supporting the work, for example, statistics, natural sciences, and social sciences. Risk analysis approaches and methods are also commonly used, but risk analysis is not broadly accepted as a science in itself. A key problem is the lack of explanatory power and large uncertainties when assessing risk. This article presents an emerging new risk analysis science based on novel ideas and theories on risk analysis developed in recent years by the risk analysis community. It builds on a fundamental change in thinking, from the search for accurate predictions and risk estimates, to knowledge generation related to concepts, theories, frameworks, approaches, principles, methods, and models to understand, assess, characterize, communicate, and (in a broad sense) manage risk. Examples are used to illustrate the importance of this distinct/separate risk analysis science for solving risk problems, supporting science in general and other disciplines in particular.  相似文献   

3.
《Risk analysis》2018,38(4):724-754
A bounding risk assessment is presented that evaluates possible human health risk from a hypothetical scenario involving a 10,000‐gallon release of flowback water from horizontal fracturing of Marcellus Shale. The water is assumed to be spilled on the ground, infiltrates into groundwater that is a source of drinking water, and an adult and child located downgradient drink the groundwater. Key uncertainties in estimating risk are given explicit quantitative treatment using Monte Carlo analysis. Chemicals that contribute significantly to estimated health risks are identified, as are key uncertainties and variables to which risk estimates are sensitive. The results show that hypothetical exposure via drinking water impacted by chemicals in Marcellus Shale flowback water, assumed to be spilled onto the ground surface, results in predicted bounds between 10−10 and 10−6 (for both adult and child receptors) for excess lifetime cancer risk. Cumulative hazard indices (HICUMULATIVE) resulting from these hypothetical exposures have predicted bounds (5th to 95th percentile) between 0.02 and 35 for assumed adult receptors and 0.1 and 146 for assumed child receptors. Predicted health risks are dominated by noncancer endpoints related to ingestion of barium and lithium in impacted groundwater. Hazard indices above unity are largely related to exposure to lithium. Salinity taste thresholds are likely to be exceeded before drinking water exposures result in adverse health effects. The findings provide focus for policy discussions concerning flowback water risk management. They also indicate ways to improve the ability to estimate health risks from drinking water impacted by a flowback water spill (i.e., reducing uncertainty).  相似文献   

4.
This article discusses the methodologies presently available for analyzing the contribution of "external initiators" to overall risks in the context of PRA (probabilistic risk assessment) of large commercial nuclear power reactors. "External initiators" include earthquakes, fires and floods inside the plant, external floods, high winds, aircraft, barge, and ship collisions, noxious or explosive gases offsite, and so on. These are in contrast to "internal initiators" such as active or passive plant equipment failures, human errors, and loss of electrical power. The ability to consider external initiators within PRA has undergone major advances in recent years. In general, uncertainties associated with the calculated risks from external initiators are much larger than those associated with internal initiators. The principal uncertainties lie with development of hazard curves (such as the frequency of occurrence of an event exceeding a given size: for example, the likelihood of a hurricane with winds exceeding 125 knots). For assessment of earthquakes, internal fires and floods, and high winds, the methodology is reasonably mature for qualitative assessment but not for quantitative application. The risks from other external initiators are generally considered to be low, either because of the very long recurrence time associated with the events or because the plants are judged to be well designed to withstand them.  相似文献   

5.
Abstract

Reliability determines, in large part, the operational productivity. Nevertheless, a frequent problem is the absence of effective mechanisms to support maintenance management. In particular, there is a need for methodologies focused on improving the detection and analysis of risks that affect reliability. This article presents a methodological proposal for the resolution of these problems, using a high-impact failure mode analysis. The methodology is based on four phases: identification of failure modes, ranking and criticality analysis of them, identification of the root cause(s) and search for highly effective solutions. Among the variety of tools that can be used, it is proposed the use of three specific tools: Criticality Analysis, which allows discrimination and ranking of phenomena and assets; Root Cause Analysis, which focuses on the identification of the real causes of the problems; and a tool for generation of effective and efficient solutions (TRIZ), which it is not usually applied to reliability problems. The proposal is applied in a mining filtration plant, identifying and classifying current problems and generating solutions.  相似文献   

6.
The bounding analysis methodology described by Ha-Duong et al. (this issue) is logically incomplete and invites serious misuse and misinterpretation, as their own example and interpretation illustrate. A key issue is the extent to which these problems are inherent in their methodology, and resolvable by a logically complete assessment (such as Monte Carlo or Bayesian risk assessment), as opposed to being general problems in any risk-assessment methodology. I here attempt to apportion the problems between those inherent in the proposed bounding analysis and those that are more general, such as reliance on questionable expert elicitations. I conclude that the specific methodology of Ha-Duong et al. suffers from logical gaps in the definition and construction of inputs, and hence should not be used in the form proposed. Furthermore, the labor required to do a sound bounding analysis is great enough so that one may as well skip that analysis and carry out a more logically complete probabilistic analysis, one that will better inform the consumer of the appropriate level uncertainty. If analysts insist on carrying out a bounding analysis in place of more thorough assessments, extensive analyses of sensitivity to inputs and assumptions will be essential to display uncertainties, arguably more essential than it would be in full probabilistic analyses.  相似文献   

7.
Methods for Uncertainty Analysis: A Comparative Survey   总被引:1,自引:0,他引:1  
This paper presents a survey and comparative evaluation of methods which have been developed for the determination of uncertainties in accident consequences and probabilities, for use in probabilistic risk assessment. The methods considered are: analytic techniques, Monte Carlo simulation, response surface approaches, differential sensitivity techniques, and evaluation of classical statistical confidence bounds. It is concluded that only the response surface and differential sensitivity approaches are sufficiently general and flexible for use as overall methods of uncertainty analysis in probabilistic risk assessment. The other methods considered, however, are very useful in particular problems.  相似文献   

8.
The management of natural hazards occurring over a territory entails two main phases: a preoperational —or pre-event—phase, whose objective is to relocate resources closer to sites characterized by the highest hazard, and an operational —during the event—phase, whose objective is to manage in real time the available resources by allocating them to sites where their intervention is needed. Obviously, the two phases are closely related, and demand a unified and integrated treatment. This work presents a unifying framework that integrates various decisional problems arising in the management of different kinds of natural hazards. The proposed approach, which is based on a mathematical programming formulation, can support the decisionmakers in the optimal resource allocation before (preoperational phase) and during (operational phase) an emergency due to natural hazard events. Different alternatives of modeling the resources and the territory are proposed and discussed according to their appropriateness in the preoperational and operational phases. The proposed approach can be applied to the management of any natural hazard and, from an integration perspective, may be particularly useful for risk management in civil protection operations. An application related to the management of wildfire hazard is presented.  相似文献   

9.
《Risk analysis》2018,38(8):1634-1655
The work in the article presents the development of an application guide based on feedback and comments stemming from various railway actors on their practices of SIL allocation to railway safety‐related functions. The initial generic methodology for SIL allocation has been updated to be applied to railway rolling stock safety‐related functions in order to solve the SIL concept application issues. Various actors dealing with railway SIL allocation problems are the intended target of the methodology; its principles will be summarized in this article with a focus on modifications and precisions made in order to establish a practical guide for railway safety authorities. The methodology is based on the flowchart formalism used in CSM (common safety method) European regulation. It starts with the use of quantitative safety requirements, particularly tolerable hazard rates (THR). THR apportioning rules are applied. On the one hand, the rules are related to classical logical combinations of safety‐related functions preventing hazard occurrence. On the other hand, to take into account technical conditions (last safety weak link, functional dependencies, technological complexity, etc.), specific rules implicitly used in existing practices are defined for readjusting some THR values. SIL allocation process based on apportioned and validated THR values is finally illustrated through the example of “emergency brake” subsystems. Some specific SIL allocation rules are also defined and illustrated.  相似文献   

10.
The transparent and fair characterization of scientific evidence for reporting the results of a hazard assessment is a demanding task. In this article, we present an approach for characterizing evidence--the evidence map approach. The theoretical starting point is to view evidence characterization as a form of argumentation. Thus, evidence maps are designed to depict the evidence base, the pro and con arguments, and the remaining uncertainties, which together lead experts to their conclusions when summarizing and evaluating the scientific evidence about a potential hazard. To illustrate its use, the evidence maps approach is applied to characterizing the health-relevant effects of engineered nanoparticles. Empirical data from an online survey suggests that the use of evidence maps improves the reporting of hazard assessments. Nonexperts prefer to receive the information included in an evidence map in order to come to an informed judgment. Furthermore, the benefits and limitations of evidence maps are discussed in the light of recent literature on risk communication. Finally, the article underlines the need for further research in order to increase quality of evidence reporting.  相似文献   

11.
Prediction of natural disasters and their consequences is difficult due to the uncertainties and complexity of multiple related factors. This article explores the use of domain knowledge and spatial data to construct a Bayesian network (BN) that facilitates the integration of multiple factors and quantification of uncertainties within a consistent system for assessment of catastrophic risk. A BN is chosen due to its advantages such as merging multiple source data and domain knowledge in a consistent system, learning from the data set, inference with missing data, and support of decision making. A key advantage of our methodology is the combination of domain knowledge and learning from the data to construct a robust network. To improve the assessment, we employ spatial data analysis and data mining to extend the training data set, select risk factors, and fine‐tune the network. Another major advantage of our methodology is the integration of an optimal discretizer, informative feature selector, learners, search strategies for local topologies, and Bayesian model averaging. These techniques all contribute to a robust prediction of risk probability of natural disasters. In the flood disaster's study, our methodology achieved a better probability of detection of high risk, a better precision, and a better ROC area compared with other methods, using both cross‐validation and prediction of catastrophic risk based on historic data. Our results suggest that BN is a good alternative for risk assessment and as a decision tool in the management of catastrophic risk.  相似文献   

12.
The tragic events of 9/11 and the concerns about the potential for a terrorist or hostile state attack with weapons of mass destruction have led to an increased emphasis on risk analysis for homeland security. Uncertain hazards (natural and engineering) have been successfully analyzed using probabilistic risk analysis (PRA). Unlike uncertain hazards, terrorists and hostile states are intelligent adversaries who can observe our vulnerabilities and dynamically adapt their plans and actions to achieve their objectives. This article compares uncertain hazard risk analysis with intelligent adversary risk analysis, describes the intelligent adversary risk analysis challenges, and presents a probabilistic defender–attacker–defender model to evaluate the baseline risk and the potential risk reduction provided by defender investments. The model includes defender decisions prior to an attack; attacker decisions during the attack; defender actions after an attack; and the uncertainties of attack implementation, detection, and consequences. The risk management model is demonstrated with an illustrative bioterrorism problem with notional data.  相似文献   

13.
The life cycle assessment (LCA) framework has established itself as the leading tool for the assessment of the environmental impact of products. Several works have established the need of integrating the LCA and risk analysis methodologies, due to the several common aspects. One of the ways to reach such integration is through guaranteeing that uncertainties in LCA modeling are carefully treated. It has been claimed that more attention should be paid to quantifying the uncertainties present in the various phases of LCA. Though the topic has been attracting increasing attention of practitioners and experts in LCA, there is still a lack of understanding and a limited use of the available statistical tools. In this work, we introduce a protocol to conduct global sensitivity analysis in LCA. The article focuses on the life cycle impact assessment (LCIA), and particularly on the relevance of global techniques for the development of trustable impact assessment models. We use a novel characterization model developed for the quantification of the impacts of noise on humans as a test case. We show that global SA is fundamental to guarantee that the modeler has a complete understanding of: (i) the structure of the model and (ii) the importance of uncertain model inputs and the interaction among them.  相似文献   

14.
The abandoned mine legacy is critical in many countries around the world, where mine cave-ins and surface subsidence disruptions are perpetual risks that can affect the population, infrastructure, historical legacies, land use, and the environment. This article establishes abandoned metal mine failure risk evaluation approaches and quantification techniques based on the Canadian mining experience. These utilize clear geomechanics considerations such as failure mechanisms, which are dependent on well-defined rock mass parameters. Quantified risk is computed using probability of failure (probabilistics using limit-equilibrium factors of safety or applicable numerical modeling factor of safety quantifications) times a consequence impact value. Semi-quantified risk can be based on failure-case-study-based empirical data used in calculating probability of failure, and personal experience can provide qualified hazard and impact consequence assessments. The article provides outlines for land use and selection of remediation measures based on risk.  相似文献   

15.
We consider multi-criteria group decision-making problems, where the decision makers (DMs) want to identify their most preferred alternative(s) based on uncertain or inaccurate criteria measurements. In many real-life problems the uncertainties may be dependent. In this paper, we focus on multicriteria decision-making (MCDM) problems where the criteria and their uncertainties are computed using a stochastic simulation model. The model is based on decision variables and stochastic parameters with given distributions. The simulation model determines for the criteria a joint probability distribution, which quantifies the uncertainties and their dependencies. We present and compare two methods for treating the uncertainty and dependency information within the SMAA-2 multi-criteria decision aid method. The first method applies directly the discrete sample generated by the simulation model. The second method is based on using a multivariate Gaussian distribution. We demonstrate the methods using a decision support model for a retailer operating in the deregulated European electricity market.  相似文献   

16.
A major issue in all risk communication efforts is the distinction between the terms “risk” and “hazard.” The potential to harm a target such as human health or the environment is normally defined as a hazard, whereas risk also encompasses the probability of exposure and the extent of damage. What can be observed again and again in risk communication processes are misunderstandings and communication gaps related to these crucial terms. We asked a sample of 53 experts from public authorities, business and industry, and environmental and consumer organizations in Germany to outline their understanding and use of these terms using both the methods of expert interviews and focus groups. The empirical study made clear that the terms risk and hazard are perceived and used very differently in risk communication depending on the perspective of the stakeholders. Several factors can be identified, such as responsibility for hazard avoidance, economic interest, or a watchdog role. Thus, communication gaps can be reduced to a four‐fold problem matrix comprising a semantic, conceptual, strategic, and control problem. The empirical study made clear that risks and hazards are perceived very differently depending on the stakeholders’ perspective. Their own worldviews played a major role in their specific use of the two terms hazards and risks in communication.  相似文献   

17.
《Omega》1986,14(3):221-231
A distinguishing feature of contracted production is that the contractual terms must be agreed upon before the product exists. If the specified product is the result of an innovative and complex process, uncertainties as to its production cost estimates can be enormous. This renders the traditional cost-plus-fixed-fee and fixed-price contractual arrangements increasingly untenable. An alternative contractual arrangement that is currently gaining greater acceptance is the fixed-price-incentive contract, which may be as simple as a linear cost-sharing formula or as complex as a state-contingent reward penalty structure. The problem from the viewpoint of the contractee (principal) is to identify the optimal incentive structure in order to minimize his expected cost; the incentive affects the contractor (agent) through his risk aversion and his propensity for “moral hazard”, due to the co-insurance effect of cost-sharing incentives. Taking as premise the contractee's risk-neutrality, the problem is formulated as a constrained optimization problem, with the constraint arising from the returns to each of the parties to the contract. It is thereby demonstrated that the currently popular linear incentive mechanism is ineffective from the viewpoint of the contractee before the introduction of the problem of moral hazard. We then show that, the problem can be segregated into independent components of the contractor's risk aversion and his propensity for moral hazard. Having established this, we proceed to examine the incentive form as related to the “monitoring mechanism” instituted by the contractor during the performance of the contract. Acceptance of the conclusions necessitates that the contracting parties can interpret their objective as the maximization of the ratio of their expected profits to target profits, which is demonstrated as being equivalent to the minimization of the certainty equivalent of costs.  相似文献   

18.
Probabilistic risk analysis (PRA) can be an effective tool to assess risks and uncertainties and to set priorities among safety policy options. Based on systems analysis and Bayesian probability, PRA has been applied to a wide range of cases, three of which are briefly presented here: the maintenance of the tiles of the space shuttle, the management of patient risk in anesthesia, and the choice of seismic provisions of building codes for the San Francisco Bay Area. In the quantification of a risk, a number of problems arise in the public sector where multiple stakeholders are involved. In this article, I describe different approaches to the treatments of uncertainties in risk analysis, their implications for risk ranking, and the role of risk analysis results in the context of a safety decision process. I also discuss the implications of adopting conservative hypotheses before proceeding to what is, in essence, a conditional uncertainty analysis, and I explore some implications of different levels of "conservatism" for the ranking of risk mitigation measures.  相似文献   

19.
The purpose of this article is to discuss the role of quantitative risk assessments for characterizing risk and uncertainty and delineating appropriate risk management options. Our main concern is situations (risk problems) with large potential consequences, large uncertainties, and/or ambiguities (related to the relevance, meaning, and implications of the decision basis; or related to the values to be protected and the priorities to be made), in particular terrorism risk. We look into the scientific basis of the quantitative risk assessments and the boundaries of the assessments in such a context. Based on a risk perspective that defines risk as uncertainty about and severity of the consequences (or outcomes) of an activity with respect to something that humans value we advocate a broad risk assessment approach characterizing uncertainties beyond probabilities and expected values. Key features of this approach are qualitative uncertainty assessment and scenario building instruments.  相似文献   

20.
Terje Aven 《Risk analysis》2010,30(3):354-360
It is common perspective in risk analysis that there are two kinds of uncertainties: i) variability as resulting from heterogeneity and stochasticity (aleatory uncertainty) and ii) partial ignorance or epistemic uncertainties resulting from systematic measurement error and lack of knowledge. Probability theory is recognized as the proper tool for treating the aleatory uncertainties, but there are different views on what is the best approach for describing partial ignorance and epistemic uncertainties. Subjective probabilities are often used for representing this type of ignorance and uncertainties, but several alternative approaches have been suggested, including interval analysis, probability bound analysis, and bounds based on evidence theory. It is argued that probability theory generates too precise results when the background knowledge of the probabilities is poor. In this article, we look more closely into this issue. We argue that this critique of probability theory is based on a conception of risk assessment being a tool to objectively report on the true risk and variabilities. If risk assessment is seen instead as a method for describing the analysts’ (and possibly other stakeholders’) uncertainties about unknown quantities, the alternative approaches (such as the interval analysis) often fail in providing the necessary decision support.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号