首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Bin Li  Ming Li  Carol Smidts 《Risk analysis》2005,25(4):1061-1077
Probabilistic risk assessment (PRA) is a methodology to assess the probability of failure or success of a system's operation. PRA has been proved to be a systematic, logical, and comprehensive technique for risk assessment. Software plays an increasing role in modern safety critical systems. A significant number of failures can be attributed to software failures. Unfortunately, current probabilistic risk assessment concentrates on representing the behavior of hardware systems, humans, and their contributions (to a limited extent) to risk but neglects the contributions of software due to a lack of understanding of software failure phenomena. It is thus imperative to consider and model the impact of software to reflect the risk in current and future systems. The objective of our research is to develop a methodology to account for the impact of software on system failure that can be used in the classical PRA analysis process. A test-based approach for integrating software into PRA is discussed in this article. This approach includes identification of software functions to be modeled in the PRA, modeling of the software contributions in the ESD, and fault tree. The approach also introduces the concepts of input tree and output tree and proposes a quantification strategy that uses a software safety testing technique. The method is applied to an example system, PACS.  相似文献   

2.
Ralph F. Miles  Jr. 《Risk analysis》2004,24(2):415-424
This article develops a decision-theoretic methodology for the risk-adjusted mission value (RAMV) for selecting between alternative missions in the presence of uncertainty in the outcomes of the missions. This methodology permits trading off mission risk for mission value, something that probabilistic risk analysis cannot do unless it explicitly incorporates both mission value and risk aversion of the project management. The methodology, in its complete implementation, is consistent with the decision theory known as expected utility theory, although it differs from conventional decision theory in that the probabilities and all but one of the utilities are not those of the decision maker. The article also introduces a new interpretation of risk aversion. The methodology is consistent with the elementary management concept concerning division of labor. An example is presented for selecting between discrete alternatives-four landing sites on Mars. A second example is presented for selecting among a set of continuous alternatives-a comet flyby distance. The methodology is developed within the context of scientific missions, but the methodology is equally applicable to any situation requiring outcome value judgments, probability judgments, and risk aversion judgments by different constituencies.  相似文献   

3.
4.
This article discusses how analyst's or expert's beliefs on the credibility and quality of models can be assessed and incorporated into the uncertainty assessment of an unknown of interest. The proposed methodology is a specialization of the Bayesian framework for the assessment of model uncertainty presented in an earlier paper. This formalism treats models as sources of information in assessing the uncertainty of an unknown, and it allows the use of predictions from multiple models as well as experimental validation data about the models’ performances. In this article, the methodology is extended to incorporate additional types of information about the model, namely, subjective information in terms of credibility of the model and its applicability when it is used outside its intended domain of application. An example in the context of fire risk modeling is also provided.  相似文献   

5.
Prediction of natural disasters and their consequences is difficult due to the uncertainties and complexity of multiple related factors. This article explores the use of domain knowledge and spatial data to construct a Bayesian network (BN) that facilitates the integration of multiple factors and quantification of uncertainties within a consistent system for assessment of catastrophic risk. A BN is chosen due to its advantages such as merging multiple source data and domain knowledge in a consistent system, learning from the data set, inference with missing data, and support of decision making. A key advantage of our methodology is the combination of domain knowledge and learning from the data to construct a robust network. To improve the assessment, we employ spatial data analysis and data mining to extend the training data set, select risk factors, and fine‐tune the network. Another major advantage of our methodology is the integration of an optimal discretizer, informative feature selector, learners, search strategies for local topologies, and Bayesian model averaging. These techniques all contribute to a robust prediction of risk probability of natural disasters. In the flood disaster's study, our methodology achieved a better probability of detection of high risk, a better precision, and a better ROC area compared with other methods, using both cross‐validation and prediction of catastrophic risk based on historic data. Our results suggest that BN is a good alternative for risk assessment and as a decision tool in the management of catastrophic risk.  相似文献   

6.
This article proposes a methodology for the application of Bayesian networks in conducting quantitative risk assessment of operations in offshore oil and gas industry. The method involves translating a flow chart of operations into the Bayesian network directly. The proposed methodology consists of five steps. First, the flow chart is translated into a Bayesian network. Second, the influencing factors of the network nodes are classified. Third, the Bayesian network for each factor is established. Fourth, the entire Bayesian network model is established. Lastly, the Bayesian network model is analyzed. Subsequently, five categories of influencing factors, namely, human, hardware, software, mechanical, and hydraulic, are modeled and then added to the main Bayesian network. The methodology is demonstrated through the evaluation of a case study that shows the probability of failure on demand in closing subsea ram blowout preventer operations. The results show that mechanical and hydraulic factors have the most important effects on operation safety. Software and hardware factors have almost no influence, whereas human factors are in between. The results of the sensitivity analysis agree with the findings of the quantitative analysis. The three‐axiom‐based analysis partially validates the correctness and rationality of the proposed Bayesian network model.  相似文献   

7.
We take a novel approach to analyzing hazardous materials transportation risk in this research. Previous studies analyzed this risk from an operations research (OR) or quantitative risk assessment (QRA) perspective by minimizing or calculating risk along a transport route. Further, even though the majority of incidents occur when containers are unloaded, the research has not focused on transportation-related activities, including container loading and unloading. In this work, we developed a decision model of a hazardous materials release during unloading using actual data and an exploratory data modeling approach. Previous studies have had a theoretical perspective in terms of identifying and advancing the key variables related to this risk, and there has not been a focus on probability and statistics-based approaches for doing this. Our decision model empirically identifies the critical variables using an exploratory methodology for a large, highly categorical database involving latent class analysis (LCA), loglinear modeling, and Bayesian networking. Our model identified the most influential variables and countermeasures for two consequences of a hazmat incident, dollar loss and release quantity , and is one of the first models to do this. The most influential variables were found to be related to the failure of the container. In addition to analyzing hazmat risk, our methodology can be used to develop data-driven models for strategic decision making in other domains involving risk.  相似文献   

8.
The Europa mission approved in 2019 is still in the development phase. It is designed to conduct a detailed reconnaissance of that moon of Jupiter as it could possibly support life as we know it. This article is based on a top-down approach (mission → system → subsystems → components) to model the probability of mission failure. The focus here is on the case where the (uncertain) radiation load exceeds the (uncertain) capacity of critical subsystems of the spacecraft. The model is an illustrative quantification of the uncertainties about (1) the complex external radiation environment in repeated exposures, (2) the effectiveness of the shielding in different zones of the spacecraft, and (3) the components’ capacities, by modeling all three as dynamic random variables. A simulation including a sensitivity analysis is used to obtain the failure probability of the whole mission in forty-five revolutions around Jupiter. This article illustrates how probabilistic risk analysis based on engineering models, test results and expert opinions can be used in the early stages of the design of space missions when uncertainties are large. It also describes the optimization of the spacecraft design, taking into account the decisionmakers’ risk attitude and the mission resource constraints.  相似文献   

9.
The use of autonomous underwater vehicles (AUVs) for various scientific, commercial, and military applications has become more common with maturing technology and improved accessibility. One relatively new development lies in the use of AUVs for under‐ice marine science research in the Antarctic. The extreme environment, ice cover, and inaccessibility as compared to open‐water missions can result in a higher risk of loss. Therefore, having an effective assessment of risks before undertaking any Antarctic under‐ice missions is crucial to ensure an AUV's survival. Existing risk assessment approaches predominantly focused on the use of historical fault log data of an AUV and elicitation of experts’ opinions for probabilistic quantification. However, an AUV program in its early phases lacks historical data and any assessment of risk may be vague and ambiguous. In this article, a fuzzy‐based risk assessment framework is proposed for quantifying the risk of AUV loss under ice. The framework uses the knowledge, prior experience of available subject matter experts, and the widely used semiquantitative risk assessment matrix, albeit in a new form. A well‐developed example based on an upcoming mission by an ISE‐explorer class AUV is presented to demonstrate the application and effectiveness of the proposed framework. The example demonstrates that the proposed fuzzy‐based risk assessment framework is pragmatically useful for future under‐ice AUV deployments. Sensitivity analysis demonstrates the validity of the proposed method.  相似文献   

10.
This article proposes a methodology for incorporating electrical component failure data into the human error assessment and reduction technique (HEART) for estimating human error probabilities (HEPs). The existing HEART method contains factors known as error-producing conditions (EPCs) that adjust a generic HEP to a more specific situation being assessed. The selection and proportioning of these EPCs are at the discretion of an assessor, and are therefore subject to the assessor's experience and potential bias. This dependence on expert opinion is prevalent in similar HEP assessment techniques used in numerous industrial areas. The proposed method incorporates factors based on observed trends in electrical component failures to produce a revised HEP that can trigger risk mitigation actions more effectively based on the presence of component categories or other hazardous conditions that have a history of failure due to human error. The data used for the additional factors are a result of an analysis of failures of electronic components experienced during system integration and testing at NASA Goddard Space Flight Center. The analysis includes the determination of root failure mechanisms and trend analysis. The major causes of these defects were attributed to electrostatic damage, electrical overstress, mechanical overstress, or thermal overstress. These factors representing user-induced defects are quantified and incorporated into specific hardware factors based on the system's electrical parts list. This proposed methodology is demonstrated with an example comparing the original HEART method and the proposed modified technique.  相似文献   

11.
This article evaluates the nine empirical studies that have been conducted on expert versus lay judgments of risk. Contrary to received wisdom, this study finds that there is little empirical evidence for the propositions (1) that experts judge risk differently from members of the public or (2) that experts are more veridical in their risk assessments. Methodological weaknesses in the early research are documented, and it is shown that the results of more recent studies are confounded by social and demographic factors that have been found to correlate with judgments of risk. Using a task-analysis taxonomy, a template is provided for the documentation of future studies of expert-lay differences/similarities that will facilitate analytic comparison.  相似文献   

12.
Kara Morgan 《Risk analysis》2005,25(6):1621-1635
Decisions are often made even when there is uncertainty about the possible outcomes. However, methods for making decisions with uncertainty in the problem framework are scarce. Presently, safety assessment for a product containing engineered nano-scale particles is a very poorly structured problem. Many fields of study may inform the safety assessment of such particles (e.g., ultrafines, aerosols, debris from medical devices), but engineered nano-scale particles may present such unique properties that extrapolating from other types of studies may introduce, and not resolve, uncertainty. Some screening-level health effects studies conducted specifically on engineered nano-scale materials have been published and many more are underway. However, it is clear that the extent of research needed to fully and confidently understand the potential for health or environmental risk from engineered nano-scale particles may take years or even decades to complete. In spite of the great uncertainty, there is existing research and experience among researchers that can help to provide a taxonomy of particle properties, perhaps indicating a relative likelihood of risk, in order to prioritize nanoparticle risk research. To help structure this problem, a framework was developed from expert interviews of nanotechnology researchers. The analysis organizes the information as a system based on the risk assessment framework, in order to support the decision about safety. In the long term, this framework is designed to incorporate research results as they are generated, and therefore serve as a tool for estimating the potential for human health and environmental risk.  相似文献   

13.
Operators of long field‐life systems like airplanes are faced with hazards in the supply of spare parts. If the original manufacturers or suppliers of parts end their supply, this may have large impacts on operating costs of firms needing these parts. Existing end‐of‐supply evaluation methods are focused mostly on the downstream supply chain, which is of interest mainly to spare part manufacturers. Firms that purchase spare parts have limited information on parts sales, and indicators of end‐of‐supply risk can also be found in the upstream supply chain. This article proposes a methodology for firms purchasing spare parts to manage end‐of‐supply risk by utilizing proportional hazard models in terms of supply chain conditions of the parts. The considered risk indicators fall into four main categories, of which two are related to supply (price and lead time) and two others are related to demand (cycle time and throughput). The methodology is demonstrated using data on about 2,000 spare parts collected from a maintenance repair organization in the aviation industry. Cross‐validation results and out‐of‐sample risk assessments show good performance of the method to identify spare parts with high end‐of‐supply risk. Further validation is provided by survey results obtained from the maintenance repair organization, which show strong agreement between the firm's and the model's identification of high‐risk spare parts.  相似文献   

14.
For some critical applications, successfully accomplishing the mission or surviving the system through aborting the mission and performing a rescue procedure in the event of certain deterioration condition being satisfied are both pivotal. This has motivated considerable studies on mission abort policies (MAPs) to mitigate the risk of system loss in the past several years, especially for standby systems that use one or multiple standby sparing components to continue the mission when the online component fails, improving the mission success probability. The existing MAPs are mainly based on the number of failed online components ignoring the status of the standby components. This article makes contributions by modeling standby systems subject to MAPs that depend not only on the number of failed online components but also on the number of available standby components remaining. Further, dynamic MAPs considering another additional factor, the time elapsed from the mission beginning in the event of the mission abort decision making, are investigated. The solution methodology encompasses an event-transition based numerical algorithm for evaluating the mission success probability and system survival probability of standby systems subject to the considered MAPs. Examples are provided to demonstrate the benefit of considering the state of standby components and elapsed operation time in obtaining more flexible MAPs.  相似文献   

15.
In the past few years, the field of dam safety has approached risk informed methodologies throughout the world and several methodologies and programs are appearing to aid in the systematization of the calculations. The most common way of implementing these calculations is through the use of event trees, computing event probabilities, and incremental consequences. This methodology is flexible enough for several situations, but its generalization to the case of systems of several dams is complex and its implementation in a completely general calculation methodology presents some problems. Retaining the event tree framework, a new methodology is proposed to calculate incremental risks. The main advantage of this proposed methodology is the ease with which it can be applied to systems of several dams: with a single risk model that describes the complete system and with a single calculation the incremental risks of the system can be obtained, being able to allocate the risk of each dam and of each failure mode. The article shows how both methodologies are equivalent and also applies them to a case study.  相似文献   

16.
Some program managers share a common belief that adding a redundant component to a system reduces the probability of failure by half. This is true only if the failures of the redundant components are independent events, which is rarely the case. For example, the redundant components may be subjected to the same external loads. There is, however, in general a decrease in the failure probability of the system. Nonetheless, the redundant element comes at a cost, even if it is less than that of developing the first one when both are based on the same design. Identical parts save the most in terms of design costs, but are subjected to common failure modes from possible design errors that limit the effectiveness of the redundancy. In the development of critical systems, managers thus need to decide if the costs of a parallel system are justified by the increase in the system's reliability. NASA, for example, has used redundant spacecraft to increase the chances of mission success, which worked well in the cases of the Viking and Voyager missions. These two successes, however, do not guarantee future ones. We present here a risk analysis framework accounting for dependencies to support the decision to launch at the same time a twin mission of identical spacecraft, given incremental costs and risk-reduction benefits of the second one. We illustrate this analytical approach with the case of the Mars Exploration Rovers launched by NASA in 2003, for which we had performed this assessment in 2001.  相似文献   

17.
Losses due to natural hazard events can be extraordinarily high and difficult to cope with. Therefore, there is considerable interest to estimate the potential impact of current and future extreme events at all scales in as much detail as possible. As hazards typically spread over wider areas, risk assessment must take into account interrelations between regions. Neglecting such interdependencies can lead to a severe underestimation of potential losses, especially for extreme events. This underestimation of extreme risk can lead to the failure of riskmanagement strategies when they are most needed, namely, in times of unprecedented events. In this article, we suggest a methodology to incorporate such interdependencies in risk via the use of copulas. We demonstrate that by coupling losses, dependencies can be incorporated in risk analysis, avoiding the underestimation of risk. Based on maximum discharge data of river basins and stream networks, we present and discuss different ways to couple loss distributions of basins while explicitly incorporating tail dependencies. We distinguish between coupling methods that require river structure data for the analysis and those that do not. For the later approach we propose a minimax algorithm to choose coupled basin pairs so that the underestimation of risk is avoided and the use of river structure data is not needed. The proposed methodology is especially useful for large‐scale analysis and we motivate and apply our method using the case of Romania. The approach can be easily extended to other countries and natural hazards.  相似文献   

18.
Underlying information about failure, including observations made in free text, can be a good source for understanding, analyzing, and extracting meaningful information for determining causation. The unstructured nature of natural language expression demands advanced methodology to identify its underlying features. There is no available solution to utilize unstructured data for risk assessment purposes. Due to the scarcity of relevant data, textual data can be a vital learning source for developing a risk assessment methodology. This work addresses the knowledge gap in extracting relevant features from textual data to develop cause–effect scenarios with minimal manual interpretation. This study applies natural language processing and text-mining techniques to extract features from past accident reports. The extracted features are transformed into parametric form with the help of fuzzy set theory and utilized in Bayesian networks as prior probabilities for risk assessment. An application of the proposed methodology is shown in microbiologically influenced corrosion-related incident reports available from the Pipeline and Hazardous Material Safety Administration database. In addition, the trained named entity recognition (NER) model is verified on eight incidents, showing a promising preliminary result for identifying all relevant features from textual data and demonstrating the robustness and applicability of the NER method. The proposed methodology can be used in domain-specific risk assessment to analyze, predict, and prevent future mishaps, ameliorating overall process safety.  相似文献   

19.
Risk Analysis for Critical Asset Protection   总被引:2,自引:0,他引:2  
This article proposes a quantitative risk assessment and management framework that supports strategic asset-level resource allocation decision making for critical infrastructure and key resource protection. The proposed framework consists of five phases: scenario identification, consequence and criticality assessment, security vulnerability assessment, threat likelihood assessment, and benefit-cost analysis. Key innovations in this methodology include its initial focus on fundamental asset characteristics to generate an exhaustive set of plausible threat scenarios based on a target susceptibility matrix (which we refer to as asset-driven analysis) and an approach to threat likelihood assessment that captures adversary tendencies to shift their preferences in response to security investments based on the expected utilities of alternative attack profiles assessed from the adversary perspective. A notional example is provided to demonstrate an application of the proposed framework. Extensions of this model to support strategic portfolio-level analysis and tactical risk analysis are suggested.  相似文献   

20.
Ali Mosleh 《Risk analysis》2012,32(11):1888-1900
Credit risk is the potential exposure of a creditor to an obligor's failure or refusal to repay the debt in principal or interest. The potential of exposure is measured in terms of probability of default. Many models have been developed to estimate credit risk, with rating agencies dating back to the 19th century. They provide their assessment of probability of default and transition probabilities of various firms in their annual reports. Regulatory capital requirements for credit risk outlined by the Basel Committee on Banking Supervision have made it essential for banks and financial institutions to develop sophisticated models in an attempt to measure credit risk with higher accuracy. The Bayesian framework proposed in this article uses the techniques developed in physical sciences and engineering for dealing with model uncertainty and expert accuracy to obtain improved estimates of credit risk and associated uncertainties. The approach uses estimates from one or more rating agencies and incorporates their historical accuracy (past performance data) in estimating future default risk and transition probabilities. Several examples demonstrate that the proposed methodology can assess default probability with accuracy exceeding the estimations of all the individual models. Moreover, the methodology accounts for potentially significant departures from “nominal predictions” due to “upsetting events” such as the 2008 global banking crisis.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号