首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
《Risk analysis》2018,38(10):2087-2104
In the United Kingdom, dwelling fires are responsible for the majority of all fire‐related fatalities. The development of these incidents involves the interaction of a multitude of variables that combine in many different ways. Consequently, assessment of dwelling fire risk can be complex, which often results in ambiguity during fire safety planning and decision making. In this article, a three‐part Bayesian network model is proposed to study dwelling fires from ignition through to extinguishment in order to improve confidence in dwelling fire safety assessment. The model incorporates both hard and soft data, delivering posterior probabilities for selected outcomes. Case studies demonstrate how the model functions and provide evidence of its use for planning and accident investigation.  相似文献   

2.
Domino Effect Analysis Using Bayesian Networks   总被引:1,自引:0,他引:1  
A new methodology is introduced based on Bayesian network both to model domino effect propagation patterns and to estimate the domino effect probability at different levels. The flexible structure and the unique modeling techniques offered by Bayesian network make it possible to analyze domino effects through a probabilistic framework, considering synergistic effects, noisy probabilities, and common cause failures. Further, the uncertainties and the complex interactions among the domino effect components are captured using Bayesian network. The probabilities of events are updated in the light of new information, and the most probable path of the domino effect is determined on the basis of the new data gathered. This study shows how probability updating helps to update the domino effect model either qualitatively or quantitatively. The methodology is applied to a hypothetical example and also to an earlier‐studied case study. These examples accentuate the effectiveness of Bayesian network in modeling domino effects in processing facility.  相似文献   

3.
This study presents probabilistic analysis of dam accidents worldwide in the period 1911–2016. The accidents are classified by the dam purpose and by the country cluster, where they occurred, distinguishing between the countries of the Organization for Economic Cooperation and Development (OECD) and nonmember countries (non-OECD without China). A Bayesian hierarchical approach is used to model distributions of frequency and severity for accidents. This approach treats accident data as a multilevel system with subsets sharing specific characteristics. To model accident probabilities for a particular dam characteristic, this approach samples data from the entire data set, borrowing the strength across data set and enabling to model distributions even for subsets with scarce data. The modelled frequencies and severities are combined in frequency-consequence curves, showing that accidents for all dam purposes are more frequent in non-OECD (without China) and their maximum consequences are larger than in OECD countries. Multipurpose dams also have higher frequencies and maximum consequences than single-purpose dams. In addition, the developed methodology explicitly models time dependence to identify trends in accident frequencies over the analyzed period. Downward trends are found for almost all dam purposes confirming that technological development and implementation of safety measures are likely to have a positive impact on dam safety. The results of the analysis provide insights for dam risk management and decision-making processes by identifying key risk factors related to country groups and dam purposes as well as changes over time.  相似文献   

4.
The current trends of climate change will increase people's exposure to urban risks related to events such as landslides, floods, forest fires, food production, health, and water availability, which are stochastic and very localized in nature. This research uses a Bayesian network (BN) approach to analyze the intensity of such urban risks for the Andean municipality of Pasto, Colombia, under climate change scenarios. The stochastic BN model is linked to correlational models and local scenarios of representative concentration trajectories (RCP) to project the possible risks to which the municipality of Pasto will be exposed in the future. The results show significant risks in crop yields, food security, water availability and disaster risks, but no significant risks on the incidence of acute diarrheal diseases (ADD) and acute respiratory infections (ARI), whereas positive outcomes are likely to occur in livestock production, influenced by population growth. The advantage of the BN approach is the possibility of updating beliefs in the probabilities of occurrence of events, especially in developing, intermediate cities with information-limited contexts.  相似文献   

5.
Pesticide risk assessment for food products involves combining information from consumption and concentration data sets to estimate a distribution for the pesticide intake in a human population. Using this distribution one can obtain probabilities of individuals exceeding specified levels of pesticide intake. In this article, we present a probabilistic, Bayesian approach to modeling the daily consumptions of the pesticide Iprodione though multiple food products. Modeling data on food consumption and pesticide concentration poses a variety of problems, such as the large proportions of consumptions and concentrations that are recorded as zero, and correlation between the consumptions of different foods. We consider daily food consumption data from the Netherlands National Food Consumption Survey and concentration data collected by the Netherlands Ministry of Agriculture. We develop a multivariate latent‐Gaussian model for the consumption data that allows for correlated intakes between products. For the concentration data, we propose a univariate latent‐t model. We then combine predicted consumptions and concentrations from these models to obtain a distribution for individual daily Iprodione exposure. The latent‐variable models allow for both skewness and large numbers of zeros in the consumption and concentration data. The use of a probabilistic approach is intended to yield more robust estimates of high percentiles of the exposure distribution than an empirical approach. Bayesian inference is used to facilitate the treatment of data with a complex structure.  相似文献   

6.
M. C. Kennedy 《Risk analysis》2011,31(10):1597-1609
Two‐dimensional Monte Carlo simulation is frequently used to implement probabilistic risk models, as it allows for uncertainty and variability to be quantified separately. In many cases, we are interested in the proportion of individuals from a variable population exceeding a critical threshold, together with uncertainty about this proportion. In this article we introduce a new method that can accurately estimate these quantities much more efficiently than conventional algorithms. We also show how those model parameters having the greatest impact on the probabilities of rare events can be quickly identified via this method. The algorithm combines elements from well‐established statistical techniques in extreme value theory and Bayesian analysis of computer models. We demonstrate the practical application of these methods with a simple example, in which the true distributions are known exactly, and also with a more realistic model of microbial contamination of milk with seven parameters. For the latter, sensitivity analysis (SA) is shown to identify the two inputs explaining the majority of variation in distribution tail behavior. In the subsequent prediction of probabilities of large contamination events, similar results are obtained using the new approach taking 43 seconds or the conventional simulation that requires more than 3 days.  相似文献   

7.
8.
Safety analysis of rare events with potentially catastrophic consequences is challenged by data scarcity and uncertainty. Traditional causation‐based approaches, such as fault tree and event tree (used to model rare event), suffer from a number of weaknesses. These include the static structure of the event causation, lack of event occurrence data, and need for reliable prior information. In this study, a new hierarchical Bayesian modeling based technique is proposed to overcome these drawbacks. The proposed technique can be used as a flexible technique for risk analysis of major accidents. It enables both forward and backward analysis in quantitative reasoning and the treatment of interdependence among the model parameters. Source‐to‐source variability in data sources is also taken into account through a robust probabilistic safety analysis. The applicability of the proposed technique has been demonstrated through a case study in marine and offshore industry.  相似文献   

9.
Vulnerability of human beings exposed to a catastrophic disaster is affected by multiple factors that include hazard intensity, environment, and individual characteristics. The traditional approach to vulnerability assessment, based on the aggregate‐area method and unsupervised learning, cannot incorporate spatial information; thus, vulnerability can be only roughly assessed. In this article, we propose Bayesian network (BN) and spatial analysis techniques to mine spatial data sets to evaluate the vulnerability of human beings. In our approach, spatial analysis is leveraged to preprocess the data; for example, kernel density analysis (KDA) and accumulative road cost surface modeling (ARCSM) are employed to quantify the influence of geofeatures on vulnerability and relate such influence to spatial distance. The knowledge‐ and data‐based BN provides a consistent platform to integrate a variety of factors, including those extracted by KDA and ARCSM to model vulnerability uncertainty. We also consider the model's uncertainty and use the Bayesian model average and Occam's Window to average the multiple models obtained by our approach to robust prediction of the risk and vulnerability. We compare our approach with other probabilistic models in the case study of seismic risk and conclude that our approach is a good means to mining spatial data sets for evaluating vulnerability.  相似文献   

10.
The association between daily variations in urban air quality and mortality has been well documented using time series statistical methods. This approach assumes a constant association over time. We develop a space-time dynamic model that relaxes this assumption, thus more directly examining the hypothesis that improvements in air quality translate into improvements in public health. We postulate a Bayesian hierarchical two-level model to estimate annual mortality risks at regional and national levels and to track both risk and heterogeneity of risk within and between regions over time. We illustrate our methods using daily nitrogen dioxide concentrations (NO2) and nonaccidental mortality data collected for 1984-2004 in 24 Canadian cities. Estimates of risk and heterogeneity are compared by cause of mortality (cardio-pulmonary [CP] versus non-CP) and season, respectively. Over the entire period, the NO2 risk for CP mortality was slightly lower but with a narrower credible interval than that for non-CP mortality, mainly due to an unusually low risk for a single year (1998). Warm season NO2 risk was higher than cold season risk for both CP and non-CP mortality. For 21 years overall there were no significant differences detected among the four regional NO2 risks. We found overall that there was no strong evidence for time trends in NO2 risk at national or regional levels. However, an increasing linear time trend in the annual between-region heterogeneities was detected, which suggests the differences in risk among the four regions are getting larger, and further studies are necessary to understand the increasing heterogeneity.  相似文献   

11.
Stakeholders making decisions in public health and world trade need improved estimations of the burden‐of‐illness of foodborne infectious diseases. In this article, we propose a Bayesian meta‐analysis or more precisely a Bayesian evidence synthesis to assess the burden‐of‐illness of campylobacteriosis in France. Using this case study, we investigate campylobacteriosis prevalence, as well as the probabilities of different events that guide the disease pathway, by (i) employing a Bayesian approach on French and foreign human studies (from active surveillance systems, laboratory surveys, physician surveys, epidemiological surveys, and so on) through the chain of events that occur during an episode of illness and (ii) including expert knowledge about this chain of events. We split the target population using an exhaustive and exclusive partition based on health status and the level of disease investigation. We assume an approximate multinomial model over this population partition. Thereby, each observed data set related to the partition brings information on the parameters of the multinomial model, improving burden‐of‐illness parameter estimates that can be deduced from the parameters of the basic multinomial model. This multinomial model serves as a core model to perform a Bayesian evidence synthesis. Expert knowledge is introduced by way of pseudo‐data. The result is a global estimation of the burden‐of‐illness parameters with their accompanying uncertainty.  相似文献   

12.
The ability to accurately measure recovery rate of infrastructure systems and communities impacted by disasters is vital to ensure effective response and resource allocation before, during, and after a disruption. However, a challenge in quantifying such measures resides in the lack of data as community recovery information is seldom recorded. To provide accurate community recovery measures, a hierarchical Bayesian kernel model (HBKM) is developed to predict the recovery rate of communities experiencing power outages during storms. The performance of the proposed method is evaluated using cross‐validation and compared with two models, the hierarchical Bayesian regression model and the Poisson generalized linear model. A case study focusing on the recovery of communities in Shelby County, Tennessee after severe storms between 2007 and 2017 is presented to illustrate the proposed approach. The predictive accuracy of the models is evaluated using the log‐likelihood and root mean squared error. The HBKM yields on average the highest out‐of‐sample predictive accuracy. This approach can help assess the recoverability of a community when data are scarce and inform decision making in the aftermath of a disaster. An illustrative example is presented demonstrating how accurate measures of community resilience can help reduce the cost of infrastructure restoration.  相似文献   

13.
The present study analyzes the effects of different socioeconomic factors on the frequency of fire ignition occurrence, according to different original causes. The data include a set of documented ignition points in the region of Catalonia for the period 1995–2008. The analysis focused on the spatial aggregation patterns of the ignitions for each specific ignition cause. The point‐based data on ignitions were interpolated into municipality‐level information using kernel methods as the basis for defining five ignition density levels. Afterwards, the combination of socioeconomic factors influencing the ignition density levels of the municipalities was analyzed for each documented cause of ignition using a principal component analysis. The obtained results confirmed the idea that both the spatial aggregation patterns of fire ignitions and the factors defining their occurrence were specific for each of the causes of ignition. Intentional fires and those of unknown origin were found to have similar spatial aggregation patterns, and the presence of high ignition density areas was related to high population and high unemployment rates. Additionally, it was found that fires originated from forest work, agricultural activities, pasture burning, and lightning had a very specific behavior on their own, differing from the similarities found on the spatial aggregation of ignitions originated from smokers, electric lines, machinery, campfires, and those of intentional or unknown origin.  相似文献   

14.
We study decision problems in which consequences of the various alternative actions depend on states determined by a generative mechanism representing some natural or social phenomenon. Model uncertainty arises because decision makers may not know this mechanism. Two types of uncertainty result, a state uncertainty within models and a model uncertainty across them. We discuss some two‐stage static decision criteria proposed in the literature that address state uncertainty in the first stage and model uncertainty in the second (by considering subjective probabilities over models). We consider two approaches to the Ellsberg‐type phenomena characteristic of such decision problems: a Bayesian approach based on the distinction between subjective attitudes toward the two kinds of uncertainty; and a non‐Bayesian approach that permits multiple subjective probabilities. Several applications are used to illustrate concepts as they are introduced.  相似文献   

15.
Ali Mosleh 《Risk analysis》2012,32(11):1888-1900
Credit risk is the potential exposure of a creditor to an obligor's failure or refusal to repay the debt in principal or interest. The potential of exposure is measured in terms of probability of default. Many models have been developed to estimate credit risk, with rating agencies dating back to the 19th century. They provide their assessment of probability of default and transition probabilities of various firms in their annual reports. Regulatory capital requirements for credit risk outlined by the Basel Committee on Banking Supervision have made it essential for banks and financial institutions to develop sophisticated models in an attempt to measure credit risk with higher accuracy. The Bayesian framework proposed in this article uses the techniques developed in physical sciences and engineering for dealing with model uncertainty and expert accuracy to obtain improved estimates of credit risk and associated uncertainties. The approach uses estimates from one or more rating agencies and incorporates their historical accuracy (past performance data) in estimating future default risk and transition probabilities. Several examples demonstrate that the proposed methodology can assess default probability with accuracy exceeding the estimations of all the individual models. Moreover, the methodology accounts for potentially significant departures from “nominal predictions” due to “upsetting events” such as the 2008 global banking crisis.  相似文献   

16.
A Bayesian forecasting model is developed to quantify uncertainty about the postflight state of a field-joint primary O-ring (not damaged or damaged), given the O-ring temperature at the time of launch of the space shuttle Challenger in 1986. The crux of this problem is the enormous extrapolation that must be performed: 23 previous shuttle flights were launched at temperatures between 53 °F and 81 °F, but the next launch is planned at 31 °F. The fundamental advantage of the Bayesian model is its theoretic structure, which remains correct over the entire sample space of the predictor and that affords flexibility of implementation. A novel approach to extrapolating the input elements based on expert judgment is presented; it recognizes that extrapolation is equivalent to changing the conditioning of the model elements. The prior probability of O-ring damage can be assessed subjectively by experts following a nominal-interacting process in a group setting. The Bayesian model can output several posterior probabilities of O-ring damage, each conditional on the given temperature and on a different strength of the temperature effect hypothesis. A lower bound on, or a value of, the posterior probability can be selected for decision making consistently with expert judgment, which encapsulates engineering information, knowledge, and experience. The Bayesian forecasting model is posed as a replacement for the logistic regression and the nonparametric approach advocated in earlier analyses of the Challenger O-ring data. A comparison demonstrates the inherent deficiency of the generalized linear models for risk analyses that require (1) forecasting an event conditional on a predictor value outside the sampling interval, and (2) combining empirical evidence with expert judgment.  相似文献   

17.
In this paper, we aim to design a monetary policy for the euro area that is robust to the high degree of model uncertainty at the start of monetary union and allows for learning about model probabilities. To this end, we compare and ultimately combine Bayesian and worst‐case analysis using four reference models estimated with pre–European Monetary Union (EMU) synthetic data. We start by computing the cost of insurance against model uncertainty, that is, the relative performance of worst‐case or minimax policy versus Bayesian policy. While maximum insurance comes at moderate costs, we highlight three shortcomings of this worst‐case insurance policy: (i) prior beliefs that would rationalize it from a Bayesian perspective indicate that such insurance is strongly oriented towards the model with highest baseline losses; (ii) the minimax policy is not as tolerant towards small perturbations of policy parameters as the Bayesian policy; and (iii) the minimax policy offers no avenue for incorporating posterior model probabilities derived from data available since monetary union. Thus, we propose preferences for robust policy design that reflect a mixture of the Bayesian and minimax approaches. We show how the incoming EMU data may then be used to update model probabilities, and investigate the implications for policy. (JEL: E52, E58, E61)  相似文献   

18.
In spite of increased attention to quality and efforts to provide safe medical care, adverse events (AEs) are still frequent in clinical practice. Reports from various sources indicate that a substantial number of hospitalized patients suffer treatment‐caused injuries while in the hospital. While risk cannot be entirely eliminated from health‐care activities, an important goal is to develop effective and durable mitigation strategies to render the system “safer.” In order to do this, though, we must develop models that comprehensively and realistically characterize the risk. In the health‐care domain, this can be extremely challenging due to the wide variability in the way that health‐care processes and interventions are executed and also due to the dynamic nature of risk in this particular domain. In this study, we have developed a generic methodology for evaluating dynamic changes in AE risk in acute care hospitals as a function of organizational and nonorganizational factors, using a combination of modeling formalisms. First, a system dynamics (SD) framework is used to demonstrate how organizational‐level and policy‐level contributions to risk evolve over time, and how policies and decisions may affect the general system‐level contribution to AE risk. It also captures the feedback of organizational factors and decisions over time and the nonlinearities in these feedback effects. SD is a popular approach to understanding the behavior of complex social and economic systems. It is a simulation‐based, differential equation modeling tool that is widely used in situations where the formal model is complex and an analytical solution is very difficult to obtain. Second, a Bayesian belief network (BBN) framework is used to represent patient‐level factors and also physician‐level decisions and factors in the management of an individual patient, which contribute to the risk of hospital‐acquired AE. BBNs are networks of probabilities that can capture probabilistic relations between variables and contain historical information about their relationship, and are powerful tools for modeling causes and effects in many domains. The model is intended to support hospital decisions with regard to staffing, length of stay, and investments in safety, which evolve dynamically over time. The methodology has been applied in modeling the two types of common AEs: pressure ulcers and vascular‐catheter‐associated infection, and the models have been validated with eight years of clinical data and use of expert opinion.  相似文献   

19.
A conventional dose–response function can be refitted as additional data become available. A predictive dose–response function in contrast does not require a curve-fitting step, only additional data and presents the unconditional probabilities of illness, reflecting the level of information it contains. In contrast, the predictive Bayesian dose–response function becomes progressively less conservative as more information is included. This investigation evaluated the potential for using predictive Bayesian methods to develop a dose–response for human infection that improves on existing models, to show how predictive Bayesian statistical methods can utilize additional data, and expand the Bayesian methods for a broad audience including those concerned about an oversimplification of dose–response curve use in quantitative microbial risk assessment (QMRA). This study used a dose–response relationship incorporating six separate data sets for Cryptosporidium parvum. A Pareto II distribution with known priors was applied to one of the six data sets to calibrate the model, while the others were used for subsequent updating. While epidemiological principles indicate that local variations, host susceptibility, and organism strain virulence may vary, the six data sets all appear to be well characterized using the Bayesian approach. The adaptable model was applied to an existing data set for Campylobacter jejuni for model validation purposes, which yielded results that demonstrate the ability to analyze a dose–response function with limited data using and update those relationships with new data. An analysis of the goodness of fit compared to the beta-Poisson methods also demonstrated correlation between the predictive Bayesian model and the data.  相似文献   

20.
Pet-Armacost  Julia J.  Sepulveda  Jose  Sakude  Milton 《Risk analysis》1999,19(6):1173-1184
The US Department of Transportation was interested in the risks associated with transporting Hydrazine in tanks with and without relief devices. Hydrazine is both highly toxic and flammable, as well as corrosive. Consequently, there was a conflict as to whether a relief device should be used or not. Data were not available on the impact of relief devices on release probabilities or the impact of Hydrazine on the likelihood of fires and explosions. In this paper, a Monte Carlo sensitivity analysis of the unknown parameters was used to assess the risks associated with highway transport of Hydrazine. To help determine whether or not relief devices should be used, fault trees and event trees were used to model the sequences of events that could lead to adverse consequences during transport of Hydrazine. The event probabilities in the event trees were derived as functions of the parameters whose effects were not known. The impacts of these parameters on the risk of toxic exposures, fires, and explosions were analyzed through a Monte Carlo sensitivity analysis and analyzed statistically through an analysis of variance. The analysis allowed the determination of which of the unknown parameters had a significant impact on the risks. It also provided the necessary support to a critical transportation decision even though the values of several key parameters were not known.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号