首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Dose‐response models are the essential link between exposure assessment and computed risk values in quantitative microbial risk assessment, yet the uncertainty that is inherent to computed risks because the dose‐response model parameters are estimated using limited epidemiological data is rarely quantified. Second‐order risk characterization approaches incorporating uncertainty in dose‐response model parameters can provide more complete information to decisionmakers by separating variability and uncertainty to quantify the uncertainty in computed risks. Therefore, the objective of this work is to develop procedures to sample from posterior distributions describing uncertainty in the parameters of exponential and beta‐Poisson dose‐response models using Bayes's theorem and Markov Chain Monte Carlo (in OpenBUGS). The theoretical origins of the beta‐Poisson dose‐response model are used to identify a decomposed version of the model that enables Bayesian analysis without the need to evaluate Kummer confluent hypergeometric functions. Herein, it is also established that the beta distribution in the beta‐Poisson dose‐response model cannot address variation among individual pathogens, criteria to validate use of the conventional approximation to the beta‐Poisson model are proposed, and simple algorithms to evaluate actual beta‐Poisson probabilities of infection are investigated. The developed MCMC procedures are applied to analysis of a case study data set, and it is demonstrated that an important region of the posterior distribution of the beta‐Poisson dose‐response model parameters is attributable to the absence of low‐dose data. This region includes beta‐Poisson models for which the conventional approximation is especially invalid and in which many beta distributions have an extreme shape with questionable plausibility.  相似文献   

2.
A. E. Ades  G. Lu 《Risk analysis》2003,23(6):1165-1172
Monte Carlo simulation has become the accepted method for propagating parameter uncertainty through risk models. It is widely appreciated, however, that correlations between input variables must be taken into account if models are to deliver correct assessments of uncertainty in risk. Various two-stage methods have been proposed that first estimate a correlation structure and then generate Monte Carlo simulations, which incorporate this structure while leaving marginal distributions of parameters unchanged. Here we propose a one-stage alternative, in which the correlation structure is estimated from the data directly by Bayesian Markov Chain Monte Carlo methods. Samples from the posterior distribution of the outputs then correctly reflect the correlation between parameters, given the data and the model. Besides its computational simplicity, this approach utilizes the available evidence from a wide variety of structures, including incomplete data and correlated and uncorrelated repeat observations. The major advantage of a Bayesian approach is that, rather than assuming the correlation structure is fixed and known, it captures the joint uncertainty induced by the data in all parameters, including variances and covariances, and correctly propagates this through the decision or risk model. These features are illustrated with examples on emissions of dioxin congeners from solid waste incinerators.  相似文献   

3.
基于变位置参数贝叶斯预测银行内部欺诈研究   总被引:1,自引:0,他引:1  
内部欺诈事件类型是中国商业银行最严重的操作风险类型。但由于操作风险本质特征和中国商业银行内部欺诈损失数据收集年度较短,数据匮乏,为了在小样本数据下进行更准确的度量,本文采用贝叶斯后验预测分布方法,其中,假设损失频率服从泊松-伽马分布,而损失强度服从广义帕累托-混合伽马分布,分析后验分布的形式。由于在广义帕累托分布的参数估计中,位置参数的确定对估计结果的影响很大,因此,本文采用变位置参数线性趋势的贝叶斯分析以增强参数预测稳定性,降低位置参数选择对结果产生的影响,获得中国商业银行内部欺诈损失频率和损失强度的后验预测分布和边际分布,进而采用蒙特卡罗模拟,联合损失频率分布和损失强度的预测分布获得内部欺诈的风险联合分布。与传统Poisson-GPD极值分析法相比,在险值和预期超额损失明显降低,有利于银行降低内部欺诈操作风险资本。利用贝叶斯分析获得的后验分布可以作为未来的先验分布,有利于在较小样本下获得较真实的参数估计。  相似文献   

4.
Li R  Englehardt JD  Li X 《Risk analysis》2012,32(2):345-359
Multivariate probability distributions, such as may be used for mixture dose‐response assessment, are typically highly parameterized and difficult to fit to available data. However, such distributions may be useful in analyzing the large electronic data sets becoming available, such as dose‐response biomarker and genetic information. In this article, a new two‐stage computational approach is introduced for estimating multivariate distributions and addressing parameter uncertainty. The proposed first stage comprises a gradient Markov chain Monte Carlo (GMCMC) technique to find Bayesian posterior mode estimates (PMEs) of parameters, equivalent to maximum likelihood estimates (MLEs) in the absence of subjective information. In the second stage, these estimates are used to initialize a Markov chain Monte Carlo (MCMC) simulation, replacing the conventional burn‐in period to allow convergent simulation of the full joint Bayesian posterior distribution and the corresponding unconditional multivariate distribution (not conditional on uncertain parameter values). When the distribution of parameter uncertainty is such a Bayesian posterior, the unconditional distribution is termed predictive. The method is demonstrated by finding conditional and unconditional versions of the recently proposed emergent dose‐response function (DRF). Results are shown for the five‐parameter common‐mode and seven‐parameter dissimilar‐mode models, based on published data for eight benzene–toluene dose pairs. The common mode conditional DRF is obtained with a 21‐fold reduction in data requirement versus MCMC. Example common‐mode unconditional DRFs are then found using synthetic data, showing a 71% reduction in required data. The approach is further demonstrated for a PCB 126‐PCB 153 mixture. Applicability is analyzed and discussed. Matlab® computer programs are provided.  相似文献   

5.
In this paper, we aim to design a monetary policy for the euro area that is robust to the high degree of model uncertainty at the start of monetary union and allows for learning about model probabilities. To this end, we compare and ultimately combine Bayesian and worst‐case analysis using four reference models estimated with pre–European Monetary Union (EMU) synthetic data. We start by computing the cost of insurance against model uncertainty, that is, the relative performance of worst‐case or minimax policy versus Bayesian policy. While maximum insurance comes at moderate costs, we highlight three shortcomings of this worst‐case insurance policy: (i) prior beliefs that would rationalize it from a Bayesian perspective indicate that such insurance is strongly oriented towards the model with highest baseline losses; (ii) the minimax policy is not as tolerant towards small perturbations of policy parameters as the Bayesian policy; and (iii) the minimax policy offers no avenue for incorporating posterior model probabilities derived from data available since monetary union. Thus, we propose preferences for robust policy design that reflect a mixture of the Bayesian and minimax approaches. We show how the incoming EMU data may then be used to update model probabilities, and investigate the implications for policy. (JEL: E52, E58, E61)  相似文献   

6.
基于多子样的贝叶斯动态过程能力估计与评价方法研究   总被引:2,自引:0,他引:2  
针对参数随机化情况下生产过程能力的评价问题,提出了新的过程能力指数估计与评价方法。通过质量控制模型的统计结构分析,研究了扩散先验分布下参数后验分布,据此构造了过程能力指数的贝叶斯点估计和区间估计;在此基础上,将前一阶段模型参数后验分布作为下一阶段的参数先验分布,充分利用历史数据信息,建立了过程能力指数及其下限的贝叶斯动态评价模型。研究结果表明:与现有的贝叶斯过程能力指数估计方法比较,贝叶斯动态过程能力指数的预测精度优于前者,更能反映实际生产过程能力水平。  相似文献   

7.
Keisuke Himoto 《Risk analysis》2020,40(6):1124-1138
Post-earthquake fires are high-consequence events with extensive damage potential. They are also low-frequency events, so their nature remains underinvestigated. One difficulty in modeling post-earthquake ignition probabilities is reducing the model uncertainty attributed to the scarce source data. The data scarcity problem has been resolved by pooling the data indiscriminately collected from multiple earthquakes. However, this approach neglects the inter-earthquake heterogeneity in the regional and seasonal characteristics, which is indispensable for risk assessment of future post-earthquake fires. Thus, the present study analyzes the post-earthquake ignition probabilities of five major earthquakes in Japan from 1995 to 2016 (1995 Kobe, 2003 Tokachi-oki, 2004 Niigata–Chuetsu, 2011 Tohoku, and 2016 Kumamoto earthquakes) by a hierarchical Bayesian approach. As the ignition causes of earthquakes share a certain commonality, common prior distributions were assigned to the parameters, and samples were drawn from the target posterior distribution of the parameters by a Markov chain Monte Carlo simulation. The results of the hierarchical model were comparatively analyzed with those of pooled and independent models. Although the pooled and hierarchical models were both robust in comparison with the independent model, the pooled model underestimated the ignition probabilities of earthquakes with few data samples. Among the tested models, the hierarchical model was least affected by the source-to-source variability in the data. The heterogeneity of post-earthquake ignitions with different regional and seasonal characteristics has long been desired in the modeling of post-earthquake ignition probabilities but has not been properly considered in the existing approaches. The presented hierarchical Bayesian approach provides a systematic and rational framework to effectively cope with this problem, which consequently enhances the statistical reliability and stability of estimating post-earthquake ignition probabilities.  相似文献   

8.
针对非正态响应的部分因子试验,当筛选试验所涉及的因子数目较大时,提出了基于广义线性模型(generalized linear models,GLM)的贝叶斯变量与模型选择方法.首先,针对模型参数的不确定性,选择了经验贝叶斯先验.其次,在广义线性模型的线性预测器中对每个变量设置了二元变量指示器,并建立起变量指示器与模型指示器之间的转换关系.然后,利用变量指示器与模型指示器的后验概率来识别显著性因子与选择最佳模型.最后,以实际的工业案例说明此方法能够有效地识别非正态响应部分因子试验的显著性因子.  相似文献   

9.
Bayesian Forecasting via Deterministic Model   总被引:1,自引:0,他引:1  
Rational decision making requires that the total uncertainty about a variate of interest (a predictand) be quantified in terms of a probability distribution, conditional on all available information and knowledge. Suppose the state-of-knowledge is embodied in a deterministic model, which is imperfect and outputs only an estimate of the predictand. Fundamentals are presented of two Bayesian methods for producing a probabilistic forecast via any deterministic model. The Bayesian Processor of Forecast (BPF) quantifies the total uncertainty in terms of a posterior distribution, conditional on model output. The Bayesian Forecasting System (BFS) decomposes the total uncertainty into input uncertainty and model uncertainty, which are characterized independently and then integrated into a predictive distribution. The BFS is compared with Monte Carlo simulation and ensemble forecasting technique, none of which can alone produce a probabilistic forecast that quantifies the total uncertainty, but each can serve as a component of the BFS.  相似文献   

10.
U.S. Environment Protection Agency benchmark doses for dichotomous cancer responses are often estimated using a multistage model based on a monotonic dose‐response assumption. To account for model uncertainty in the estimation process, several model averaging methods have been proposed for risk assessment. In this article, we extend the usual parameter space in the multistage model for monotonicity to allow for the possibility of a hormetic dose‐response relationship. Bayesian model averaging is used to estimate the benchmark dose and to provide posterior probabilities for monotonicity versus hormesis. Simulation studies show that the newly proposed method provides robust point and interval estimation of a benchmark dose in the presence or absence of hormesis. We also apply the method to two data sets on carcinogenic response of rats to 2,3,7,8‐tetrachlorodibenzo‐p‐dioxin.  相似文献   

11.
A simple and useful characterization of many predictive models is in terms of model structure and model parameters. Accordingly, uncertainties in model predictions arise from uncertainties in the values assumed by the model parameters (parameter uncertainty) and the uncertainties and errors associated with the structure of the model (model uncertainty). When assessing uncertainty one is interested in identifying, at some level of confidence, the range of possible and then probable values of the unknown of interest. All sources of uncertainty and variability need to be considered. Although parameter uncertainty assessment has been extensively discussed in the literature, model uncertainty is a relatively new topic of discussion by the scientific community, despite being often the major contributor to the overall uncertainty. This article describes a Bayesian methodology for the assessment of model uncertainties, where models are treated as sources of information on the unknown of interest. The general framework is then specialized for the case where models provide point estimates about a single‐valued unknown, and where information about models are available in form of homogeneous and nonhomogeneous performance data (pairs of experimental observations and model predictions). Several example applications for physical models used in fire risk analysis are also provided.  相似文献   

12.
Regulatory agencies often perform microbial risk assessments to evaluate the change in the number of human illnesses as the result of a new policy that reduces the level of contamination in the food supply. These agencies generally have regulatory authority over the production and retail sectors of the farm‐to‐table continuum. Any predicted change in contamination that results from new policy that regulates production practices occurs many steps prior to consumption of the product. This study proposes a framework for conducting microbial food‐safety risk assessments; this framework can be used to quantitatively assess the annual effects of national regulatory policies. Advantages of the framework are that estimates of human illnesses are consistent with national disease surveillance data (which are usually summarized on an annual basis) and some of the modeling steps that occur between production and consumption can be collapsed or eliminated. The framework leads to probabilistic models that include uncertainty and variability in critical input parameters; these models can be solved using a number of different Bayesian methods. The Bayesian synthesis method performs well for this application and generates posterior distributions of parameters that are relevant to assessing the effect of implementing a new policy. An example, based on Campylobacter and chicken, estimates the annual number of illnesses avoided by a hypothetical policy; this output could be used to assess the economic benefits of a new policy. Empirical validation of the policy effect is also examined by estimating the annual change in the numbers of illnesses observed via disease surveillance systems.  相似文献   

13.
Bayesian methods are presented for updating the uncertainty in the predictions of an integrated Environmental Health Risk Assessment (EHRA) model. The methods allow the estimation of posterior uncertainty distributions based on the observation of different model outputs along the chain of the linked assessment framework. Analytical equations are derived for the case of the multiplicative lognormal risk model where the sequential log outputs (log ambient concentration, log applied dose, log delivered dose, and log risk) are each normally distributed. Given observations of a log output made with a normally distributed measurement error, the posterior distributions of the log outputs remain normal, but with modified means and variances, and induced correlations between successive log outputs and log inputs. The analytical equations for forward and backward propagation of the updates are generally applicable to sums of normally distributed variables. The Bayesian Monte-Carlo (BMC) procedure is presented to provide an approximate, but more broadly applicable method for numerically updating uncertainty with concurrent backward and forward propagation. Illustrative examples, presented for the multiplicative lognormal model, demonstrate agreement between the analytical and BMC methods, and show how uncertainty updates can propagate through a linked EHRA. The Bayesian updating methods facilitate the pooling of knowledge encoded in predictive models with that transmitted by research outcomes (e.g., field measurements), and thereby support the practice of iterative risk assessment and value of information appraisals.  相似文献   

14.
Experimental animal studies often serve as the basis for predicting risk of adverse responses in humans exposed to occupational hazards. A statistical model is applied to exposure-response data and this fitted model may be used to obtain estimates of the exposure associated with a specified level of adverse response. Unfortunately, a number of different statistical models are candidates for fitting the data and may result in wide ranging estimates of risk. Bayesian model averaging (BMA) offers a strategy for addressing uncertainty in the selection of statistical models when generating risk estimates. This strategy is illustrated with two examples: applying the multistage model to cancer responses and a second example where different quantal models are fit to kidney lesion data. BMA provides excess risk estimates or benchmark dose estimates that reflects model uncertainty.  相似文献   

15.
Vulnerability of human beings exposed to a catastrophic disaster is affected by multiple factors that include hazard intensity, environment, and individual characteristics. The traditional approach to vulnerability assessment, based on the aggregate‐area method and unsupervised learning, cannot incorporate spatial information; thus, vulnerability can be only roughly assessed. In this article, we propose Bayesian network (BN) and spatial analysis techniques to mine spatial data sets to evaluate the vulnerability of human beings. In our approach, spatial analysis is leveraged to preprocess the data; for example, kernel density analysis (KDA) and accumulative road cost surface modeling (ARCSM) are employed to quantify the influence of geofeatures on vulnerability and relate such influence to spatial distance. The knowledge‐ and data‐based BN provides a consistent platform to integrate a variety of factors, including those extracted by KDA and ARCSM to model vulnerability uncertainty. We also consider the model's uncertainty and use the Bayesian model average and Occam's Window to average the multiple models obtained by our approach to robust prediction of the risk and vulnerability. We compare our approach with other probabilistic models in the case study of seismic risk and conclude that our approach is a good means to mining spatial data sets for evaluating vulnerability.  相似文献   

16.
Stakeholders making decisions in public health and world trade need improved estimations of the burden‐of‐illness of foodborne infectious diseases. In this article, we propose a Bayesian meta‐analysis or more precisely a Bayesian evidence synthesis to assess the burden‐of‐illness of campylobacteriosis in France. Using this case study, we investigate campylobacteriosis prevalence, as well as the probabilities of different events that guide the disease pathway, by (i) employing a Bayesian approach on French and foreign human studies (from active surveillance systems, laboratory surveys, physician surveys, epidemiological surveys, and so on) through the chain of events that occur during an episode of illness and (ii) including expert knowledge about this chain of events. We split the target population using an exhaustive and exclusive partition based on health status and the level of disease investigation. We assume an approximate multinomial model over this population partition. Thereby, each observed data set related to the partition brings information on the parameters of the multinomial model, improving burden‐of‐illness parameter estimates that can be deduced from the parameters of the basic multinomial model. This multinomial model serves as a core model to perform a Bayesian evidence synthesis. Expert knowledge is introduced by way of pseudo‐data. The result is a global estimation of the burden‐of‐illness parameters with their accompanying uncertainty.  相似文献   

17.
In this article, Bayesian networks are used to model semiconductor lifetime data obtained from a cyclic stress test system. The data of interest are a mixture of log‐normal distributions, representing two dominant physical failure mechanisms. Moreover, the data can be censored due to limited test resources. For a better understanding of the complex lifetime behavior, interactions between test settings, geometric designs, material properties, and physical parameters of the semiconductor device are modeled by a Bayesian network. Statistical toolboxes in MATLAB® have been extended and applied to find the best structure of the Bayesian network and to perform parameter learning. Due to censored observations Markov chain Monte Carlo (MCMC) simulations are employed to determine the posterior distributions. For model selection the automatic relevance determination (ARD) algorithm and goodness‐of‐fit criteria such as marginal likelihoods, Bayes factors, posterior predictive density distributions, and sum of squared errors of prediction (SSEP) are applied and evaluated. The results indicate that the application of Bayesian networks to semiconductor reliability provides useful information about the interactions between the significant covariates and serves as a reliable alternative to currently applied methods.  相似文献   

18.
This paper is concerned with the Bayesian estimation of nonlinear stochastic differential equations when observations are discretely sampled. The estimation framework relies on the introduction of latent auxiliary data to complete the missing diffusion between each pair of measurements. Tuned Markov chain Monte Carlo (MCMC) methods based on the Metropolis‐Hastings algorithm, in conjunction with the Euler‐Maruyama discretization scheme, are used to sample the posterior distribution of the latent data and the model parameters. Techniques for computing the likelihood function, the marginal likelihood, and diagnostic measures (all based on the MCMC output) are developed. Examples using simulated and real data are presented and discussed in detail.  相似文献   

19.
The recent decision of the U.S. Supreme Court on the regulation of CO2 emissions from new motor vehicles( 1 ) shows the need for a robust methodology to evaluate the fraction of attributable risk from such emissions. The methodology must enable decisionmakers to reach practically relevant conclusions on the basis of expert assessments the decisionmakers see as an expression of research in progress, rather than as knowledge consolidated beyond any reasonable doubt.( 2,3,4 ) This article presents such a methodology and demonstrates its use for the Alpine heat wave of 2003. In a Bayesian setting, different expert assessments on temperature trends and volatility can be formalized as probability distributions, with initial weights (priors) attached to them. By Bayesian learning, these weights can be adjusted in the light of data. The fraction of heat wave risk attributable to anthropogenic climate change can then be computed from the posterior distribution. We show that very different priors consistently lead to the result that anthropogenic climate change has contributed more than 90% to the probability of the Alpine summer heat wave in 2003. The present method can be extended to a wide range of applications where conclusions must be drawn from divergent assessments under uncertainty.  相似文献   

20.
基于风险项目价值的风险投资最优中止时机研究   总被引:1,自引:0,他引:1  
中止决策能力是影响风险投资公司长期绩效的重要指标之一,然而现有中止决策方法忽视了风险项目发展过程中信息对项目价值的影响,从而影响了中止决策时机。针对该问题,本文首先从贝叶斯后验估计的角度提出了风险投资家后验概率的信号学习过程;然后运用动态规划方法,在确定对风险项目最优投资水平的基础上,给出了基于风险项目价值的最优中止时机模型;最后对最优中止时机模型进行了算例分析。本文提出的最优中止模型考虑了风险项目发展过程中新信息释放对中止时机的影响,解决了分阶段投资情形下的最优中止决策问题,为风险投资家及时做出中止决策提供决策依据。同时,该模型还可用来分析风险投资家对风险项目的最优投资水平问题,为其做出最优决策提供参考。此外,模型的分析也为风险项目创始人通常发生的逆向选择行为提供了合理的解释。  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号