首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 62 毫秒
1.
Concern about the degree of uncertainty and potential conservatism in deterministic point estimates of risk has prompted researchers to turn increasingly to probabilistic methods for risk assessment. With Monte Carlo simulation techniques, distributions of risk reflecting uncertainty and/or variability are generated as an alternative. In this paper the compounding of conservatism(1) between the level associated with point estimate inputs selected from probability distributions and the level associated with the deterministic value of risk calculated using these inputs is explored. Two measures of compounded conservatism are compared and contrasted. The first measure considered, F , is defined as the ratio of the risk value, R d, calculated deterministically as a function of n inputs each at the j th percentile of its probability distribution, and the risk value, R j that falls at the j th percentile of the simulated risk distribution (i.e., F=Rd/Rj). The percentile of the simulated risk distribution which corresponds to the deterministic value, Rd , serves as a second measure of compounded conservatism. Analytical results for simple products of lognormal distributions are presented. In addition, a numerical treatment of several complex cases is presented using five simulation analyses from the literature to illustrate. Overall, there are cases in which conservatism compounds dramatically for deterministic point estimates of risk constructed from upper percentiles of input parameters, as well as those for which the effect is less notable. The analytical and numerical techniques discussed are intended to help analysts explore the factors that influence the magnitude of compounding conservatism in specific cases.  相似文献   

2.
Quantitative risk analysis (QRA) is a systematic approach for evaluating likelihood, consequences, and risk of adverse events. QRA based on event (ETA) and fault tree analyses (FTA) employs two basic assumptions. The first assumption is related to likelihood values of input events, and the second assumption is regarding interdependence among the events (for ETA) or basic events (for FTA). Traditionally, FTA and ETA both use crisp probabilities; however, to deal with uncertainties, the probability distributions of input event likelihoods are assumed. These probability distributions are often hard to come by and even if available, they are subject to incompleteness (partial ignorance) and imprecision. Furthermore, both FTA and ETA assume that events (or basic events) are independent. In practice, these two assumptions are often unrealistic. This article focuses on handling uncertainty in a QRA framework of a process system. Fuzzy set theory and evidence theory are used to describe the uncertainties in the input event likelihoods. A method based on a dependency coefficient is used to express interdependencies of events (or basic events) in ETA and FTA. To demonstrate the approach, two case studies are discussed.  相似文献   

3.
Probabilistic risk analysis (PRA) can be an effective tool to assess risks and uncertainties and to set priorities among safety policy options. Based on systems analysis and Bayesian probability, PRA has been applied to a wide range of cases, three of which are briefly presented here: the maintenance of the tiles of the space shuttle, the management of patient risk in anesthesia, and the choice of seismic provisions of building codes for the San Francisco Bay Area. In the quantification of a risk, a number of problems arise in the public sector where multiple stakeholders are involved. In this article, I describe different approaches to the treatments of uncertainties in risk analysis, their implications for risk ranking, and the role of risk analysis results in the context of a safety decision process. I also discuss the implications of adopting conservative hypotheses before proceeding to what is, in essence, a conditional uncertainty analysis, and I explore some implications of different levels of "conservatism" for the ranking of risk mitigation measures.  相似文献   

4.
Roger Cooke 《Risk analysis》2010,30(3):330-339
The practice of uncertainty factors as applied to noncancer endpoints in the IRIS database harkens back to traditional safety factors. In the era before risk quantification, these were used to build in a “margin of safety.” As risk quantification takes hold, the safety factor methods yield to quantitative risk calculations to guarantee safety. Many authors believe that uncertainty factors can be given a probabilistic interpretation as ratios of response rates, and that the reference values computed according to the IRIS methodology can thus be converted to random variables whose distributions can be computed with Monte Carlo methods, based on the distributions of the uncertainty factors. Recent proposals from the National Research Council echo this view. Based on probabilistic arguments, several authors claim that the current practice of uncertainty factors is overprotective. When interpreted probabilistically, uncertainty factors entail very strong assumptions on the underlying response rates. For example, the factor for extrapolating from animal to human is the same whether the dosage is chronic or subchronic. Together with independence assumptions, these assumptions entail that the covariance matrix of the logged response rates is singular. In other words, the accumulated assumptions entail a log‐linear dependence between the response rates. This in turn means that any uncertainty analysis based on these assumptions is ill‐conditioned; it effectively computes uncertainty conditional on a set of zero probability. The practice of uncertainty factors is due for a thorough review. Two directions are briefly sketched, one based on standard regression models, and one based on nonparametric continuous Bayesian belief nets.  相似文献   

5.
A Note on Compounded Conservatism   总被引:1,自引:0,他引:1  
Compounded conservatism (or "creeping safety") describes the impact of using conservative, upper-bound estimates of the values of multiple input variates to obtain a conservative estimate of risk modeled as an increasing function of those variates. In a simple multiplicative model of risk, for example, if upper p -fractile (100 p th percentile) values are used for each of several statistically independent input variates, the resulting risk estimate will be the upper p' -fractile of risk predicted according to that multiplicative model, where p' > p . The amount of compounded conservativism reflected by the difference between p' and p may be substantial, depending on the number of inputs, their relative uncertainties, and the value of p selected. Particular numerical examples of compounded conservatism are often cited, but an analytic approach may better serve to conceptualize and communicate its potential quantitative impact. This note briefly outlines such an approach and illustrates its application to the case of risk modeled as a product of lognormally distributed inputs.  相似文献   

6.
In a quantitative model with uncertain inputs, the uncertainty of the output can be summarized by a risk measure. We propose a sensitivity analysis method based on derivatives of the output risk measure, in the direction of model inputs. This produces a global sensitivity measure, explicitly linking sensitivity and uncertainty analyses. We focus on the case of distortion risk measures, defined as weighted averages of output percentiles, and prove a representation of the sensitivity measure that can be evaluated on a Monte Carlo sample, as a weighted average of gradients over the input space. When the analytical model is unknown or hard to work with, nonparametric techniques are used for gradient estimation. This process is demonstrated through the example of a nonlinear insurance loss model. Furthermore, the proposed framework is extended in order to measure sensitivity to constant model parameters, uncertain statistical parameters, and random factors driving dependence between model inputs.  相似文献   

7.
Currently, there is a trend away from the use of single (often conservative) estimates of risk to summarize the results of risk analyses in favor of stochastic methods which provide a more complete characterization of risk. The use of such stochastic methods leads to a distribution of possible values of risk, taking into account both uncertainty and variability in all of the factors affecting risk. In this article, we propose a general framework for the analysis of uncertainty and variability for use in the commonly encountered case of multiplicative risk models, in which risk may be expressed as a product of two or more risk factors. Our analytical methods facilitate the evaluation of overall uncertainty and variability in risk assessment, as well as the contributions of individual risk factors to both uncertainty and variability which is cumbersome using Monte Carlo methods. The use of these methods is illustrated in the analysis of potential cancer risks due to the ingestion of radon in drinking water.  相似文献   

8.
考虑上游生产和下游需求不确定性,研究了由工厂、分销中心及终端市场构成的生产-分销网络优化设计问题。针对上游生产不确定性,考虑产生故障和无故障两种状态;针对下游市场需求不确定性,考虑其具有低、中和高三种状态。由于生产发生故障可能导致不合格品的产生,进一步考虑了在上游生产环节是否实施产品监测问题。综合网络运作成本和由不确定性导致的绩效风险,建立了由风险厌恶水平和悲观系数刻画的基于均值-条件风险值(CVaR)准则的生产-分销网络两阶段随机规划模型。特别地,针对由网络潜在节点数众多所导致的不确定情景规模过大的问题,采用情景缩减技术进行了情景筛选,降低了所建模型的求解难度。最后,进行了数值计算,分析了相关参数对网络运作绩效的影响,并给出了期望成本和条件风险值两个目标权衡的帕累托有效前沿。进一步,通过回归试验设计检验了决策者风险厌恶水平和悲观系数对所设计的生产-分销网络绩效的影响程度。结果表明,相对于决策者的风险厌恶程度,悲观系数对网络运作绩效的影响更大。  相似文献   

9.
Exposure guidelines for potentially toxic substances are often based on a reference dose (RfD) that is determined by dividing a no-observed-adverse-effect-level (NOAEL), lowest-observed-adverse-effect-level (LOAEL), or benchmark dose (BD) corresponding to a low level of risk, by a product of uncertainty factors. The uncertainty factors for animal to human extrapolation, variable sensitivities among humans, extrapolation from measured subchronic effects to unknown results for chronic exposures, and extrapolation from a LOAEL to a NOAEL can be thought of as random variables that vary from chemical to chemical. Selected databases are examined that provide distributions across chemicals of inter- and intraspecies effects, ratios of LOAELs to NOAELs, and differences in acute and chronic effects, to illustrate the determination of percentiles for uncertainty factors. The distributions of uncertainty factors tend to be approximately lognormally distributed. The logarithm of the product of independent uncertainty factors is approximately distributed as the sum of normally distributed variables, making it possible to estimate percentiles for the product. Hence, the size of the products of uncertainty factors can be selected to provide adequate safety for a large percentage (e.g., approximately 95%) of RfDs. For the databases used to describe the distributions of uncertainty factors, using values of 10 appear to be reasonable and conservative. For the databases examined the following simple "Rule of 3s" is suggested that exceeds the estimated 95th percentile of the product of uncertainty factors: If only a single uncertainty factor is required use 33, for any two uncertainty factors use 3 x 33 approximately 100, for any three uncertainty factors use a combined factor of 3 x 100 = 300, and if all four uncertainty factors are needed use a total factor of 3 x 300 = 900. If near the 99th percentile is desired use another factor of 3. An additional factor may be needed for inadequate data or a modifying factor for other uncertainties (e.g., different routes of exposure) not covered above.  相似文献   

10.
彭涛  黄福广  孙凌霞 《管理科学》2021,24(3):98-114
将经济政策不确定性引入风险投资的决策模型,从理论上证明经济政策不确定性既直接降低风险承担,也负向影响风险投资退出绩效间接降低风险承担.利用1996年~2016年中国经济政策不确定性与风险投资的匹配数据,实证结果支持理论预期.研究发现,经济政策不确定性较高时,风险承担显著更低,表现为风险投资对早期阶段企业和高科技企业的投资比例下降.退出绩效在经济政策不确定性与风险承担之间具有中介作用.经济政策不确定性较高时,风险投资通过IPO或者并购成功退出的交易数目更少、退出期限更长、退出收益更低,因而降低风险承担研究表明,为引导风险投资支持早期高科技企业,除通过财政、税收等政策扶 持奖励风险投资外,政府有必要维持相关政策的稳定性和延续性.  相似文献   

11.
This study examines how government safety regulations affect the uncertainty of work-related road accident loss (UWRAL) by considering the multi-identity of local governments in the relationship among the central government, the local governments, and enterprises. Fixed effects panel models and mediation analyses with bootstrapping were conducted to test the hypotheses using Chinese provincial panel data from 2008 to 2014. Given the complexity and nonlinear characteristics of road safety systems, a new approach based on self-organized criticality theory is proposed to measure the uncertainty of road accident loss from a complex system perspective. We find that a regional government with detailed safety work planning (SWP), high safety supervision intensity (SSI), and safety information transparency (SIT) can decrease the UWRAL. Furthermore, our findings suggest that SSI and SIT partially mediate the relationship between the SWP of regional governments and the UWRAL, with 19.7% and 23.6% indirect effects, respectively. This study also provides the government with managerial implications by linking the results of risk assessment to decision making for risk management.  相似文献   

12.
In pest risk assessment it is frequently necessary to make management decisions regarding emerging threats under severe uncertainty. Although risk maps provide useful decision support for invasive alien species, they rarely address knowledge gaps associated with the underlying risk model or how they may change the risk estimates. Failure to recognize uncertainty leads to risk‐ignorant decisions and miscalculation of expected impacts as well as the costs required to minimize these impacts. Here we use the information gap concept to evaluate the robustness of risk maps to uncertainties in key assumptions about an invading organism. We generate risk maps with a spatial model of invasion that simulates potential entries of an invasive pest via international marine shipments, their spread through a landscape, and establishment on a susceptible host. In particular, we focus on the question of how much uncertainty in risk model assumptions can be tolerated before the risk map loses its value. We outline this approach with an example of a forest pest recently detected in North America, Sirex noctilio Fabricius. The results provide a spatial representation of the robustness of predictions of S. noctilio invasion risk to uncertainty and show major geographic hotspots where the consideration of uncertainty in model parameters may change management decisions about a new invasive pest. We then illustrate how the dependency between the extent of uncertainties and the degree of robustness of a risk map can be used to select a surveillance network design that is most robust to knowledge gaps about the pest.  相似文献   

13.
Biomagnification of organochlorine and other persistent organic contaminants by higher trophic level organisms represents one of the most significant sources of uncertainty and variability in evaluating potential risks associated with disposal of dredged materials. While it is important to distinguish between population variability (e.g., true population heterogeneity in fish weight, and lipid content) and uncertainty (e.g., measurement error), they can be operationally difficult to define separately in probabilistic estimates of human health and ecological risk. We propose a disaggregation of uncertain and variable parameters based on: (1) availability of supporting data; (2) the specific management and regulatory context (in this case, of the U.S. Army Corps of Engineers/U.S. Environmental Protection Agency tiered approach to dredged material management); and (3) professional judgment and experience in conducting probabilistic risk assessments. We describe and quantitatively evaluate several sources of uncertainty and variability in estimating risk to human health from trophic transfer of polychlorinated biphenyls (PCBs) using a case study of sediments obtained from the New York-New Jersey Harbor and being evaluated for disposal at an open water off-shore disposal site within the northeast region. The estimates of PCB concentrations in fish and dietary doses of PCBs to humans ingesting fish are expressed as distributions of values, of which the arithmetic mean or mode represents a particular fractile. The distribution of risk values is obtained using a food chain biomagnification model developed by Gobas by specifying distributions for input parameters disaggregated to represent either uncertainty or variability. Only those sources of uncertainty that could be quantified were included in the analysis. Results for several different two-dimensional Latin Hypercube analyses are provided to evaluate the influence of the uncertain versus variable disaggregation of model parameters. The analysis suggests that variability in human exposure parameters is greater than the uncertainty bounds on any particular fractile, given the described assumptions.  相似文献   

14.
15.
A probabilistic model (SHEDS-Wood) was developed to examine children's exposure and dose to chromated copper arsenate (CCA)-treated wood, as described in Part 1 of this two-part article. This Part 2 article discusses sensitivity and uncertainty analyses conducted to assess the key model inputs and areas of needed research for children's exposure to CCA-treated playsets and decks. The following types of analyses were conducted: (1) sensitivity analyses using a percentile scaling approach and multiple stepwise regression; and (2) uncertainty analyses using the bootstrap and two-stage Monte Carlo techniques. The five most important variables, based on both sensitivity and uncertainty analyses, were: wood surface residue-to-skin transfer efficiency; wood surface residue levels; fraction of hand surface area mouthed per mouthing event; average fraction of nonresidential outdoor time a child plays on/around CCA-treated public playsets; and frequency of hand washing. In general, there was a factor of 8 for the 5th and 95th percentiles and a factor of 4 for the 50th percentile in the uncertainty of predicted population dose estimates due to parameter uncertainty. Data were available for most of the key model inputs identified with sensitivity and uncertainty analyses; however, there were few or no data for some key inputs. To evaluate and improve the accuracy of model results, future measurement studies should obtain longitudinal time-activity diary information on children, spatial and temporal measurements of residue and soil concentrations on or near CCA-treated playsets and decks, and key exposure factors. Future studies should also address other sources of uncertainty in addition to parameter uncertainty, such as scenario and model uncertainty.  相似文献   

16.
Most public health risk assessments assume and combine a series of average, conservative, and worst-case values to derive a conservative point estimate of risk. This procedure has major limitations. This paper demonstrates a new methodology for extended uncertainty analyses in public health risk assessments using Monte Carlo techniques. The extended method begins as do some conventional methods--with the preparation of a spreadsheet to estimate exposure and risk. This method, however, continues by modeling key inputs as random variables described by probability density functions (PDFs). Overall, the technique provides a quantitative way to estimate the probability distributions for exposure and health risks within the validity of the model used. As an example, this paper presents a simplified case study for children playing in soils contaminated with benzene and benzo(a)pyrene (BaP).  相似文献   

17.
Risk and uncertainty are integral parts of modern technology, and they must be managed effectively to allow the development of reliable, high-quality products. Because so many facets of technology and society involve risk and uncertainty, it is essential that risk management be handled in a systematic manner. Fault-tree analysis is one of the principal methods used in the analysis of systems'safety. Its detailed and systematic deductive structure makes it a valuable tool for design and diagnostic purposes. Point probability and the minimization of the expected failure probability have, until recently, dominated fault-tree analysis. A methodology that incorporates uncertainty analysis, conditional expected risk, and multiple objectives with fault-tree analysis is presented. A computer software package termed the "Distribution Analyzer and Risk Evaluator (DARE) Using Fault Trees," which translates the new methodology into a working decision-support system, is developed. DARE Using Fault Trees is a flexible computer code that is capable of analyzing the risk of the overall system in terms of the probability density function of failure probability. Emphasis is placed on the uncertainty and risk of extreme events. A comparative study between existing codes for fault-tree analysis and DARE demonstrates the strengths of the methodology. A case study for NASA's solid rocket booster is used to perform the comparative analysis.  相似文献   

18.
Researchers in judgment and decision making have long debunked the idea that we are economically rational optimizers. However, problematic assumptions of rationality remain common in studies of agricultural economics and climate change adaptation, especially those that involve quantitative models. Recent movement toward more complex agent‐based modeling provides an opportunity to reconsider the empirical basis for farmer decision making. Here, we reconceptualize farmer decision making from the ground up, using an in situ mental models approach to analyze weather and climate risk management. We assess how large‐scale commercial grain farmers in South Africa (n = 90) coordinate decisions about weather, climate variability, and climate change with those around other environmental, agronomic, economic, political, and personal risks that they manage every day. Contrary to common simplifying assumptions, we show that these farmers tend to satisfice rather than optimize as they face intractable and multifaceted uncertainty; they make imperfect use of limited information; they are differently averse to different risks; they make decisions on multiple time horizons; they are cautious in responding to changing conditions; and their diverse risk perceptions contribute to important differences in individual behaviors. We find that they use two important nonoptimizing strategies, which we call cognitive thresholds and hazy hedging, to make practical decisions under pervasive uncertainty. These strategies, evident in farmers' simultaneous use of conservation agriculture and livestock to manage weather risks, are the messy in situ performance of naturalistic decision‐making techniques. These results may inform continued research on such behavioral tendencies in narrower lab‐ and modeling‐based studies.  相似文献   

19.
This article explores the use of an approach for setting default values for the noncancer toxicity, developed as part of the Threshold of Toxicological Concern (TTC), for the evaluation of the chronic noncarcinogenic effects of certain chemical mixtures. Individuals are exposed to many mixtures where there are little or no toxicological data on some or all of the mixture components. The approach developed in the TTC can provide a basis for conservative estimates of the toxicity of the mixture components when compound-specific data are not available. The application of this approach to multiple chemicals in a mixture, however, has implications for the statistical assumptions made in developing component-based estimates of mixtures. Specifically, conservative assumptions that are appropriate for one compound may become overly conservative when applied to all components of a mixture. This overestimation can be investigated by modeling the uncertainty in toxicity standards. In this article the approach is applied to both hypothetical and actual examples of chemical mixtures and the potential for overestimation is investigated. The results indicate that the use of the approach leads to conservative estimates of mixture toxicity and therefore its use is most appropriate for screening assessments of mixtures.  相似文献   

20.
Regional estimates of cryptosporidiosis risks from drinking water exposure were developed and validated, accounting for AIDS status and age. We constructed a model with probability distributions and point estimates representing Cryptosporidium in tap water, tap water consumed per day (exposure characterization); dose response, illness given infection, prolonged illness given illness; and three conditional probabilities describing the likelihood of case detection by active surveillance (health effects characterization). The model predictions were combined with population data to derive expected case numbers and incidence rates per 100,000 population, by age and AIDS status, borough specific and for New York City overall in 2000 (risk characterization). They were compared with same-year surveillance data to evaluate predictive ability, assumed to represent true incidence of waterborne cryptosporidiosis. The predicted mean risks, similar to previously published estimates for this region, overpredicted observed incidence-most extensively when accounting for AIDS status. The results suggest that overprediction may be due to conservative parameters applied to both non-AIDS and AIDS populations, and that biological differences for children need to be incorporated. Interpretations are limited by the unknown accuracy of available surveillance data, in addition to variability and uncertainty of model predictions. The model appears sensitive to geographical differences in AIDS prevalence. The use of surveillance data for validation and model parameters pertinent to susceptibility are discussed.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号