共查询到20条相似文献,搜索用时 15 毫秒
1.
Estimation of uncertainties associated with model predictions is an important component of the application of environmental and biological models. "Traditional" methods for propagating uncertainty, such as standard Monte Carlo and Latin Hypercube Sampling, however, often require performing a prohibitive number of model simulations, especially for complex, computationally intensive models. Here, a computationally efficient method for uncertainty propagation, the Stochastic Response Surface Method (SRSM) is coupled with another method, the Automatic Differentiation of FORTRAN (ADIFOR). The SRSM is based on series expansions of model inputs and outputs in terms of a set of "well-behaved" standard random variables. The ADIFOR method is used to transform the model code into one that calculates the derivatives of the model outputs with respect to inputs or transformed inputs. The calculated model outputs and the derivatives at a set of sample points are used to approximate the unknown coefficients in the series expansions of outputs. A framework for the coupling of the SRSM and ADIFOR is developed and presented here. Two case studies are presented, involving (1) a physiologically based pharmacokinetic model for perchloroethylene for humans, and (2) an atmospheric photochemical model, the Reactive Plume Model. The results obtained agree closely with those of traditional Monte Carlo and Latin hypercube sampling methods, while reducing the required number of model simulations by about two orders of magnitude. 相似文献
2.
Propagation of Uncertainty in Risk Assessments: The Need to Distinguish Between Uncertainty Due to Lack of Knowledge and Uncertainty Due to Variability 总被引:16,自引:0,他引:16
In quantitative uncertainty analysis, it is essential to define rigorously the endpoint or target of the assessment. Two distinctly different approaches using Monte Carlo methods are discussed: (1) the end point is a fixed but unknown value (e.g., the maximally exposed individual, the average individual, or a specific individual) or (2) the end point is an unknown distribution of values (e.g., the variability of exposures among unspecified individuals in the population). In the first case, values are sampled at random from distributions representing various "degrees of belief" about the unknown "fixed" values of the parameters to produce a distribution of model results. The distribution of model results represents a subjective confidence statement about the true but unknown assessment end point. The important input parameters are those that contribute most to the spread in the distribution of the model results. In the second case, Monte Carlo calculations are performed in two dimensions producing numerous alternative representations of the true but unknown distribution. These alternative distributions permit subject confidence statements to be made from two perspectives: (1) for the individual exposure occurring at a specified fractile of the distribution or (2) for the fractile of the distribution associated with a specified level of individual exposure. The relative importance of input parameters will depend on the fractile or exposure level of interest. The quantification of uncertainty for the simulation of a true but unknown distribution of values represents the state-of-the-art in assessment modeling. 相似文献
3.
Bayesian Methods for Model Uncertainty Analysis with Application to Future Sea Level Rise 总被引:1,自引:0,他引:1
This paper addresses the use of data for identifying and characterizing uncertainties in model parameters and predictions. The Bayesian Monte Carlo method is formally presented and elaborated, and applied to the analysis of the uncertainty in a predictive model for global mean sea level change. The method uses observations of output variables, made with an assumed error structure, to determine a posterior distribution of model outputs. This is used to derive a posterior distribution for the model parameters. Results demonstrate the resolution of the uncertainty that is obtained as a result of the Bayesian analysis and also indicate the key contributors to the uncertainty in the sea level rise model. While the technique is illustrated with a simple, preliminary model, the analysis provides an iterative framework for model refinement. The methodology developed in this paper provides a mechanism for the incorporation of ongoing data collection and research in decision-making for problems involving uncertain environmental change. 相似文献
4.
Methods for Uncertainty Analysis: A Comparative Survey 总被引:1,自引:0,他引:1
This paper presents a survey and comparative evaluation of methods which have been developed for the determination of uncertainties in accident consequences and probabilities, for use in probabilistic risk assessment. The methods considered are: analytic techniques, Monte Carlo simulation, response surface approaches, differential sensitivity techniques, and evaluation of classical statistical confidence bounds. It is concluded that only the response surface and differential sensitivity approaches are sufficiently general and flexible for use as overall methods of uncertainty analysis in probabilistic risk assessment. The other methods considered, however, are very useful in particular problems. 相似文献
5.
Previous research on mass customization (MC) has focused on what it is and how it is implemented. In this study we examine when MC is an appropriate strategy for firms to follow by scrutinizing the effects of three environmental uncertainty variables (demand uncertainty, competitive intensity, and supply chain complexity) on the MC–performance relationship. Specifically, we distinguish the direct effect of environmental uncertainty on MC ability and the moderation effect of environmental uncertainty on MC ability's impact on customer satisfaction. We examine six competing hypotheses using data collected from 266 manufacturing plants. Our results show that competitive intensity has a direct positive impact on MC ability. However, demand uncertainty moderates the relationship between MC ability and customer satisfaction, and the direct and positive relationship between MC ability and customer satisfaction holds only when customer demand is highly uncertain. Supply chain complexity neither has a direct relationship with MC, nor moderates the MC–performance relationship. Implications of these research findings are discussed and future research directions are identified. 相似文献
6.
Methods to Approximate Joint Uncertainty and Variability in Risk 总被引:3,自引:0,他引:3
Kenneth T. Bogen 《Risk analysis》1995,15(3):411-419
As interest in quantitative analysis of joint uncertainty and interindividual variability (JUV) in risk grows, so does the need for related computational shortcuts. To quantify JUV in risk, Monte Carlo methods typically require nested sampling of JUV in distributed inputs, which is cumbersome and time-consuming. Two approximation methods proposed here allow simpler and more rapid analysis. The first consists of new upper-bound JUV estimators that involve only uncertainty or variability, not both, and so never require nested sampling to calculate. The second is a discrete-probability-calculus procedure that uses only the mean and one upper-tail mean for each input in order to estimate mean and upper-bound risk, which procedure is simpler and more intuitive than similar ones in use. Application of these methods is illustrated in an assessment of cancer risk from residential exposures to chloroform in Kanawah Valley, West Virginia. Because each of the multiple exposure pathways considered in this assessment had separate modeled sources of uncertainty and variability, the assessment illustrates a realistic case where a standard Monte Carlo approach to JUV analysis requires nested sampling. In the illustration, the first proposed method quantified JUV in cancer risk much more efficiently than corresponding nested Monte Carlo calculations. The second proposed method also nearly duplicated JUV-related and other estimates of risk obtained using Monte Carlo methods. Both methods were thus found adequate to obtain basic risk estimates accounting for JUV in a realistically complex risk assessment. These methods make routine JUV analysis more convenient and practical. 相似文献
7.
A sequence of linear, monotonic, and nonmonotonic test problems is used to illustrate sampling-based uncertainty and sensitivity analysis procedures. Uncertainty results obtained with replicated random and Latin hypercube samples are compared, with the Latin hypercube samples tending to produce more stable results than the random samples. Sensitivity results obtained with the following procedures and/or measures are illustrated and compared: correlation coefficients (CCs), rank correlation coefficients (RCCs), common means (CMNs), common locations (CLs), common medians (CMDs), statistical independence (SI), standardized regression coefficients (SRCs), partial correlation coefficients (PCCs), standardized rank regression coefficients (SRRCs), partial rank correlation coefficients (PRCCs), stepwise regression analysis with raw and rank-transformed data, and examination of scatter plots. The effectiveness of a given procedure and/or measure depends on the characteristics of the individual test problems, with (1) linear measures (i.e., CCs, PCCs, SRCs) performing well on the linear test problems, (2) measures based on rank transforms (i.e., RCCs, PRCCs, SRRCs) performing well on the monotonic test problems, and (3) measures predicated on searches for nonrandom patterns (i.e., CMNs, CLs, CMDs, SI) performing well on the nonmonotonic test problems. 相似文献
8.
A Monte Carlo method is presented to study the effect of systematic and random errors on computer models mainly dealing with experimental data. It is a common assumption in this type of models (linear and nonlinear regression, and nonregression computer models) involving experimental measurements that the error sources are mainly random and independent with no constant background errors (systematic errors). However, from comparisons of different experimental data sources evidence is often found of significant bias or calibration errors. The uncertainty analysis approach presented in this work is based on the analysis of cumulative probability distributions for output variables of the models involved taking into account the effect of both types of errors. The probability distributions are obtained by performing Monte Carlo simulation coupled with appropriate definitions for the random and systematic errors. The main objectives are to detect the error source with stochastic dominance on the uncertainty propagation and the combined effect on output variables of the models. The results from the case studies analyzed show that the approach is able to distinguish which error type has a more significant effect on the performance of the model. Also, it was found that systematic or calibration errors, if present, cannot be neglected in uncertainty analysis of models dependent on experimental measurements such as chemical and physical properties. The approach can be used to facilitate decision making in fields related to safety factors selection, modeling, experimental data measurement, and experimental design. 相似文献
9.
Branden B. Johnson 《Risk analysis》2003,23(4):781-789
Given the prevalence of uncertainty and variability in estimates of environmental health risks, it is important to know how citizens interpret information representing uncertainty in risk estimates. Ranges of risk estimates from a hypothetical industry source elicited divergent evaluations of risk assessors' honesty and competence among New Jersey residents within one mile of one or more factories. A plurality saw ranges of risk estimates as both honest and competent, but with most judging such ranges as deficient on one or both dimensions. They wanted definitive conclusions about safety, tended to believe the high end of the range was more likely to be an accurate estimate of the risk, and believed that institutions only discuss risks when they are "high." Acknowledgment of scientific, as opposed to self-interested, reasons for uncertainty and disputes among experts was low. Attitude toward local industry seemed associated with, if not a cause of, attitudes about ranges of risk estimates. These reactions by industry neighbors appear to replicate the findings of Johnson and Slovic (1995, 1998), despite the hypothetical producer of risk estimates being industry instead of government. Respondents were older and less educated on average than were the earlier samples, but more diverse. Regression analyses suggested attitude toward industry was a major factor in these reactions, although other explanations (e.g., level of scientific understanding independent of general education) were not tested in this study. 相似文献
10.
Variability arises due to differences in the value of a quantity among different members of a population. Uncertainty arises due to lack of knowledge regarding the true value of a quantity for a given member of a population. We describe and evaluate two methods for quantifying both variability and uncertainty. These methods, bootstrap simulation and a likelihood-based method, are applied to three datasets. The datasets include a synthetic sample of 19 values from a Lognormal distribution, a sample of nine values obtained from measurements of the PCB concentration in leafy produce, and a sample of five values for the partitioning of chromium in the flue gas desulfurization system of coal-fired power plants. For each of these datasets, we employ the two methods to characterize uncertainty in the arithmetic mean and standard deviation, cumulative distribution functions based upon fitted parametric distributions, the 95th percentile of variability, and the 63rd percentile of uncertainty for the 81st percentile of variability. The latter is intended to show that it is possible to describe any point within the uncertain frequency distribution by specifying an uncertainty percentile and a variability percentile. Using the bootstrap method, we compare results based upon use of the method of matching moments and the method of maximum likelihood for fitting distributions to data. Our results indicate that with only 5–19 data points as in the datasets we have evaluated, there is substantial uncertainty based upon random sampling error. Both the boostrap and likelihood-based approaches yield comparable uncertainty estimates in most cases. 相似文献
11.
Brian Veitch 《Risk analysis》2011,31(1):86-107
Quantitative risk analysis (QRA) is a systematic approach for evaluating likelihood, consequences, and risk of adverse events. QRA based on event (ETA) and fault tree analyses (FTA) employs two basic assumptions. The first assumption is related to likelihood values of input events, and the second assumption is regarding interdependence among the events (for ETA) or basic events (for FTA). Traditionally, FTA and ETA both use crisp probabilities; however, to deal with uncertainties, the probability distributions of input event likelihoods are assumed. These probability distributions are often hard to come by and even if available, they are subject to incompleteness (partial ignorance) and imprecision. Furthermore, both FTA and ETA assume that events (or basic events) are independent. In practice, these two assumptions are often unrealistic. This article focuses on handling uncertainty in a QRA framework of a process system. Fuzzy set theory and evidence theory are used to describe the uncertainties in the input event likelihoods. A method based on a dependency coefficient is used to express interdependencies of events (or basic events) in ETA and FTA. To demonstrate the approach, two case studies are discussed. 相似文献
12.
An auditor gives a going concern uncertainty opinion when the client company is at risk of failure or exhibits other signs of distress that threaten its ability to continue as a going concern. The decision to issue a going concern opinion is an unstructured task that requires the use of the auditor's judgment. In cases where judgment is required, the auditor may benefit from the use of statistical analysis or other forms of decision models to support the final decision. This study uses the generalized reduced gradient (GRG2) optimizer for neural network learning, a backpropagation neural network, and a logit model to predict which firms would receive audit reports reflecting a going concern uncertainty modification. The GRG2 optimizer has previously been used as a more efficient optimizer for solving business problems. The neural network model formulated using GRG2 has the highest prediction accuracy of 95 percent. It performs best when tested with a small number of variables on a group of data sets, each containing 70 observations. While the logit procedure fails to converge when using our eight variable model, the GRG2 based neural network analysis provides consistent results using either eight or four variable models. The GRG2 based neural network is proposed as a robust alternative model for auditors to support their assessment of going concern uncertainty affecting the client company. 相似文献
13.
A Combined Monte Carlo and Possibilistic Approach to Uncertainty Propagation in Event Tree Analysis 总被引:1,自引:0,他引:1
In risk analysis, the treatment of the epistemic uncertainty associated to the probability of occurrence of an event is fundamental. Traditionally, probabilistic distributions have been used to characterize the epistemic uncertainty due to imprecise knowledge of the parameters in risk models. On the other hand, it has been argued that in certain instances such uncertainty may be best accounted for by fuzzy or possibilistic distributions. This seems the case in particular for parameters for which the information available is scarce and of qualitative nature. In practice, it is to be expected that a risk model contains some parameters affected by uncertainties that may be best represented by probability distributions and some other parameters that may be more properly described in terms of fuzzy or possibilistic distributions. In this article, a hybrid method that jointly propagates probabilistic and possibilistic uncertainties is considered and compared with pure probabilistic and pure fuzzy methods for uncertainty propagation. The analyses are carried out on a case study concerning the uncertainties in the probabilities of occurrence of accident sequences in an event tree analysis of a nuclear power plant. 相似文献
14.
Job exposure matrices (JEMs) are used to measure exposures based on information about particular jobs and tasks. JEMs are especially useful when individual exposure data cannot be obtained. Nonetheless, there may be other workplace exposures associated with the study disease that are not measured in available JEMs. When these exposures are also associated with the exposures measured in the JEM, biases due to uncontrolled confounding will be introduced. Furthermore, individual exposures differ from JEM measurements due to differences in job conditions and worker practices. Uncertainty may also be present at the assessor level since exposure information for each job may be imprecise or incomplete. Assigning individuals a fixed exposure determined by the JEM ignores these uncertainty sources. We examine the uncertainty displayed by bias analyses in a study of occupational electric shocks, occupational magnetic fields, and amyotrophic lateral sclerosis. 相似文献
15.
Human populations are exposed to environmental carcinogens in both indoor and outdoor atmospheres. Recent studies indicate that pollutant concentrations are generally higher in indoor atmospheres than in outdoor. Environmental pollutants that occur in indoor air from a variety of sources include radon, asbestos, organic and inorganic compounds, and certain particles (e.g., tobacco smoke). Some of the gases or vapors are adsorbed on suspended particulate matter, whereas others exist entirely in the gas phase or are distributed between the latter and a particle-bound state. Because of differences in chemical and physical properties, each class of carcinogens generally requires different sampling and analytical methods. In addition, a single indoor environment may contain a wide variety of air pollutants from different sources. Unfortunately, no single best approach currently exists for the quantitative determination of such complex mixtures and, for practical reasons, only the more toxic or the more abundant pollutants are usually measured. This paper summarizes the currently available monitoring methods for selected environmental pollutants found in indoor atmospheres. In addition, some possible sources for those pollutants are identified. 相似文献
16.
John A. Edwards Frank J. Snyder Pamela M. Allen Kevin A. Makinson David M. Hamby 《Risk analysis》2012,32(12):2055-2070
Previous research has shown that people err when making decisions aided by probability information. Surprisingly, there has been little exploration into the accuracy of decisions made based on many commonly used probabilistic display methods. Two experiments examined the ability of a comprehensive set of such methods to effectively communicate critical information to a decision maker and influence confidence in decision making. The second experiment investigated the performance of these methods under time pressure, a situational factor known to exacerbate judgmental errors. Ten commonly used graphical display methods were randomly assigned to participants. Across eight scenarios in which a probabilistic outcome was described, participants were asked questions regarding graph interpretation (e.g., mean) and made behavioral choices (i.e., act; do not act) based on the provided information indicated that decision‐maker accuracy differed by graphical method; error bars and boxplots led to greatest mean estimation and behavioral choice accuracy whereas complementary cumulative probability distribution functions were associated with the highest probability estimation accuracy. Under time pressure, participant performance decreased when making behavioral choices. 相似文献
17.
The Environmental Leadership Model (ELM) is presented which specifies four prominent antecedents to the organization's environmental strategic formulation process. These factors include: (1) moral norms and values for environmental responsibility, (2) the environmental attitudes of the CEO and top management, (3) stakeholder influences, and (4) perceived behavioral control of the firm by technological, financial, and regulatory constraints. It is proposed that the environmental strategy of an organization directs its pro-environmental behaviors. Finally, the model is initially investigated in an organization in the hazardous waste management industry using the case methodology advocated by Eisenhardt (1989). Future research directions for model development are discussed. 相似文献
18.
Updating Uncertainty in an Integrated Risk Assessment: Conceptual Framework and Methods 总被引:1,自引:0,他引:1
Bayesian methods are presented for updating the uncertainty in the predictions of an integrated Environmental Health Risk Assessment (EHRA) model. The methods allow the estimation of posterior uncertainty distributions based on the observation of different model outputs along the chain of the linked assessment framework. Analytical equations are derived for the case of the multiplicative lognormal risk model where the sequential log outputs (log ambient concentration, log applied dose, log delivered dose, and log risk) are each normally distributed. Given observations of a log output made with a normally distributed measurement error, the posterior distributions of the log outputs remain normal, but with modified means and variances, and induced correlations between successive log outputs and log inputs. The analytical equations for forward and backward propagation of the updates are generally applicable to sums of normally distributed variables. The Bayesian Monte-Carlo (BMC) procedure is presented to provide an approximate, but more broadly applicable method for numerically updating uncertainty with concurrent backward and forward propagation. Illustrative examples, presented for the multiplicative lognormal model, demonstrate agreement between the analytical and BMC methods, and show how uncertainty updates can propagate through a linked EHRA. The Bayesian updating methods facilitate the pooling of knowledge encoded in predictive models with that transmitted by research outcomes (e.g., field measurements), and thereby support the practice of iterative risk assessment and value of information appraisals. 相似文献
19.
Mitchell J. Small 《Risk analysis》2008,28(5):1289-1308
The distributional approach for uncertainty analysis in cancer risk assessment is reviewed and extended. The method considers a combination of bioassay study results, targeted experiments, and expert judgment regarding biological mechanisms to predict a probability distribution for uncertain cancer risks. Probabilities are assigned to alternative model components, including the determination of human carcinogenicity, mode of action, the dosimetry measure for exposure, the mathematical form of the dose‐response relationship, the experimental data set(s) used to fit the relationship, and the formula used for interspecies extrapolation. Alternative software platforms for implementing the method are considered, including Bayesian belief networks (BBNs) that facilitate assignment of prior probabilities, specification of relationships among model components, and identification of all output nodes on the probability tree. The method is demonstrated using the application of Evans, Sielken, and co‐workers for predicting cancer risk from formaldehyde inhalation exposure. Uncertainty distributions are derived for maximum likelihood estimate (MLE) and 95th percentile upper confidence limit (UCL) unit cancer risk estimates, and the effects of resolving selected model uncertainties on these distributions are demonstrated, considering both perfect and partial information for these model components. A method for synthesizing the results of multiple mechanistic studies is introduced, considering the assessed sensitivities and selectivities of the studies for their targeted effects. A highly simplified example is presented illustrating assessment of genotoxicity based on studies of DNA damage response caused by naphthalene and its metabolites. The approach can provide a formal mechanism for synthesizing multiple sources of information using a transparent and replicable weight‐of‐evidence procedure. 相似文献
20.
Kenneth T. Bogen 《Risk analysis》2014,34(10):1795-1806
The National Research Council 2009 “Silver Book” panel report included a recommendation that the U.S. Environmental Protection Agency (EPA) should increase all of its chemical carcinogen (CC) potency estimates by ~7‐fold to adjust for a purported median‐vs.‐mean bias that I recently argued does not exist (Bogen KT. “Does EPA underestimate cancer risks by ignoring susceptibility differences?,” Risk Analysis, 2014; 34(10):1780–1784). In this issue of the journal, my argument is critiqued for having flaws concerning: (1) intent, bias, and conservatism of EPA estimates of CC potency; (2) bias in potency estimates derived from epidemiology; and (3) human‐animal CC‐potency correlation. However, my argument remains valid, for the following reasons. (1) EPA's default approach to estimating CC risks has correctly focused on bounding average (not median) individual risk under a genotoxic mode‐of‐action (MOA) assumption, although pragmatically the approach leaves both inter‐individual variability in CC–susceptibility, and widely varying CC‐specific magnitudes of fundamental MOA uncertainty, unquantified. (2) CC risk estimates based on large epidemiology studies are not systematically biased downward due to limited sampling from broad, lognormal susceptibility distributions. (3) A good, quantitative correlation is exhibited between upper‐bounds on CC‐specific potency estimated from human vs. animal studies (n = 24, r = 0.88, p = 2 × 10?8). It is concluded that protective upper‐bound estimates of individual CC risk that account for heterogeneity in susceptibility, as well as risk comparisons informed by best predictions of average‐individual and population risk that address CC‐specific MOA uncertainty, should each be used as separate, complimentary tools to improve regulatory decisions concerning low‐level, environmental CC exposures. 相似文献