首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
A simple and useful characterization of many predictive models is in terms of model structure and model parameters. Accordingly, uncertainties in model predictions arise from uncertainties in the values assumed by the model parameters (parameter uncertainty) and the uncertainties and errors associated with the structure of the model (model uncertainty). When assessing uncertainty one is interested in identifying, at some level of confidence, the range of possible and then probable values of the unknown of interest. All sources of uncertainty and variability need to be considered. Although parameter uncertainty assessment has been extensively discussed in the literature, model uncertainty is a relatively new topic of discussion by the scientific community, despite being often the major contributor to the overall uncertainty. This article describes a Bayesian methodology for the assessment of model uncertainties, where models are treated as sources of information on the unknown of interest. The general framework is then specialized for the case where models provide point estimates about a single‐valued unknown, and where information about models are available in form of homogeneous and nonhomogeneous performance data (pairs of experimental observations and model predictions). Several example applications for physical models used in fire risk analysis are also provided.  相似文献   

2.
In a series of articles and a health-risk assessment report, scientists at the CIIT Hamner Institutes developed a model (CIIT model) for estimating respiratory cancer risk due to inhaled formaldehyde within a conceptual framework incorporating extensive mechanistic information and advanced computational methods at the toxicokinetic and toxicodynamic levels. Several regulatory bodies have utilized predictions from this model; on the other hand, upon detailed evaluation the California EPA has decided against doing so. In this article, we study the CIIT model to identify key biological and statistical uncertainties that need careful evaluation if such two-stage clonal expansion models are to be used for extrapolation of cancer risk from animal bioassays to human exposure. Broadly, these issues pertain to the use and interpretation of experimental labeling index and tumor data, the evaluation and biological interpretation of estimated parameters, and uncertainties in model specification, in particular that of initiated cells. We also identify key uncertainties in the scale-up of the CIIT model to humans, focusing on assumptions underlying model parameters for cell replication rates and formaldehyde-induced mutation. We discuss uncertainties in identifying parameter values in the model used to estimate and extrapolate DNA protein cross-link levels. The authors of the CIIT modeling endeavor characterized their human risk estimates as "conservative in the face of modeling uncertainties." The uncertainties discussed in this article indicate that such a claim is premature.  相似文献   

3.
The treatment of uncertainties associated with modeling and risk assessment has recently attracted significant attention. The methodology and guidance for dealing with parameter uncertainty have been fairly well developed and quantitative tools such as Monte Carlo modeling are often recommended. However, the issue of model uncertainty is still rarely addressed in practical applications of risk assessment. The use of several alternative models to derive a range of model outputs or risks is one of a few available techniques. This article addresses the often-overlooked issue of what we call "modeler uncertainty," i.e., difference in problem formulation, model implementation, and parameter selection originating from subjective interpretation of the problem at hand. This study uses results from the Fruit Working Group, which was created under the International Atomic Energy Agency (IAEA) BIOMASS program (BIOsphere Modeling and ASSessment). Model-model and model-data intercomparisons reviewed in this study were conducted by the working group for a total of three different scenarios. The greatest uncertainty was found to result from modelers' interpretation of scenarios and approximations made by modelers. In scenarios that were unclear for modelers, the initial differences in model predictions were as high as seven orders of magnitude. Only after several meetings and discussions about specific assumptions did the differences in predictions by various models merge. Our study shows that parameter uncertainty (as evaluated by a probabilistic Monte Carlo assessment) may have contributed over one order of magnitude to the overall modeling uncertainty. The final model predictions ranged between one and three orders of magnitude, depending on the specific scenario. This study illustrates the importance of problem formulation and implementation of an analytic-deliberative process in risk characterization.  相似文献   

4.
Congress is currently considering adopting a mathematical formula to assign shares in cancer causation to specific doses of radiation, for use in establishing liability and compensation awards. The proposed formula, if it were sound, would allow difficult problems in tort law and public policy to be resolved by reference to tabulated "probabilities of causation." This article examines the statistical and conceptual bases for the proposed methodology. We find that the proposed formula is incorrect as an expression for "probability and causation," that it implies hidden, debatable policy judgments in its treatment of factor interactions and uncertainties, and that it can not in general be quantified with sufficient precision to be useful. Three generic sources of statistical uncertainty are identified--sampling variability, population heterogeneity, and error propagation--that prevent accurate quantification of "assigned shares." These uncertainties arise whenever aggregate epidemiological or risk data are used to draw causal inferences about individual cases.  相似文献   

5.
This paper proposes that idiosyncratic firm‐level shocks can explain an important part of aggregate movements and provide a microfoundation for aggregate shocks. Existing research has focused on using aggregate shocks to explain business cycles, arguing that individual firm shocks average out in the aggregate. I show that this argument breaks down if the distribution of firm sizes is fat‐tailed, as documented empirically. The idiosyncratic movements of the largest 100 firms in the United States appear to explain about one‐third of variations in output growth. This “granular” hypothesis suggests new directions for macroeconomic research, in particular that macroeconomic questions can be clarified by looking at the behavior of large firms. This paper's ideas and analytical results may also be useful for thinking about the fluctuations of other economic aggregates, such as exports or the trade balance.  相似文献   

6.
Risk analysis often depends on complex, computer-based models to describe links between policies (e.g., required emission-control equipment) and consequences (e.g., probabilities of adverse health effects). Appropriate specification of many model aspects is uncertain, including details of the model structure; transport, reaction-rate, and other parameters; and application-specific inputs such as pollutant-release rates. Because these uncertainties preclude calculation of the precise consequences of a policy, it is important to characterize the plausible range of effects. In principle, a probability distribution function for the effects can be constructed using Monte Carlo analysis, but the combinatorics of multiple uncertainties and the often high cost of model runs quickly exhaust available resources. This paper presents and applies a method to choose sets of input conditions (scenarios) that efficiently represent knowledge about the joint probability distribution of inputs. A simple score function approximately relating inputs to a policy-relevant output—in this case, globally averaged stratospheric ozone depletion—is developed. The probability density function for the score-function value is analytically derived from a subjective joint probability density for the inputs. Scenarios are defined by selected quantiles of the score function. Using this method, scenarios can be systematically selected in terms of the approximate probability distribution function for the output of concern, and probability intervals for the joint effect of the inputs can be readily constructed.  相似文献   

7.
This paper discusses the present state of the art in long range economic and manpower forecasting. The problem of comprehensive long range forecasting is viewed as one of integrating classical econometric forecasting models with detailed interindustry and manpower models. Major attention is given here to the problem of output conversion— translating aggregate econometric forecasts into industry detail—and the manner in which this is accomplished in the major U.S. forecasting models. Problems involved in projecting input-output coefficients and labor productivity are analyzed and some serious deficiencies in U.S. occupational data are identified.  相似文献   

8.
Pest risk maps can provide useful decision support in invasive species management, but most do not adequately consider the uncertainty associated with predicted risk values. This study explores how increased uncertainty in a risk model's numeric assumptions might affect the resultant risk map. We used a spatial stochastic model, integrating components for entry, establishment, and spread, to estimate the risks of invasion and their variation across a two-dimensional landscape for Sirex noctilio , a nonnative woodwasp recently detected in the United States and Canada. Here, we present a sensitivity analysis of the mapped risk estimates to variation in key model parameters. The tested parameter values were sampled from symmetric uniform distributions defined by a series of nested bounds (±5%, … , ±40%) around the parameters' initial values. The results suggest that the maximum annual spread distance, which governs long-distance dispersal, was by far the most sensitive parameter. At ±15% or larger variability bound increments for this parameter, there were noteworthy shifts in map risk values, but no other parameter had a major effect, even at wider bounds of variation. The methodology presented here is generic and can be used to assess the impact of uncertainties on the stability of pest risk maps as well as to identify geographic areas for which management decisions can be made confidently, regardless of uncertainty.  相似文献   

9.
This article describes a simple model for quantifying the health impacts of toxic metal emissions. In contrast to most traditional models it calculates the expectation value of the total damage (summed over the total population and over all time) for typical emission sites, rather than "worst-case" estimates for specific sites or episodes. Such a model is needed for the evaluation of many environmental policy measures, e.g., the optimal level of pollution taxes or emission limits. Based on the methodology that has been developed by USEPA for the assessment of multimedia pathways, the equations and parameters are assembled for the assessment of As, Cd, Cr, Hg, Ni, and Pb, and some typical results are presented (the dose from seafood is not included and for Hg the results are extremely uncertain); the model is freely available on the web. The structure of the model is very simple because, as we show, if the parameters can be approximated by time-independent constants (the case for the USEPA methodology), the total impacts can be calculated with steady-state models even though the environment is never in steady state. The collective ingestion dose is found to be roughly 2 orders of magnitude larger than the collective dose via inhalation. The uncertainties are large, easily an order of magnitude, the main uncertainties arising from the parameter values of the model, in particular the transfer factors. Using linearized dose-response functions, estimates are provided for cancers due to As, Cd, Cr, and Ni as well as IQ loss due to Pb emissions in Europe.  相似文献   

10.
11.
The tenfold "uncertainty" factor traditionally used to guard against human interindividual differences in susceptibility to toxicity is not based on human observations. To begin to build a basis for quantifying an important component of overall variability in susceptibility to toxicity, a data base has been constructed of individual measurements of key pharmacokinetic parameters for specific substances (mostly drugs) in groups of at least five healthy adults. 72 of the 101 data sets studied were positively skewed, indicating that the distributions are generally closer to expectations for log-normal distributions than for normal distributions. Measurements of interindividual variability in elimination half-lives, maximal blood concentrations, and AUC (area under the curve of blood concentration by time) have median values of log10 geometric standard deviations in the range of 0.11-0.145. For the median chemical, therefore, a tenfold difference in these pharmacokinetic parameters would correspond to 7-9 standard deviations in populations of normal healthy adults. For one relatively lipophilic chemical, however, interindividual variability in maximal blood concentration and AUC was 0.4--implying that a tenfold difference would correspond to only about 2.5 standard deviations for those parameters in the human population. The parameters studied to date are only components of overall susceptibility to toxic agents, and do not include contributions from variability in exposure- and response-determining parameters. The current study also implicitly excludes most human interindividual variability from age and illness. When these other sources of variability are included in an overall analysis of variability in susceptibility, it is likely that a tenfold difference will correspond to fewer standard deviations in the overall population, and correspondingly greater numbers of people at risk of toxicity.  相似文献   

12.
Traditionally, microbial risk assessors have used point estimates to evaluate the probability that an individual will become infected. We developed a quantitative approach that shifts the risk characterization perspective from point estimate to distributional estimate, and from individual to population. To this end, we first designed and implemented a dynamic model that tracks traditional epidemiological variables such as the number of susceptible, infected, diseased, and immune, and environmental variables such as pathogen density. Second, we used a simulation methodology that explicitly acknowledges the uncertainty and variability associated with the data. Specifically, the approach consists of assigning probability distributions to each parameter, sampling from these distributions for Monte Carlo simulations, and using a binary classification to assess the output of each simulation. A case study is presented that explores the uncertainties in assessing the risk of giardiasis when swimming in a recreational impoundment using reclaimed water. Using literature-based information to assign parameters ranges, our analysis demonstrated that the parameter describing the shedding of pathogens by infected swimmers was the factor that contributed most to the uncertainty in risk. The importance of other parameters was dependent on reducing the a priori range of this shedding parameter. By constraining the shedding parameter to its lower subrange, treatment efficiency was the parameter most important in predicting whether a simulation resulted in prevalences above or below non outbreak levels. Whereas parameters associated with human exposure were important when the shedding parameter was constrained to a higher subrange. This Monte Carlo simulation technique identified conditions in which outbreaks and/or nonoutbreaks are likely and identified the parameters that most contributed to the uncertainty associated with a risk prediction.  相似文献   

13.
A wide range of uncertainties will be introduced inevitably during the process of performing a safety assessment of engineering systems. The impact of all these uncertainties must be addressed if the analysis is to serve as a tool in the decision-making process. Uncertainties present in the components (input parameters of model or basic events) of model output are propagated to quantify its impact in the final results. There are several methods available in the literature, namely, method of moments, discrete probability analysis, Monte Carlo simulation, fuzzy arithmetic, and Dempster-Shafer theory. All the methods are different in terms of characterizing at the component level and also in propagating to the system level. All these methods have different desirable and undesirable features, making them more or less useful in different situations. In the probabilistic framework, which is most widely used, probability distribution is used to characterize uncertainty. However, in situations in which one cannot specify (1) parameter values for input distributions, (2) precise probability distributions (shape), and (3) dependencies between input parameters, these methods have limitations and are found to be not effective. In order to address some of these limitations, the article presents uncertainty analysis in the context of level-1 probabilistic safety assessment (PSA) based on a probability bounds (PB) approach. PB analysis combines probability theory and interval arithmetic to produce probability boxes (p-boxes), structures that allow the comprehensive propagation through calculation in a rigorous way. A practical case study is also carried out with the developed code based on the PB approach and compared with the two-phase Monte Carlo simulation results.  相似文献   

14.
A Systematic Uncertainty Analysis of an Evaluative Fate and Exposure Model   总被引:7,自引:0,他引:7  
Multimedia fate and exposure models are widely used to regulate the release of toxic chemicals, to set cleanup standards for contaminated sites, and to evaluate emissions in life-cycle assessment. CalTOX, one of these models, is used to calculate the potential dose, an outcome that is combined with the toxicity of the chemical to determine the Human Toxicity Potential (HTP), used to aggregate and compare emissions. The comprehensive assessment of the uncertainty in the potential dose calculation in this article serves to provide the information necessary to evaluate the reliability of decisions based on the HTP A framework for uncertainty analysis in multimedia risk assessment is proposed and evaluated with four types of uncertainty. Parameter uncertainty is assessed through Monte Carlo analysis. The variability in landscape parameters is assessed through a comparison of potential dose calculations for different regions in the United States. Decision rule uncertainty is explored through a comparison of the HTP values under open and closed system boundaries. Model uncertainty is evaluated through two case studies, one using alternative formulations for calculating the plant concentration and the other testing the steady state assumption for wet deposition. This investigation shows that steady state conditions for the removal of chemicals from the atmosphere are not appropriate and result in an underestimate of the potential dose for 25% of the 336 chemicals evaluated.  相似文献   

15.
"双一流"建设背景下,基于合作竞争的资源配置模式决定了中国必须提高高等教育投入产出效率。综合运用Window-Malmquist指数法和空间聚类方法测算分析中国31省(市)2004-2015年的高等教育投入产出效率及其演变规律和空间差异,结果显示:(1)总体而言,中国高等教育投入产出效率呈现DEA有效状态,技术进步是其提升的主要贡献因素,但高等教育"区域鸿沟"的存在却导致追赶效应拉低了效率值;(2)从时间演变规律看,中国高等教育投入产出效率相对稳定,但上升趋势并不明显,且存在两极分化或多极分化的可能性;(3)从空间差异情况看,31个省(市)中,江苏高等教育投入产出效率最高,东部沿海地区的高等教育投入产出效率具有"高高"特征,且效率值明显高于其他地区;(4)从空间集聚特征看,在技术进步的作用下不同区域的高等教育投入产出效率存在正向的空间关系,但追赶效应却降低了这种空间联系。  相似文献   

16.
In any model the values of estimates for various parameters are obtained from different sources each with its own level of uncertainty. When the probability distributions of the estimates are obtained as opposed to point values only, the measurement uncertainties in the parameter estimates may be addressed. However, the sources used for obtaining the data and the models used to select appropriate distributions are of differing degrees of uncertainty. A hierarchy of different sources of uncertainty based upon one's ability to validate data and models empirically is presented. When model parameters are aggregated with different levels of the hierarchy represented, this implies distortion or degradation in the utility and validity of the models used. Means to identify and deal with such heterogeneous data sources are explored, and a number of approaches to addressing this problem is presented. One approach, using Range/Confidence Estimates coupled with an Information Value Analysis Process, is presented as an example.  相似文献   

17.
In quantitative uncertainty analysis, it is essential to define rigorously the endpoint or target of the assessment. Two distinctly different approaches using Monte Carlo methods are discussed: (1) the end point is a fixed but unknown value (e.g., the maximally exposed individual, the average individual, or a specific individual) or (2) the end point is an unknown distribution of values (e.g., the variability of exposures among unspecified individuals in the population). In the first case, values are sampled at random from distributions representing various "degrees of belief" about the unknown "fixed" values of the parameters to produce a distribution of model results. The distribution of model results represents a subjective confidence statement about the true but unknown assessment end point. The important input parameters are those that contribute most to the spread in the distribution of the model results. In the second case, Monte Carlo calculations are performed in two dimensions producing numerous alternative representations of the true but unknown distribution. These alternative distributions permit subject confidence statements to be made from two perspectives: (1) for the individual exposure occurring at a specified fractile of the distribution or (2) for the fractile of the distribution associated with a specified level of individual exposure. The relative importance of input parameters will depend on the fractile or exposure level of interest. The quantification of uncertainty for the simulation of a true but unknown distribution of values represents the state-of-the-art in assessment modeling.  相似文献   

18.
19.
Typically, the uncertainty affecting the parameters of physiologically based pharmacokinetic (PBPK) models is ignored because it is not currently practical to adjust their values using classical parameter estimation techniques. This issue of parametric variability in a physiological model of benzene pharmacokinetics is addressed in this paper. Monte Carlo simulations were used to study the effects on the model output arising from variability in its parameters. The output was classified into two categories, depending on whether the output of the model on a particular run was judged to be generally consistent with published experimental data. Statistical techniques were used to examine sensitivity and interaction in the parameter space. The model was evaluated against the data from three different experiments in order to test for the structural adequacy of the model and the consistency of the experimental results. The regions of the parameter space associated with various inhalation and gavage experiments are distinct, and the model as presently structured cannot adequately represent the outcomes of all experiments. Our results suggest that further effort is required to discern between the structural adequacy of the model and the consistency of the experimental results. The impact of our results on the risk assessment process for benzene is also examined.  相似文献   

20.
Physiologically-based toxicokinetic (PBTK) models are widely used to quantify whole-body kinetics of various substances. However, since they attempt to reproduce anatomical structures and physiological events, they have a high number of parameters. Their identification from kinetic data alone is often impossible, and other information about the parameters is needed to render the model identifiable. The most commonly used approach consists of independently measuring, or taking from literature sources, some of the parameters, fixing them in the kinetic model, and then performing model identification on a reduced number of less certain parameters. This results in a substantial reduction of the degrees of freedom of the model. In this study, we show that this method results in final estimates of the free parameters whose precision is overestimated. We then compared this approach with an empirical Bayes approach, which takes into account not only the mean value, but also the error associated with the independently determined parameters. Blood and breath 2H8-toluene washout curves, obtained in 17 subjects, were analyzed with a previously presented PBTK model suitable for person-specific dosimetry. Model parameters with the greatest effect on predicted levels were alveolar ventilation rate QPC, fat tissue fraction VFC, blood-air partition coefficient Kb, fraction of cardiac output to fat Qa/co and rate of extrahepatic metabolism Vmax-p. Differences in the measured and Bayesian-fitted values of QPC, VFC and Kb were significant (p < 0.05), and the precision of the fitted values Vmax-p and Qa/co went from 11 ± 5% to 75 ± 170% (NS) and from 8 ± 2% to 9 ± 2% (p < 0.05) respectively. The empirical Bayes approach did not result in less reliable parameter estimates: rather, it pointed out that the precision of parameter estimates can be overly optimistic when other parameters in the model, either directly measured or taken from literature sources, are treated as known without error. In conclusion, an empirical Bayes approach to parameter estimation resulted in a better model fit, different final parameter estimates, and more realistic parameter precisions.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号