首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
In risk analysis, the treatment of the epistemic uncertainty associated to the probability of occurrence of an event is fundamental. Traditionally, probabilistic distributions have been used to characterize the epistemic uncertainty due to imprecise knowledge of the parameters in risk models. On the other hand, it has been argued that in certain instances such uncertainty may be best accounted for by fuzzy or possibilistic distributions. This seems the case in particular for parameters for which the information available is scarce and of qualitative nature. In practice, it is to be expected that a risk model contains some parameters affected by uncertainties that may be best represented by probability distributions and some other parameters that may be more properly described in terms of fuzzy or possibilistic distributions. In this article, a hybrid method that jointly propagates probabilistic and possibilistic uncertainties is considered and compared with pure probabilistic and pure fuzzy methods for uncertainty propagation. The analyses are carried out on a case study concerning the uncertainties in the probabilities of occurrence of accident sequences in an event tree analysis of a nuclear power plant.  相似文献   

2.
Risks from exposure to contaminated land are often assessed with the aid of mathematical models. The current probabilistic approach is a considerable improvement on previous deterministic risk assessment practices, in that it attempts to characterize uncertainty and variability. However, some inputs continue to be assigned as precise numbers, while others are characterized as precise probability distributions. Such precision is hard to justify, and we show in this article how rounding errors and distribution assumptions can affect an exposure assessment. The outcome of traditional deterministic point estimates and Monte Carlo simulations were compared to probability bounds analyses. Assigning all scalars as imprecise numbers (intervals prescribed by significant digits) added uncertainty to the deterministic point estimate of about one order of magnitude. Similarly, representing probability distributions as probability boxes added several orders of magnitude to the uncertainty of the probabilistic estimate. This indicates that the size of the uncertainty in such assessments is actually much greater than currently reported. The article suggests that full disclosure of the uncertainty may facilitate decision making in opening up a negotiation window. In the risk analysis process, it is also an ethical obligation to clarify the boundary between the scientific and social domains.  相似文献   

3.
A Probabilistic Framework for the Reference Dose (Probabilistic RfD)   总被引:5,自引:0,他引:5  
Determining the probabilistic limits for the uncertainty factors used in the derivation of the Reference Dose (RfD) is an important step toward the goal of characterizing the risk of noncarcinogenic effects from exposure to environmental pollutants. If uncertainty factors are seen, individually, as "upper bounds" on the dose-scaling factor for sources of uncertainty, then determining comparable upper bounds for combinations of uncertainty factors can be accomplished by treating uncertainty factors as distributions, which can be combined by probabilistic techniques. This paper presents a conceptual approach to probabilistic uncertainty factors based on the definition and use of RfDs by the US. EPA. The approach does not attempt to distinguish one uncertainty factor from another based on empirical data or biological mechanisms but rather uses a simple displaced lognormal distribution as a generic representation of all uncertainty factors. Monte Carlo analyses show that the upper bounds for combinations of this distribution can vary by factors of two to four when compared to the fixed-value uncertainty factor approach. The probabilistic approach is demonstrated in the comparison of Hazard Quotients based on RfDs with differing number of uncertainty factors.  相似文献   

4.
Variability and Uncertainty Meet Risk Management and Risk Communication   总被引:1,自引:0,他引:1  
In the past decade, the use of probabilistic risk analysis techniques to quantitatively address variability and uncertainty in risks increased in popularity as recommended by the 1994 National Research Council that wrote Science and Judgment in Risk Assessment. Under the 1996 Food Quality Protection Act, for example, the U.S. EPA supported the development of tools that produce distributions of risk demonstrating the variability and/or uncertainty in the results. This paradigm shift away from the use of point estimates creates new challenges for risk managers, who now struggle with decisions about how to use distributions in decision making. The challenges for risk communication, however, have only been minimally explored. This presentation uses the case studies of variability in the risks of dying on the ground from a crashing airplane and from the deployment of motor vehicle airbags to demonstrate how better characterization of variability and uncertainty in the risk assessment lead to better risk communication. Analogies to food safety and environmental risks are also discussed. This presentation demonstrates that probabilistic risk assessment has an impact on both risk management and risk communication, and highlights remaining research issues associated with using improved sensitivity and uncertainty analyses in risk assessment.  相似文献   

5.
Deep uncertainty in future climatic and economic conditions complicates developing infrastructure designed to last several generations, such as water reservoirs. In response, analysts have developed multiple robust decision frameworks to help identify investments and policies that can withstand a wide range of future states. Although these frameworks are adept at supporting decisions where uncertainty cannot be represented probabilistically, analysts necessarily choose probabilistic bounds and distributions for uncertain variables to support exploratory modeling. The implications of these assumptions on the analytical outcomes of robust decision frameworks are rarely evaluated, and little guidance exists in terms of how to select uncertain variable distributions. Here, we evaluate the impact of these choices by following the robust decision-making procedure, using four different assumptions about the probabilistic distribution of exogenous uncertainties in future climatic and economic states. We take a water reservoir system in Ethiopia as our case study, and sample climatic parameters from uniform, normal, extended uniform, and extended normal distributions; we similarly sample two economic parameters. We compute regret and satisficing robustness decision criteria for two performance measures, agricultural water demand coverage and net present value, and perform scenario discovery on the most robust reservoir alternative. We find lower robustness scores resulting from extended parameter distributions and demonstrate that parameter distributions can impact vulnerabilities identified through scenario discovery. Our results suggest that exploratory modeling within robust decision frameworks should sample from extended, uniform parameters distributions.  相似文献   

6.
Quantifying safety goals is a key to the regulation of activities which are beneficial on the whole but entail some risks in being performed. Determining compliance with safety goals involves dealing with uncertainties. A recent article by Bier(I) describes some of the difficulties encountered using measures with uncertainty to determine compliance with safety goals for nuclear reactors. This paper uses a hierarchical Bayes approach to address two practical modeling problems in determining safety goal compliance under uncertainty: (1) allowing some modeling assumptions to be relaxed, and (2) allowing data from previous related samples to be included in the analysis. The two issues effect each other to the extent that relaxing some assumptions allows the use of a broader range of data. The usefulness of these changes and their impact on assessing safety compliance for nuclear reactors is shown.  相似文献   

7.
Slob  W.  Pieters  M. N. 《Risk analysis》1998,18(6):787-798
The use of uncertainty factors in the standard method for deriving acceptable intake or exposure limits for humans, such as the Reference Dose (RfD), may be viewed as a conservative method of taking various uncertainties into account. As an obvious alternative, the use of uncertainty distributions instead of uncertainty factors is gaining attention. This paper presents a comprehensive discussion of a general framework that quantifies both the uncertainties in the no-adverse-effect level in the animal (using a benchmark-like approach) and the uncertainties in the various extrapolation steps involved (using uncertainty distributions). This approach results in an uncertainty distribution for the no-adverse-effect level in the sensitive human subpopulation, reflecting the overall scientific uncertainty associated with that level. A lower percentile of this distribution may be regarded as an acceptable exposure limit (e.g., RfD) that takes account of the various uncertainties in a nonconservative fashion. The same methodology may also be used as a tool to derive a distribution for possible human health effects at a given exposure level. We argue that in a probabilistic approach the uncertainty in the estimated no-adverse-effect-level in the animal should be explicitly taken into account. Not only is this source of uncertainty too large to be ignored, it also has repercussions for the quantification of the other uncertainty distributions.  相似文献   

8.
Dose‐response models are the essential link between exposure assessment and computed risk values in quantitative microbial risk assessment, yet the uncertainty that is inherent to computed risks because the dose‐response model parameters are estimated using limited epidemiological data is rarely quantified. Second‐order risk characterization approaches incorporating uncertainty in dose‐response model parameters can provide more complete information to decisionmakers by separating variability and uncertainty to quantify the uncertainty in computed risks. Therefore, the objective of this work is to develop procedures to sample from posterior distributions describing uncertainty in the parameters of exponential and beta‐Poisson dose‐response models using Bayes's theorem and Markov Chain Monte Carlo (in OpenBUGS). The theoretical origins of the beta‐Poisson dose‐response model are used to identify a decomposed version of the model that enables Bayesian analysis without the need to evaluate Kummer confluent hypergeometric functions. Herein, it is also established that the beta distribution in the beta‐Poisson dose‐response model cannot address variation among individual pathogens, criteria to validate use of the conventional approximation to the beta‐Poisson model are proposed, and simple algorithms to evaluate actual beta‐Poisson probabilities of infection are investigated. The developed MCMC procedures are applied to analysis of a case study data set, and it is demonstrated that an important region of the posterior distribution of the beta‐Poisson dose‐response model parameters is attributable to the absence of low‐dose data. This region includes beta‐Poisson models for which the conventional approximation is especially invalid and in which many beta distributions have an extreme shape with questionable plausibility.  相似文献   

9.
Setting action levels or limits for health protection is complicated by uncertainty in the dose-response relation across a range of hazards and exposures. To address this issue, we consider the classic newsboy problem. The principles used to manage uncertainty for that case are applied to two stylized exposure examples, one for high dose and high dose rate radiation and the other for ammonia. Both incorporate expert judgment on uncertainty quantification in the dose-response relationship. The mathematical technique of probabilistic inversion also plays a key role. We propose a coupled approach, whereby scientists quantify the dose-response uncertainty using techniques such as structured expert judgment with performance weights and probabilistic inversion, and stakeholders quantify associated loss rates.  相似文献   

10.
Biomagnification of organochlorine and other persistent organic contaminants by higher trophic level organisms represents one of the most significant sources of uncertainty and variability in evaluating potential risks associated with disposal of dredged materials. While it is important to distinguish between population variability (e.g., true population heterogeneity in fish weight, and lipid content) and uncertainty (e.g., measurement error), they can be operationally difficult to define separately in probabilistic estimates of human health and ecological risk. We propose a disaggregation of uncertain and variable parameters based on: (1) availability of supporting data; (2) the specific management and regulatory context (in this case, of the U.S. Army Corps of Engineers/U.S. Environmental Protection Agency tiered approach to dredged material management); and (3) professional judgment and experience in conducting probabilistic risk assessments. We describe and quantitatively evaluate several sources of uncertainty and variability in estimating risk to human health from trophic transfer of polychlorinated biphenyls (PCBs) using a case study of sediments obtained from the New York-New Jersey Harbor and being evaluated for disposal at an open water off-shore disposal site within the northeast region. The estimates of PCB concentrations in fish and dietary doses of PCBs to humans ingesting fish are expressed as distributions of values, of which the arithmetic mean or mode represents a particular fractile. The distribution of risk values is obtained using a food chain biomagnification model developed by Gobas by specifying distributions for input parameters disaggregated to represent either uncertainty or variability. Only those sources of uncertainty that could be quantified were included in the analysis. Results for several different two-dimensional Latin Hypercube analyses are provided to evaluate the influence of the uncertain versus variable disaggregation of model parameters. The analysis suggests that variability in human exposure parameters is greater than the uncertainty bounds on any particular fractile, given the described assumptions.  相似文献   

11.
A limiting constraint of many management science techniques is that inputs from the decision maker based upon his experiences, opinions and intuition are not considered. For those models that do allow this type of input, it is assumed that they can be accurately and precisely defined in a subjective probability distribution. Little attention, however, has been directed towards evaluating the techniques to define these distributions in a management setting. This study investigates the relative merits of four of the most commonly used techniques for the quantification of subjective assessments. When these techniques were used with professionals whose jobs entail evaluation of uncertainty, a clear preference was shown. Additionally, some concluding observations concerning the selection and the application of assessment techniques are presented.  相似文献   

12.
Human health risk assessments use point values to develop risk estimates and thus impart a deterministic character to risk, which, by definition, is a probability phenomenon. The risk estimates are calculated based on individuals and then, using uncertainty factors (UFs), are extrapolated to the population that is characterized by variability. Regulatory agencies have recommended the quantification of the impact of variability in risk assessments through the application of probabilistic methods. In the present study, a framework that deals with the quantitative analysis of uncertainty (U) and variability (V) in target tissue dose in the population was developed by applying probabilistic analysis to physiologically-based toxicokinetic models. The mechanistic parameters that determine kinetics were described with probability density functions (PDFs). Since each PDF depicts the frequency of occurrence of all expected values of each parameter in the population, the combined effects of multiple sources of U/V were accounted for in the estimated distribution of tissue dose in the population, and a unified (adult and child) intraspecies toxicokinetic uncertainty factor UFH-TK was determined. The results show that the proposed framework accounts effectively for U/V in population toxicokinetics. The ratio of the 95th percentile to the 50th percentile of the annual average concentration of the chemical at the target tissue organ (i.e., the UFH-TK) varies with age. The ratio is equivalent to a unified intraspecies toxicokinetic UF, and it is one of the UFs by which the NOAEL can be divided to obtain the RfC/RfD. The 10-fold intraspecies UF is intended to account for uncertainty and variability in toxicokinetics (3.2x) and toxicodynamics (3.2x). This article deals exclusively with toxicokinetic component of UF. The framework provides an alternative to the default methodology and is advantageous in that the evaluation of toxicokinetic variability is based on the distribution of the effective target tissue dose, rather than applied dose. It allows for the replacement of the default adult and children intraspecies UF with toxicokinetic data-derived values and provides accurate chemical-specific estimates for their magnitude. It shows that proper application of probability and toxicokinetic theories can reduce uncertainties when establishing exposure limits for specific compounds and provide better assurance that established limits are adequately protective. It contributes to the development of a probabilistic noncancer risk assessment framework and will ultimately lead to the unification of cancer and noncancer risk assessment methodologies.  相似文献   

13.
Monte Carlo simulations are commonplace in quantitative risk assessments (QRAs). Designed to propagate the variability and uncertainty associated with each individual exposure input parameter in a quantitative risk assessment, Monte Carlo methods statistically combine the individual parameter distributions to yield a single, overall distribution. Critical to such an assessment is the representativeness of each individual input distribution. The authors performed a literature review to collect and compare the distributions used in published QRAs for the parameters of body weight, food consumption, soil ingestion rates, breathing rates, and fluid intake. To provide a basis for comparison, all estimated exposure parameter distributions were evaluated with respect to four properties: consistency, accuracy, precision, and specificity. The results varied depending on the exposure parameter. Even where extensive, well-collected data exist, investigators used a variety of different distributional shapes to approximate these data. Where such data do not exist, investigators have collected their own data, often leading to substantial disparity in parameter estimates and subsequent choice of distribution. The present findings indicate that more attention must be paid to the data underlying these distributional choices. More emphasis should be placed on sensitivity analyses, quantifying the impact of assumptions, and on discussion of sources of variation as part of the presentation of any risk assessment results. If such practices and disclosures are followed, it is believed that Monte Carlo simulations can greatly enhance the accuracy and appropriateness of specific risk assessments. Without such disclosures, researchers will be increasing the size of the risk assessment "black box," a concern already raised by many critics of more traditional risk assessments.  相似文献   

14.
The Environmental Protection Agency's (EPA's) estimates of the benefits of improved air quality, especially from reduced mortality associated with reductions in fine particle concentrations, constitute the largest category of benefits from all federal regulation over the last decade. EPA develops such estimates, however, using an approach little changed since a 2002 report by the National Research Council (NRC), which was critical of EPA's methods and recommended a more comprehensive uncertainty analysis incorporating probability distributions for major sources of uncertainty. Consistent with the NRC's 2002 recommendations, we explore alternative assumptions and probability distributions for the major variables used to calculate the value of mortality benefits. For metropolitan Philadelphia, we show that uncertainty in air quality improvements and in baseline mortality have only modest effects on the distribution of estimated benefits. We analyze the effects of alternative assumptions regarding the value of reducing mortality risk, whether the toxicity is above or below the average for fine particles, and whether there is a threshold in the concentration‐response relationship, and show these assumptions all have large effects on the distribution of benefits.  相似文献   

15.
The dose–response relationship between folate levels and cognitive impairment among individuals with vitamin B12 deficiency is an essential component of a risk-benefit analysis approach to regulatory and policy recommendations regarding folic acid fortification. Epidemiological studies provide data that are potentially useful for addressing this research question, but the lack of analysis and reporting of data in a manner suitable for dose–response purposes hinders the application of the traditional evidence synthesis process. This study aimed to estimate a quantitative dose–response relationship between folate exposure and the risk of cognitive impairment among older adults with vitamin B12 deficiency using “probabilistic meta-analysis,” a novel approach for synthesizing data from observational studies. Second-order multistage regression was identified as the best-fit model for the association between the probability of cognitive impairment and serum folate levels based on data generated by randomly sampling probabilistic distributions with parameters estimated based on summarized information reported in relevant publications. The findings indicate a “J-shape” effect of serum folate levels on the occurrence of cognitive impairment. In particular, an excessive level of folate exposure is predicted to be associated with a higher risk of cognitive impairment, albeit with greater uncertainty than the association between low folate exposure and cognitive impairment. This study directly contributes to the development of a practical solution to synthesize observational evidence for dose–response assessment purposes, which will help strengthen future nutritional risk assessments for the purpose of informing decisions on nutrient fortification in food.  相似文献   

16.
The abandoned mine legacy is critical in many countries around the world, where mine cave-ins and surface subsidence disruptions are perpetual risks that can affect the population, infrastructure, historical legacies, land use, and the environment. This article establishes abandoned metal mine failure risk evaluation approaches and quantification techniques based on the Canadian mining experience. These utilize clear geomechanics considerations such as failure mechanisms, which are dependent on well-defined rock mass parameters. Quantified risk is computed using probability of failure (probabilistics using limit-equilibrium factors of safety or applicable numerical modeling factor of safety quantifications) times a consequence impact value. Semi-quantified risk can be based on failure-case-study-based empirical data used in calculating probability of failure, and personal experience can provide qualified hazard and impact consequence assessments. The article provides outlines for land use and selection of remediation measures based on risk.  相似文献   

17.
A method is proposed for integrated probabilistic risk assessment where exposure assessment and hazard characterization are both included in a probabilistic way. The aim is to specify the probability that a random individual from a defined (sub)population will have an exposure high enough to cause a particular health effect of a predefined magnitude, the critical effect size ( CES ). The exposure level that results in exactly that CES in a particular person is that person's individual critical effect dose ( ICED ). Individuals in a population typically show variation, both in their individual exposure ( IEXP ) and in their ICED . Both the variation in IEXP and the variation in ICED are quantified in the form of probability distributions. Assuming independence between both distributions, they are combined (by Monte Carlo) into a distribution of the individual margin of exposure ( IMoE ). The proportion of the IMoE distribution below unity is the probability of critical exposure ( PoCE ) in the particular (sub)population. Uncertainties involved in the overall risk assessment (i.e., both regarding exposure and effect assessment) are quantified using Monte Carlo and bootstrap methods. This results in an uncertainty distribution for any statistic of interest, such as the probability of critical exposure ( PoCE ). The method is illustrated based on data for the case of dietary exposure to the organophosphate acephate. We present plots that concisely summarize the probabilistic results, retaining the distinction between variability and uncertainty. We show how the relative contributions from the various sources of uncertainty involved may be quantified.  相似文献   

18.
The neurotoxic effects of chemical agents are often investigated in controlled studies on rodents, with binary and continuous multiple endpoints routinely collected. One goal is to conduct quantitative risk assessment to determine safe dose levels. Yu and Catalano (2005) describe a method for quantitative risk assessment for bivariate continuous outcomes by extending a univariate method of percentile regression. The model is likelihood based and allows for separate dose‐response models for each outcome while accounting for the bivariate correlation. The approach to benchmark dose (BMD) estimation is analogous to that for quantal data without having to specify arbitrary cutoff values. In this article, we evaluate the behavior of the BMD relative to background rates, sample size, level of bivariate correlation, dose‐response trend, and distributional assumptions. Using simulations, we explore the effects of these factors on the resulting BMD and BMDL distributions. In addition, we illustrate our method with data from a neurotoxicity study of parathion exposure in rats.  相似文献   

19.
To make the methodology of risk assessment more consistent with the realities of biological processes, a computer-based model of the carcinogenic process may be used. A previously developed probabilistic model, which is based on a two-stage theory of carcinogenesis, represents urinary bladder carcinogenesis at the cellular level with emphasis on quantification of cell dynamics: cell mitotic rates, cell loss and birth rates, and irreversible cellular transitions from normal to initiated to transformed states are explicitly accounted for. Analyses demonstrate the sensitivity of tumor incidence to the timing and magnitude of changes to these cellular variables. It is demonstrated that response in rats following administration of nongenotoxic compounds, such as sodium saccharin, can be explained entirely on the basis of cytotoxicity and consequent hyperplasia alone.  相似文献   

20.
Exposure guidelines for potentially toxic substances are often based on a reference dose (RfD) that is determined by dividing a no-observed-adverse-effect-level (NOAEL), lowest-observed-adverse-effect-level (LOAEL), or benchmark dose (BD) corresponding to a low level of risk, by a product of uncertainty factors. The uncertainty factors for animal to human extrapolation, variable sensitivities among humans, extrapolation from measured subchronic effects to unknown results for chronic exposures, and extrapolation from a LOAEL to a NOAEL can be thought of as random variables that vary from chemical to chemical. Selected databases are examined that provide distributions across chemicals of inter- and intraspecies effects, ratios of LOAELs to NOAELs, and differences in acute and chronic effects, to illustrate the determination of percentiles for uncertainty factors. The distributions of uncertainty factors tend to be approximately lognormally distributed. The logarithm of the product of independent uncertainty factors is approximately distributed as the sum of normally distributed variables, making it possible to estimate percentiles for the product. Hence, the size of the products of uncertainty factors can be selected to provide adequate safety for a large percentage (e.g., approximately 95%) of RfDs. For the databases used to describe the distributions of uncertainty factors, using values of 10 appear to be reasonable and conservative. For the databases examined the following simple "Rule of 3s" is suggested that exceeds the estimated 95th percentile of the product of uncertainty factors: If only a single uncertainty factor is required use 33, for any two uncertainty factors use 3 x 33 approximately 100, for any three uncertainty factors use a combined factor of 3 x 100 = 300, and if all four uncertainty factors are needed use a total factor of 3 x 300 = 900. If near the 99th percentile is desired use another factor of 3. An additional factor may be needed for inadequate data or a modifying factor for other uncertainties (e.g., different routes of exposure) not covered above.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号