首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Studies using open-ended response modes to elicit probabilistic beliefs have sometimes found an elevated frequency (or blip) at 50 in their response distributions. Our previous research suggests that this is caused by intrusion of the phrase "fifty-fifty," which represents epistemic uncertainty, rather than a true numeric probability of 50%. Such inappropriate responses pose a problem for decision analysts and others relying on probabilistic judgments. Using an explicit numeric probability scale (ranging from 0-100%) reduces thinking about uncertain events in verbal terms like "fifty-fifty," and, with it, exaggerated use of the 50 response. Here, we present two procedures for adjusting response distributions for data already collected with open-ended response modes and hence vulnerable to an exaggerated presence of 50%. Each procedure infers the prevalence of 50s had a numeric probability scale been used, then redistributes the excess. The two procedures are validated on some of our own existing data and then applied to judgments elicited from experts in groundwater pollution and bioremediation.  相似文献   

2.
The health‐related damages associated with emissions from coal‐fired power plants can vary greatly across facilities as a function of plant, site, and population characteristics, but the degree of variability and the contributing factors have not been formally evaluated. In this study, we modeled the monetized damages associated with 407 coal‐fired power plants in the United States, focusing on premature mortality from fine particulate matter (PM2.5). We applied a reduced‐form chemistry‐transport model accounting for primary PM2.5 emissions and the influence of sulfur dioxide (SO2) and nitrogen oxide (NOx) emissions on secondary particulate formation. Outputs were linked with a concentration‐response function for PM2.5‐related mortality that incorporated nonlinearities and model uncertainty. We valued mortality with a value of statistical life approach, characterizing and propagating uncertainties in all model elements. At the median of the plant‐specific uncertainty distributions, damages across plants ranged from $30,000 to $500,000 per ton of PM2.5, $6,000 to $50,000 per ton of SO2, $500 to $15,000 per ton of NOx, and $0.02 to $1.57 per kilowatt‐hour of electricity generated. Variability in damages per ton of emissions was almost entirely explained by population exposure per unit emissions (intake fraction), which itself was related to atmospheric conditions and the population size at various distances from the power plant. Variability in damages per kilowatt‐hour was highly correlated with SO2 emissions, related to fuel and control technology characteristics, but was also correlated with atmospheric conditions and population size at various distances. Our findings emphasize that control strategies that consider variability in damages across facilities would yield more efficient outcomes.  相似文献   

3.
Environmental tobacco smoke (ETS) is a major contributor to indoor human exposures to fine particulate matter of 2.5 μm or smaller (PM2.5). The Stochastic Human Exposure and Dose Simulation for Particulate Matter (SHEDS‐PM) Model developed by the U.S. Environmental Protection Agency estimates distributions of outdoor and indoor PM2.5 exposure for a specified population based on ambient concentrations and indoor emissions sources. A critical assessment was conducted of the methodology and data used in SHEDS‐PM for estimation of indoor exposure to ETS. For the residential microenvironment, SHEDS uses a mass‐balance approach, which is comparable to best practices. The default inputs in SHEDS‐PM were reviewed and more recent and extensive data sources were identified. Sensitivity analysis was used to determine which inputs should be prioritized for updating. Data regarding the proportion of smokers and “other smokers” and cigarette emission rate were found to be important. SHEDS‐PM does not currently account for in‐vehicle ETS exposure; however, in‐vehicle ETS‐related PM2.5 levels can exceed those in residential microenvironments by a factor of 10 or more. Therefore, a mass‐balance‐based methodology for estimating in‐vehicle ETS PM2.5 concentration is evaluated. Recommendations are made regarding updating of input data and algorithms related to ETS exposure in the SHEDS‐PM model. Interindividual variability for ETS exposure was quantified. Geographic variability in ETS exposure was quantified based on the varying prevalence of smokers in five selected locations in the United States.  相似文献   

4.
In environmental risk management, there are often interests in maximizing public health benefits (efficiency) and addressing inequality in the distribution of health outcomes. However, both dimensions are not generally considered within a single analytical framework. In this study, we estimate both total population health benefits and changes in quantitative indicators of health inequality for a number of alternative spatial distributions of diesel particulate filter retrofits across half of an urban bus fleet in Boston, Massachusetts. We focus on the impact of emissions controls on primary fine particulate matter (PM2.5) emissions, modeling the effect on PM2.5 concentrations and premature mortality. Given spatial heterogeneity in baseline mortality rates, we apply the Atkinson index and other inequality indicators to quantify changes in the distribution of mortality risk. Across the different spatial distributions of control strategies, the public health benefits varied by more than a factor of two, related to factors such as mileage driven per day, population density near roadways, and baseline mortality rates in exposed populations. Changes in health inequality indicators varied across control strategies, with the subset of optimal strategies considering both efficiency and equality generally robust across different parametric assumptions and inequality indicators. Our analysis demonstrates the viability of formal analytical approaches to jointly address both efficiency and equality in risk assessment, providing a tool for decisionmakers who wish to consider both issues.  相似文献   

5.
Mixed Levels of Uncertainty in Complex Policy Models   总被引:3,自引:0,他引:3  
The characterization and treatment of uncertainty poses special challenges when modeling indeterminate or complex coupled systems such as those involved in the interactions between human activity, climate and the ecosystem. Uncertainty about model structure may become as, or more important than, uncertainty about parameter values. When uncertainty grows so large that prediction or optimization no longer makes sense, it may still be possible to use the model as a behavioral test bed to examine the relative robustness of alternative observational and behavioral strategies. When models must be run into portions of their phase space that are not well understood, different submodels may become unreliable at different rates. A common example involves running a time stepped model far into the future. Several strategies can be used to deal with such situations. The probability of model failure can be reported as a function of time. Possible alternative surprises can be assigned probabilities, modeled separately, and combined. Finally, through the use of subjective judgments, one may be able to combine, and over time shift between models, moving from more detailed to progressively simpler order-of-magnitude models, and perhaps ultimately, on to simple bounding analysis.  相似文献   

6.
One of the common challenges for life cycle impact assessment and risk assessment is the need to estimate the population exposures associated with emissions. The concept of intake fraction (a unitless term representing the fraction of material or its precursor released from a source that is eventually inhaled or ingested) can be used when limited site data are available or the number of sources to model is large. Although studies have estimated intake fractions for some pollutant-source combinations, there is a need to quickly and accurately estimate intake fractions for sources and settings not previously evaluated. It would be expected that limited source or site information could be used to yield intake fraction estimates with reasonable accuracy. To test this theory, we developed regression models to predict intake fractions previously estimated for primary fine particles (PM2.5) and secondary sulfate and nitrate particles from power plants and mobile sources in the United States. Our regression models were able to predict pollutant-specific intake fractions with R2 between 0.53 and 0.86 and equations that reflected expected relationships (e.g., intake fraction increased with population density, stack height influenced the intake fraction of primary but not secondary particles). Further analysis would be needed to generalize beyond this case study and construct models applicable across source categories and settings, but our analysis demonstrates that inclusion of a limited number of parameters can significantly reduce the uncertainty in population-average exposure estimates.  相似文献   

7.
    
《Risk analysis》2018,38(1):163-176
The U.S. Environmental Protection Agency (EPA) uses health risk assessment to help inform its decisions in setting national ambient air quality standards (NAAQS). EPA's standard approach is to make epidemiologically‐based risk estimates based on a single statistical model selected from the scientific literature, called the “core” model. The uncertainty presented for “core” risk estimates reflects only the statistical uncertainty associated with that one model's concentration‐response function parameter estimate(s). However, epidemiologically‐based risk estimates are also subject to “model uncertainty,” which is a lack of knowledge about which of many plausible model specifications and data sets best reflects the true relationship between health and ambient pollutant concentrations. In 2002, a National Academies of Sciences (NAS) committee recommended that model uncertainty be integrated into EPA's standard risk analysis approach. This article discusses how model uncertainty can be taken into account with an integrated uncertainty analysis (IUA) of health risk estimates. It provides an illustrative numerical example based on risk of premature death from respiratory mortality due to long‐term exposures to ambient ozone, which is a health risk considered in the 2015 ozone NAAQS decision. This example demonstrates that use of IUA to quantitatively incorporate key model uncertainties into risk estimates produces a substantially altered understanding of the potential public health gain of a NAAQS policy decision, and that IUA can also produce more helpful insights to guide that decision, such as evidence of decreasing incremental health gains from progressive tightening of a NAAQS.  相似文献   

8.
Treatment of Uncertainty in Performance Assessments for Complex Systems   总被引:13,自引:0,他引:13  
When viewed at a high level, performance assessments (PAs) for complex systems involve two types of uncertainty: stochastic uncertainty, which arises because the system under study can behave in many different ways, and subjective uncertainty, which arises from a lack of knowledge about quantities required within the computational implementation of the PA. Stochastic uncertainty is typically incorporated into a PA with an experimental design based on importance sampling and leads to the final results of the PA being expressed as a complementary cumulative distribution function (CCDF). Subjective uncertainty is usually treated with Monte Carlo techniques and leads to a distribution of CCDFs. This presentation discusses the use of the Kaplan/Garrick ordered triple representation for risk in maintaining a distinction between stochastic and subjective uncertainty in PAs for complex systems. The topics discussed include (1) the definition of scenarios and the calculation of scenario probabilities and consequences, (2) the separation of subjective and stochastic uncertainties, (3) the construction of CCDFs required in comparisons with regulatory standards (e.g., 40 CFR Part 191, Subpart B for the disposal of radioactive waste), and (4) the performance of uncertainty and sensitivity studies. Results obtained in a preliminary PA for the Waste Isolation Pilot Plant, an uncertainty and sensitivity analysis of the MACCS reactor accident consequence analysis model, and the NUREG-1150 probabilistic risk assessments are used for illustration.  相似文献   

9.
In the analysis of the risk associated to rare events that may lead to catastrophic consequences with large uncertainty, it is questionable that the knowledge and information available for the analysis can be reflected properly by probabilities. Approaches other than purely probabilistic have been suggested, for example, using interval probabilities, possibilistic measures, or qualitative methods. In this article, we look into the problem and identify a number of issues that are foundational for its treatment. The foundational issues addressed reflect on the position that “probability is perfect” and take into open consideration the need for an extended framework for risk assessment that reflects the separation that practically exists between analyst and decisionmaker.  相似文献   

10.
The conceptual and computational structure of a performance assessment (PA) for the Waste Isolation Pilot Plant (WIPP) is described. Important parts of this structure are (1) maintenance of a separation between stochastic (i.e., aleatory) and subjective (i.e., epistemic) uncertainty, with stochastic uncertainty arising from the many possible disruptions that could occur over the 10,000-year regulatory period that applies to the WIPP, and subjective uncertainty arising from the imprecision with which many of the quantities required in the analysis are known, (2) use of Latin hypercube sampling to incorporate the effects of subjective uncertainty, (3) use of Monte Carlo (i.e., random) sampling to incorporate the effects of stochastic uncertainty, and (4) efficient use of the necessarily limited number of mechanistic calculations that can be performed to support the analysis. The WIPP is under development by the U.S. Department of Energy (DOE) for the geologic (i.e., deep underground) disposal of transuranic (TRU) waste, with the indicated PA supporting a Compliance Certification Application (CCA) by the DOE to the U.S. Environmental Protection Agency (EPA) in October 1996 for the necessary certifications for the WIPP to begin operation. The EPA certified the WIPP for the disposal of TRU waste in May 1998, with the result that the WIPP will be the first operational facility in the United States for the geologic disposal of radioactive waste.  相似文献   

11.
    
This perspective presents empirical data to demonstrate the existence of different expert views on scientific policy advice on complex environmental health issues. These views are partly research‐field specific. According to scientific literature, experts differ in the way they provide policy advice on complex issues such as electromagnetic fields (EMF), particulate matter (PM), and antimicrobial resistance (AMR). Where some experts feel their primary task is to carry out fundamental research, others actively engage in the policy dialogue. Although the literature provides ideas about expert roles, there exists little empirical underpinning. Our aim is to gather empirical evidence about expert roles. The results of an international study indicated that experts on EMF, PM, and AMR differ in the way they view their role in the policy dialogue. For example, experts differed in their views on the need for precaution and their motivation to initiate stakeholder cooperation. Besides, most experts thought that their views on the risks of EMF/PM/AMR did not differ from those of colleagues. Great dissensus was found in views on the best ways of managing risks and uncertainties. In conclusion, the theoretical ideal–typical roles from the literature can be identified to a certain extent.  相似文献   

12.
Yifan Zhang 《Risk analysis》2013,33(1):109-120
Expert judgment (or expert elicitation) is a formal process for eliciting judgments from subject‐matter experts about the value of a decision‐relevant quantity. Judgments in the form of subjective probability distributions are obtained from several experts, raising the question how best to combine information from multiple experts. A number of algorithmic approaches have been proposed, of which the most commonly employed is the equal‐weight combination (the average of the experts’ distributions). We evaluate the properties of five combination methods (equal‐weight, best‐expert, performance, frequentist, and copula) using simulated expert‐judgment data for which we know the process generating the experts’ distributions. We examine cases in which two well‐calibrated experts are of equal or unequal quality and their judgments are independent, positively or negatively dependent. In this setting, the copula, frequentist, and best‐expert approaches perform better and the equal‐weight combination method performs worse than the alternative approaches.  相似文献   

13.
Safety systems are important components of high-consequence systems that are intended to prevent the unintended operation of the system and thus the potentially significant negative consequences that could result from such an operation. This presentation investigates and illustrates formal procedures for assessing the uncertainty in the probability that a safety system will fail to operate as intended in an accident environment. Probability theory and evidence theory are introduced as possible mathematical structures for the representation of the epistemic uncertainty associated with the performance of safety systems, and a representation of this type is illustrated with a hypothetical safety system involving one weak link and one strong link that is exposed to a high temperature fire environment. Topics considered include (1) the nature of diffuse uncertainty information involving a system and its environment, (2) the conversion of diffuse uncertainty information into the mathematical structures associated with probability theory and evidence theory, and (3) the propagation of these uncertainty structures through a model for a safety system to obtain representations in the context of probability theory and evidence theory of the uncertainty in the probability that the safety system will fail to operate as intended. The results suggest that evidence theory provides a potentially valuable representational tool for the display of the implications of significant epistemic uncertainty in inputs to complex analyses.  相似文献   

14.
    
Air pollution has been linked to an increased risk of several respiratory diseases in children, especially respiratory tract infections. The present study aims to evaluate the association between pediatric emergency department (PED) presentations for bronchiolitis and air pollution. PED presentations due to bronchiolitis in children aged less than 1 year were retrospectively collected from 2007 to 2018 in Padova, Italy, together with daily environmental data. A conditional logistic regression based on a time-stratified case-crossover design was performed to evaluate the association between PED presentations and exposure to NO2, PM2.5, and PM10. Models were adjusted for temperature, relative humidity, atmospheric pressure, and public holidays. Delayed effects in time were evaluated using distributed lag non-linear models. Odds ratio for lagged exposure from 0 to 14 days were obtained. Overall, 2251 children presented to the PED for bronchiolitis. Infants’ exposure to higher concentrations of PM10 and PM2.5 in the 5 days before the presentation to the PED increased the risk of accessing the PED by more than 10%, whereas high concentrations of NO2 between 2 and 12 days before the PED presentation were associated with an increased risk of up to 30%. The association between pollutants and infants who required hospitalization was even greater. A cumulative effect of NO2 among the 2 weeks preceding the presentation was also observed. In summary, PM and NO2 concentrations are associated with PED presentations and hospitalizations for bronchiolitis. Exposure of infants to air pollution could damage the respiratory tract mucosa, facilitating viral infections and exacerbating symptoms.  相似文献   

15.
Tim Bedford 《Risk analysis》2013,33(10):1884-1898
Group risk is usually represented by FN curves showing the frequency of different accident sizes for a given activity. Many governments regulate group risk through FN criterion lines, which define the tolerable location of an FN curve. However, to compare different risk reduction alternatives, one must be able to rank FN curves. The two main problems in doing this are that the FN curve contains multiple frequencies, and that there are usually large epistemic uncertainties about the curve. Since the mid 1970s, a number of authors have used the concept of “disutility” to summarize FN curves in which a family of disutility functions was defined with a single parameter controlling the degree of “risk aversion.” Here, we show it to be risk neutral, disaster averse, and insensitive to epistemic uncertainty on accident frequencies. A new approach is outlined that has a number of attractive properties. The formulation allows us to distinguish between risk aversion and disaster aversion, two concepts that have been confused in the literature until now. A two‐parameter family of disutilities generalizing the previous approach is defined, where one parameter controls risk aversion and the other disaster aversion. The family is sensitive to epistemic uncertainties. Such disutilities may, for example, be used to compare the impact of system design changes on group risks, or might form the basis for valuing reductions in group risk in a cost‐benefit analysis.  相似文献   

16.
    
A better understanding of the uncertainty that exists in models used for seismic risk assessment is critical to improving risk-based decisions pertaining to earthquake safety. Current models estimating the probability of collapse of a building do not consider comprehensively the nature and impact of uncertainty. This article presents a model framework to enhance seismic risk assessment and thus gives decisionmakers a fuller understanding of the nature and limitations of the estimates. This can help ensure that risks are not over- or underestimated and the value of acquiring accurate data is appreciated fully. The methodology presented provides a novel treatment of uncertainties in input variables, their propagation through the model, and their effect on the results. The study presents ranges of possible annual collapse probabilities for different case studies on buildings in different parts of the world, exposed to different levels of seismicity, and with different vulnerabilities. A global sensitivity analysis was conducted to determine the significance of uncertain variables. Two key outcomes are (1) that the uncertainty in ground-motion conversion equations has the largest effect on the uncertainty in the calculation of annual collapse probability; and (2) the vulnerability of a building appears to have an effect on the range of annual collapse probabilities produced, i.e., the level of uncertainty in the estimate of annual collapse probability, with less vulnerable buildings having a smaller uncertainty.  相似文献   

17.
Demand for air travel is projected to increase in the upcoming years, with a corresponding influence on emissions, air quality, and public health. The trajectory of health impacts would be influenced by not just emissions growth, but also changes in nonaviation ambient concentrations that influence secondary fine particulate matter (PM2.5) formation, population growth and aging, and potential shifts in PM2.5 concentration‐response functions (CRFs). However, studies to date have not systematically evaluated the individual and joint contributions of these factors to health risk trajectories. In this study, we simulated emissions during landing and takeoff from aircraft at 99 airports across the United States for 2005 and for a 2025 flight activity projection scenario. We applied the Community Multiscale Air Quality (CMAQ) model with the Speciated Modeled Attainment Test (SMAT) to determine the contributions of these emissions to ambient concentrations, including scenarios with 2025 aircraft emissions and 2005 nonaviation air quality. We combined CMAQ outputs with PM2.5 mortality CRFs and population projections, and evaluated the influence of changing emissions, nonaviation concentrations, and population factors. Given these scenarios, aviation‐related health impacts would increase by a factor of 6.1 from 2005 to 2025, with a factor of 2.1 attributable to emissions, a factor of 1.3 attributable to population factors, and a factor of 2.3 attributable to changing nonaviation concentrations which enhance secondary PM2.5 formation. Our study emphasizes that the public health burden of aviation emissions would be significantly influenced by the joint effects of flight activity increases, nonaviation concentration changes, and population growth and aging.  相似文献   

18.
A sequence of linear, monotonic, and nonmonotonic test problems is used to illustrate sampling-based uncertainty and sensitivity analysis procedures. Uncertainty results obtained with replicated random and Latin hypercube samples are compared, with the Latin hypercube samples tending to produce more stable results than the random samples. Sensitivity results obtained with the following procedures and/or measures are illustrated and compared: correlation coefficients (CCs), rank correlation coefficients (RCCs), common means (CMNs), common locations (CLs), common medians (CMDs), statistical independence (SI), standardized regression coefficients (SRCs), partial correlation coefficients (PCCs), standardized rank regression coefficients (SRRCs), partial rank correlation coefficients (PRCCs), stepwise regression analysis with raw and rank-transformed data, and examination of scatter plots. The effectiveness of a given procedure and/or measure depends on the characteristics of the individual test problems, with (1) linear measures (i.e., CCs, PCCs, SRCs) performing well on the linear test problems, (2) measures based on rank transforms (i.e., RCCs, PRCCs, SRRCs) performing well on the monotonic test problems, and (3) measures predicated on searches for nonrandom patterns (i.e., CMNs, CLs, CMDs, SI) performing well on the nonmonotonic test problems.  相似文献   

19.
In 2002, the U.S. Environmental Protection Agency (EPA) released an “Interim Policy on Genomics,” stating a commitment to developing guidance on the inclusion of genetic information in regulatory decision making. This statement was followed in 2004 by a document exploring the potential implications. Genetic information can play a key role in understanding and quantifying human susceptibility, an essential step in many of the risk assessments used to shape policy. For example, the federal Clean Air Act (CAA) requires EPA to set National Ambient Air Quality Standards (NAAQS) for criteria pollutants at levels to protect even sensitive populations from adverse health effects with an adequate margin of safety. Asthmatics are generally regarded as a sensitive population, yet substantial research gaps in understanding genetic susceptibility and disease have hindered quantitative risk analysis. This case study assesses the potential role of genomic information regarding susceptible populations in the NAAQS process for fine particulate matter (PM2.5) under the CAA. In this initial assessment, we model the contribution of a single polymorphism to asthma risk and mortality risk; however, multiple polymorphisms and interactions (gene‐gene and gene‐environment) are known to play key roles in the disease process. We show that the impact of new information about susceptibility on estimates of population risk or average risk derived from large epidemiological studies depends on the circumstances. We also suggest that analysis of a single polymorphism, or other risk factor such as health status, may or may not change estimates of individual risk enough to alter a particular regulatory decision, but this depends on specific characteristics of the decision and risk information. We also show how new information about susceptibility in the context of the NAAQS for PM2.5 could have a large impact on the estimated distribution of individual risk. This would occur if a group were consequently identified (based on genetic and/or disease status), that accounted for a disproportionate share of observed effects. Our results highlight certain conditions under which genetic information is likely to have an impact on risk estimates and the balance of costs and benefits within groups, and highlight critical research needs. As future studies explore more fully the relationship between exposure, genetic makeup, and disease status, the opportunity for genetic information and disease status to play pivotal roles in regulation can only increase.  相似文献   

20.
This paper suggests a behavioral definition of (subjective) ambiguity in an abstract setting where objects of choice are Savage‐style acts. Then axioms are described that deliver probabilistic sophistication of preference on the set of unambiguous acts. In particular, both the domain and the values of the decision‐maker's probability measure are derived from preference. It is argued that the noted result also provides a decision‐theoretic foundation for the Knightian distinction between risk and ambiguity.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号