首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 312 毫秒
1.
Elodie Adida 《Risk analysis》2011,31(10):1622-1631
An effective nonpharmaceutical intervention for influenza interrupts an exposure route that contributes significantly to infection risk. Herein, we use uncertainty analysis (point‐interval method) and Monte Carlo simulation to explore the magnitude of infection risk and predominant route of exposure. We utilized a previously published mathematical model of a susceptible person attending a bed‐ridden infectious person. Infection risk is sensitive to the magnitude of virus emission and contact rates. The contribution of droplet spray exposure to infection risk increases with cough frequency, and decreases with virus concentration in cough particles. We consider two infectivity scenarios: greater infectivity of virus deposited in the upper respiratory tract than virus inhaled in respirable aerosols, based on human studies; and equal infectivity in the two locations, based on studies in guinea pigs. Given that virus have equal probability of infection throughout the respiratory tract, the mean overall infection risk is 9.8 × 10?2 (95th percentile 0.78). However, when virus in the upper respiratory tract is less infectious than inhaled virus, the overall infection risk is several orders of magnitude lower. In this event, inhalation is a significant exposure route. Contact transmission is important in both infectivity scenarios. The presence of virus in only respirable particles increases the mean overall infection risk by 1–3 orders of magnitude, with inhalation contributing ≥ 99% of the infection risk. The analysis indicates that reduction of uncertainties in the concentration of virus in expiratory particles of different sizes, expiratory event frequency, and infectivity at different sites in the respiratory tract will clarify the predominate exposure routes for influenza.  相似文献   

2.
A probabilistic and interdisciplinary risk–benefit assessment (RBA) model integrating microbiological, nutritional, and chemical components was developed for infant milk, with the objective of predicting the health impact of different scenarios of consumption. Infant feeding is a particular concern of interest in RBA as breast milk and powder infant formula have both been associated with risks and benefits related to chemicals, bacteria, and nutrients, hence the model considers these three facets. Cronobacter sakazakii, dioxin‐like polychlorinated biphenyls (dl‐PCB), and docosahexaenoic acid (DHA) were three risk/benefit factors selected as key issues in microbiology, chemistry, and nutrition, respectively. The present model was probabilistic with variability and uncertainty separated using a second‐order Monte Carlo simulation process. In this study, advantages and limitations of undertaking probabilistic and interdisciplinary RBA are discussed. In particular, the probabilistic technique was found to be powerful in dealing with missing data and to translate assumptions into quantitative inputs while taking uncertainty into account. In addition, separation of variability and uncertainty strengthened the interpretation of the model outputs by enabling better consideration and distinction of natural heterogeneity from lack of knowledge. Interdisciplinary RBA is necessary to give more structured conclusions and avoid contradictory messages to policymakers and also to consumers, leading to more decisive food recommendations. This assessment provides a conceptual development of the RBA methodology and is a robust basis on which to build upon.  相似文献   

3.
The objectives of this study are to understand tradeoffs between forest carbon and timber values, and evaluate the impact of uncertainty in improved forest management (IFM) carbon offset projects to improve forest management decisions. The study uses probabilistic simulation of uncertainty in financial risk for three management scenarios (clearcutting in 45‐ and 65‐year rotations and no harvest) under three carbon price schemes (historic voluntary market prices, cap and trade, and carbon prices set to equal net present value (NPV) from timber‐oriented management). Uncertainty is modeled for value and amount of carbon credits and wood products, the accuracy of forest growth model forecasts, and four other variables relevant to American Carbon Registry methodology. Calculations use forest inventory data from a 1,740 ha forest in western Washington State, using the Forest Vegetation Simulator (FVS) growth model. Sensitivity analysis shows that FVS model uncertainty contributes more than 70% to overall NPV variance, followed in importance by variability in inventory sample (3–14%), and short‐term prices for timber products (8%), while variability in carbon credit price has little influence (1.1%). At regional average land‐holding costs, a no‐harvest management scenario would become revenue‐positive at a carbon credit break‐point price of $14.17/Mg carbon dioxide equivalent (CO2e). IFM carbon projects are associated with a greater chance of both large payouts and large losses to landowners. These results inform policymakers and forest owners of the carbon credit price necessary for IFM approaches to equal or better the business‐as‐usual strategy, while highlighting the magnitude of financial risk and reward through probabilistic simulation.  相似文献   

4.
Probabilistic seismic risk analysis is a well‐established method in the insurance industry for modeling portfolio losses from earthquake events. In this context, precise exposure locations are often unknown. However, so far, location uncertainty has not been in the focus of a large amount of research. In this article, we propose a novel framework for treatment of location uncertainty. As a case study, a large number of synthetic portfolios resembling typical real‐world cases were created. We investigate the effect of portfolio characteristics such as value distribution, portfolio size, or proportion of risk items with unknown coordinates on the variability of loss frequency estimations. The results indicate that due to loss aggregation effects and spatial hazard variability, location uncertainty in isolation and in conjunction with ground motion uncertainty can induce significant variability to probabilistic loss results, especially for portfolios with a small number of risks. After quantifying its effect, we conclude that location uncertainty should not be neglected when assessing probabilistic seismic risk, but should be treated stochastically and the resulting variability should be visualized and interpreted carefully.  相似文献   

5.
Nanomaterials are finding application in many different environmentally relevant products and processes due to enhanced catalytic, antimicrobial, and oxidative properties of materials at this scale. As the market share of nano‐functionalized products increases, so too does the potential for environmental exposure and contamination. This study presents some exposure ranking methods that consider potential metallic nanomaterial surface water exposure and fate, due to nano‐functionalized products, through a number of exposure pathways. These methods take into account the limited and disparate data currently available for metallic nanomaterials and apply variability and uncertainty principles, together with qualitative risk assessment principles, to develop a scientific ranking. Three exposure scenarios with three different nanomaterials were considered to demonstrate these assessment methods: photo‐catalytic exterior paint (nano‐scale TiO2), antimicrobial food packaging (nano‐scale Ag), and particulate‐reducing diesel fuel additives (nano‐scale CeO2). Data and hypotheses from literature relating to metallic nanomaterial aquatic behavior (including the behavior of materials that may relate to nanomaterials in aquatic environments, e.g., metals, pesticides, surfactants) were used together with commercial nanomaterial characteristics and Irish natural aquatic environment characteristics to rank the potential concentrations, transport, and persistence behaviors within subjective categories. These methods, and the applied scenarios, reveal where data critical to estimating exposure and risk are lacking. As research into the behavior of metallic nanomaterials in different environments emerges, the influence of material and environmental characteristics on nanomaterial behavior within these exposure‐ and risk‐ranking methods may be redefined on a quantitative basis.  相似文献   

6.
The purpose of this article is to quantify the public health risk associated with inhalation of indoor airborne infection based on a probabilistic transmission dynamic modeling approach. We used the Wells-Riley mathematical model to estimate (1) the CO2 exposure concentrations in indoor environments where cases of inhalation airborne infection occurred based on reported epidemiological data and epidemic curves for influenza and severe acute respiratory syndrome (SARS), (2) the basic reproductive number, R0 (i.e., expected number of secondary cases on the introduction of a single infected individual in a completely susceptible population) and its variability in a shared indoor airspace, and (3) the risk for infection in various scenarios of exposure in a susceptible population for a range of R0. We also employ a standard susceptible-infectious-recovered (SIR) structure to relate Wells-Riley model derived R0 to a transmission parameter to implicate the relationships between indoor carbon dioxide concentration and contact rate. We estimate that a single case of SARS will infect 2.6 secondary cases on average in a population from nosocomial transmission, whereas less than 1 secondary infection was generated per case among school children. We also obtained an estimate of the basic reproductive number for influenza in a commercial airliner: the median value is 10.4. We suggest that improving the building air cleaning rate to lower the critical rebreathed fraction of indoor air can decrease transmission rate. Here, we show that virulence of the organism factors, infectious quantum generation rates (quanta/s by an infected person), and host factors determine the risk for inhalation of indoor airborne infection.  相似文献   

7.
Jan F. Van Impe 《Risk analysis》2011,31(8):1295-1307
The aim of quantitative microbiological risk assessment is to estimate the risk of illness caused by the presence of a pathogen in a food type, and to study the impact of interventions. Because of inherent variability and uncertainty, risk assessments are generally conducted stochastically, and if possible it is advised to characterize variability separately from uncertainty. Sensitivity analysis allows to indicate to which of the input variables the outcome of a quantitative microbiological risk assessment is most sensitive. Although a number of methods exist to apply sensitivity analysis to a risk assessment with probabilistic input variables (such as contamination, storage temperature, storage duration, etc.), it is challenging to perform sensitivity analysis in the case where a risk assessment includes a separate characterization of variability and uncertainty of input variables. A procedure is proposed that focuses on the relation between risk estimates obtained by Monte Carlo simulation and the location of pseudo‐randomly sampled input variables within the uncertainty and variability distributions. Within this procedure, two methods are used—that is, an ANOVA‐like model and Sobol sensitivity indices—to obtain and compare the impact of variability and of uncertainty of all input variables, and of model uncertainty and scenario uncertainty. As a case study, this methodology is applied to a risk assessment to estimate the risk of contracting listeriosis due to consumption of deli meats.  相似文献   

8.
Children may be more susceptible to toxicity from some environmental chemicals than adults. This susceptibility may occur during narrow age periods (windows), which can last from days to years depending on the toxicant. Breathing rates specific to narrow age periods are useful to assess inhalation dose during suspected windows of susceptibility. Because existing breathing rates used in risk assessment are typically for broad age ranges or are based on data not representative of the population, we derived daily breathing rates for narrow age ranges of children designed to be more representative of the current U.S. children's population. These rates were derived using the metabolic conversion method of Layton (1993) and energy intake data adjusted to represent the U.S. population from a relatively recent dietary survey (CSFII 1994–1996, 1998). We calculated conversion factors more specific to children than those previously used. Both nonnormalized (L/day) and normalized (L/kg-day) breathing rates were derived and found comparable to rates derived using energy estimates that are accurate for the individuals sampled but not representative of the population. Estimates of breathing rate variability within a population can be used with stochastic techniques to characterize the range of risk in the population from inhalation exposures. For each age and age-gender group, we present the mean, standard error of the mean, percentiles (50th, 90th, and 95th), geometric mean, standard deviation, 95th percentile, and best-fit parametric models of the breathing rate distributions. The standard errors characterize uncertainty in the parameter estimate, while the percentiles describe the combined interindividual and intra-individual variability of the sampled population. These breathing rates can be used for risk assessment of subchronic and chronic inhalation exposures of narrow age groups of children.  相似文献   

9.
Roger Cooke 《Risk analysis》2010,30(3):330-339
The practice of uncertainty factors as applied to noncancer endpoints in the IRIS database harkens back to traditional safety factors. In the era before risk quantification, these were used to build in a “margin of safety.” As risk quantification takes hold, the safety factor methods yield to quantitative risk calculations to guarantee safety. Many authors believe that uncertainty factors can be given a probabilistic interpretation as ratios of response rates, and that the reference values computed according to the IRIS methodology can thus be converted to random variables whose distributions can be computed with Monte Carlo methods, based on the distributions of the uncertainty factors. Recent proposals from the National Research Council echo this view. Based on probabilistic arguments, several authors claim that the current practice of uncertainty factors is overprotective. When interpreted probabilistically, uncertainty factors entail very strong assumptions on the underlying response rates. For example, the factor for extrapolating from animal to human is the same whether the dosage is chronic or subchronic. Together with independence assumptions, these assumptions entail that the covariance matrix of the logged response rates is singular. In other words, the accumulated assumptions entail a log‐linear dependence between the response rates. This in turn means that any uncertainty analysis based on these assumptions is ill‐conditioned; it effectively computes uncertainty conditional on a set of zero probability. The practice of uncertainty factors is due for a thorough review. Two directions are briefly sketched, one based on standard regression models, and one based on nonparametric continuous Bayesian belief nets.  相似文献   

10.
Dose‐response models are the essential link between exposure assessment and computed risk values in quantitative microbial risk assessment, yet the uncertainty that is inherent to computed risks because the dose‐response model parameters are estimated using limited epidemiological data is rarely quantified. Second‐order risk characterization approaches incorporating uncertainty in dose‐response model parameters can provide more complete information to decisionmakers by separating variability and uncertainty to quantify the uncertainty in computed risks. Therefore, the objective of this work is to develop procedures to sample from posterior distributions describing uncertainty in the parameters of exponential and beta‐Poisson dose‐response models using Bayes's theorem and Markov Chain Monte Carlo (in OpenBUGS). The theoretical origins of the beta‐Poisson dose‐response model are used to identify a decomposed version of the model that enables Bayesian analysis without the need to evaluate Kummer confluent hypergeometric functions. Herein, it is also established that the beta distribution in the beta‐Poisson dose‐response model cannot address variation among individual pathogens, criteria to validate use of the conventional approximation to the beta‐Poisson model are proposed, and simple algorithms to evaluate actual beta‐Poisson probabilities of infection are investigated. The developed MCMC procedures are applied to analysis of a case study data set, and it is demonstrated that an important region of the posterior distribution of the beta‐Poisson dose‐response model parameters is attributable to the absence of low‐dose data. This region includes beta‐Poisson models for which the conventional approximation is especially invalid and in which many beta distributions have an extreme shape with questionable plausibility.  相似文献   

11.
Over the last decade the health and environmental research communities have made significant progress in collecting and improving access to genomic, toxicology, exposure, health, and disease data useful to health risk assessment. One of the barriers to applying these growing volumes of information in fields such as risk assessment is the lack of informatics tools to organize, curate, and evaluate thousands of journal publications and hundreds of databases to provide new insights on relationships among exposure, hazard, and disease burden. Many fields are developing ontologies as a way of organizing and analyzing large amounts of complex information from multiple scientific disciplines. Ontologies include a vocabulary of terms and concepts with defined logical relationships to each other. Building from the recently published exposure ontology and other relevant health and environmental ontologies, this article proposes an ontology for health risk assessment (RsO) that provides a structural framework for organizing risk assessment information and methods. The RsO is anchored by eight major concepts that were either identified by exploratory curations of the risk literature or the exposure‐ontology working group as key for describing the risk assessment domain. These concepts are: (1) stressor, (2) receptor, (3) outcome, (4) exposure event, (5) dose‐response approach, (6) dose‐response metric, (7) uncertainty, and (8) measure of risk. We illustrate the utility of these concepts for the RsO with example curations of published risk assessments for ionizing radiation, arsenic in drinking water, and persistent pollutants in salmon.  相似文献   

12.
《Risk analysis》2018,38(6):1183-1201
In assessing environmental health risks, the risk characterization step synthesizes information gathered in evaluating exposures to stressors together with dose–response relationships, characteristics of the exposed population, and external environmental conditions. This article summarizes key steps of a cumulative risk assessment (CRA) followed by a discussion of considerations for characterizing cumulative risks. Cumulative risk characterizations differ considerably from single chemical‐ or single source‐based risk characterization. CRAs typically focus on a specific population instead of a pollutant or pollutant source and should include an evaluation of all relevant sources contributing to the exposures in the population and other factors that influence dose–response relationships. Second, CRAs may include influential environmental and population‐specific conditions, involving multiple chemical and nonchemical stressors. Third, a CRA could examine multiple health effects, reflecting joint toxicity and the potential for toxicological interactions. Fourth, the complexities often necessitate simplifying methods, including judgment‐based and semi‐quantitative indices that collapse disparate data into numerical scores. Fifth, because of the higher dimensionality and potentially large number of interactions, information needed to quantify risk is typically incomplete, necessitating an uncertainty analysis. Three approaches that could be used for characterizing risks in a CRA are presented: the multiroute hazard index, stressor grouping by exposure and toxicity, and indices for screening multiple factors and conditions. Other key roles of the risk characterization in CRAs are also described, mainly the translational aspect of including a characterization summary for lay readers (in addition to the technical analysis), and placing the results in the context of the likely risk‐based decisions.  相似文献   

13.
Biomarkers such as DNA adducts have significant potential to improve quantitative risk assessment by characterizing individual differences in metabolism of genotoxins and DNA repair and accounting for some of the factors that could affect interindividual variation in cancer risk. Inherent uncertainty in laboratory measurements and within-person variability of DNA adduct levels over time are putatively unrelated to cancer risk and should be subtracted from observed variation to better estimate interindividual variability of response to carcinogen exposure. A total of 41 volunteers, both smokers and nonsmokers, were asked to provide a peripheral blood sample every 3 weeks for several months in order to specifically assess intraindividual variability of polycyclic aromatic hydrocarbon (PAH)-DNA adduct levels. The intraindividual variance in PAH-DNA adduct levels, together with measurement uncertainty (laboratory variability and unaccounted for differences in exposure), constituted roughly 30% of the overall variance. An estimated 70% of the total variance was contributed by interindividual variability and is probably representative of the true biologic variability of response to carcinogenic exposure in lymphocytes. The estimated interindividual variability in DNA damage after subtracting intraindividual variability and measurement uncertainty was 24-fold. Inter-individual variance was higher (52-fold) in persons who constitutively lack the Glutathione S-Transferase M1 (GSTM1) gene which is important in the detoxification pathway of PAH. Risk assessment models that do not consider the variability of susceptibility to DNA damage following carcinogen exposure may underestimate risks to the general population, especially for those people who are most vulnerable.  相似文献   

14.
Biomagnification of organochlorine and other persistent organic contaminants by higher trophic level organisms represents one of the most significant sources of uncertainty and variability in evaluating potential risks associated with disposal of dredged materials. While it is important to distinguish between population variability (e.g., true population heterogeneity in fish weight, and lipid content) and uncertainty (e.g., measurement error), they can be operationally difficult to define separately in probabilistic estimates of human health and ecological risk. We propose a disaggregation of uncertain and variable parameters based on: (1) availability of supporting data; (2) the specific management and regulatory context (in this case, of the U.S. Army Corps of Engineers/U.S. Environmental Protection Agency tiered approach to dredged material management); and (3) professional judgment and experience in conducting probabilistic risk assessments. We describe and quantitatively evaluate several sources of uncertainty and variability in estimating risk to human health from trophic transfer of polychlorinated biphenyls (PCBs) using a case study of sediments obtained from the New York-New Jersey Harbor and being evaluated for disposal at an open water off-shore disposal site within the northeast region. The estimates of PCB concentrations in fish and dietary doses of PCBs to humans ingesting fish are expressed as distributions of values, of which the arithmetic mean or mode represents a particular fractile. The distribution of risk values is obtained using a food chain biomagnification model developed by Gobas by specifying distributions for input parameters disaggregated to represent either uncertainty or variability. Only those sources of uncertainty that could be quantified were included in the analysis. Results for several different two-dimensional Latin Hypercube analyses are provided to evaluate the influence of the uncertain versus variable disaggregation of model parameters. The analysis suggests that variability in human exposure parameters is greater than the uncertainty bounds on any particular fractile, given the described assumptions.  相似文献   

15.
Safety analysis of rare events with potentially catastrophic consequences is challenged by data scarcity and uncertainty. Traditional causation‐based approaches, such as fault tree and event tree (used to model rare event), suffer from a number of weaknesses. These include the static structure of the event causation, lack of event occurrence data, and need for reliable prior information. In this study, a new hierarchical Bayesian modeling based technique is proposed to overcome these drawbacks. The proposed technique can be used as a flexible technique for risk analysis of major accidents. It enables both forward and backward analysis in quantitative reasoning and the treatment of interdependence among the model parameters. Source‐to‐source variability in data sources is also taken into account through a robust probabilistic safety analysis. The applicability of the proposed technique has been demonstrated through a case study in marine and offshore industry.  相似文献   

16.
For safe innovation, knowledge on potential human health impacts is essential. Ideally, these impacts are considered within a larger life‐cycle‐based context to support sustainable development of new applications and products. A methodological framework that accounts for human health impacts caused by inhalation of engineered nanomaterials (ENMs) in an indoor air environment has been previously developed. The objectives of this study are as follows: (i) evaluate the feasibility of applying the CF framework for NP exposure in the workplace based on currently available data; and (ii) supplement any resulting knowledge gaps with methods and data from the li fe c ycle a pproach and human r isk a ssessment (LICARA) project to develop a modified case‐specific version of the framework that will enable near‐term inclusion of NP human health impacts in life cycle assessment (LCA) using a case study involving nanoscale titanium dioxide (nanoTiO2). The intent is to enhance typical LCA with elements of regulatory risk assessment, including its more detailed measure of uncertainty. The proof‐of‐principle demonstration of the framework highlighted the lack of available data for both the workplace emissions and human health effects of ENMs that is needed to calculate generalizable characterization factors using common human health impact assessment practices in LCA. The alternative approach of using intake fractions derived from workplace air concentration measurements and effect factors based on best‐available toxicity data supported the current case‐by‐case approach for assessing the human health life cycle impacts of ENMs. Ultimately, the proposed framework and calculations demonstrate the potential utility of integrating elements of risk assessment with LCA for ENMs once the data are available.  相似文献   

17.
This article examines demand, manufacturing, and supply factors proposed to inhibit manufacturer delivery execution. Extant research proposes many factors expected to harm delivery performance. Prior cross‐sectional empirical research examines such factors at the plant level, generally finding factors arising from dynamic complexity to be significant, but factors arising from detail complexity to be insignificant. Little empirical research examines the factors using product‐level operating data, which arguably makes more sense for analyzing how supply chain complexity factors inhibit delivery. For purposes of research triangulation, we use longitudinal product‐level data from MRP systems to examine whether the factors inhibit internal manufacturing on time job rates and three customer‐oriented measures of delivery performance: product line item fill rates, average delivery lead times, and average tardiness. Our econometric models pool product line item data across division plants and within distinct product families, using a proprietary monthly dataset on over 100 product line items from the environmental controls manufacturing division of a Fortune 100 conglomerate. The data summarize customer ordering events of over 900 customers and supply chain activities of over 80 suppliers. The study contributes academically by finding significant detail complexity inhibitors of delivery that prior studies found insignificant. The findings demonstrate the need for empirical research using data disaggregated below the plant‐level unit of analysis, as they illustrate how some factors previously found insignificant indeed are significant when considered at the product‐level unit of analysis. Managers can use the findings to understand better which drivers and inhibitors of delivery performance are important.  相似文献   

18.
The environmental health goals of many Native American tribes are to restore natural resources and ensure that they are safe to harvest and consume in traditional subsistence quantities. Therefore, it is important to tribes to accurately estimate risks incurred through the consumption of subsistence foods. This article explores problems in conventional fish consumption survey methods used in widely cited tribal fish consumption reports. The problems arise because of the following: (1) widely cited reports do not clearly state what they intend to do with the data supporting these reports, (2) data collection methods are incongruent with community norms and protocols, (3) data analysis methods omit or obscure the highest consumer subset of the population, (4) lack of understanding or recognition of tribal health co‐risk factors, and (5) restrictive policies that do not allow inclusion of tribal values within state or federal actions. In particular, the data collection and analysis methods in current tribal fish consumption surveys result in the misunderstanding that tribal members are satisfied with eating lower contemporary amounts of fish and shellfish, rather than the subsistence amounts that their cultural heritage and aboriginal rights indicate. A community‐based interview method developed in collaboration with and used by the Swinomish Tribe is suggested as a way to gather more accurate information on contemporary consumption rates. For traditional subsistence rates, a multidisciplinary reconstruction method is recommended.  相似文献   

19.
Currently, there is a trend away from the use of single (often conservative) estimates of risk to summarize the results of risk analyses in favor of stochastic methods which provide a more complete characterization of risk. The use of such stochastic methods leads to a distribution of possible values of risk, taking into account both uncertainty and variability in all of the factors affecting risk. In this article, we propose a general framework for the analysis of uncertainty and variability for use in the commonly encountered case of multiplicative risk models, in which risk may be expressed as a product of two or more risk factors. Our analytical methods facilitate the evaluation of overall uncertainty and variability in risk assessment, as well as the contributions of individual risk factors to both uncertainty and variability which is cumbersome using Monte Carlo methods. The use of these methods is illustrated in the analysis of potential cancer risks due to the ingestion of radon in drinking water.  相似文献   

20.
Many environmental and risk management decisions are made jointly by technical experts and members of the public. Frequently, their task is to select from among management alternatives whose outcomes are subject to varying degrees of uncertainty. Although it is recognized that how this uncertainty is interpreted can significantly affect decision‐making processes and choices, little research has examined similarities and differences between expert and public understandings of uncertainty. We present results from a web‐based survey that directly compares expert and lay interpretations and understandings of different expressions of uncertainty in the context of evaluating the consequences of proposed environmental management actions. Participants responded to two hypothetical but realistic scenarios involving trade‐offs between environmental and other objectives and were asked a series of questions about their comprehension of the uncertainty information, their preferred choice among the alternatives, and the associated difficulty and amount of effort. Results demonstrate that experts and laypersons tend to use presentations of numerical ranges and evaluative labels differently; interestingly, the observed differences between the two groups were not explained by differences in numeracy or concerns for the predicted environmental losses. These findings question many of the usual presumptions about how uncertainty should be presented as part of deliberative risk‐ and environmental‐management processes.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号