首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Risk‐information framing can be a powerful tool for aiding the communication of risk and improving decision making. However, little work has investigated the extent that these framing effects depend on the characteristics of the perceiver. In our study, we examine whether the effects of different risk‐pricing formats on risky choices are the same for all individuals, no matter their domain experience or cultural background, or whether there are interactions between these factors. Survey 1 revealed that three risk‐pricing formats of the same choice problem resulted in the same individuals making different risky choices (preference reversal), suggesting that risk perception was distorted by the risk‐pricing format manipulation. In Survey 2, the effects of the risk‐pricing formats were shown to differ by the participants’ cultural background (Asian vs. European) and the extent of their domain experience. The fact that there were no differences between the cultural or domain experience groups in their overall tendency to select riskier (cf. safer) choices indicates that risk behavior differences between groups are often closely linked to perceptual, rather than simply attitudinal, cognitive processes. The results demonstrate the complex, interactive cognitive processes that are used to encode risk information, involving the framing of the information and the cultural background and previous experiences of the individual. We conclude that it is important to consider the characteristics of the individual (e.g., culture, domain experience, etc.) when manipulating risk‐information framing with the aim of improving risk communication.  相似文献   

2.
Communicating probability information about risks to the public is more difficult than might be expected. Many studies have examined this subject, so that their resulting recommendations are scattered over various publications, diverse research fields, and are about different presentation formats. An integration of empirical findings in one review would be useful therefore to describe the evidence base for communication about probability information and to present the recommendations that can be made so far. We categorized the studies in the following presentation formats: frequencies, percentages, base rates and proportions, absolute and relative risk reduction, cumulative probabilities, verbal probability information, numerical versus verbal probability information, graphs, and risk ladders. We suggest several recommendations for these formats. Based on the results of our review, we show that the effects of presentation format depend not only on the type of format, but also on the context in which the format is used. We therefore argue that the presentation format has the strongest effect when the receiver processes probability information heuristically instead of systematically. We conclude that future research and risk communication practitioners should not only concentrate on the presentation format of the probability information but also on the situation in which this message is presented, as this may predict how people process the information and how this may influence their interpretation of the risk.  相似文献   

3.
Researchers recommend the use of pictographs in medical risk communication to improve people's risk comprehension and decision making. However, it is not yet clear whether the iconicity used in pictographs to convey risk information influences individuals’ information processing and comprehension. In an eye‐tracking experiment with participants from the general population (N = 188), we examined whether specific types of pictograph icons influence the processing strategy viewers use to extract numerical information. In addition, we examined the effect of iconicity and numeracy on probability estimation, recall, and icon liking. This experiment used a 2 (iconicity: blocks vs. restroom icons) × 2 (scenario: medical vs. nonmedical) between‐subject design. Numeracy had a significant effect on information processing strategy, but we found no effect of iconicity or scenario. Results indicated that both icon types enabled high and low numerates to use their default way of processing and extracting the gist of the message from the pictorial risk communication format: high numerates counted icons, whereas low numerates used large‐area processing. There was no effect of iconicity in the probability estimation. However, people who saw restroom icons had a higher probability of correctly recalling the exact risk level. Iconicity had no effect on icon liking. Although the effects are small, our findings suggest that person‐like restroom icons in pictographs seem to have some advantages for risk communication. Specifically, in nonpersonalized prevention brochures, person‐like restroom icons may maintain reader motivation for processing the risk information.  相似文献   

4.
This paper suggests a behavioral definition of (subjective) ambiguity in an abstract setting where objects of choice are Savage‐style acts. Then axioms are described that deliver probabilistic sophistication of preference on the set of unambiguous acts. In particular, both the domain and the values of the decision‐maker's probability measure are derived from preference. It is argued that the noted result also provides a decision‐theoretic foundation for the Knightian distinction between risk and ambiguity.  相似文献   

5.
The RISK of an event generally relates to its expected severity and the perceived probability of its occurrence. In RISK research, however, there is no standard measure for subjective probability estimates. In this study, we compared five commonly used measurement formats—two rating scales, a visual analog scale, and two numeric measures—in terms of their ability to assess subjective probability judgments when objective probabilities are available. We varied the probabilities (low vs. moderate) and severity (low vs. high) of the events to be judged as well as the presentation mode of objective probabilities (sequential presentation of singular events vs. graphical presentation of aggregated information). We employed two complementary goodness‐of‐fit criteria: the correlation between objective and subjective probabilities (sensitivity), and the root mean square deviations of subjective probabilities from objective values (accuracy). The numeric formats generally outperformed all other measures. The severity of events had no effect on the performance. Generally, a rise in probability led to decreases in performance. This effect, however, depended on how the objective probabilities were encoded: pictographs ensured perfect information, which improved goodness of fit for all formats and diminished this negative effect on the performance. Differences in performance between scales are thus caused only in part by characteristics of the scales themselves—they also depend on the process of encoding. Consequently, researchers should take the source of probability information into account before selecting a measure.  相似文献   

6.
Probability elicitation protocols are used to assess and incorporate subjective probabilities in risk and decision analysis. While most of these protocols use methods that have focused on the precision of the elicited probabilities, the speed of the elicitation process has often been neglected. However, speed is also important, particularly when experts need to examine a large number of events on a recurrent basis. Furthermore, most existing elicitation methods are numerical in nature, but there are various reasons why an expert would refuse to give such precise ratio‐scale estimates, even if highly numerate. This may occur, for instance, when there is lack of sufficient hard evidence, when assessing very uncertain events (such as emergent threats), or when dealing with politicized topics (such as terrorism or disease outbreaks). In this article, we adopt an ordinal ranking approach from multicriteria decision analysis to provide a fast and nonnumerical probability elicitation process. Probabilities are subsequently approximated from the ranking by an algorithm based on the principle of maximum entropy, a rule compatible with the ordinal information provided by the expert. The method can elicit probabilities for a wide range of different event types, including new ways of eliciting probabilities for stochastically independent events and low‐probability events. We use a Monte Carlo simulation to test the accuracy of the approximated probabilities and try the method in practice, applying it to a real‐world risk analysis recently conducted for DEFRA (the U.K. Department for the Environment, Farming and Rural Affairs): the prioritization of animal health threats.  相似文献   

7.
The purpose of this article was to conduct a risk‐based study based on a linkage of experimental human influenza infections and fluctuation analysis of airway function to assess whether influenza viral infection was risk factor for exacerbations of chronic occupational asthma. Here we provided a comprehensive probabilistic analysis aimed at quantifying influenza‐associated exacerbations risk for occupational asthmatics, based on a combination of published distributions of viral shedding and symptoms scores and lung respiratory system properties characterized by long‐range peak expiratory flow (PEF) dynamics. Using a coupled detrended fluctuation analysis‐experimental human influenza approach, we estimated the conditional probability of moderate or severe lung airway obstruction and hence the exacerbations risk of influenza‐associated occupational asthma in individuals. The long‐range correlation exponent (α) was used as a predictor of future exacerbations risk of influenza‐associated asthma. For our illustrative distribution of PEF fluctuations and influenza‐induced asthma exacerbations risk relations, we found that the probability of exacerbations risk can be limited to below 50% by keeping α to below 0.53. This study also found that limiting wheeze scores to 0.56 yields a 75% probability of influenza‐associated asthma exacerbations risk and a limit of 0.34 yields a 50% probability that may give a representative estimate of the distribution of chronic respiratory system properties. This study implicates that influenza viral infection is an important risk factor for exacerbations of chronic occupational asthma.  相似文献   

8.
Technical Research Centre of Finland (VTT) and Studsvik AB, Sweden, have simulated decision making of the Swedish Nuclear Power Inspectorate and a power company by applying decision models in a benchmark study. Based on the experience from the benchmark study, a decision analysis framework to be used in safety related problems is outlined. By this framework both the power companies and the safety authorities could be provided with a more rigorous, systematic approach in their decision making. A decision analytic approach provides a structure for identifying the information requirements of the problem solving. Thus it could serve as a discussion forum between the authorities and the utilities. In this context, probabilistic safety assessment (PSA) has a crucial role of expressing the plant safety status in terms of reactor core damage accident probability and of risk contributions from various accident precursors. However, a decision under uncertainty should not be based solely on probabilities, particularly when the event in question is a rare one and its probability of occurrence is estimated by means of different kinds of approximations.  相似文献   

9.
Currently, a binary alarm system is used in the United States to issue deterministic warning polygons in case of tornado events. To enhance the effectiveness of the weather information, a likelihood alarm system, which uses a tool called probabilistic hazard information (PHI), is being developed at National Severe Storms Laboratory to issue probabilistic information about the threat. This study aims to investigate the effects of providing the uncertainty information about a tornado occurrence through the PHI's graphical swath on laypeople's concern, fear, and protective action, as compared with providing the warning information with the deterministic polygon. The displays of color‐coded swaths and deterministic polygons were shown to subjects. Some displays had a blue background denoting the probability of any tornado formation in the general area. Participants were asked to report their levels of concern, fear, and protective action at randomly chosen locations within each of seven designated levels on each display. Analysis of a three‐stage nested design showed that providing the uncertainty information via the PHI would appropriately increase recipients’ levels of concern, fear, and protective action in highly dangerous scenarios, with a more than 60% chance of being affected by the threat, as compared with deterministic polygons. The blue background and the color‐coding type did not have a significant effect on the people's cognition of the threat and reaction to it. This study shows that using a likelihood alarm system leads to more conscious decision making by the weather information recipients and enhances the system safety.  相似文献   

10.
In the nuclear power industry, Level 3 probabilistic risk assessment (PRA) is used to estimate damage to public health and the environment if a severe accident leads to large radiological release. Current Level 3 PRA does not have an explicit inclusion of social factors and, therefore, it is not possible to perform importance ranking of social factors for risk‐informing emergency preparedness, planning, and response (EPPR). This article offers a methodology for adapting the concept of social vulnerability, commonly used in natural hazard research, in the context of a severe nuclear power plant accident. The methodology has four steps: (1) calculating a hazard‐independent social vulnerability index for the local population; (2) developing a location‐specific representation of the maximum radiological hazard estimated from current Level 3 PRA, in a geographic information system (GIS) environment; (3) developing a GIS‐based socio‐technical risk map by combining the social vulnerability index and the location‐specific radiological hazard; and (4) conducting a risk importance measure analysis to rank the criticality of social factors based on their contribution to the socio‐technical risk. The methodology is applied using results from the 2012 Surry Power Station state‐of‐the‐art reactor consequence analysis. A radiological hazard model is generated from MELCOR accident consequence code system, translated into a GIS environment, and combined with the Center for Disease Control social vulnerability index (SVI). This research creates an opportunity to explicitly consider and rank the criticality of location‐specific SVI themes based on their influence on risk, providing input for EPPR.  相似文献   

11.
设计专家权重和属性指标权重的计算模型已成为近年来备受关注的两个重要研究课题。针对评价信息为概率语义信任函数的社会网络群决策问题,提出一种基于信任关系和信息测度的概率语义社会网络群决策模型。首先,构建基于信任关系的概率语义决策空间,探究专家之间的信任传递模型,通过专家之间信任关系计算专家的权重;其次,引入概率语义信任函数的熵和相似度概念,并运用三角函数设计概率语义信任函数信息熵和相似度的衡量方法;最后,构建基于信任关系和信息测度的概率语义社会网络群决策模型,进而得到合理可靠的决策结果,同时将提出的社会网络群决策模型用于电动汽车供应商的选择实例,对比分析实验验证了模型的合理性和有效性。  相似文献   

12.
Human health risk assessments use point values to develop risk estimates and thus impart a deterministic character to risk, which, by definition, is a probability phenomenon. The risk estimates are calculated based on individuals and then, using uncertainty factors (UFs), are extrapolated to the population that is characterized by variability. Regulatory agencies have recommended the quantification of the impact of variability in risk assessments through the application of probabilistic methods. In the present study, a framework that deals with the quantitative analysis of uncertainty (U) and variability (V) in target tissue dose in the population was developed by applying probabilistic analysis to physiologically-based toxicokinetic models. The mechanistic parameters that determine kinetics were described with probability density functions (PDFs). Since each PDF depicts the frequency of occurrence of all expected values of each parameter in the population, the combined effects of multiple sources of U/V were accounted for in the estimated distribution of tissue dose in the population, and a unified (adult and child) intraspecies toxicokinetic uncertainty factor UFH-TK was determined. The results show that the proposed framework accounts effectively for U/V in population toxicokinetics. The ratio of the 95th percentile to the 50th percentile of the annual average concentration of the chemical at the target tissue organ (i.e., the UFH-TK) varies with age. The ratio is equivalent to a unified intraspecies toxicokinetic UF, and it is one of the UFs by which the NOAEL can be divided to obtain the RfC/RfD. The 10-fold intraspecies UF is intended to account for uncertainty and variability in toxicokinetics (3.2x) and toxicodynamics (3.2x). This article deals exclusively with toxicokinetic component of UF. The framework provides an alternative to the default methodology and is advantageous in that the evaluation of toxicokinetic variability is based on the distribution of the effective target tissue dose, rather than applied dose. It allows for the replacement of the default adult and children intraspecies UF with toxicokinetic data-derived values and provides accurate chemical-specific estimates for their magnitude. It shows that proper application of probability and toxicokinetic theories can reduce uncertainties when establishing exposure limits for specific compounds and provide better assurance that established limits are adequately protective. It contributes to the development of a probabilistic noncancer risk assessment framework and will ultimately lead to the unification of cancer and noncancer risk assessment methodologies.  相似文献   

13.
Low‐probability, high‐impact events are difficult to manage. Firms may underinvest in risk assessments for low‐probability, high‐impact events because it is not easy to link the direct and indirect benefits of doing so. Scholarly research on the effectiveness of programs aimed at reducing such events faces the same challenge. In this article, we draw on comprehensive industry‐wide data from the U.S. nuclear power industry to explore the impact of conducting probabilistic risk assessment (PRA) on preventing safety‐related disruptions. We examine this using data from over 25,000 monthly event reports across 101 U.S. nuclear reactors from 1985 to 1998. Using Poisson fixed effects models with time trends, we find that the number of safety‐related disruptions reduced between 8% and 27% per month in periods after operators submitted their PRA in response to the Nuclear Regulatory Commission's Generic Letter 88‐20, which required all operators to conduct a PRA. One possible mechanism for this is that the adoption of PRA may have increased learning rates, lowering the rate of recurring events by 42%. We find that operators that completed their PRA before Generic Letter 88‐20 continued to experience safety improvements during 1990–1995. This suggests that revisiting PRA or conducting it again can be beneficial. Our results suggest that even in a highly safety‐conscious industry as nuclear utilities, a more formal approach to quantifying risk has its benefits.  相似文献   

14.
Flood loss modeling is an important component for risk analyses and decision support in flood risk management. Commonly, flood loss models describe complex damaging processes by simple, deterministic approaches like depth‐damage functions and are associated with large uncertainty. To improve flood loss estimation and to provide quantitative information about the uncertainty associated with loss modeling, a probabilistic, multivariable B agging decision T ree F lood L oss E stimation MO del (BT‐FLEMO) for residential buildings was developed. The application of BT‐FLEMO provides a probability distribution of estimated losses to residential buildings per municipality. BT‐FLEMO was applied and validated at the mesoscale in 19 municipalities that were affected during the 2002 flood by the River Mulde in Saxony, Germany. Validation was undertaken on the one hand via a comparison with six deterministic loss models, including both depth‐damage functions and multivariable models. On the other hand, the results were compared with official loss data. BT‐FLEMO outperforms deterministic, univariable, and multivariable models with regard to model accuracy, although the prediction uncertainty remains high. An important advantage of BT‐FLEMO is the quantification of prediction uncertainty. The probability distribution of loss estimates by BT‐FLEMO well represents the variation range of loss estimates of the other models in the case study.  相似文献   

15.
This paper demonstrates a new methodology for probabilistic public health risk assessment using the first-order reliability method. The method provides the probability that incremental lifetime cancer risk exceeds a threshold level, and the probabilistic sensitivity quantifying the relative impact of considering the uncertainty of each random variable on the exceedance probability. The approach is applied to a case study given by Thompson et al. (1) on cancer risk caused by ingestion of benzene-contaminated soil, and the results are compared to that of the Monte Carlo method. Parametric sensitivity analyses are conducted to assess the sensitivity of the probabilistic event with respect to the distribution parameters of the basic random variables, such as the mean and standard deviation. The technique is a novel approach to probabilistic risk assessment, and can be used in situations when Monte Carlo analysis is computationally expensive, such as when the simulated risk is at the tail of the risk probability distribution.  相似文献   

16.
17.
Various methods for risk characterization have been developed using probabilistic approaches. Data on Vietnamese farmers are available for the comparison of outcomes for risk characterization using different probabilistic methods. This article addresses the health risk characterization of chlorpyrifos using epidemiological dose‐response data and probabilistic techniques obtained from a case study with rice farmers in Vietnam. Urine samples were collected from farmers and analyzed for trichloropyridinol (TCP), which was converted into absorbed daily dose of chlorpyrifos. Adverse health response doses due to chlorpyrifos exposure were collected from epidemiological studies to develop dose‐adverse health response relationships. The health risk of chlorpyrifos was quantified using hazard quotient (HQ), Monte Carlo simulation (MCS), and overall risk probability (ORP) methods. With baseline (prior to pesticide spraying) and lifetime exposure levels (over a lifetime of pesticide spraying events), the HQ ranged from 0.06 to 7.1. The MCS method indicated less than 0.05% of the population would be affected while the ORP method indicated that less than 1.5% of the population would be adversely affected. With postapplication exposure levels, the HQ ranged from 1 to 32.5. The risk calculated by the MCS method was that 29% of the population would be affected, and the risk calculated by ORP method was 33%. The MCS and ORP methods have advantages in risk characterization due to use of the full distribution of data exposure as well as dose response, whereas HQ methods only used the exposure data distribution. These evaluations indicated that single‐event spraying is likely to have adverse effects on Vietnamese rice farmers.  相似文献   

18.
The rise in economic disparity presents significant risks to global social order and the resilience of local communities. However, existing measurement science for economic disparity (e.g., the Gini coefficient) does not explicitly consider a probability distribution with information, deficiencies, and uncertainties associated with the underlying income distribution. This article introduces the quantification of Shannon entropy for income inequality across scales, including national‐, subnational‐, and city‐level data. The probabilistic principles of Shannon entropy provide a new interpretation for uncertainty and risk related to economic disparity. Entropy and information‐based conflict rise as world incomes converge. High‐entropy instances can resemble both happy and prosperous societies as well as a socialist–communist social structure. Low entropy signals high‐risk tipping points for anomaly and conflict detection with higher confidence. Finally, spatial–temporal entropy maps for U.S. cities offer a city risk profiling framework. The results show polarization of household incomes within and across Baltimore, Washington, DC, and San Francisco. Entropy produces reliable results at significantly reduced computational costs than Gini coefficients.  相似文献   

19.
Information format can influence the extent to which target audiences understand and respond to risk-related information. This study examined four elements of risk information presentation format. Using printed materials, we examined target audience perceptions about: (a) reading level; (b) use of diagrams vs. text; (c) commanding versus cajoling tone; and (d) use of qualitative vs. quantitative information presented in a risk ladder. We used the risk communication topic of human health concerns related to eating noncommercial Great Lakes fish affected by chemical contaminants. Results from the comparisons of specific communication formats indicated that multiple formats are required to meet the needs of a significant percent of anglers for three of the four format types examined. Advisory text should be reviewed to ensure the reading level is geared to abilities of the target audience. For many audiences, a combination of qualitative and quantitative information, and a combination of diagrams and text may be most effective. For most audiences, a cajoling rather than commanding tone better provides them with the information they need to make a decision about fish consumption. Segmenting audiences regarding information needs and communication formats may help clarify which approaches to take with each audience.  相似文献   

20.
Risk‐informed decision making is often accompanied by the specification of an acceptable level of risk. Such target level is compared against the value of a risk metric, usually computed through a probabilistic safety assessment model, to decide about the acceptability of a given design, the launch of a space mission, etc. Importance measures complement the decision process with information about the risk/safety significance of events. However, importance measures do not tell us whether the occurrence of an event can change the overarching decision. By linking value of information and importance measures for probabilistic risk assessment models, this work obtains a value‐of‐information‐based importance measure that brings together the risk metric, risk importance measures, and the risk threshold in one expression. The new importance measure does not impose additional computational burden because it can be calculated from our knowledge of the risk achievement and risk reduction worth, and complements the insights delivered by these importance measures. Several properties are discussed, including the joint decision worth of basic event groups. The application to the large loss of coolant accident sequence of the Advanced Test Reactor helps us in illustrating the risk analysis insights.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号