首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Pesticide application is increasing and despite extensive educational programs farmers continue to take high health and environmental risks when applying pesticides.
The structured mental model approach (SMMA) is a new method for risk perception analysis. It embeds farmers' risk perception into their livelihood system in the elaboration of a mental model (MM). Results from its first application are presented here. The study region is Vereda la Hoya (Colombia), an area characterized by subsistence farming, high use of pesticides, and a high incidence of health problems. Our hypothesis was that subsistence farmers were constrained by economic, environmental, and sociocultural factors, which consequently should influence their mental models.
Thirteen experts and 10 farmers were interviewed and their MMs of the extended pesticide system elicited. The interviews were open-ended with the questions structured in three parts: (i) definition and ranking of types of capital with respect to their importance for the sustainability of farmers' livelihood; (ii) understanding the system and its dynamics; and (iii) importance of the agents in the farmers' agent network. Following this structure, each part of the interview was analyzed qualitatively and statistically. Our analyses showed that the mental models of farmers and experts differed significantly from each other.
By applying the SMMA, we were also able to identify reasons for the divergence of experts' and farmers' MMs. Of major importance are the following factors: (i) culture and tradition; (ii) trust in the source of information; and (iii) feedback on knowledge.  相似文献   

2.
A GIS-Based Framework for Hazardous Materials Transport Risk Assessment   总被引:2,自引:0,他引:2  
This article presents a methodology for assessment of the hazardous materials transport risk in a multicommodity, multiple origin-destination setting. The proposed risk assessment methodology was integrated with a Geographical Information System (GIS), which made large-scale implementation possible. A GIS-based model of the truck shipments of dangerous goods via the highway network of Quebec and Ontario was developed. Based on the origin and destination of each shipment, the risk associated with the routes that minimize (1) the transport distance, (2) the population exposure, (3) the expected number of people to be evacuated in case of an incident, and (4) the probability of an incident during transportation was evaluated. Using these assessments, a government agency can estimate the impact of alternative policies that could alter the carriers' route choices. A related issue is the spatial distribution of transport risk, because an unfair distribution is likely to cause public concern. Thus, an analysis of transport risk equity in the provinces of Quebec and Ontario is also provided.  相似文献   

3.
Cost‐benefit analysis (CBA) is commonly applied as a tool for deciding on risk protection. With CBA, one can identify risk mitigation strategies that lead to an optimal tradeoff between the costs of the mitigation measures and the achieved risk reduction. In practical applications of CBA, the strategies are typically evaluated through efficiency indicators such as the benefit‐cost ratio (BCR) and the marginal cost (MC) criterion. In many of these applications, the BCR is not consistently defined, which, as we demonstrate in this article, can lead to the identification of suboptimal solutions. This is of particular relevance when the overall budget for risk reduction measures is limited and an optimal allocation of resources among different subsystems is necessary. We show that this problem can be formulated as a hierarchical decision problem, where the general rules and decisions on the available budget are made at a central level (e.g., central government agency, top management), whereas the decisions on the specific measures are made at the subsystem level (e.g., local communities, company division). It is shown that the MC criterion provides optimal solutions in such hierarchical optimization. Since most practical applications only include a discrete set of possible risk protection measures, the MC criterion is extended to this situation. The findings are illustrated through a hypothetical numerical example. This study was prepared as part of our work on the optimal management of natural hazard risks, but its conclusions also apply to other fields of risk management.  相似文献   

4.
The study investigated the effects of incidence rates stated as a probability (e.g., 0006) and incidence rate information expressed in terms of frequency (e.g., 600 in 1,000,000) on risk-avoidant behavior. Subjects were informed about the risks associated with an old and a new, improved medication. They were asked how much they were willing to pay for the safer medicine. Risk information was given either in a frequency or a probability format. The second factor manipulated was the level of risk, either high or low. As expected, analysis of variance yielded a significant interaction. Subjects confronted with high risk in the frequency format were willing to pay the highest prices for the improved medication. The choice between frequency or probability format can be made according to the goal of the communication of risk.  相似文献   

5.
This article develops a quantitative all-hazards framework for critical asset and portfolio risk analysis (CAPRA) that considers both natural and human-caused hazards. Following a discussion on the nature of security threats, the need for actionable risk assessments, and the distinction between asset and portfolio-level analysis, a general formula for all-hazards risk analysis is obtained that resembles the traditional model based on the notional product of consequence, vulnerability, and threat, though with clear meanings assigned to each parameter. Furthermore, a simple portfolio consequence model is presented that yields first-order estimates of interdependency effects following a successful attack on an asset. Moreover, depending on the needs of the decisions being made and available analytical resources, values for the parameters in this model can be obtained at a high level or through detailed systems analysis. Several illustrative examples of the CAPRA methodology are provided.  相似文献   

6.
Landslide Risk Models for Decision Making   总被引:1,自引:0,他引:1  
This contribution presents a quantitative procedure for landslide risk analysis and zoning considering hazard, exposure (or value of elements at risk), and vulnerability. The method provides the means to obtain landslide risk models (expressing expected damage due to landslides on material elements and economic activities in monetary terms, according to different scenarios and periods) useful to identify areas where mitigation efforts will be most cost effective. It allows identifying priority areas for the implementation of actions to reduce vulnerability (elements) or hazard (processes). The procedure proposed can also be used as a preventive tool, through its application to strategic environmental impact analysis (SEIA) of land-use plans. The underlying hypothesis is that reliable predictions about hazard and risk can be made using models based on a detailed analysis of past landslide occurrences in connection with conditioning factors and data on past damage. The results show that the approach proposed and the hypothesis formulated are essentially correct, providing estimates of the order of magnitude of expected losses for a given time period. Uncertainties, strengths, and shortcomings of the procedure and results obtained are discussed and potential lines of research to improve the models are indicated. Finally, comments and suggestions are provided to generalize this type of analysis.  相似文献   

7.
This guest editorial is a summary of the NCSU/USDA Workshop on Sensitivity Analysis held June 11–12, 2001 at North Carolina State University and sponsored by the U.S. Department of Agriculture's Office of Risk Assessment and Cost Benefit Analysis. The objective of the workshop was to learn across disciplines in identifying, evaluating, and recommending sensitivity analysis methods and practices for application to food‐safety process risk models. The workshop included presentations regarding the Hazard Assessment and Critical Control Points (HACCP) framework used in food‐safety risk assessment, a survey of sensitivity analysis methods, invited white papers on sensitivity analysis, and invited case studies regarding risk assessment of microbial pathogens in food. Based on the sharing of interdisciplinary information represented by the presentations, the workshop participants, divided into breakout sessions, responded to three trigger questions: What are the key criteria for sensitivity analysis methods applied to food‐safety risk assessment? What sensitivity analysis methods are most promising for application to food safety and risk assessment? and What are the key needs for implementation and demonstration of such methods? The workshop produced agreement regarding key criteria for sensitivity analysis methods and the need to use two or more methods to try to obtain robust insights. Recommendations were made regarding a guideline document to assist practitioners in selecting, applying, interpreting, and reporting the results of sensitivity analysis.  相似文献   

8.
Adrian Kent 《Risk analysis》2004,24(1):157-168
Recent articles by Busza et al. (BJSW) and Dar et al. (DDH) argue that astrophysical data can be used to establish small bounds on the risk of a "killer strangelet" catastrophe scenario in the RHIC and ALICE collider experiments. The case for the safety of the experiments set out by BJSW does not rely solely on these bounds, but on theoretical arguments, which BJSW find sufficiently compelling to firmly exclude any possibility of catastrophe. Nonetheless, DDH and other commentators (initially including BJSW) suggested that these empirical bounds alone do give sufficient reassurance. This seems unsupportable when the bounds are expressed in terms of expectation value-a good measure, according to standard risk analysis arguments. For example, DDH's main bound, p(catastrophe) < 2 x 10(-8), implies only that the expectation value of the number of deaths is bounded by 120; BJSW's most conservative bound implies the expectation value of the number of deaths is bounded by 60,000. This article reappraises the DDH and BJSW risk bounds by comparing risk policy in other areas. For example, it is noted that, even if highly risk-tolerant assumptions are made and no value is placed on the lives of future generations, a catastrophe risk no higher than approximately 10(-15) per year would be required for consistency with established policy for radiation hazard risk minimization. Allowing for risk aversion and for future lives, a respectable case can be made for requiring a bound many orders of magnitude smaller. In summary, the costs of small risks of catastrophe have been significantly underestimated by BJSW (initially), by DDH, and by other commentators. Future policy on catastrophe risks would be more rational, and more deserving of public trust, if acceptable risk bounds were generally agreed upon ahead of time and if serious research on whether those bounds could indeed be guaranteed was carried out well in advance of any hypothetically risky experiment, with the relevant debates involving experts with no stake in the experiments under consideration.  相似文献   

9.
Since 1996, when bovine spongiform encephalopathy (BSE) was assessed as a possible human transmissible disease, a variant of Creutzfeldt-Jakob disease (vCJD), French people have entered into a long period of fear and avoidance of beef and bovine byproducts, which produced an unprecedented collapse in the beef market. This article deals with the perceived risk of the "mad cow disease" (MCD) in the French general population. Two surveys were conducted on a representative sample of the adult population, the first one in 2000 during the peak of the crisis and the second one 13 months later in a quieter period. The main assumption we made was that changes in beef consumption are strongly related to the perceived risk of MCD, which we defined as people's cognitive and affective responses to hazard. Our objective was to identify the determinants and consequences of this perceived risk and to compare them in different sociopolitical contexts. The results issued from a bivariate and multivariate analysis show that: (i) the distribution of most of the variables significantly related to the perceived risk identified in the first survey had changed in the second survey, in relation with the reduction of worry and the resumption of national beef consumption; (ii) the propensity for self-protection through avoiding or ceasing beef eating was more related to feelings of worry than to subjective vCJD risk assessments; and (iii) the main determinant of less avoidance to beef products was the preference for beef, a feeling identified prior to emergence of the risk of MCD, remaining unchanged in various contexts.  相似文献   

10.
Renewed interest in precursor analysis has shown that the evaluation of near misses is an interdisciplinary effort, fundamental within the life of an organization for reducing operational risks and enabling accident prevention. The practice of precursor analysis has been a part of nuclear power plant regulation in the United States for over 25 years. During this time, the models used in the analysis have evolved from simple risk equations to quite complex probabilistic risk assessments. But, one item that has remained constant over this time is that the focus of the analysis has been on modeling the scenario using the risk model (regardless of the model sophistication) and then using the results of the model to determine the severity of the precursor incident. We believe that evaluating precursors in this fashion could be a shortcoming since decision making during the incident is not formally investigated. Consequently, we present the idea for an evaluation procedure that enables one to integrate current practice with the evaluation of decisions made during the precursor event. The methodology borrows from technologies both in the risk analysis and the decision analysis realms. We demonstrate this new methodology via an evaluation of a U.S. precursor incident. Specifically, the course of the incident is represented by the integration of a probabilistic risk assessment model (i.e., the risk analysis tool) with an influence diagram and the corresponding decision tree (i.e., the decision analysis tools). The results and insights from the application of this new methodology are discussed.  相似文献   

11.
《Risk analysis》2018,38(1):84-98
The emergence of the complexity characterizing our systems of systems (SoS) requires a reevaluation of the way we model, assess, manage, communicate, and analyze the risk thereto. Current models for risk analysis of emergent complex SoS are insufficient because too often they rely on the same risk functions and models used for single systems. These models commonly fail to incorporate the complexity derived from the networks of interdependencies and interconnectedness (I–I) characterizing SoS. There is a need to reevaluate currently practiced risk analysis to respond to this reality by examining, and thus comprehending, what makes emergent SoS complex. The key to evaluating the risk to SoS lies in understanding the genesis of characterizing I–I of systems manifested through shared states and other essential entities within and among the systems that constitute SoS. The term “essential entities” includes shared decisions, resources, functions, policies, decisionmakers, stakeholders, organizational setups, and others. This undertaking can be accomplished by building on state‐space theory, which is fundamental to systems engineering and process control. This article presents a theoretical and analytical framework for modeling the risk to SoS with two case studies performed with the MITRE Corporation and demonstrates the pivotal contributions made by shared states and other essential entities to modeling and analysis of the risk to complex SoS. A third case study highlights the multifarious representations of SoS, which require harmonizing the risk analysis process currently applied to single systems when applied to complex SoS.  相似文献   

12.
The effect of specification of the target on risk evaluation was examined. A whole set of hazards, covering most of the domains, were considered: common individual hazards, outdoor activities, medical care, public transportation, energy production, pollutants, sex, deviance, and addictions. Three human targets were introduced: personal health risk (including personal risk of death), health risk for people in the country, and health risk for people in the world. The basic design was a between-subjects design. The first hypothesis was that risk judgments made in the "world" condition should be higher than risk judgments made in the "country" condition, and risk judgments made in this condition should be higher than risk judgments made in the "personal" condition. This is what was observed. The second hypothesis was that the target effect should differ as a function of the kind of hazards considered. This also is what was observed. In two domains--pollutants, and deviance, sex, and addictions--the target effect was important. It corresponded to about one-tenth of the response scale. In the four remaining domains, the target effect was unimportant or absent.  相似文献   

13.
Many states in Latin America, Africa, and Asia lack the monopoly of violence, even though this was identified by Max Weber as the foundation of the state, and thus the capacity to govern effectively. In this paper we develop a new perspective on the establishment of the monopoly of violence. We build a model to explain the incentive of central states to eliminate nonstate armed actors (paramilitaries) in a democracy. The model is premised on the idea that paramilitaries may choose to and can influence elections. Since paramilitaries have preferences over policies, this reduces the incentives of the politicians they favor to eliminate them. We then investigate these ideas using data from Colombia between 1991 and 2006. We first present regression and case study evidence supporting our postulate that paramilitary groups can have significant effects on elections for the legislature and the executive. Next, we show that the evidence is also broadly consistent with the implication of the model that paramilitaries tend to persist to the extent that they deliver votes to candidates for the executive whose preferences are close to theirs and that this effect is larger in areas where the presidential candidate would have otherwise not done as well. Finally, we use roll‐call votes to illustrate a possible “quid pro quo” between the executive and paramilitaries in Colombia.  相似文献   

14.
What's Wrong with Risk Matrices?   总被引:2,自引:1,他引:1  
Risk matrices—tables mapping "frequency" and "severity" ratings to corresponding risk priority levels—are popular in applications as diverse as terrorism risk analysis, highway construction project management, office building risk analysis, climate change risk management, and enterprise risk management (ERM). National and international standards (e.g., Military Standard 882C and AS/NZS 4360:1999) have stimulated adoption of risk matrices by many organizations and risk consultants. However, little research rigorously validates their performance in actually improving risk management decisions. This article examines some mathematical properties of risk matrices and shows that they have the following limitations. (a) Poor Resolution . Typical risk matrices can correctly and unambiguously compare only a small fraction (e.g., less than 10%) of randomly selected pairs of hazards. They can assign identical ratings to quantitatively very different risks ("range compression"). (b) Errors . Risk matrices can mistakenly assign higher qualitative ratings to quantitatively smaller risks. For risks with negatively correlated frequencies and severities, they can be "worse than useless," leading to worse-than-random decisions. (c) Suboptimal Resource Allocation . Effective allocation of resources to risk-reducing countermeasures cannot be based on the categories provided by risk matrices. (d) Ambiguous Inputs and Outputs . Categorizations of severity cannot be made objectively for uncertain consequences. Inputs to risk matrices (e.g., frequency and severity categorizations) and resulting outputs (i.e., risk ratings) require subjective interpretation, and different users may obtain opposite ratings of the same quantitative risks. These limitations suggest that risk matrices should be used with caution, and only with careful explanations of embedded judgments.  相似文献   

15.
The experience sampling method (ESM) was used to collect data from 74 part-time students who described and assessed the risks involved in their current activities when interrupted at random moments by text messages. The major categories of perceived risk were short term in nature and involved "loss of time or materials" related to work and "physical damage" (e.g., from transportation). Using techniques of multilevel analysis, we demonstrate effects of gender, emotional state, and types of risk on assessments of risk. Specifically, females do not differ from males in assessing the potential severity of risks but they see these as more likely to occur. Also, participants assessed risks to be lower when in more positive self-reported emotional states. We further demonstrate the potential of ESM by showing that risk assessments associated with current actions exceed those made retrospectively. We conclude by noting advantages and disadvantages of ESM for collecting data about risk perceptions.  相似文献   

16.
The current article describes the economic evaluation of interventions to control Campylobacter on chicken meat by means of a cost-utility analysis. Apart from the methodology used, the main focus of this article is on data gaps and assumptions made, and their impact on results and conclusions. The direct intervention costs, the relative risk, the disease burden (expressed in disability-adjusted life years (DALYs)), and the costs of illness for the various interventions are necessary inputs for the cost-utility analysis. The cost-utility ratio (CUR) -- the measure for efficiency -- is expressed in net costs per avoided DALY. Most data gaps were of a biological order, but for some interventions, information on costs was also scarce. As a consequence, assumptions had to be made, which had some impact on the estimated CUR. A higher (lower) incidence of Campylobacter infections associated with chicken meat, higher (lower) effectiveness, and lower (higher) intervention costs, respectively, would result in absolute better (worse) CUR estimates. By taking the perspective of all consumers eating Dutch chicken meat, rather than only the Dutch society, absolute better CUR estimates could be obtained. Indirect costs or a shift toward non-Dutch chicken meat would both result in higher CUR estimates. Despite the assumptions made, three interventions showed for most of the applied sensitivity analyses relatively favorable CUR estimates: limiting fecal leakage during processing, carcass decontamination by dipping in a chemical solution, and the phage therapy. However, all three do have some clauses.  相似文献   

17.
A risk assessment comparing the acute effects of mineral oil and PCB-askarel dielectric fluids in two transformer sites was performed. The first site has the installation characteristics for a PCB-askarel-filled transformer with sprinkler fire protection. A risk comparison is made between two types of transformers (PCB-askarel-filled and mineral oil-filled) for this site. The second site (a vault) has the installation characteristics for a mineral oil-filled transformer, and a risk comparison is made in a fashion similar to the first site. Risk is expressed in terms of frequencies of one or more acute injuries or fatalities per transformer year.  相似文献   

18.
This article responds to the call advancing risk science as an independent research field, by introducing a conceptual model for risk analysis based on distributed sensemaking. Significant advances in recent decades have expanded the use of risk analysis to almost every organization globally. Continued improvements have been made to our understanding of risk, placing a wide range of contexts under organizational control. This article argues that four dimensions are central in how organizations make sense of uncertainty in their context and hence do risk analysis: the activities the organization engages in, their sensory systems, the role and competence of individuals, and the ability to coordinate information through organizational structures. The structure enables insight into the decision-making process and the dimensions contributing to how organizations perceive risks and uncertainty in a given context. Three examples from the Arctic context illustrate the network risk analysis model's practical application and how it will expose weaknesses in these organizations’ risk analysis and decision-making processes. Finally, the article discusses sensemaking in network risk analysis and how such an approach supports organizations’ ability to perceive, collect, process, and decide on changes in context.  相似文献   

19.
This paper extends prior research by jointly assessing the roles of risk attitude and tolerance for ambiguity in predicting choice. An experiment examined the effects of these variables on decisions made in four different scenarios. The four scenarios (treatment combinations) were generated by manipulating risk and ambiguity into two levels (high and low). The context was defined in terms of a sample size selection problem. The second issue explored was the effect of attitudes toward risk and ambiguity on decision confidence. The results indicate that (1) both risk attitude and ambiguity intolerance determined choice behavior, (2) the roles of these individual attitudes depend on the levels of the two treatment variables of risk and ambiguity, (3) the presence of ambiguity accentuates the perception of risk in individual subjects, and (4) decision makers who are less risk averse, and have more tolerance for ambiguity, display greater confidence in their choice. The paper discusses some of the managerial implications of the results.  相似文献   

20.
A risk analysis was performed to examine the effect of changes in the Dutch greenhouse sector on the probability of occurrence and magnitude of indemnities induced by catastrophic natural hazards. Analyzed historical indemnities, which included direct and consequential losses resulting from severe hail and windstorms, were used as input in a stochastic simulation model. Applications of the stochastic simulation model were illustrated under alternative risk conditions. A comparison was made between the current structure of greenhouse production and the expected structure in the next decade that differs with respect to the spatial distribution and average size.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号