首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Losses due to natural hazard events can be extraordinarily high and difficult to cope with. Therefore, there is considerable interest to estimate the potential impact of current and future extreme events at all scales in as much detail as possible. As hazards typically spread over wider areas, risk assessment must take into account interrelations between regions. Neglecting such interdependencies can lead to a severe underestimation of potential losses, especially for extreme events. This underestimation of extreme risk can lead to the failure of riskmanagement strategies when they are most needed, namely, in times of unprecedented events. In this article, we suggest a methodology to incorporate such interdependencies in risk via the use of copulas. We demonstrate that by coupling losses, dependencies can be incorporated in risk analysis, avoiding the underestimation of risk. Based on maximum discharge data of river basins and stream networks, we present and discuss different ways to couple loss distributions of basins while explicitly incorporating tail dependencies. We distinguish between coupling methods that require river structure data for the analysis and those that do not. For the later approach we propose a minimax algorithm to choose coupled basin pairs so that the underestimation of risk is avoided and the use of river structure data is not needed. The proposed methodology is especially useful for large‐scale analysis and we motivate and apply our method using the case of Romania. The approach can be easily extended to other countries and natural hazards.  相似文献   

2.
Teun Terpstra 《Risk analysis》2011,31(10):1658-1675
Despite the prognoses of the effects of global warming (e.g., rising sea levels, increasing river discharges), few international studies have addressed how flood preparedness should be stimulated among private citizens. This article aims to predict Dutch citizens’ flood preparedness intentions by testing a path model, including previous flood hazard experiences, trust in public flood protection, and flood risk perceptions (both affective and cognitive components). Data were collected through questionnaire surveys in two coastal communities (n= 169, n= 244) and in one river area community (n= 658). Causal relations were tested by means of structural equation modeling (SEM). Overall, the results indicate that both cognitive and affective mechanisms influence citizens’ preparedness intentions. First, a higher level of trust reduces citizens’ perceptions of flood likelihood, which in turn hampers their flood preparedness intentions (cognitive route). Second, trust also lessens the amount of dread evoked by flood risk, which in turn impedes flood preparedness intentions (affective route). Moreover, the affective route showed that levels of dread were especially influenced by citizens’ negative and positive emotions related to their previous flood hazard experiences. Negative emotions most often reflected fear and powerlessness, while positive emotions most frequently reflected feelings of solidarity. The results are consistent with the affect heuristic and the historical context of Dutch flood risk management. The great challenge for flood risk management is the accommodation of both cognitive and affective mechanisms in risk communications, especially when most people lack an emotional basis stemming from previous flood hazard events.  相似文献   

3.
Abstract

This paper investigates the reliability of the Occupational Stress Indicator (OSI). Data from a sample of university staff, drawn from all areas of an urban university, are used to reassess the apparently low reliabilities of many of the OSI subscales reported by Cooper et al. (1988). In addition, factor analysis results are reported for the first time for the sources of pressure data. The reliability data reported here, while higher reliabilities than originally obtained, remain unacceptably low. A lack of stability in the device as it is currently formulated seems apparent. The paper also presents for the first time a detailed analysis of the sources of pressure scale, indicating a solution different from that proposed by Cooper et al. (1988). Certain areas of the OSI clearly need refinement; the inclusion of locus of control and type A behaviour as personality variables in particular is called into question. Observations regarding the particular strengths and weaknesses of this device, and suggestions for future refinements, are offered.  相似文献   

4.
This paper presents new identification conditions for the mixed proportional hazard model. In particular, the baseline hazard is assumed to be bounded away from 0 and ∞ near t = 0. These conditions ensure that the information matrix is nonsingular. The paper also presents an estimator for the mixed proportional hazard model that converges at rate N−1/2.  相似文献   

5.
Today there are more than 80,000 chemicals in commerce and the environment. The potential human health risks are unknown for the vast majority of these chemicals as they lack human health risk assessments, toxicity reference values, and risk screening values. We aim to use computational toxicology and quantitative high‐throughput screening (qHTS) technologies to fill these data gaps, and begin to prioritize these chemicals for additional assessment. In this pilot, we demonstrate how we were able to identify that benzo[k]fluoranthene may induce DNA damage and steatosis using qHTS data and two separate adverse outcome pathways (AOPs). We also demonstrate how bootstrap natural spline‐based meta‐regression can be used to integrate data across multiple assay replicates to generate a concentration–response curve. We used this analysis to calculate an in vitro point of departure of 0.751 μM and risk‐specific in vitro concentrations of 0.29 μM and 0.28 μM for 1:1,000 and 1:10,000 risk, respectively, for DNA damage. Based on the available evidence, and considering that only a single HSD17B4 assay is available, we have low overall confidence in the steatosis hazard identification. This case study suggests that coupling qHTS assays with AOPs and ontologies will facilitate hazard identification. Combining this with quantitative evidence integration methods, such as bootstrap meta‐regression, may allow risk assessors to identify points of departure and risk‐specific internal/in vitro concentrations. These results are sufficient to prioritize the chemicals; however, in the longer term we will need to estimate external doses for risk screening purposes, such as through margin of exposure methods.  相似文献   

6.
The approximate solution of the two-stage clonal expansion model of cancer may substantially deviate from the exact solution, and may therefore lead to erroneous conclusions in particular applications. However, for time-varying parameters the exact solution (method of characteristics) is not easy to implement, hampering the accessibility of the model to nonmathematicians. Based on intuitive reasoning, Clewell et al. (1995) proposed an improved approximate solution that is easy to implement whatever time-varying behavior the parameters may have. Here we provide the mathematical foundation for the approximation suggested by Clewell et al. (1995) and show that, after a slight modification, it is in fact an exact solution for the case of time-constant parameters. We were not able to prove that it is an exact solution for time-varying parameters as well. However, several computer simulations showed that the numerical results do not differ from the exact solution as proposed by Moolgavkar and Luebeck (1990). The advantage of this alternative solution is that the hazard rate of the first malignant cell can be evaluated by numerically integrating a single differential equation.  相似文献   

7.
Risk assessment is the process of estimating the likelihood that an adverse effect may result from exposure to a specific health hazard. The process traditionally involves hazard identification, dose-response assessment, exposure assessment, and risk characterization to answer “How many excess cases of disease A will occur in a population of size B due to exposure to agent C at dose level D?” For natural hazards, however, we modify the risk assessment paradigm to answer “How many excess cases of outcome Y will occur in a population of size B due to natural hazard event E of severity D?” Using a modified version involving hazard identification, risk factor characterization, exposure characterization, and risk characterization, we demonstrate that epidemiologic modeling and measures of risk can quantify the risks from natural hazard events. We further extend the paradigm to address mitigation, the equivalent of risk management, to answer “What is the risk for outcome Y in the presence of prevention intervention X relative to the risk for Y in the absence of X?” We use the preventable fraction to estimate the efficacy of mitigation, or reduction in adverse health outcomes as a result of a prevention strategy under ideal circumstances, and further estimate the effectiveness of mitigation, or reduction in adverse health outcomes under typical community-based settings. By relating socioeconomic costs of mitigation to measures of risk, we illustrate that prevention effectiveness is useful for developing cost-effective risk management options.  相似文献   

8.
Self-efficacy is one of the strongest and most consistent drivers of private flood mitigation behavior; however, the factors influencing self-efficacy in the context of flooding remain unclear. The present study examines three potential antecedents of self-efficacy: personal and vicarious experiences of floods or building-related events, social norms for private flood preparedness, and personal competencies such as technical abilities and social skills. While controlling for other drivers in a protection motivation theory (PMT) framework, these antecedents are tested as precursors of self-efficacy and intentions to improve flood resilience. Structural equation modeling is applied to conduct mediation analyses with survey data of 381 flood-prone households in Austria. Contrary to theoretical expectations, personal and vicarious experiences do not predict self-efficacy, presumably because rare flood events and changing hazard characteristics do not facilitate generalizable performance accomplishments. Social norms strongly and consistently influence self-efficacy, especially for actions observable by others, and also directly influence protective responses. Personal competencies increase self-efficacy and support protective action, particularly with regard to preventive and structural measures. The strength and direction of the antecedents of self-efficacy as well as of other PMT determinants vary between general and specific protective responses. This study provides important insights for risk managers, suggesting that interventions involving social norms and personal competencies can be effective in stimulating self-efficacy and, in turn, private flood mitigation. Interventions and research should clearly differentiate between general intention and the implementation of specific measures, and should address cumulative, synergistic, or tradeoff interrelations between multiple measures.  相似文献   

9.
A significant majority of hazardous materials (hazmat) shipments are moved via the highway and railroad networks, wherein the latter mode is generally preferred for long distances. Although the characteristics of highway transportation make trucks the most dominant surface transportation mode, should it be preferred for hazmat whose accidental release can cause catastrophic consequences? We answer this question by first developing a novel and comprehensive assessment methodology—which incorporates the sequence of events leading to hazmat release from the derailed railcars and the resulting consequence—to measure rail transport risk, and second making use of the proposed assessment methodology to analyze hazmat transport risk resulting from meeting the demand for chlorine and ammonia in six distinct corridors in North America. We demonstrate that rail transport will reduce risk, irrespective of the risk measure and the transport corridor, and that every attempt must be made to use railroads to transport these shipments.  相似文献   

10.
Abstract

This paper describes a micro-analysis of the cognitive appraisal of daily stressful events in a sample of correctional officers (COs). More specifically, the authors examined whether three attribution dimensions mediated the relationship between the occurrence of stressful events and the ‘significance’ of these events, and whether the latter functioned as a mediator between the attribution dimensions on the one hand and negative affect (outcome variable) on the other. Convincing indications were found for the mediating role of the ‘significance’ of a stressful event, while weak indications were found for the mediating role of the attribution dimensions. Finally, the strengths and weaknesses of daily event-recording methods are discussed at length.  相似文献   

11.
The objective of the present study was to integrate the relative risk from mercury exposure to stream biota, groundwater, and humans in the Río Artiguas (Sucio) river basin, Nicaragua, where local gold mining occurs. A hazard quotient was used as a common exchange rate in probabilistic estimations of exposure and effects by means of Monte Carlo simulations. The endpoint for stream organisms was the lethal no‐observed‐effect concentration (NOECs), for groundwater the WHO guideline and the inhibitory Hg concentrations in bacteria (IC), and for humans the tolerable daily intake (TDI) and the benchmark dose level with an uncertainty factor of 10 (BMDLs0.1). Macroinvertebrates and fish in the contaminated river are faced with a higher risk to suffer from exposure to Hg than humans eating contaminated fish and bacteria living in the groundwater. The river sediment is the most hazardous source for the macroinvertebrates, and macroinvertebrates make up the highest risk for fish. The distribution of body concentrations of Hg in fish in the mining areas of the basin may exceed the distribution of endpoint values with close to 100% probability. Similarly, the Hg concentration in cord blood of humans feeding on fish from the river was predicted to exceed the BMDLs0.1 with about 10% probability. Most of the risk to the groundwater quality is confined to the vicinity of the gold refining plants and along the river, with a probability of about 20% to exceed the guideline value.  相似文献   

12.
Natural hazards, such as major flood events, are occurring with increasing frequency and inflicting increasing levels of financial damages upon affected communities. The experience of such major flood events has brought about a significant change in attitudes to flood‐risk management, with a shift away from built engineering solutions alone towards a more multifaceted approach. Europe's experience with damaging flood episodes provided the impetus for the introduction of the European Floods Directive, requiring the establishment of flood‐risk management plans at the river‐basin scale. The effectiveness of such plans, focusing on prevention, protection, and preparedness, is dependent on adequate flood awareness and preparedness, and this is related to perception of flood risk. This is an important factor in the design and assessment of flood‐risk management. Whilst there is a modern body of literature exploring flood perception issues, there have been few examples that explore its spatial manifestations. Previous literature has examined perceived and real distance to a hazard source (such as a river, nuclear facility, landfill, or incinerator, etc.), whereas this article advances the literature by including an objectively assessed measure of distance to a perceived flood zone, using a cognitive mapping methodology. The article finds that distance to the perceived flood zone (perceived flood exposure) is a crucial factor in determining flood‐risk perception, both the cognitive and affective components. Furthermore, we find an interesting phenomenon of misperception among respondents. The article concludes by discussing the implications for flood‐risk management.  相似文献   

13.
Landslide Risk Models for Decision Making   总被引:1,自引:0,他引:1  
This contribution presents a quantitative procedure for landslide risk analysis and zoning considering hazard, exposure (or value of elements at risk), and vulnerability. The method provides the means to obtain landslide risk models (expressing expected damage due to landslides on material elements and economic activities in monetary terms, according to different scenarios and periods) useful to identify areas where mitigation efforts will be most cost effective. It allows identifying priority areas for the implementation of actions to reduce vulnerability (elements) or hazard (processes). The procedure proposed can also be used as a preventive tool, through its application to strategic environmental impact analysis (SEIA) of land-use plans. The underlying hypothesis is that reliable predictions about hazard and risk can be made using models based on a detailed analysis of past landslide occurrences in connection with conditioning factors and data on past damage. The results show that the approach proposed and the hypothesis formulated are essentially correct, providing estimates of the order of magnitude of expected losses for a given time period. Uncertainties, strengths, and shortcomings of the procedure and results obtained are discussed and potential lines of research to improve the models are indicated. Finally, comments and suggestions are provided to generalize this type of analysis.  相似文献   

14.
Mortality effects of exposure to air pollution and other environmental hazards are often described by the estimated number of “premature” or “attributable” deaths and the economic value of a reduction in exposure as the product of an estimate of “statistical lives saved” and a “value per statistical life.” These terms can be misleading because the number of deaths advanced by exposure cannot be determined from mortality data alone, whether from epidemiology or randomized trials (it is not statistically identified). The fraction of deaths “attributed” to exposure is conventionally derived as the hazard fraction (R – 1)/R, where R is the relative risk of mortality between high and low exposure levels. The fraction of deaths advanced by exposure (the “etiologic” fraction) can be substantially larger or smaller: it can be as large as one and as small as 1/e (≈0.37) times the hazard fraction (if the association is causal and zero otherwise). Recent literature reveals misunderstanding about these concepts. Total life years lost in a population due to exposure can be estimated but cannot be disaggregated by age or cause of death. Economic valuation of a change in exposure-related mortality risk to a population is not affected by inability to know the fraction of deaths that are etiologic. When individuals facing larger or smaller changes in mortality risk cannot be identified, the mean change in population hazard is sufficient for valuation; otherwise, the economic value can depend on the distribution of risk reductions.  相似文献   

15.
Pipeline damage by dropped objects from crane activities is a significant hazard for offshore platform installations. In this paper a probabilistic methodology is utilized for the estimation of the pipeline impact and rupture frequencies; this information is obtained both for the overall pipeline section exposed to the hazard and for a number of critical locations along the pipeline route. The presented algorithm has been implemented in a computer program that allows the analysis of a large number of possible drop points and pipeline target point locations. This methodology may be used in common risk analysis studies for evaluating the risk for platform personnel from dropped objects; however, the proposed technique may also be useful for other applications where engineering judgment has so far been the main driving criterion. In particular, two sample cases have been analyzed. The first one is the problem of selecting the best approaching route to a platform. By analyzing different route alternatives, a reduction of the impact frequency and therefore of the risk for the platform personnel may be achieved. The second application deals with the selection of the location for a safety valve at the riser base. The analysis may give useful information, such as the highest impact frequency location and the rupture frequencies upstream and downstream of the valve as a function of the valve position; this information, together with the transported medium inventory upstream of the valve, may give the designer a documented and justifiable rationale for selecting the best location for the valve from a safety point of view.  相似文献   

16.
Convex (concave) interaction weighting functions are combined with circular configurations of black and white sites to determine configurations that have minimum (maximum) weight. These configurations are called maximally even configurations. It is shown that for a given number of black and white sites, all maximally even configurations are equivalent under rotation and reflection, and a simple algorithm is constructed that generates these configurations. A number of equivalent conditions that determine a maximally even configuration are established. These equivalent conditions permit maximally even configurations to apply to a number of seemingly disparate problems including the dinner table and concentric circles problems, the one-dimensional antiferromagnetic Ising model, and musical scales. This paper is dedicated to the memory of John Clough (1928–2003). Without his seminal works in music theory and patient encouragement of others, this work and much of the work referenced herein would never have been started, much less completed. The field of mathematical music theory owes a great debt to John Clough. The authors are privileged to have known and worked with John Clough.  相似文献   

17.
Manufacturing capability has often been viewed to be a major obstacle in achieving higher levels of customization. Companies follow various strategies ranging from equipment selection to order process management to cope with the challenges of increased customization. We examined how the customization process affects product performance and conformance in the context of a design‐to‐order (DTO) manufacturer of industrial components. Our competing risk hazard function model incorporates two thresholds, which we define as mismatch and manufacturing thresholds. Product performance was adversely affected when the degree of customization exceeded the mismatch threshold. Likewise, product conformance eroded when the degree of customization exceeded the manufacturing threshold. Relative sizes of the two thresholds have management implications for the subsequent investments to improve customization capabilities. Our research developed a rigorous framework to address two key questions relevant to the implementation of product customization: (1) what degrees of customization to offer, and (2) how to customize the product design process.  相似文献   

18.
Abstract

Web-based instruction, also called e-learning, is currently one of the most talked-about education and training media. To prepare courses for online delivery and to maintain their effectiveness, the designer must have an understanding of e-learning instructional design principles. Action learning is a proven, effective management development process that has not been implemented to date as an e-learning instructional methodology. The purpose of this exploratory case study was to examine the impact of the action learning process on the effectiveness of management level web-based instruction (WBI). A leader-led, management-level course using face-to-face delivery was converted to web-based instruction where action learning was the delivery methodology. Kirkpatrick's Four Levels of Evaluation served as the evaluation tool to determine effectiveness of the intervention. It was found that, though challenging to facilitate, the action learning online method is effective and yields changes in participants' knowledge. However, contrary to expectations, online learning communities did not form.  相似文献   

19.
Cluster‐based segmentation usually involves two sets of variables: (i) the needs‐based variables (referred to as the bases variables), which are used in developing the original segments to identify the value, and (ii) the classification or background variables, which are used to profile or target the customers. The managers’ goal is to utilize these two sets of variables in the most efficient manner. Pragmatic managerial interests recognize the underlying need to start shifting from methodologies that obtain highly precise value‐based segments but may be of limited practical use as they provide less targetable segments. Consequently, the imperative is to shift toward newer segmentation approaches that provide greater focus on targetable segments while maintaining homogeneity. This requires dual objective segmentation, which is a combinatorially difficult problem. Hence, we propose and examine a new evolutionary methodology based on genetic algorithms to address this problem. We show, based on a large‐scale Monte Carlo simulation and a case study, that the proposed approach consistently outperforms the existing methods for a wide variety of problem instances. We are able to obtain statistically significant and managerially important improvements in targetability with little diminution in the identifiability of value‐based segments. Moreover, the proposed methodology provides a set of good solutions, unlike existing methodologies that provide a single solution. We also show how these good solutions can be used to plot an efficient Pareto frontier. Finally, we present useful insights that would help managers in implementing the proposed solution approach effectively.  相似文献   

20.
This article presents an iterative six‐step risk analysis methodology based on hybrid Bayesian networks (BNs). In typical risk analysis, systems are usually modeled as discrete and Boolean variables with constant failure rates via fault trees. Nevertheless, in many cases, it is not possible to perform an efficient analysis using only discrete and Boolean variables. The approach put forward by the proposed methodology makes use of BNs and incorporates recent developments that facilitate the use of continuous variables whose values may have any probability distributions. Thus, this approach makes the methodology particularly useful in cases where the available data for quantification of hazardous events probabilities are scarce or nonexistent, there is dependence among events, or when nonbinary events are involved. The methodology is applied to the risk analysis of a regasification system of liquefied natural gas (LNG) on board an FSRU (floating, storage, and regasification unit). LNG is becoming an important energy source option and the world's capacity to produce LNG is surging. Large reserves of natural gas exist worldwide, particularly in areas where the resources exceed the demand. Thus, this natural gas is liquefied for shipping and the storage and regasification process usually occurs at onshore plants. However, a new option for LNG storage and regasification has been proposed: the FSRU. As very few FSRUs have been put into operation, relevant failure data on FSRU systems are scarce. The results show the usefulness of the proposed methodology for cases where the risk analysis must be performed under considerable uncertainty.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号