首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
This article develops a methodology for quantifying model risk in quantile risk estimates. The application of quantile estimates to risk assessment has become common practice in many disciplines, including hydrology, climate change, statistical process control, insurance and actuarial science, and the uncertainty surrounding these estimates has long been recognized. Our work is particularly important in finance, where quantile estimates (called Value‐at‐Risk) have been the cornerstone of banking risk management since the mid 1980s. A recent amendment to the Basel II Accord recommends additional market risk capital to cover all sources of “model risk” in the estimation of these quantiles. We provide a novel and elegant framework whereby quantile estimates are adjusted for model risk, relative to a benchmark which represents the state of knowledge of the authority that is responsible for model risk. A simulation experiment in which the degree of model risk is controlled illustrates how to quantify Value‐at‐Risk model risk and compute the required regulatory capital add‐on for banks. An empirical example based on real data shows how the methodology can be put into practice, using only two time series (daily Value‐at‐Risk and daily profit and loss) from a large bank. We conclude with a discussion of potential applications to nonfinancial risks.  相似文献   

2.
Cost‐benefit analysis (CBA) is commonly applied as a tool for deciding on risk protection. With CBA, one can identify risk mitigation strategies that lead to an optimal tradeoff between the costs of the mitigation measures and the achieved risk reduction. In practical applications of CBA, the strategies are typically evaluated through efficiency indicators such as the benefit‐cost ratio (BCR) and the marginal cost (MC) criterion. In many of these applications, the BCR is not consistently defined, which, as we demonstrate in this article, can lead to the identification of suboptimal solutions. This is of particular relevance when the overall budget for risk reduction measures is limited and an optimal allocation of resources among different subsystems is necessary. We show that this problem can be formulated as a hierarchical decision problem, where the general rules and decisions on the available budget are made at a central level (e.g., central government agency, top management), whereas the decisions on the specific measures are made at the subsystem level (e.g., local communities, company division). It is shown that the MC criterion provides optimal solutions in such hierarchical optimization. Since most practical applications only include a discrete set of possible risk protection measures, the MC criterion is extended to this situation. The findings are illustrated through a hypothetical numerical example. This study was prepared as part of our work on the optimal management of natural hazard risks, but its conclusions also apply to other fields of risk management.  相似文献   

3.
Groundwater leakage into subsurface constructions can cause reduction of pore pressure and subsidence in clay deposits, even at large distances from the location of the construction. The potential cost of damage is substantial, particularly in urban areas. The large‐scale process also implies heterogeneous soil conditions that cannot be described in complete detail, which causes a need for estimating uncertainty of subsidence with probabilistic methods. In this study, the risk for subsidence is estimated by coupling two probabilistic models, a geostatistics‐based soil stratification model with a subsidence model. Statistical analyses of stratification and soil properties are inputs into the models. The results include spatially explicit probabilistic estimates of subsidence magnitude and sensitivities of included model parameters. From these, areas with significant risk for subsidence are distinguished from low‐risk areas. The efficiency and usefulness of this modeling approach as a tool for communication to stakeholders, decision support for prioritization of risk‐reducing measures, and identification of the need for further investigations and monitoring are demonstrated with a case study of a planned tunnel in Stockholm.  相似文献   

4.
This article estimates the value of a statistical life (VSL) for Chile under the hedonic wage method while accounting for individual risk preferences. Two alternative measures of risk aversion are used. First, risk aversion is directly measured using survey measures of preferences over hypothetical gambles, and second, over observed individual behaviors that may proxy for risk preferences, such as smoking status, are used. I reconcile the results with a theoretical model of economic behavior that predicts how the wage‐risk tradeoff changes as risk aversion differs across individuals. The VSL estimates range between 0.61 and 8.68 million dollars. The results using smoking behavior as a proxy for risk attitudes are consistent with previous findings. However, directly measuring risk aversion corrects the wage‐risk tradeoff estimation bias in the opposite direction. The results are robust to other observed measures of risk aversion such as drinking behavior and stock investments. Results suggest that, consistent with the literature that connects smoking behavior with labor market outcomes, smoking status could be capturing poor health productivity effect in addition to purely risk preferences.  相似文献   

5.
《Risk analysis》2018,38(6):1258-1278
Although individual behavior plays a major role in community flood risk, traditional flood risk models generally do not capture information on how community policies and individual decisions impact the evolution of flood risk over time. The purpose of this study is to improve the understanding of the temporal aspects of flood risk through a combined analysis of the behavioral, engineering, and physical hazard aspects of flood risk. Additionally, the study aims to develop a new modeling approach for integrating behavior, policy, flood hazards, and engineering interventions. An agent‐based model (ABM) is used to analyze the influence of flood protection measures, individual behavior, and the occurrence of floods and near‐miss flood events on community flood risk. The ABM focuses on the following decisions and behaviors: dissemination of flood management information, installation of community flood protection, elevation of household mechanical equipment, and elevation of homes. The approach is place based, with a case study area in Fargo, North Dakota, but is focused on generalizable insights. Generally, community mitigation results in reduced future damage, and individual action, including mitigation and movement into and out of high‐risk areas, can have a significant influence on community flood risk. The results of this study provide useful insights into the interplay between individual and community actions and how it affects the evolution of flood risk. This study lends insight into priorities for future work, including the development of more in‐depth behavioral and decision rules at the individual and community level.  相似文献   

6.
The aim of this study was to develop a reliable and valid measure of hurricane risk perception. The utility of such a measure lies in the need to understand how people make decisions when facing an evacuation order. This study included participants located within a 15‐mile buffer of the Gulf and southeast Atlantic U.S. coasts. The study was executed as a three‐wave panel with mail surveys in 2010–2012 (T0 baseline N = 629, 56%; T1 retention N = 427, 75%; T2 retention N = 350, 89%). An inventory based on the psychometric model was developed to discriminate cognitive and affective perceptions of hurricane risk, and included open‐ended responses to solicit additional concepts in the T0 survey. Analysis of the T0 data modified the inventory and this revised item set was fielded at T1 and then replicated at T2. The resulting scales were assessed for validity against existing measures for perception of hurricane risk, dispositional optimism, and locus of control. A measure of evacuation expectation was also examined as a dependent variable, which was significantly predicted by the new measures. The resulting scale was found to be reliable, stable, and largely valid against the comparison measures. Despite limitations involving sample size, bias, and the strength of some reliabilities, it was concluded that the measure has potential to inform approaches to hurricane preparedness efforts and advance planning for evacuation messages, and that the measure has good promise to generalize to other contexts in natural hazards as well as other domains of risk.  相似文献   

7.
Waterborne disease is estimated to cause about 10% of all diseases worldwide. However, related risk perceptions are not well understood, particularly in the developing world where waterborne disease is an enormous problem. We focus on understanding risk perceptions related to these issues in a region within northern Mexico. Our findings show how waterborne disease problems and solutions are understood in eight small communities along a highly contaminated river system. We found major differences in risk perceptions between health professionals, government officials, and lay citizens. Health professionals believed that a high level of human‐waste‐related risk existed within the region. Few officials and lay citizens shared this belief. In addition, few officials and lay citizens were aware of poor wastewater‐management‐related disease outbreaks and water contamination. Finally, aside from health professionals, a few interviewees understood the importance of basic hygiene and water treatment measures that could help to prevent disease. Our results add to the literature on environmentally‐related risk perceptions in the developing world. We discuss recommendations for improving future human‐wastewater‐related risk communication within the region.  相似文献   

8.
The Petroleum Safety Authority Norway (PSA‐N) has recently adopted a new definition of risk: “the consequences of an activity with the associated uncertainty.” The PSA‐N has also been using “deficient risk assessment” for some time as a basis for assigning nonconformities in audit reports. This creates an opportunity to study the link between risk perspective and risk assessment quality in a regulatory context, and, in the present article, we take a hard look at the term “deficient risk assessment” both normatively and empirically. First, we perform a conceptual analysis of how a risk assessment can be deficient in light of a particular risk perspective consistent with the new PSA‐N risk definition. Then, we examine the usages of the term “deficient” in relation to risk assessments in PSA‐N audit reports and classify these into a set of categories obtained from the conceptual analysis. At an overall level, we were able to identify on what aspects of the risk assessment the PSA‐N is focusing and where deficiencies are being identified in regulatory practice. A key observation is that there is a diversity in how the agency officials approach the risk assessments in audits. Hence, we argue that improving the conceptual clarity of what the authorities characterize as “deficient” in relation to the uncertainty‐based risk perspective may contribute to the development of supervisory practices and, eventually, potentially strengthen the learning outcome of the audit reports.  相似文献   

9.
Reacting to an emergency requires quick decisions under stressful and dynamic conditions. To react effectively, responders need to know the right actions to take given the risks posed by the emergency. While existing research on risk scales focuses primarily on decision making in static environments with known risks, these scales may be inappropriate for conditions where the decision maker's time and mental resources are limited and may be infeasible if the actual risk probabilities are unknown. In this article, we propose a method to develop context‐specific, scenario‐based risk scales designed for emergency response training. Emergency scenarios are used as scale points, reducing our dependence on known probabilities; these are drawn from the targeted emergency context, reducing the mental resources required to interpret the scale. The scale is developed by asking trainers/trainees to rank order a range of risk scenarios and then aggregating these orderings using a Kemeny ranking. We propose measures to assess this aggregated scale's internal consistency, reliability, and validity, and we discuss how to use the scale effectively. We demonstrate our process by developing a risk scale for subsurface coal mine emergencies and test the reliability of the scale by repeating the process, with some methodological variations, several months later.  相似文献   

10.
In recent years, the healthcare sector has adopted the use of operational risk assessment tools to help understand the systems issues that lead to patient safety incidents. But although these problem‐focused tools have improved the ability of healthcare organizations to identify hazards, they have not translated into measurable improvements in patient safety. One possible reason for this is a lack of support for the solution‐focused process of risk control. This article describes a content analysis of the risk management strategies, policies, and procedures at all acute (i.e., hospital), mental health, and ambulance trusts (health service organizations) in the East of England area of the British National Health Service. The primary goal was to determine what organizational‐level guidance exists to support risk control practice. A secondary goal was to examine the risk evaluation guidance provided by these trusts. With regard to risk control, we found an almost complete lack of useful guidance to promote good practice. With regard to risk evaluation, the trusts relied exclusively on risk matrices. A number of weaknesses were found in the use of this tool, especially related to the guidance for scoring an event's likelihood. We make a number of recommendations to address these concerns. The guidance assessed provides insufficient support for risk control and risk evaluation. This may present a significant barrier to the success of risk management approaches in improving patient safety.  相似文献   

11.
We developed a simulation model for quantifying the spatio‐temporal distribution of contaminants (e.g., xenobiotics) and assessing the risk of exposed populations at the landscape level. The model is a spatio‐temporal exposure‐hazard model based on (i) tools of stochastic geometry (marked polygon and point processes) for structuring the landscape and describing the exposed individuals, (ii) a dispersal kernel describing the dissemination of contaminants from polygon sources, and (iii) an (eco)toxicological equation describing the toxicokinetics and dynamics of contaminants in affected individuals. The model was implemented in the briskaR package (b iological risk a ssessment with R ) of the R software. This article presents the model background, the use of the package in an illustrative example, namely, the effect of genetically modified maize pollen on nontarget Lepidoptera, and typical comparisons of landscape configurations that can be carried out with our model (different configurations lead to different mortality rates in the treated example). In real case studies, parameters and parametric functions encountered in the model will have to be precisely specified to obtain realistic measures of risk and impact and accurate comparisons of landscape configurations. Our modeling framework could be applied to study other risks related to agriculture, for instance, pathogen spread in crops or livestock, and could be adapted to cope with other hazards such as toxic emissions from industrial areas having health effects on surrounding populations. Moreover, the R package has the potential to help risk managers in running quantitative risk assessments and testing management strategies.  相似文献   

12.
This paper extends the long‐term factorization of the stochastic discount factor introduced and studied by Alvarez and Jermann (2005) in discrete‐time ergodic environments and by Hansen and Scheinkman (2009) and Hansen (2012) in Markovian environments to general semimartingale environments. The transitory component discounts at the stochastic rate of return on the long bond and is factorized into discounting at the long‐term yield and a positive semimartingale that extends the principal eigenfunction of Hansen and Scheinkman (2009) to the semimartingale setting. The permanent component is a martingale that accomplishes a change of probabilities to the long forward measure, the limit of T‐forward measures. The change of probabilities from the data‐generating to the long forward measure absorbs the long‐term risk‐return trade‐off and interprets the latter as the long‐term risk‐neutral measure.  相似文献   

13.
The purpose of this article was to conduct a risk‐based study based on a linkage of experimental human influenza infections and fluctuation analysis of airway function to assess whether influenza viral infection was risk factor for exacerbations of chronic occupational asthma. Here we provided a comprehensive probabilistic analysis aimed at quantifying influenza‐associated exacerbations risk for occupational asthmatics, based on a combination of published distributions of viral shedding and symptoms scores and lung respiratory system properties characterized by long‐range peak expiratory flow (PEF) dynamics. Using a coupled detrended fluctuation analysis‐experimental human influenza approach, we estimated the conditional probability of moderate or severe lung airway obstruction and hence the exacerbations risk of influenza‐associated occupational asthma in individuals. The long‐range correlation exponent (α) was used as a predictor of future exacerbations risk of influenza‐associated asthma. For our illustrative distribution of PEF fluctuations and influenza‐induced asthma exacerbations risk relations, we found that the probability of exacerbations risk can be limited to below 50% by keeping α to below 0.53. This study also found that limiting wheeze scores to 0.56 yields a 75% probability of influenza‐associated asthma exacerbations risk and a limit of 0.34 yields a 50% probability that may give a representative estimate of the distribution of chronic respiratory system properties. This study implicates that influenza viral infection is an important risk factor for exacerbations of chronic occupational asthma.  相似文献   

14.
We examine the critical role of advance supply signals—such as suppliers’ financial health and production viability—in dynamic supply risk management. The firm operates an inventory system with multiple demand classes and multiple suppliers. The sales are discretionary and the suppliers are susceptible to both systematic and operational risks. We develop a hierarchical Markov model that captures the essential features of advance supply signals, and integrate it with procurement and selling decisions. We characterize the optimal procurement and selling policy, and the strategic relationship between signal‐based forecast, multi‐sourcing, and discretionary selling. We show that higher demand heterogeneity may reduce the value of discretionary selling, and that the mean value‐based forecast may outperform the stationary distribution‐based forecast. This work advances our understanding on when and how to use advance supply signals in dynamic risk management. Future supply risk erodes profitability but enhances the marginal value of current inventory. A signal of future supply shortage raises both base stock and demand rationing levels, thereby boosting the current production and tightening the current sales. Signal‐based dynamic forecast effectively guides the firm's procurement and selling decisions. Its value critically depends on supply volatility and scarcity. Ignoring advance supply signals can result in misleading recommendations and severe losses. Signal‐based dynamic supply forecast should be used when: (a) supply uncertainty is substantial, (b) supply‐demand ratio is moderate, (c) forecast precision is high, and (d) supplier heterogeneity is high.  相似文献   

15.
The development of catastrophe models in recent years allows for assessment of the flood hazard much more effectively than when the federally run National Flood Insurance Program (NFIP) was created in 1968. We propose and then demonstrate a methodological approach to determine pure premiums based on the entire distribution of possible flood events. We apply hazard, exposure, and vulnerability analyses to a sample of 300,000 single‐family residences in two counties in Texas (Travis and Galveston) using state‐of‐the‐art flood catastrophe models. Even in zones of similar flood risk classification by FEMA there is substantial variation in exposure between coastal and inland flood risk. For instance, homes in the designated moderate‐risk X500/B zones in Galveston are exposed to a flood risk on average 2.5 times greater than residences in X500/B zones in Travis. The results also show very similar average annual loss (corrected for exposure) for a number of residences despite their being in different FEMA flood zones. We also find significant storm‐surge exposure outside of the FEMA designated storm‐surge risk zones. Taken together these findings highlight the importance of a microanalysis of flood exposure. The process of aggregating risk at a flood zone level—as currently undertaken by FEMA—provides a false sense of uniformity. As our analysis indicates, the technology to delineate the flood risks exists today.  相似文献   

16.
Mean‐deviation analysis, along with the existing theories of coherent risk measures and dual utility, is examined in the context of the theory of choice under uncertainty, which studies rational preference relations for random outcomes based on different sets of axioms such as transitivity, monotonicity, continuity, etc. An axiomatic foundation of the theory of coherent risk measures is obtained as a relaxation of the axioms of the dual utility theory, and a further relaxation of the axioms are shown to lead to the mean‐deviation analysis. Paradoxes arising from the sets of axioms corresponding to these theories and their possible resolutions are discussed, and application of the mean‐deviation analysis to optimal risk sharing and portfolio selection in the context of rational choice is considered.  相似文献   

17.
This article proposes an intertemporal risk‐value (IRV) model that integrates probability‐time tradeoff, time‐value tradeoff, and risk‐value tradeoff into one unified framework. We obtain a general probability‐time tradeoff, which yields a formal representation form to reflect the psychological distance of a decisionmaker in evaluating a temporal lottery. This intuition of probability‐time tradeoff is supported by robust empirical findings as well as by psychological theory. Through an explicit formalization of probability‐time tradeoff, an IRV model taking into account three fundamental dimensions, namely, value, probability, and time, is established. The object of evaluation in our framework is a complex lottery. We also give some insights into the structure of the IRV model using a wildcatter problem.  相似文献   

18.
Recent studies showed that climate change and socioeconomic trends are expected to increase flood risks in many regions. However, in these studies, human behavior is commonly assumed to be constant, which neglects interaction and feedback loops between human and environmental systems. This neglect of human adaptation leads to a misrepresentation of flood risk. This article presents an agent‐based model that incorporates human decision making in flood risk analysis. In particular, household investments in loss‐reducing measures are examined under three economic decision models: (1) expected utility theory, which is the traditional economic model of rational agents; (2) prospect theory, which takes account of bounded rationality; and (3) a prospect theory model, which accounts for changing risk perceptions and social interactions through a process of Bayesian updating. We show that neglecting human behavior in flood risk assessment studies can result in a considerable misestimation of future flood risk, which is in our case study an overestimation of a factor two. Furthermore, we show how behavior models can support flood risk analysis under different behavioral assumptions, illustrating the need to include the dynamic adaptive human behavior of, for instance, households, insurers, and governments. The method presented here provides a solid basis for exploring human behavior and the resulting flood risk with respect to low‐probability/high‐impact risks.  相似文献   

19.
Various foot‐and‐mouth disease (FMD) virus strains circulate in the Middle East, causing frequent episodes of FMD outbreaks among Israeli livestock. Since the virus is highly resistant in semen, artificial insemination with contaminated bull semen may lead to the infection of the receiver cow. As a non‐FMD‐free country with vaccination, Israel is currently engaged in trading bull semen only with countries of the same status. The purpose of this study was to assess the risk of release of FMD virus through export of bull semen in order to estimate the risk for FMD‐free countries considering purchasing Israeli bull semen. A stochastic risk assessment model was used to estimate this risk, defined as the annual likelihood of exporting at least one ejaculate of bull semen contaminated with viable FMD virus. A total of 45 scenarios were assessed to account for uncertainty and variability around specific parameter estimates and to evaluate the effect of various mitigation measures, such as performing a preexport test on semen ejaculates. Under the most plausible scenario, the annual likelihood of exporting bull semen contaminated with FMD virus had a median of 1.3 * 10?7 for an export of 100 ejaculates per year. This corresponds to one infected ejaculate exported every 7 million years. Under the worst‐case scenario, the median of the risk rose to 7.9 * 10?5, which is equivalent to the export of one infected ejaculate every 12,000 years. Sensitivity analysis indicated that the most influential parameter is the probability of viral excretion in infected bulls.  相似文献   

20.
This article presents a new methodology to implement the concept of equity in regional earthquake risk mitigation programs using an optimization framework. It presents a framework that could be used by decisionmakers (government and authorities) to structure budget allocation strategy toward different seismic risk mitigation measures, i.e., structural retrofitting for different building structural types in different locations and planning horizons. A two‐stage stochastic model is developed here to seek optimal mitigation measures based on minimizing mitigation expenditures, reconstruction expenditures, and especially large losses in highly seismically active countries. To consider fairness in the distribution of financial resources among different groups of people, the equity concept is incorporated using constraints in model formulation. These constraints limit inequity to the user‐defined level to achieve the equity‐efficiency tradeoff in the decision‐making process. To present practical application of the proposed model, it is applied to a pilot area in Tehran, the capital city of Iran. Building stocks, structural vulnerability functions, and regional seismic hazard characteristics are incorporated to compile a probabilistic seismic risk model for the pilot area. Results illustrate the variation of mitigation expenditures by location and structural type for buildings. These expenditures are sensitive to the amount of available budget and equity consideration for the constant risk aversion. Most significantly, equity is more easily achieved if the budget is unlimited. Conversely, increasing equity where the budget is limited decreases the efficiency. The risk‐return tradeoff, equity‐reconstruction expenditures tradeoff, and variation of per‐capita expected earthquake loss in different income classes are also presented.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号