首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
免赔额和NCD赔付条件下保险索赔次数的分布   总被引:5,自引:4,他引:5  
本文分析了免赔额及NCD赔付条件对索赔次数分布的影响,通过比较风险事件与索赔事件的差异引出了一类同质集合保单索赔次数的分布(PG分布);给出了PG分布的性质及参数估计方法。通过两个保险实例展示了数据拟合效果。  相似文献   

2.
In the United States, insurance against flood hazard (inland flooding or storm surge from hurricanes) has been provided mainly through the National Flood Insurance Program (NFIP) since 1968. The NFIP covers $1.23 trillion of assets today. This article provides the first analysis of flood insurance tenure ever undertaken: that is, the number of years that people keep their flood insurance policy before letting it lapse. Our analysis of the entire portfolio of the NFIP over the period 2001-2009 reveals that the median tenure of new policies during that time is between two and four years; it is also relatively stable over time and levels of flood hazard. Prior flood experience can affect tenure: people who have experienced small flood claims tend to hold onto their insurance longer; people who have experienced large flood claims tend to let their insurance lapse sooner. To overcome the policy and governance challenges posed by homeowners' inadequate insurance coverage, we discuss policy recommendations that include for banks and government-sponsored enterprises (GSEs) strengthening their requirements and the introduction of multiyear flood insurance contracts attached to the property, both of which are likely to provide more coverage stability and encourage investments in risk-reduction measures.  相似文献   

3.
This article compares two nonparametric tree‐based models, quantile regression forests (QRF) and Bayesian additive regression trees (BART), for predicting storm outages on an electric distribution network in Connecticut, USA. We evaluated point estimates and prediction intervals of outage predictions for both models using high‐resolution weather, infrastructure, and land use data for 89 storm events (including hurricanes, blizzards, and thunderstorms). We found that spatially BART predicted more accurate point estimates than QRF. However, QRF produced better prediction intervals for high spatial resolutions (2‐km grid cells and towns), while BART predictions aggregated to coarser resolutions (divisions and service territory) more effectively. We also found that the predictive accuracy was dependent on the season (e.g., tree‐leaf condition, storm characteristics), and that the predictions were most accurate for winter storms. Given the merits of each individual model, we suggest that BART and QRF be implemented together to show the complete picture of a storm's potential impact on the electric distribution network, which would allow for a utility to make better decisions about allocating prestorm resources.  相似文献   

4.
Projecting losses associated with hurricanes is a complex and difficult undertaking that is fraught with uncertainties. Hurricane Charley, which struck southwest Florida on August 13, 2004, illustrates the uncertainty of forecasting damages from these storms. Due to shifts in the track and the rapid intensification of the storm, real-time estimates grew from 2 billion dollars to 3 billion dollars in losses late on the 12th to a peak of 50 billion dollars for a brief time as the storm appeared to be headed for the Tampa Bay area. The storm struck the resort areas of Charlotte Harbor and moved across the densely populated central part of the state, with early poststorm estimates in the 28 dollars to 31 billion dollars range, and final estimates converging at 15 billion dollars as the actual intensity at landfall became apparent. The Florida Commission on Hurricane Loss Projection Methodology (FCHLPM) has a great appreciation for the role of computer models in projecting losses from hurricanes. The FCHLPM contracts with a professional team to perform onsite (confidential) audits of computer models developed by several different companies in the United States that seek to have their models approved for use in insurance rate filings in Florida. The team's members represent the fields of actuarial science, computer science, meteorology, statistics, and wind and structural engineering. An important part of the auditing process requires uncertainty and sensitivity analyses to be performed with the applicant's proprietary model. To influence future such analyses, an uncertainty and sensitivity analysis has been completed for loss projections arising from use of a sophisticated computer model based on the Holland wind field. Sensitivity analyses presented in this article utilize standardized regression coefficients to quantify the contribution of the computer input variables to the magnitude of the wind speed.  相似文献   

5.
Projecting losses associated with hurricanes is a complex and difficult undertaking that is wrought with uncertainties. Hurricane Charley, which struck southwest Florida on August 13, 2004, illustrates the uncertainty of forecasting damages from these storms. Due to shifts in the track and the rapid intensification of the storm, real-time estimates grew from 2 to 3 billion dollars in losses late on August 12 to a peak of 50 billion dollars for a brief time as the storm appeared to be headed for the Tampa Bay area. The storm hit the resort areas of Charlotte Harbor near Punta Gorda and then went on to Orlando in the central part of the state, with early poststorm estimates converging on a damage estimate in the 28 to 31 billion dollars range. Comparable damage to central Florida had not been seen since Hurricane Donna in 1960. The Florida Commission on Hurricane Loss Projection Methodology (FCHLPM) has recognized the role of computer models in projecting losses from hurricanes. The FCHLPM established a professional team to perform onsite (confidential) audits of computer models developed by several different companies in the United States that seek to have their models approved for use in insurance rate filings in Florida. The team's members represent the fields of actuarial science, computer science, meteorology, statistics, and wind and structural engineering. An important part of the auditing process requires uncertainty and sensitivity analyses to be performed with the applicant's proprietary model. To influence future such analyses, an uncertainty and sensitivity analysis has been completed for loss projections arising from use of a Holland B parameter hurricane wind field model. Uncertainty analysis quantifies the expected percentage reduction in the uncertainty of wind speed and loss that is attributable to each of the input variables.  相似文献   

6.
New features of natural disasters have been observed over the last several years. The factors that influence the disasters’ formation mechanisms, regularity of occurrence and main characteristics have been revealed to be more complicated and diverse in nature than previously thought. As the uncertainty involved increases, the variables need to be examined further. This article discusses the importance and the shortage of multivariate analysis of natural disasters and presents a method to estimate the joint probability of the return periods and perform a risk analysis. Severe dust storms from 1990 to 2008 in Inner Mongolia were used as a case study to test this new methodology, as they are normal and recurring climatic phenomena on Earth. Based on the 79 investigated events and according to the dust storm definition with bivariate, the joint probability distribution of severe dust storms was established using the observed data of maximum wind speed and duration. The joint return periods of severe dust storms were calculated, and the relevant risk was analyzed according to the joint probability. The copula function is able to simulate severe dust storm disasters accurately. The joint return periods generated are closer to those observed in reality than the univariate return periods and thus have more value in severe dust storm disaster mitigation, strategy making, program design, and improvement of risk management. This research may prove useful in risk‐based decision making. The exploration of multivariate analysis methods can also lay the foundation for further applications in natural disaster risk analysis.  相似文献   

7.
The main results of this paper are monotonicity statements about the risk measures value-at-risk (VaR) and tail value-at-risk (TVaR) with respect to the parameters of single and multi risk factor models, which are standard models for the quantification of credit and insurance risk. In the context of single risk factor models, non-Gaussian distributed latent risk factors are allowed. It is shown that the TVaR increases with increasing claim amounts, probabilities of claims and correlations, whereas the VaR is in general not monotone in the correlation parameters. To compare the aggregated risks arising from single and multi risk factor models, the usual stochastic order and the increasing convex order are used in this paper, since these stochastic orders can be interpreted as being induced by the VaR-concept and the TVaR-concept, respectively. To derive monotonicity statements about these risk measures, properties of several further stochastic orders are used and their relation to the usual stochastic order and to the increasing convex order are applied.  相似文献   

8.
Weather and climate disasters pose an increasing risk to life and property in the United States. Managing this risk requires objective information about the nature of the threat and subjective information about how people perceive it. Meteorologists and climatologists have a relatively firm grasp of the historical objective risk. For example, we know which parts of the United States are most likely to experience drought, heat waves, flooding, snow or ice storms, tornadoes, and hurricanes. We know less about the geographic distribution of the perceived risks of meteorological events and trends. Do subjective perceptions align with exposure to weather risks? This question is difficult to answer because analysts have yet to develop a comprehensive and spatially consistent methodology for measuring risk perceptions across geographic areas in the United States. In this project, we propose a methodology that uses multilevel regression and poststratification to estimate extreme weather and climate risk perceptions by geographic area (i.e., region, state, forecast area, and county). Then we apply the methodology using data from three national surveys (n = 9,542). This enables us to measure, map, and compare perceptions of risk from multiple weather hazards in geographic areas across the country.  相似文献   

9.
Gim S. Seow 《决策科学》1995,26(2):145-173
This study develops a contingent claims model for valuing the implicit market value of the pension claim associated with defined benefit pension plans. In this model, the firm issues pension, debt, and equity claims. These claims have joint access to two underlying portfolios: corporate and pension. The changes in the market values of these two portfolios are assumed to follow a joint lognormal diffusion process. By imposing terminal boundary conditions implied by Employment Retirement Income Security Act (ERISA) rules and the pension insurance provisions of the Pension Benefit Guaranty Corporation (PBGC) on the partial differential equation, a solution for the pension value is obtained. This quasi-market measure of the value of the pension claim may be represented by a portfolio consisting of four components: (1) a risk-free discount bond with face value equal to promised pension benefits; (2) a short put on pension assets with exercise price equal to pension benefits; (3) a long call on 30 percent of corporate assets with exercise price equal to the face value of secured corporate debt; and (4) a short call on 30 percent of corporate assets with a stochastic exercise price which depends on the terminal value of the pension fund. A numerical example using 1992 and 1993 financial statement data from six major U.S. corporations is provided. This example illustrates the usefulness of the model's prediction and the potential effect of theoretical pension values on corporate debt-equity ratios.  相似文献   

10.
Most automobile insurance databases contain a large number of policyholders with zero claims. This high frequency of zeros may reflect the fact that some insureds make little use of their vehicle, or that they do not wish to make a claim for small accidents in order to avoid an increase in their premium, but it might also be because of good driving. We analyze information on exposure to risk and driving habits using telematics data from a pay‐as‐you‐drive sample of insureds. We include distance traveled per year as part of an offset in a zero‐inflated Poisson model to predict the excess of zeros. We show the existence of a learning effect for large values of distance traveled, so that longer driving should result in higher premiums, but there should be a discount for drivers who accumulate longer distances over time due to the increased proportion of zero claims. We confirm that speed limit violations and driving in urban areas increase the expected number of accident claims. We discuss how telematics information can be used to design better insurance and to improve traffic safety.  相似文献   

11.
Hurricane track and intensity can change rapidly in unexpected ways, thus making predictions of hurricanes and related hazards uncertain. This inherent uncertainty often translates into suboptimal decision-making outcomes, such as unnecessary evacuation. Representing this uncertainty is thus critical in evacuation planning and related activities. We describe a physics-based hazard modeling approach that (1) dynamically accounts for the physical interactions among hazard components and (2) captures hurricane evolution uncertainty using an ensemble method. This loosely coupled model system provides a framework for probabilistic water inundation and wind speed levels for a new, risk-based approach to evacuation modeling, described in a companion article in this issue. It combines the Weather Research and Forecasting (WRF) meteorological model, the Coupled Routing and Excess STorage (CREST) hydrologic model, and the ADvanced CIRCulation (ADCIRC) storm surge, tide, and wind-wave model to compute inundation levels and wind speeds for an ensemble of hurricane predictions. Perturbations to WRF's initial and boundary conditions and different model physics/parameterizations generate an ensemble of storm solutions, which are then used to drive the coupled hydrologic + hydrodynamic models. Hurricane Isabel (2003) is used as a case study to illustrate the ensemble-based approach. The inundation, river runoff, and wind hazard results are strongly dependent on the accuracy of the mesoscale meteorological simulations, which improves with decreasing lead time to hurricane landfall. The ensemble envelope brackets the observed behavior while providing “best-case” and “worst-case” scenarios for the subsequent risk-based evacuation model.  相似文献   

12.
破产概率是非寿险保险风险理论的核心问题。与经典的Cramér-Lundberg模型相比, 由Li Zehui等建立的现代风险模型更为准确地描述了非寿险保险运营的主要特征, 对现实保险业务具有较好的解释力。本文基于现代风险模型, 考虑保险公司多个险种混合经营这一更为现实的情形, 在索赔额服从正则尾分布条件下获得了破产概率的渐近等价估计。我们发现, 在具有大额索赔特征的多个险种混合的条件下, 公司面临的极端索赔风险将由索赔额分布尾部最厚的那些险种决定, 而索赔额分布尾部相对较薄的那些险种的影响作用将被淹没。该结论的有效性可用MATLAB数值模拟得到理想的验证。本文结果是对风险模型研究的重要推广, 也为多险种混合情形下保险公司的风险控制与初始保证金界定提供了依据。  相似文献   

13.
Occupation-related mental stress has been associated with significant loss in terms of diminished productivity, higher absenteeism, and increased workers' compensation claims. The Liberty Mutual Group workers' compensation data were analysed for the years 1984-93 for mental stress-related claims. This represented over 7 million claims, over 17000 of which were identified as mental stress-related. The proportion of all stress claims was estimated for each year. The proportion by gender, age and occupation (job classification code and occupation name) was also described. Stress claims increased during the late 1980s, peaking in 1991, accounting for 0.48% of all claims and 1.69% of all claims costs, and has declined since. Even at its peak, mental stress claims were not a major portion of workers' compensation losses. However, they are expensive. The average costs of a stress claim in 1993 was about $13000. The state of California accounts for 60% of the claims reported to Liberty Mutual over this 10-year period. In 1993 women accounted for 51% of the stress claims and about 30% of all claims. The mean age of workers with stress claims was 39.3 years, with most stress claims from 30 to 34 year-olds. High-risk occupations and industries include banks, insurance companies, general labourers, management, salespersons, and drivers. The current decline in stress claims can largely be explained by a combination of changes in unemployment, increasing litigation, and changes in law in California and other states that made more stringent the requirements whereby a mental stress claim can be considered work-related. While the data presented are helpful for comparing stress claims to all claims reported to Liberty Mutual and for identifying high-risk occupations and industries, because of the uniqueness of the stress claim selection algorithm and the uncertainties with cost estimates, the cost figures are not directly comparable to other claims reporting systems.  相似文献   

14.
This paper exploits dynamic features of insurance contracts in the empirical analysis of moral hazard. We first show that experience rating implies negative occurrence dependence under moral hazard: individual claim intensities decrease with the number of past claims. We then show that dynamic insurance data allow to distinguish this moral‐hazard effect from dynamic selection on unobservables. We develop nonparametric tests and estimate a flexible parametric model. We find no evidence of moral hazard in French car insurance. Our analysis contributes to a recent literature based on static data that has problems distinguishing between moral hazard and selection and dealing with dynamic features of actual insurance contracts. Methodologically, this paper builds on and extends the literature on state dependence and heterogeneity in event‐history data. (JEL: D82, G22, C41, C14)  相似文献   

15.
This study examines how organizations construct and manage risk objects as a duality of harm–benefit within their normal operations. It moves beyond the existing focus on accidents, disasters and crisis. We study the risk‐transfer processes of 35 insurers where they navigate the tension of retaining risk in their insurance portfolio to increase the benefit of making profit and transferring risk to reinsurance to reduce the harm of paying claims. We show that organizations’ constructions of risk are underpinned by everyday risk management practices of centralizing, calculating and diversifying. Through variation in these practices, not all organizations seek balance and we in turn uncover the sensemaking processes of abstracting and localizing that enable organizations to prioritize harm or benefit. This contributes to the risk literature by illuminating the co‐constitutive relationship between risk sensemaking processes and everyday risk management practices. Following the complex linkages involved in the construction of risk objects as sources of harm–benefit, our analysis also contributes to the literature on dualities. It shows that while immediate trade‐offs between harm–benefit occur, prioritizing one element of the duality is ultimately a means for attaining the other. Thus, while initial imbalance is evident, prioritization can be an enabling approach to navigating duality.  相似文献   

16.
Floods are a natural hazard evolving in space and time according to meteorological and river basin dynamics, so that a single flood event can affect different regions over the event duration. This physical mechanism introduces spatio‐temporal relationships between flood records and losses at different locations over a given time window that should be taken into account for an effective assessment of the collective flood risk. However, since extreme floods are rare events, the limited number of historical records usually prevents a reliable frequency analysis. To overcome this limit, we move from the analysis of extreme events to the modeling of continuous stream flow records preserving spatio‐temporal correlation structures of the entire process, and making a more efficient use of the information provided by continuous flow records. The approach is based on the dynamic copula framework, which allows for splitting the modeling of spatio‐temporal properties by coupling suitable time series models accounting for temporal dynamics, and multivariate distributions describing spatial dependence. The model is applied to 490 stream flow sequences recorded across 10 of the largest river basins in central and eastern Europe (Danube, Rhine, Elbe, Oder, Waser, Meuse, Rhone, Seine, Loire, and Garonne). Using available proxy data to quantify local flood exposure and vulnerability, we show that the temporal dependence exerts a key role in reproducing interannual persistence, and thus magnitude and frequency of annual proxy flood losses aggregated at a basin‐wide scale, while copulas allow the preservation of the spatial dependence of losses at weekly and annual time scales.  相似文献   

17.
In August 2012, Hurricane Isaac, a Category 1 hurricane at landfall, caused extensive power outages in Louisiana. The storm brought high winds, storm surge, and flooding to Louisiana, and power outages were widespread and prolonged. Hourly power outage data for the state of Louisiana were collected during the storm and analyzed. This analysis included correlation of hourly power outage figures by zip code with storm conditions including wind, rainfall, and storm surge using a nonparametric ensemble data mining approach. Results were analyzed to understand how correlation of power outages with storm conditions differed geographically within the state. This analysis provided insight on how rainfall and storm surge, along with wind, contribute to power outages in hurricanes. By conducting a longitudinal study of outages at the zip code level, we were able to gain insight into the causal drivers of power outages during hurricanes. Our analysis showed that the statistical importance of storm characteristic covariates to power outages varies geographically. For Hurricane Isaac, wind speed, precipitation, and previous outages generally had high importance, whereas storm surge had lower importance, even in zip codes that experienced significant surge. The results of this analysis can inform the development of power outage forecasting models, which often focus strictly on wind‐related covariates. Our study of Hurricane Isaac indicates that inclusion of other covariates, particularly precipitation, may improve model accuracy and robustness across a range of storm conditions and geography.  相似文献   

18.
This case study describes a behavioral management program to modify the performance of employees of independent appraisal firms involved in the processing of insurance claims. Delays in appraising and reporting automobile damages had produced a corresponding delay in the payment of claims. The latter slowed the productivity of the claim department and often led to customer dissatisfaction, complaints and time consuming telephone conversations to explain the delay. Although appraisers were required by a contractual agreement. to report financial estimates of damages within twenty-four hours after receiving a request, informal observations indicated that appraisal forms typically failed to meet this criteria.  相似文献   

19.
Coastal cities around the world have experienced large costs from major flooding events in recent years. Climate change is predicted to bring an increased likelihood of flooding due to sea level rise and more frequent severe storms. In order to plan future development and adaptation, cities must know the magnitude of losses associated with these events, and how they can be reduced. Often losses are calculated from insurance claims or surveying flood victims. However, this largely neglects the loss due to the disruption of economic activity. We use a forward‐looking dynamic computable general equilibrium model to study how a local economy responds to a flood, focusing on the subsequent recovery/reconstruction. Initial damage is modeled as a shock to the capital stock and recovery requires rebuilding that stock. We apply the model to Vancouver, British Columbia by considering a flood scenario causing total capital damage of $14.6 billion spread across five municipalities. GDP loss relative to a no‐flood scenario is relatively long‐lasting. It is 2.0% ($2.2 billion) in the first year after the flood, 1.7% ($1.9 billion) in the second year, and 1.2% ($1.4 billion) in the fifth year.  相似文献   

20.
Decisionmakers need information about the anticipated future costs of maintaining polio eradication as a function of the policy options under consideration. Given the large portfolio of options, we reviewed and synthesized the existing cost data relevant to current policies to provide context for future policies. We model the expected future costs of different strategies for continued vaccination, surveillance, and other costs that require significant potential resource commitments. We estimate the costs of different potential policy portfolios for low-, middle-, and high-income countries to demonstrate the variability in these costs. We estimate that a global transition from routine immunization with oral poliovirus vaccine (OPV) to inactivated poliovirus vaccine (IPV) would increase the costs of managing polio globally, although routine IPV use remains less costly than routine OPV use with supplemental immunization activities. The costs of surveillance and a stockpile, while small compared to routine vaccination costs, represent important expenditures to ensure adequate response to potential outbreaks. The uncertainty and sensitivity analyses highlight important uncertainty in the aggregated costs and demonstrates that the discount rate and uncertainty in price and administration cost of IPV drives the expected incremental cost of routine IPV vs. OPV immunization.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号