首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
为了更有效的规避影响保险市场交易效率的逆向选择问题,本文分投保人风险类型为两种和多种情形建立了带奖惩金的两期保险契约模型,首次提出可以用奖励金和惩罚金有效甄别投保人的风险类型。该模型根据投保人第一个保险期内的索赔情况在第二个保险期对其进行奖励或惩罚,高风险类型的投保人如果选择为低风险类型投保人设计的保险契约,则其在第二阶段受到惩罚的概率要远远大于得到奖励的概率,即风险越高的投保人越害怕惩罚金,因此所建模型满足斯彭斯-莫里斯分离条件。带奖惩金的两期保险契约模型中保险公司的期望利润仍然为0,并不会给投保人带来额外的经济负担,却能够实现对传统部分保险契约简单重复两次的严格帕累托改进。最后采用一个算例说明了该模型的有效性。  相似文献   

2.
Risk factor selection is very important in the insurance industry, which helps precise rate making and studying the features of high‐quality insureds. Zero‐inflated data are common in insurance, such as the claim frequency data, and zero‐inflation makes the selection of risk factors quite difficult. In this article, we propose a new risk factor selection approach, EM adaptive LASSO, for a zero‐inflated Poisson regression model, which combines the EM algorithm and adaptive LASSO penalty. Under some regularity conditions, we show that, with probability approaching 1, important factors are selected and the redundant factors are excluded. We investigate the finite sample performance of the proposed method through a simulation study and the analysis of car insurance data from SAS Enterprise Miner database.  相似文献   

3.
The transition to semiautonomous driving is set to considerably reduce road accident rates as human error is progressively removed from the driving task. Concurrently, autonomous capabilities will transform the transportation risk landscape and significantly disrupt the insurance industry. Semiautonomous vehicle (SAV) risks will begin to alternate between human error and technological susceptibilities. The evolving risk landscape will force a departure from traditional risk assessment approaches that rely on historical data to quantify insurable risks. This article investigates the risk structure of SAVs and employs a telematics‐based anomaly detection model to assess split risk profiles. An unsupervised multivariate Gaussian (MVG) based anomaly detection method is used to identify abnormal driving patterns based on accelerometer and GPS sensors of manually driven vehicles. Parameters are inferred for vehicles equipped with semiautonomous capabilities and the resulting split risk profile is determined. The MVG approach allows for the quantification of vehicle risks by the relative frequency and severity of observed anomalies and a location‐based risk analysis is performed for a more comprehensive assessment. This approach contributes to the challenge of quantifying SAV risks and the methods employed here can be applied to evolving data sources pertinent to SAVs. Utilizing the vast amounts of sensor‐generated data will enable insurers to proactively reassess the collective performances of both the artificial driving agent and human driver.  相似文献   

4.
In the United States, insurance against flood hazard (inland flooding or storm surge from hurricanes) has been provided mainly through the National Flood Insurance Program (NFIP) since 1968. The NFIP covers $1.23 trillion of assets today. This article provides the first analysis of flood insurance tenure ever undertaken: that is, the number of years that people keep their flood insurance policy before letting it lapse. Our analysis of the entire portfolio of the NFIP over the period 2001-2009 reveals that the median tenure of new policies during that time is between two and four years; it is also relatively stable over time and levels of flood hazard. Prior flood experience can affect tenure: people who have experienced small flood claims tend to hold onto their insurance longer; people who have experienced large flood claims tend to let their insurance lapse sooner. To overcome the policy and governance challenges posed by homeowners' inadequate insurance coverage, we discuss policy recommendations that include for banks and government-sponsored enterprises (GSEs) strengthening their requirements and the introduction of multiyear flood insurance contracts attached to the property, both of which are likely to provide more coverage stability and encourage investments in risk-reduction measures.  相似文献   

5.
Pesticide risk assessment for food products involves combining information from consumption and concentration data sets to estimate a distribution for the pesticide intake in a human population. Using this distribution one can obtain probabilities of individuals exceeding specified levels of pesticide intake. In this article, we present a probabilistic, Bayesian approach to modeling the daily consumptions of the pesticide Iprodione though multiple food products. Modeling data on food consumption and pesticide concentration poses a variety of problems, such as the large proportions of consumptions and concentrations that are recorded as zero, and correlation between the consumptions of different foods. We consider daily food consumption data from the Netherlands National Food Consumption Survey and concentration data collected by the Netherlands Ministry of Agriculture. We develop a multivariate latent‐Gaussian model for the consumption data that allows for correlated intakes between products. For the concentration data, we propose a univariate latent‐t model. We then combine predicted consumptions and concentrations from these models to obtain a distribution for individual daily Iprodione exposure. The latent‐variable models allow for both skewness and large numbers of zeros in the consumption and concentration data. The use of a probabilistic approach is intended to yield more robust estimates of high percentiles of the exposure distribution than an empirical approach. Bayesian inference is used to facilitate the treatment of data with a complex structure.  相似文献   

6.
Research on the risk of motor vehicle injuries and their relationship with the amount of travel has been only partially analyzed. The few individual exposure assessments are related to very specific subsets of the driving and traveling populations. This study analyzes the relationship between kilometers traveled and hospitalization due to motor vehicle injuries. Twelve thousand three hundred and sixty nine Spanish university graduates from the Seguimiento Universidad de Navarra multipurpose cohort study were evaluated. They had not been hospitalized due to motor vehicle injuries at baseline and were followed up to eight years. Biannual questionnaires allowed for self‐reporting of kilometers traveled in motor vehicles, together with incidence of hospitalization. Covariates in the Cox regression models included age and gender and baseline use of safety belt while driving, driving a vehicle with driver‐side airbag, driving a motorcycle, and drinking and driving. There were 49,766 participant‐years with an average yearly travel of 7,828 km per person‐year. Thirty‐six subjects reported a first hospitalization event during this time. The adjusted hazard ratio per additional kilometer traveled was 1.00005 (95% confidence interval 1.000013 to 1.000086). Even the smallest of reductions in the amount of kilometers traveled (from an average of 3,250 km per year to 1,000) has a statistically significant protective effect on the likelihood of sustaining hospitalization due to motor vehicle injury (aHR 0.9, 95% CI 0.78 to 0.98). In light of current policies aimed to reduce motorized traffic due to environmental concerns, it may be appropriate to consider the additional health benefit related to reductions in injuries.  相似文献   

7.
Probabilistic seismic risk analysis is a well‐established method in the insurance industry for modeling portfolio losses from earthquake events. In this context, precise exposure locations are often unknown. However, so far, location uncertainty has not been in the focus of a large amount of research. In this article, we propose a novel framework for treatment of location uncertainty. As a case study, a large number of synthetic portfolios resembling typical real‐world cases were created. We investigate the effect of portfolio characteristics such as value distribution, portfolio size, or proportion of risk items with unknown coordinates on the variability of loss frequency estimations. The results indicate that due to loss aggregation effects and spatial hazard variability, location uncertainty in isolation and in conjunction with ground motion uncertainty can induce significant variability to probabilistic loss results, especially for portfolios with a small number of risks. After quantifying its effect, we conclude that location uncertainty should not be neglected when assessing probabilistic seismic risk, but should be treated stochastically and the resulting variability should be visualized and interpreted carefully.  相似文献   

8.
The impact of insurer competition on welfare, negotiated provider prices, and premiums in the U.S. private health care industry is theoretically ambiguous. Reduced competition may increase the premiums charged by insurers and their payments made to hospitals. However, it may also strengthen insurers' bargaining leverage when negotiating with hospitals, thereby generating offsetting cost decreases. To understand and measure this trade‐off, we estimate a model of employer‐insurer and hospital‐insurer bargaining over premiums and reimbursements, household demand for insurance, and individual demand for hospitals using detailed California admissions, claims, and enrollment data. We simulate the removal of both large and small insurers from consumers' choice sets. Although consumer welfare decreases and premiums typically increase, we find that premiums can fall upon the removal of a small insurer if an employer imposes effective premium constraints through negotiations with the remaining insurers. We also document substantial heterogeneity in hospital price adjustments upon the removal of an insurer, with renegotiated price increases and decreases of as much as 10% across markets.  相似文献   

9.
Both aristocratic privileges and constitutional constraints in traditional monarchies can be derived from a ruler's incentive to minimize expected costs of moral‐hazard rents for high officials. We consider a dynamic moral‐hazard model of governors serving a sovereign prince, who must deter them from rebellion and hidden corruption which could cause costly crises. To minimize costs, a governor's rewards for good performance should be deferred up to the maximal credit that the prince can be trusted to pay. In the long run, we find that high officials can become an entrenched aristocracy with low turnover and large claims on the ruler. Dismissals for bad performance should be randomized to avoid inciting rebellions, but the prince can profit from reselling vacant offices, and so his decisions to dismiss high officials require institutionalized monitoring. A soft budget constraint that forgives losses for low‐credit governors can become efficient when costs of corruption are low.  相似文献   

10.
According to E.U. regulations, the maximum allowable rate of adventitious transgene presence in non‐genetically modified (GM) crops is 0.9%. We compared four sampling methods for the detection of transgenic material in agricultural non‐GM maize fields: random sampling, stratified sampling, random sampling + ratio reweighting, random sampling + regression reweighting. Random sampling involves simply sampling maize grains from different locations selected at random from the field concerned. The stratified and reweighting sampling methods make use of an auxiliary variable corresponding to the output of a gene‐flow model (a zero‐inflated Poisson model) simulating cross‐pollination as a function of wind speed, wind direction, and distance to the closest GM maize field. With the stratified sampling method, an auxiliary variable is used to define several strata with contrasting transgene presence rates, and grains are then sampled at random from each stratum. With the two methods involving reweighting, grains are first sampled at random from various locations within the field, and the observations are then reweighted according to the auxiliary variable. Data collected from three maize fields were used to compare the four sampling methods, and the results were used to determine the extent to which transgene presence rate estimation was improved by the use of stratified and reweighting sampling methods. We found that transgene rate estimates were more accurate and that substantially smaller samples could be used with sampling strategies based on an auxiliary variable derived from a gene‐flow model.  相似文献   

11.
For insurance companies, wind storms represent a main source of volatility, leading to potentially huge aggregated claim amounts. In this article, we compare different constructions of a storm index allowing us to assess the economic impact of storms on an insurance portfolio by exploiting information from historical wind speed data. Contrary to historical insurance portfolio data, meteorological variables show fewer nonstationarities between years and are easily available with long observation records; hence, they represent a valuable source of additional information for insurers if the relation between observations of claims and wind speeds can be revealed. Since standard correlation measures between raw wind speeds and insurance claims are weak, a storm index focusing on high wind speeds can afford better information. A storm index approach has been applied to yearly aggregated claim amounts in Germany with promising results. Using historical meteorological and insurance data, we assess the consistency of the proposed index constructions with respect to various parameters and weights. Moreover, we are able to place the major insurance events since 1998 on a broader horizon beyond 40 years. Our approach provides a meteorological justification for calculating the return periods of extreme‐storm‐related insurance events whose magnitude has rarely been reached.  相似文献   

12.
This paper investigates the impacts of competition and market uncertainty on airlines' network structures and capacity investment. The airlines choose their network structures and construct capacities while demands are unknown. After uncertainty is resolved, they determine the total number of seats to offer in each leg constrained by their capacities built earlier. We conclude that market uncertainty is the driving force of hub‐and‐spoke networks, whereas the market mean is the driving force of point‐to‐point networks. Which of the two countervailing forces dominates determines the equilibrium network structures. Moreover, we find that the airlines' total expected profits in the mixed equilibrium in which the airlines employ different networks are larger than in the pure hub‐and‐spoke network equilibrium in which each airline employs the hub‐and‐spoke network. However, the mixed equilibrium does not necessarily yield larger profits than the pure point‐to‐point equilibrium in which each airline employs the point‐to‐point network.  相似文献   

13.
《决策科学》2017,48(6):1198-1227
We study two firms that compete on price and lead‐time decisions in a common market. We explore the impact of decentralizing these decisions, as made by the marketing and production departments, respectively, with either marketing or production as the leader. We compare scenarios in which none, one, or both of the firms are decentralized to see whether decentralization can be the equilibrium strategy. We find that under intense price competition, with intensity characterized by the underlying parameters of market demand, firms may suffer from a decentralized structure, particularly under high flexibility induced by high capacity, where revenue‐based sales incentives motivate sales/marketing to make aggressive price cuts that often erode profit margins. In contrast, under intense lead‐time competition, a decentralized strategy with marketing as the leader can not only result in significantly higher profits, but also be the equilibrium strategy. Moreover, decentralization may no longer lead to lower prices or longer lead‐times if the production department chooses capacity along with lead‐time.   相似文献   

14.
We define the class of two‐player zero‐sum games with payoffs having mild discontinuities, which in applications typically stem from how ties are resolved. For such games, we establish sufficient conditions for existence of a value of the game, maximin and minimax strategies for the players, and a Nash equilibrium. If all discontinuities favor one player, then a value exists and that player has a maximin strategy. A property called payoff approachability implies existence of an equilibrium, and that the resulting value is invariant: games with the same payoffs at points of continuity have the same value and ɛ‐equilibria. For voting games in which two candidates propose policies and a candidate wins election if a weighted majority of voters prefer his proposed policy, we provide tie‐breaking rules and assumptions about voters' preferences sufficient to imply payoff approachability. These assumptions are satisfied by generic preferences if the dimension of the space of policies exceeds the number of voters; or with no dimensional restriction, if the electorate is sufficiently large. Each Colonel Blotto game is a special case in which each candidate allocates a resource among several constituencies and a candidate gets votes from those allocated more than his opponent offers; in this case, for simple‐majority rule we prove existence of an equilibrium with zero probability of ties.  相似文献   

15.
We consider the situation when there is a large number of series, N, each with T observations, and each series has some predictive ability for some variable of interest. A methodology of growing interest is first to estimate common factors from the panel of data by the method of principal components and then to augment an otherwise standard regression with the estimated factors. In this paper, we show that the least squares estimates obtained from these factor‐augmented regressions are consistent and asymptotically normal if . The conditional mean predicted by the estimated factors is consistent and asymptotically normal. Except when T/N goes to zero, inference should take into account the effect of “estimated regressors” on the estimated conditional mean. We present analytical formulas for prediction intervals that are valid regardless of the magnitude of N/T and that can also be used when the factors are nonstationary.  相似文献   

16.
This study analyses the impact of two financially equivalent frames of a co-payment policy on the choice between a co-payment policy and one with full cost recovery. It also examines the incentive effect of the co-payment and the post-choice evaluation on the pay-offs in case of unexpected losses, using the example of health insurance. Two experimental studies form the context for an empirical investigation of the theoretical considerations. We examine the framing-related effects of a rebate compared to a premium reduction frame of a co-payment policy. The results confirm that a rebate frame has a positive effect on the intention to choose a co-payment policy. The intention to avoid claims in the premium reduction frame is greater than in the rebate frame. In case of unexpected losses because of high insurance claims, the rebate frame results in less dissatisfaction and causes fewer regret effects. The results support the theoretical considerations that insurance companies should account for these differences and design their co-payment policies according their priorities, either with a rebate or a premium reduction frame.  相似文献   

17.
Coastal cities around the world have experienced large costs from major flooding events in recent years. Climate change is predicted to bring an increased likelihood of flooding due to sea level rise and more frequent severe storms. In order to plan future development and adaptation, cities must know the magnitude of losses associated with these events, and how they can be reduced. Often losses are calculated from insurance claims or surveying flood victims. However, this largely neglects the loss due to the disruption of economic activity. We use a forward‐looking dynamic computable general equilibrium model to study how a local economy responds to a flood, focusing on the subsequent recovery/reconstruction. Initial damage is modeled as a shock to the capital stock and recovery requires rebuilding that stock. We apply the model to Vancouver, British Columbia by considering a flood scenario causing total capital damage of $14.6 billion spread across five municipalities. GDP loss relative to a no‐flood scenario is relatively long‐lasting. It is 2.0% ($2.2 billion) in the first year after the flood, 1.7% ($1.9 billion) in the second year, and 1.2% ($1.4 billion) in the fifth year.  相似文献   

18.
This study examines how organizations construct and manage risk objects as a duality of harm–benefit within their normal operations. It moves beyond the existing focus on accidents, disasters and crisis. We study the risk‐transfer processes of 35 insurers where they navigate the tension of retaining risk in their insurance portfolio to increase the benefit of making profit and transferring risk to reinsurance to reduce the harm of paying claims. We show that organizations’ constructions of risk are underpinned by everyday risk management practices of centralizing, calculating and diversifying. Through variation in these practices, not all organizations seek balance and we in turn uncover the sensemaking processes of abstracting and localizing that enable organizations to prioritize harm or benefit. This contributes to the risk literature by illuminating the co‐constitutive relationship between risk sensemaking processes and everyday risk management practices. Following the complex linkages involved in the construction of risk objects as sources of harm–benefit, our analysis also contributes to the literature on dualities. It shows that while immediate trade‐offs between harm–benefit occur, prioritizing one element of the duality is ultimately a means for attaining the other. Thus, while initial imbalance is evident, prioritization can be an enabling approach to navigating duality.  相似文献   

19.
Robeson offers a number of options to employers to help reduce the impact of increasing health care costs. He points out that large organizations which employ hundreds of people have considerable market power which can be exerted to contain costs. It is suggested that the risk management departments assume the responsibility for managing the effort to reduce the costs of medical care and of the health insurance programs of these organizations since that staff is experienced at evaluating premiums and negotiating with third-party payors. The article examines a number of short-run strategies for firms to pursue to contain health care costs: (1) use alternative delivery systems such as health maintenance organizations (HMOs) which have cost-cutting potential but require marketing efforts to persuade employees of their desirability; (2) contracts with third-party payors which require a second opinion (peer review), a practice which saved one labor union over $2 million from 1972 to 1976; (3) implementation of insurance coverage for less expensive outpatient care; and (4) the use of claims review. These strategies are compared in terms of four criteria: supply of demand for health services; management effort; cost; and time necessary for realized savings. Robeson concludes that development of a management plan for containing health care costs requires an extensive analysis of alternatives, organizational objectives, existing policies, and resources, and offers a table summarizing the cost-containment strategies that a firm should consider.  相似文献   

20.
贷款信用风险评估是银行风控的重要内容。贷款逾期天数作为常见的风险度量指标,具有典型的零膨胀特征。对于零膨胀数据,传统的线性回归不再适用,两部模型是常用的代表方法。考虑到贷款数据具有偏态分布特征,本文构建了一个分位数两部模型—logit-quantile模型。该模型由Logistic回归和分位数回归构成,为了进行风险因素的选择,在模型的两个回归中添加了Lasso惩罚。为了求解模型,本文采用了坐标下降法和线性规划法相结合的迭代算法。模拟分析显示,对比逐步法和常用的logit-linear两部模型,新模型表现出了最好的变量选择效果,尤其在零膨胀比例为80%及高维情形时,该模型的表现仍然最优。最后对某银行的贷款数据实证分析显示,新模型具有更精简的结构,采用交叉验证技术进行预测显示新模型的预测和分类表现最好。  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号