首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
本文以雷曼破产日至2009年1月底这段时期内上证综指、恒生指数以及S&P500指数的日内高频数据作为研究对象,采用跳跃显著性检验方法和扩展HAR模型,对波动跳跃特征进行了实证研究.结果表明:雷曼危机导致股市波动的显著提高,但中国内地股市受到的影响最小;中国香港股市成为波动跳跃发生频率最高、跳跃幅度最大的市场,且波动跳跃主要发生在夜间休市时间内;雷曼危机使得波动率模型的预测精度大大降低,股市风险变得更加难以预测,对于新兴市场来说这一现象更加明显.  相似文献   

2.
Evidence that cell phone use while driving increases the risk of being involved in a motor vehicle crash has led policymakers to consider prohibitions on this practice. However, while restrictions would reduce property loss, injuries, and fatalities, consumers would lose the convenience of using these devices while driving. Quantifying the risks and benefits associated with cell phone use while driving is complicated by substantial uncertainty in the estimates of several important inputs, including the extent to which cell phone use increases a driver's risk of being involved in a crash, the amount of time drivers spend using cell phones (and hence their aggregate contribution to crashes, injuries, and fatalities), and the incremental value to users of being able to make calls while driving. Two prominent studies that have investigated cell phone use while driving have concluded that the practice should not be banned. One finds that the benefits of calls made while driving substantially exceed their costs while the other finds that other interventions could reduce motor vehicle injuries and fatalities (measured in terms of quality adjusted life years) at a lower cost. Another issue is that cell phone use imposes increased (involuntary) risks on other roadway users. This article revises the assumptions used in the two previous analyses to make them consistent and updates them using recent data. The result is a best estimate of zero for the net benefit of cell phone use while driving, a finding that differs substantially from the previous study. Our revised cost-effectiveness estimate for cell phone use while driving moves in the other direction, finding that the cost per quality adjusted life year increases modestly compared to the previous estimate. Both estimates are very uncertain.  相似文献   

3.
Kenneth Lloyd Rider 《Omega》1973,1(5):577-589
Using JW Forrester's Urban Dynamics city model as a starting point, an economic model of the housing market of New York City was constructed incorporating City data on housing stock, rent levels, operating expenses and return on capital. Several possible housing policies were examined over a range of model parameters. It was found that, as Forrester found, artificially restricting new housing construction and increasing slum housing demolition would serve to drive the poor from the City by making adequate housing unavailable but, in contrast to Forrester's conclusions, this would have little effect on upward mobility, the availability of jobs, or influx of labor population.  相似文献   

4.
Access management, which systematically limits opportunities for egress and ingress of vehicles to highway lanes, is critical to protect trillions of dollars of current investment in transportation. This article addresses allocating resources for access management with incomplete and partially relevant data on crash rates, travel speeds, and other factors. While access management can be effective to avoid crashes, reduce travel times, and increase route capacities, the literature suggests a need for performance metrics to guide investments in resource allocation across large corridor networks and several time horizons. In this article, we describe a quantitative decision model to support an access management program via risk‐cost‐benefit analysis under data uncertainties from diverse sources of data and expertise. The approach quantifies potential benefits, including safety improvement and travel time savings, and costs of access management through functional relationships of input parameters including crash rates, corridor access point densities, and traffic volumes. Parameter uncertainties, which vary across locales and experts, are addressed via numerical interval analyses. This approach is demonstrated at several geographic scales across 7,000 kilometers of highways in a geographic region and several subregions. The demonstration prioritizes route segments that would benefit from risk management, including (i) additional data or elicitation, (ii) right‐of‐way purchases, (iii) restriction or closing of access points, (iv) new alignments, (v) developer proffers, and (vi) etc. The approach ought to be of wide interest to analysts, planners, policymakers, and stakeholders who rely on heterogeneous data and expertise for risk management.  相似文献   

5.
杨威  冯璐  宋敏  李春涛 《管理世界》2020,(1):167-186,241
股价高估指的是公司市场价值超出其内在价值的现象,但如何衡量内在价值一直存在争议。借鉴行为金融文献中锚定效应的概念,结合中国资本市场的特殊性,本文提出了锚定比率(简称"RPR")这一新的股价高估指标。为了证实该指标的有效性,本文利用事后的股价崩盘风险进行了相关检验。结果表明:第一,锚定比率与股价崩盘风险正相关;第二,更少的分析师跟踪、更多的散户持股以及更高的股票流动性均会强化锚定比率对股价崩盘风险的影响;第三,在控制常用的股价高估指标、两类代理问题和管理层"捂盘"行为后,本文的结果依然成立;第四,利用崩盘事件,本文证实了锚定比率会加剧股价下跌的程度,且长期来看股价会保持"惯性"而非"反转"。本文的研究表明投资者做决策时对股价高点存在明显的锚定效应,丰富了锚定效应在中国资本市场中的运用。更重要的是,本文提出了一个可能更适合于中国资本市场的股价高估指标,该指标意味着资本市场定价机制的不完善是导致股价高估和频繁崩盘的重要原因,这对于改善资本市场定价效率、降低股价崩盘风险有一定的启示作用。  相似文献   

6.
Policy makers, vehicle manufacturers, and consumers have shown growing concern about the relative safety of sport utility vehicles (SUVs), vans, pickups, and cars. Empirical analysis of real-world crashes is complicated by the possibility that apparent relationships between vehicle type and safety may be confounded by other factors, such as driver behavior and crash circumstances. This study compares different vehicle types with respect to their crashworthiness (self-protection) and aggressivity (risk to others) in crashes between two passenger vehicles. The U.S. Crashworthiness Data System is used to analyze detailed information on 6,481 drivers involved in crashes during 1993-1999. Logistic regression analysis is used to model the risk of serious injury or death to a driver, conditional on a crash occurring. Covariates include the body type of each vehicle in the crash; the driver's age, gender, and restraint use; and the configuration of the crash. A unique feature of this study is the use of "delta-v" to represent the joint effects of vehicle mass and crash severity. While estimated effects are somewhat sensitive to the injury severity level used as the outcome variable, SUVs, vans, and pickups appear to be more aggressive and may be more crashworthy than cars. Effects of pickups are most pronounced. Drivers in pickups face less risk of serious injury than car drivers (odds ratio [OR], 0.35; 95% confidence interval [CI], 0.20-0.60), and drivers who collide with pickups experience more than twice the risk than those who collide with a car (OR, 2.18; 95% CI, 1.03-4.62). While vehicle mass and crash severity contribute to the apparent crashworthiness and aggressivity of passenger vehicles, other vehicle characteristics associated with body type (e.g., the stiffness and height of the underlying structure of the vehicle) also influence safety risks.  相似文献   

7.
The objective of this article is to evaluate the performance of the COM‐Poisson GLM for analyzing crash data exhibiting underdispersion (when conditional on the mean). The COM‐Poisson distribution, originally developed in 1962, has recently been reintroduced by statisticians for analyzing count data subjected to either over‐ or underdispersion. Over the last year, the COM‐Poisson GLM has been evaluated in the context of crash data analysis and it has been shown that the model performs as well as the Poisson‐gamma model for crash data exhibiting overdispersion. To accomplish the objective of this study, several COM‐Poisson models were estimated using crash data collected at 162 railway‐highway crossings in South Korea between 1998 and 2002. This data set has been shown to exhibit underdispersion when models linking crash data to various explanatory variables are estimated. The modeling results were compared to those produced from the Poisson and gamma probability models documented in a previous published study. The results of this research show that the COM‐Poisson GLM can handle crash data when the modeling output shows signs of underdispersion. Finally, they also show that the model proposed in this study provides better statistical performance than the gamma probability and the traditional Poisson models, at least for this data set.  相似文献   

8.
This paper compares the properties and performance of three weight elicitation methods. It is in effect a “second round contest” in which the Bottomley et al. (2000) champion, direct rating (DR), locks horns with two new challengers. People using DR rate each attribute in turn on a scale of 0–100, whilst people using Max100 first assign to the most important attribute(s) a rating of 100, and then rate the other attributes relative to it/them. People using Min10 first assign the least important attribute(s) a rating of 10, and then rate the other attributes relative to it/them.The weights produced by Max100 were somewhat more test–retest reliable than DR. Both methods were considerably more reliable than Min10. Using people's test–retest data as attribute weights on simulated alternative values in a multi-attribute choice scenario, the same alternative would be chosen on 91% of occasions using Max100, 87% of occasions using DR, but only 75% of occasions using Min10. Moreover, the three methods are shown to have very distinct “signatures”, that is profiles relating weights to rank position. Finally, people actually preferred using Max100 and DR rather than Min10, an important pragmatic consideration.  相似文献   

9.
Hosay CK 《Omega》2001,44(1):57-76
Nursing home patients have a constitutional right to refuse treatment. The Patient Self-Determination Act confirmed that right. State laws address the obligations of health care providers and facilities to honor that right. The New York State law is more specific than those of many other states. It allows exemptions for "reasons of conscience" and imposes a number of requirements on nursing homes claiming such an exemption, including the transfer of a patient to a home that will honor an end-of-life wish. This study, conducted by FRIA, investigated the refusal of some nursing homes in New York City to carry out patients' end-of-life wishes because of conscience-based objections. The study also investigated the willingness of homes which did not have such policies to accept patients transferring from a home with a policy so that the patient's end-of-life wishes would be honored. Implications for administrators, policy makers, and regulators are discussed.  相似文献   

10.
Worldwide data on terrorist incidents between 1968 and 2004 gathered by the RAND Corporation and the Oklahoma City National Memorial Institute for the Prevention of Terrorism (MIPT) were assessed for patterns and trends in morbidity/mortality. Adjusted data analyzed involve a total of 19,828 events, 7,401 "adverse" events (each causing >or= 1 victim), and 86,568 "casualties" (injuries), of which 25,408 were fatal. Most terror-related adverse events, casualties, and deaths involved bombs and guns. Weapon-specific patterns and terror-related risk levels in Israel (IS) have differed markedly from those of all other regions combined (OR). IS had a fatal fraction of casualties about half that of OR, but has experienced relatively constant lifetime terror-related casualty risks on the order of 0.5%--a level 2 to 3 orders of magnitude more than those experienced in OR that increased approximately 100-fold over the same period. Individual event fatality has increased steadily, the median increasing from 14% to 50%. Lorenz curves obtained indicate substantial dispersion among victim/event rates: about half of all victims were caused by the top 2.5% (or 10%) of harm-ranked events in OR (or IS). Extreme values of victim/event rates were approximated fairly well by generalized Pareto models (typically used to fit to data on forest fires, sea levels, earthquakes, etc.). These results were in turn used to forecast maximum OR- and IS-specific victims/event rates through 2080, illustrating empirically-based methods that could be applied to improve strategies to assess, prevent, and manage terror-related risks and consequences.  相似文献   

11.
We describe a risk-based analytical framework for estimating traffic fatalities that combines the probability of a crash and the probability of fatality in the event of a crash. As an illustrative application, we use the methodology to explore the role of vehicle mix and vehicle prevalence on long-run fatality trends for a range of transportation growth scenarios that may be relevant to developing societies. We assume crash rates between different road users are proportional to their roadway use and estimate case fatality ratios (CFRs) for the different vehicle-vehicle and vehicle-pedestrian combinations. We find that in the absence of road safety interventions, the historical trend of initially rising and then falling fatalities observed in industrialized nations occurred only if motorization was through car ownership. In all other cases studied (scenarios dominated by scooter use, bus use, and mixed use), traffic fatalities rose monotonically. Fatalities per vehicle had a falling trend similar to that observed in historical data from industrialized nations. Regional adaptations of the model validated with local data can be used to evaluate the impacts of transportation planning and safety interventions, such as helmets, seat belts, and enforcement of traffic laws, on traffic fatalities.  相似文献   

12.
The tenfold "uncertainty" factor traditionally used to guard against human interindividual differences in susceptibility to toxicity is not based on human observations. To begin to build a basis for quantifying an important component of overall variability in susceptibility to toxicity, a data base has been constructed of individual measurements of key pharmacokinetic parameters for specific substances (mostly drugs) in groups of at least five healthy adults. 72 of the 101 data sets studied were positively skewed, indicating that the distributions are generally closer to expectations for log-normal distributions than for normal distributions. Measurements of interindividual variability in elimination half-lives, maximal blood concentrations, and AUC (area under the curve of blood concentration by time) have median values of log10 geometric standard deviations in the range of 0.11-0.145. For the median chemical, therefore, a tenfold difference in these pharmacokinetic parameters would correspond to 7-9 standard deviations in populations of normal healthy adults. For one relatively lipophilic chemical, however, interindividual variability in maximal blood concentration and AUC was 0.4--implying that a tenfold difference would correspond to only about 2.5 standard deviations for those parameters in the human population. The parameters studied to date are only components of overall susceptibility to toxic agents, and do not include contributions from variability in exposure- and response-determining parameters. The current study also implicitly excludes most human interindividual variability from age and illness. When these other sources of variability are included in an overall analysis of variability in susceptibility, it is likely that a tenfold difference will correspond to fewer standard deviations in the overall population, and correspondingly greater numbers of people at risk of toxicity.  相似文献   

13.
Adrian Kent 《Risk analysis》2004,24(1):157-168
Recent articles by Busza et al. (BJSW) and Dar et al. (DDH) argue that astrophysical data can be used to establish small bounds on the risk of a "killer strangelet" catastrophe scenario in the RHIC and ALICE collider experiments. The case for the safety of the experiments set out by BJSW does not rely solely on these bounds, but on theoretical arguments, which BJSW find sufficiently compelling to firmly exclude any possibility of catastrophe. Nonetheless, DDH and other commentators (initially including BJSW) suggested that these empirical bounds alone do give sufficient reassurance. This seems unsupportable when the bounds are expressed in terms of expectation value-a good measure, according to standard risk analysis arguments. For example, DDH's main bound, p(catastrophe) < 2 x 10(-8), implies only that the expectation value of the number of deaths is bounded by 120; BJSW's most conservative bound implies the expectation value of the number of deaths is bounded by 60,000. This article reappraises the DDH and BJSW risk bounds by comparing risk policy in other areas. For example, it is noted that, even if highly risk-tolerant assumptions are made and no value is placed on the lives of future generations, a catastrophe risk no higher than approximately 10(-15) per year would be required for consistency with established policy for radiation hazard risk minimization. Allowing for risk aversion and for future lives, a respectable case can be made for requiring a bound many orders of magnitude smaller. In summary, the costs of small risks of catastrophe have been significantly underestimated by BJSW (initially), by DDH, and by other commentators. Future policy on catastrophe risks would be more rational, and more deserving of public trust, if acceptable risk bounds were generally agreed upon ahead of time and if serious research on whether those bounds could indeed be guaranteed was carried out well in advance of any hypothetically risky experiment, with the relevant debates involving experts with no stake in the experiments under consideration.  相似文献   

14.
本文基于关联交易的视角,以2008-2017年沪深A股上市家族企业为研究对象,考察了家族董事席位超额控制程度与股价崩盘风险之间的关系。研究发现,家族董事席位超额控制程度与企业的股价崩盘风险、控股家族的关联交易行为正相关;同时,控股家族的关联交易行为对企业股价崩盘风险有显著的正向影响,并且这种影响在家族董事席位超额控程度较高的公司中更为明显。进一步分析发现,机构持股比例较低、两职合一、董事会规模较小时,家族董事席位超额控制程度与企业的股价崩盘风险、控股家族的关联交易规模之间的正相关性更强;控股家族的关联交易规模对企业股价崩盘风险的正向影响更显著,其与家族董事席位超额控制程度的交互关系对股价崩盘风险的正向影响更明显。最后,控制潜在的内生性问题,并进行一系列的稳健性检验后,研究结论依然成立。本文不仅从关联交易的视角探讨了家族董事席位超额控制对资本市场的影响,也为家族企业股价崩盘风险的成因提供了更多的理论解释。  相似文献   

15.
The hyper‐Poisson distribution can handle both over‐ and underdispersion, and its generalized linear model formulation allows the dispersion of the distribution to be observation‐specific and dependent on model covariates. This study's objective is to examine the potential applicability of a newly proposed generalized linear model framework for the hyper‐Poisson distribution in analyzing motor vehicle crash count data. The hyper‐Poisson generalized linear model was first fitted to intersection crash data from Toronto, characterized by overdispersion, and then to crash data from railway‐highway crossings in Korea, characterized by underdispersion. The results of this study are promising. When fitted to the Toronto data set, the goodness‐of‐fit measures indicated that the hyper‐Poisson model with a variable dispersion parameter provided a statistical fit as good as the traditional negative binomial model. The hyper‐Poisson model was also successful in handling the underdispersed data from Korea; the model performed as well as the gamma probability model and the Conway‐Maxwell‐Poisson model previously developed for the same data set. The advantages of the hyper‐Poisson model studied in this article are noteworthy. Unlike the negative binomial model, which has difficulties in handling underdispersed data, the hyper‐Poisson model can handle both over‐ and underdispersed crash data. Although not a major issue for the Conway‐Maxwell‐Poisson model, the effect of each variable on the expected mean of crashes is easily interpretable in the case of this new model.  相似文献   

16.
理论上企业披露社会责任信息既可能抑制股价崩盘风险,亦可能加剧股价崩盘风险;新闻媒体作为社会责任披露的重要载体在其中既可能弱化也可能强化这种影响,对这些问题的探讨是近年来公司金融领域研究的热点,但学者对当前的研究结果尚存在较多争论。鉴于此,本文首先在理论上导出社会责任披露对股价崩盘风险的双向作用机制,然后引入新闻媒体研究其可能的传导途径。基于我国A股市场所有上市公司2010-2018年面板数据的研究结果显示:上市公司通过披露企业社会责任指数能够显著降低股价崩盘风险;企业履行社会责任会显著增加媒体报道的数量,而媒体报道数量增加能够显著抑制股价崩盘风险,即媒体报道在企业社会责任影响股价崩盘风险的过程中起到中介作用;进一步的拓展研究发现:企业社会责任指数中的股东责任对股价崩盘风险影响最大,而供应商、客户和消费者权益、社会责任的影响不显著;与中性媒体报道相比,正面媒体报道和负面媒体报道的中介作用效果更强。  相似文献   

17.
A crisis in the commercial fishing industry jeopardized long term survival of the Port of Portland, the second largest seaport in New England and a significant contributor to Maines economic health. The State of Maine Department of Marine Resources and the City of Portland wanted to find a solution such that industry stakeholders would self-organize, define the industrys future, and create and implement a viable strategy. The consultant conceptualized the project using complexity theory because this theory reflected the deep, complex, community inter-relationships and also provided the opportunity to integrate activities and outcomes across all phases of the project.  相似文献   

18.
As is affected by many factors, mid-long term power load forecasting has become the nonlinear and multi-dimension complex problem, and its accuracy affects the decision and layout of power generation sector. In order to improve the accuracy and convergence ability of the single least square support vector machine (LSSVM), this paper proposes the improved fruit fly optimization algorithm applied to wavelet least square support vector machine (IFOA-w-LSSVM). Firstly, the Gaussian kernel function of LSSVM is replaced by the wavelet kernel function and wavelet least square support vector machine (w-LSSVM) is built. Secondly, the ordinary fruit fly optimization algorithm (FOA) is improved from three aspects: (1) dividing fruit fly group into two parts: (2) improving the taste detection function; (3) using Cauchy mutation process to make fruit fly individuals variant. Finally, w-LSSVM is optimized by IFOA for seeking the optimal parameters and achieving the forecasting accuracy. Additionally, the example verification results show that the proposed model outperforms other alternative methods and has a strong effectiveness and feasibility in mid-long term power load forecasting.  相似文献   

19.
Several years ago I went to a lecture in which documentary film maker Rick Burns described his new film on the history of New York City. He mentioned that his biggest challenge was to identify a theme—an element that was a common thread in the history of the city which he could use as the focus for the documentary. (He finally decided that the theme would be “money.”)To develop the theme for Philips Electronic's application for the World Environment Center's (WEC's) Gold Medal Award for Corporate Environmental Excellence, I encountered the same challenge as the film maker. What theme would I use for a company which is diverse, decentralized, and has a worldwide presence? While Philips was doing many worthwhile things and had significant accomplishments, how could I put that in a package which would be coherent, cohesive, and understandable to the independent judges from around the world?  相似文献   

20.
This study presents a tree‐based logistic regression approach to assessing work zone casualty risk, which is defined as the probability of a vehicle occupant being killed or injured in a work zone crash. First, a decision tree approach is employed to determine the tree structure and interacting factors. Based on the Michigan M‐94\I‐94\I‐94BL\I‐94BR highway work zone crash data, an optimal tree comprising four leaf nodes is first determined and the interacting factors are found to be airbag, occupant identity (i.e., driver, passenger), and gender. The data are then split into four groups according to the tree structure. Finally, the logistic regression analysis is separately conducted for each group. The results show that the proposed approach outperforms the pure decision tree model because the former has the capability of examining the marginal effects of risk factors. Compared with the pure logistic regression method, the proposed approach avoids the variable interaction effects so that it significantly improves the prediction accuracy.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号