首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 944 毫秒
1.
Jump Regressions     
We develop econometric tools for studying jump dependence of two processes from high‐frequency observations on a fixed time interval. In this context, only segments of data around a few outlying observations are informative for the inference. We derive an asymptotically valid test for stability of a linear jump relation over regions of the jump size domain. The test has power against general forms of nonlinearity in the jump dependence as well as temporal instabilities. We further propose an efficient estimator for the linear jump regression model that is formed by optimally weighting the detected jumps with weights based on the diffusive volatility around the jump times. We derive the asymptotic limit of the estimator, a semiparametric lower efficiency bound for the linear jump regression, and show that our estimator attains the latter. The analysis covers both deterministic and random jump arrivals. In an empirical application, we use the developed inference techniques to test the temporal stability of market jump betas.  相似文献   

2.
EXcess Idle Time     
We introduce a novel economic indicator, named excess idle time (EXIT), measuring the extent of sluggishness in financial prices. Under a null and an alternative hypothesis grounded in no‐arbitrage (the null) and market microstructure (the alternative) theories of price determination, we derive a limit theory for EXIT leading to formal tests for staleness in the price adjustments. Empirical implementation of the theory indicates that financial prices are often more sluggish than implied by the (ubiquitous, in frictionless continuous‐time asset pricing) semimartingale assumption. EXIT is interpretable as an illiquidity proxy and is easily implementable, for each trading day, using transaction prices only. By using EXIT, we show how to estimate structurally market microstructure models with asymmetric information.  相似文献   

3.
We propose a semiparametric two‐step inference procedure for a finite‐dimensional parameter based on moment conditions constructed from high‐frequency data. The population moment conditions take the form of temporally integrated functionals of state‐variable processes that include the latent stochastic volatility process of an asset. In the first step, we nonparametrically recover the volatility path from high‐frequency asset returns. The nonparametric volatility estimator is then used to form sample moment functions in the second‐step GMM estimation, which requires the correction of a high‐order nonlinearity bias from the first step. We show that the proposed estimator is consistent and asymptotically mixed Gaussian and propose a consistent estimator for the conditional asymptotic variance. We also construct a Bierens‐type consistent specification test. These infill asymptotic results are based on a novel empirical‐process‐type theory for general integrated functionals of noisy semimartingale processes.  相似文献   

4.
This paper studies regulated health insurance markets known as exchanges, motivated by the increasingly important role they play in both public and private insurance provision. We develop a framework that combines data on health outcomes and insurance plan choices for a population of insured individuals with a model of a competitive insurance exchange to predict outcomes under different exchange designs. We apply this framework to examine the effects of regulations that govern insurers' ability to use health status information in pricing. We investigate the welfare implications of these regulations with an emphasis on two potential sources of inefficiency: (i) adverse selection and (ii) premium reclassification risk. We find substantial adverse selection leading to full unraveling of our simulated exchange, even when age can be priced. While the welfare cost of adverse selection is substantial when health status cannot be priced, that of reclassification risk is five times larger when insurers can price based on some health status information. We investigate several extensions including (i) contract design regulation, (ii) self‐insurance through saving and borrowing, and (iii) insurer risk adjustment transfers.  相似文献   

5.
We develop an econometric methodology to infer the path of risk premia from a large unbalanced panel of individual stock returns. We estimate the time‐varying risk premia implied by conditional linear asset pricing models where the conditioning includes both instruments common to all assets and asset‐specific instruments. The estimator uses simple weighted two‐pass cross‐sectional regressions, and we show its consistency and asymptotic normality under increasing cross‐sectional and time series dimensions. We address consistent estimation of the asymptotic variance by hard thresholding, and testing for asset pricing restrictions induced by the no‐arbitrage assumption. We derive the restrictions given by a continuum of assets in a multi‐period economy under an approximate factor structure robust to asset repackaging. The empirical analysis on returns for about ten thousand U.S. stocks from July 1964 to December 2009 shows that risk premia are large and volatile in crisis periods. They exhibit large positive and negative strays from time‐invariant estimates, follow the macroeconomic cycles, and do not match risk premia estimates on standard sets of portfolios. The asset pricing restrictions are rejected for a conditional four‐factor model capturing market, size, value, and momentum effects.  相似文献   

6.
不确定环境下的期权价格上下界研究   总被引:1,自引:1,他引:1  
传统的期权定价理论总是建立在标的资产价格分布的严格假设下,而没有考虑分布的不确定性。本文对标的资产价格分布的严格假设进行放松,分别在仅知到期日标的资产价格的前二阶矩及前三阶矩,而不知道其具体分布的条件下,对期权进行定价。由于信息不充分及分布不确定,推导出的期权价格为一个区间。我们针对有限信息条件下求解期权价格上下界的问题,建立数学规划模型,并将其转化为对偶规划问题进行求解。对此上下界和Black-Scholes价格进行对比分析后发现,Black-Scholes价格介于此上下界之间,相对于采用前二阶矩推导的上下界,采用前三阶矩信息推导的上下界更窄。在使用香港恒生指数权证数据进行的时序分析及横截面分析中发现,市场价格确实介于上下界之间,上下界区间随波动率及剩余存续期的减小而缩小。采用本文的定价方法,不需要对资产价格分布进行严格假设,故可提高定价模型的稳健性,有助于投资者结合期权价格上下界及自己的主观判断进行投资决策。  相似文献   

7.
针对VIX指数的均值回复、波动率聚集特征以及实证研究最近发现的跳跃自刺激性,本文采用具有自刺激性的Hawkes过程对VIX指数的跳跃进行建模,进而构建仿射跳跃扩散模型用于VIX期权定价,得到Hawkes跳跃扩散过程的条件特征函数,然后在风险中性定价框架内采用傅里叶变换方法推导出VIX期权的价值表达式。实证结果表明,本文模型不仅能克服一般均值回复模型拟合误差大的缺点,且能产生正的隐含波动率倾斜和隐含波动率微笑;另一方面,由于考虑了跳跃的自刺激,本文模型在泊松跳跃均值回复模型基础上进一步改善了VIX期权价格的预测效果。  相似文献   

8.
关于VIX时间序列及其期权定价研究较多,但源于日历时间与内在时间视角的研究较少。本文考虑到投资者对均值回复的认知情绪,应用布朗运动与调和稳态过程分别拟合日历时间与内在时间带来的波动,构建基于调和稳态多类均值回复模型对VIX期权定价展开研究。研究结果表明,考虑投资者对VIX认知情绪后构建基于调和稳态的均值回复情绪模型能较好地拟合VIX指数的尖峰厚尾的现象,更能抓住一般结构性模型刻画不理想的非对称跳、有偏、局部均值回复等新的随机特征,并验证了模型不是越复杂越好的结论。  相似文献   

9.
由于经典的Black-Scholes期权定价模型的假设忽略了突发事件对资产价格的影响和"波动率微笑"对期权价值的影响而与实际情形往往存在偏差,因此学者们对Black-Scholes模型的改进则主要分别集中在带跳扩散过程的期权定价模型与具随机波动率的期权定价模型等两个方面,然而却少见将这两种模型结合起来的研究。本文首先在带跳扩散过程的期权定价模型与具随机波动率的期权定价模型的研究工作的基础上,建立了一种同时带跳扩散过程和具随机波动率的美式期权定价模型,并通过伊藤引理推导出了资产价格、随机波动率和期权满足的偏微分方程;然后,利用特征函数法和傅里叶变换导出了资产价格的随机分布,进而通过马尔科夫链方法给出了基于跳扩散过程和随机波动率的美式期权的数值解;最后,运用已建立的带跳扩散过程和随机波动率的美式期权定价模型对高新技术企业项目投资的专利权价值进行实物期权定价评估的案例研究,并对跳扩散强度参数和随机波动率参数进行敏感性分析,研究结果表明:将项目收益跳扩散过程和市场环境随机波动率加入到专利权实物期权定价模型中,可以有效避免专利权的期权价值被高估。  相似文献   

10.
Stochastic discount factor (SDF) processes in dynamic economies admit a permanent‐transitory decomposition in which the permanent component characterizes pricing over long investment horizons. This paper introduces an empirical framework to analyze the permanent‐transitory decomposition of SDF processes. Specifically, we show how to estimate nonparametrically the solution to the Perron–Frobenius eigenfunction problem of Hansen and Scheinkman, 2009. Our empirical framework allows researchers to (i) construct time series of the estimated permanent and transitory components and (ii) estimate the yield and the change of measure which characterize pricing over long investment horizons. We also introduce nonparametric estimators of the continuation value function in a class of models with recursive preferences by reinterpreting the value function recursion as a nonlinear Perron–Frobenius problem. We establish consistency and convergence rates of the eigenfunction estimators and asymptotic normality of the eigenvalue estimator and estimators of related functionals. As an application, we study an economy where the representative agent is endowed with recursive preferences, allowing for general (nonlinear) consumption and earnings growth dynamics.  相似文献   

11.
12.
In the setting of ‘affine’ jump‐diffusion state processes, this paper provides an analytical treatment of a class of transforms, including various Laplace and Fourier transforms as special cases, that allow an analytical treatment of a range of valuation and econometric problems. Example applications include fixed‐income pricing models, with a role for intensity‐based models of default, as well as a wide range of option‐pricing applications. An illustrative example examines the implications of stochastic volatility and jumps for option valuation. This example highlights the impact on option ‘smirks’ of the joint distribution of jumps in volatility and jumps in the underlying asset price, through both jump amplitude as well as jump timing.  相似文献   

13.
We extend Kyle's (1985) model of insider trading to the case where noise trading volatility follows a general stochastic process. We determine conditions under which, in equilibrium, price impact and price volatility are both stochastic, driven by shocks to uninformed volume even though the fundamental value is constant. The volatility of price volatility appears ‘excessive’ because insiders choose to trade more aggressively (and thus more information is revealed) when uninformed volume is higher and price impact is lower. This generates a positive relation between price volatility and trading volume, giving rise to an endogenous subordinate stochastic process for prices.  相似文献   

14.
The availability of high frequency financial data has generated a series of estimators based on intra‐day data, improving the quality of large areas of financial econometrics. However, estimating the standard error of these estimators is often challenging. The root of the problem is that traditionally, standard errors rely on estimating a theoretically derived asymptotic variance, and often this asymptotic variance involves substantially more complex quantities than the original parameter to be estimated. Standard errors are important: they are used to assess the precision of estimators in the form of confidence intervals, to create “feasible statistics” for testing, to build forecasting models based on, say, daily estimates, and also to optimize the tuning parameters. The contribution of this paper is to provide an alternative and general solution to this problem, which we call Observed Asymptotic Variance. It is a general nonparametric method for assessing asymptotic variance (AVAR). It provides consistent estimators of AVAR for a broad class of integrated parameters Θ = ∫ θt dt, where the spot parameter process θ can be a general semimartingale, with continuous and jump components. The observed AVAR is implemented with the help of a two‐scales method. Its construction works well in the presence of microstructure noise, and when the observation times are irregular or asynchronous in the multivariate case. The methodology is valid for a wide variety of estimators, including the standard ones for variance and covariance, and also for more complex estimators, such as, of leverage effects, high frequency betas, and semivariance.  相似文献   

15.
We consider forecasting with uncertainty about the choice of predictor variables. The researcher wants to select a model, estimate the parameters, and use the parameter estimates for forecasting. We investigate the distributional properties of a number of different schemes for model choice and parameter estimation, including: in‐sample model selection using the Akaike information criterion; out‐of‐sample model selection; and splitting the data into subsamples for model selection and parameter estimation. Using a weak‐predictor local asymptotic scheme, we provide a representation result that facilitates comparison of the distributional properties of the procedures and their associated forecast risks. This representation isolates the source of inefficiency in some of these procedures. We develop a simulation procedure that improves the accuracy of the out‐of‐sample and split‐sample methods uniformly over the local parameter space. We also examine how bootstrap aggregation (bagging) affects the local asymptotic risk of the estimators and their associated forecasts. Numerically, we find that for many values of the local parameter, the out‐of‐sample and split‐sample schemes perform poorly if implemented in the conventional way. But they perform well, if implemented in conjunction with our risk‐reduction method or bagging.  相似文献   

16.
Our paper provides a complete characterization of leverage and default in binomial economies with financial assets serving as collateral. Our Binomial No‐Default Theorem states that any equilibrium is equivalent (in real allocations and prices) to another equilibrium in which there is no default. Thus actual default is irrelevant, though the potential for default drives the equilibrium and limits borrowing. This result is valid with arbitrary preferences and endowments, contingent or noncontingent promises, many assets and consumption goods, production, and multiple periods. We also show that only no‐default equilibria would be selected if there were the slightest cost of using collateral or handling default. Our Binomial Leverage Theorem shows that equilibrium Loan to Value (LTV) for noncontingent debt contracts is the ratio of the worst‐case return of the asset to the riskless gross rate of interest. In binomial economies, leverage is determined by down risk and not by volatility.  相似文献   

17.
This paper studies the impact of time‐varying idiosyncratic risk at the establishment level on unemployment fluctuations over 1972–2009. I build a tractable directed search model with firm dynamics and time‐varying idiosyncratic volatility. The model allows for endogenous separations, entry and exit, and job‐to‐job transitions. I show that the model can replicate salient features of the microeconomic behavior of firms and that the introduction of volatility improves the fit of the model for standard business cycle moments. In a series of counterfactual experiments, I show that time‐varying risk is important to account for the magnitude of fluctuations in aggregate unemployment for past U.S. recessions. Though the model can account for about 40% of the total increase in unemployment for the 2007–2009 recession, uncertainty alone is not sufficient to explain the magnitude and persistence of unemployment during that episode.  相似文献   

18.
We estimate demand for residential broadband using high‐frequency data from subscribers facing a three‐part tariff. The three‐part tariff makes data usage during the billing cycle a dynamic problem, thus generating variation in the (shadow) price of usage. We provide evidence that subscribers respond to this variation, and we use their dynamic decisions to estimate a flexible distribution of willingness to pay for different plan characteristics. Using the estimates, we simulate demand under alternative pricing and find that usage‐based pricing eliminates low‐value traffic. Furthermore, we show that the costs associated with investment in fiber‐optic networks are likely recoverable in some markets, but that there is a large gap between social and private incentives to invest.  相似文献   

19.
Most countries have automatic rules in their tax‐and‐transfer systems that are partly intended to stabilize economic fluctuations. This paper measures their effect on the dynamics of the business cycle. We put forward a model that merges the standard incomplete‐markets model of consumption and inequality with the new Keynesian model of nominal rigidities and business cycles, and that includes most of the main potential stabilizers in the U.S. data and the theoretical channels by which they may work. We find that the conventional argument that stabilizing disposable income will stabilize aggregate demand plays a negligible role in the dynamics of the business cycle, whereas tax‐and‐transfer programs that affect inequality and social insurance can have a larger effect on aggregate volatility. However, as currently designed, the set of stabilizers in place in the United States has had little effect on the volatility of aggregate output fluctuations or on their welfare costs despite stabilizing aggregate consumption. The stabilizers have a more important role when monetary policy is constrained by the zero lower bound, and they affect welfare significantly through the provision of social insurance.  相似文献   

20.
We develop an asymptotic theory for the pre‐averaging estimator when asset price jumps are weakly identified, here modeled as local to zero. The theory unifies the conventional asymptotic theory for continuous and discontinuous semimartingales as two polar cases with a continuum of local asymptotics, and explains the breakdown of the conventional procedures under weak identification. We propose simple bias‐corrected estimators for jump power variations, and construct robust confidence sets with valid asymptotic size in a uniform sense. The method is also robust to certain forms of microstructure noise.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号