首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 46 毫秒
1.
When genuine panel data samples are not available, repeated cross-sectional surveys can be used to form so-called pseudo panels. In this article, we investigate the properties of linear pseudo panel data estimators with fixed number of cohorts and time observations. We extend standard linear pseudo panel data setup to models with factor residuals by adapting the quasi-differencing approach developed for genuine panels. In a Monte Carlo study, we find that the proposed procedure has good finite sample properties in situations with endogeneity, cohort interactive effects, and near nonidentification. Finally, as an illustration the proposed method is applied to data from Ecuador to study labor supply elasticity. Supplementary materials for this article are available online.  相似文献   

2.
Expectile regression is a topic which became popular in the last years. It includes ordinary mean regression as special case but is more general as it offers the possibility to also model non-central parts of a distribution. Semi-parametric expectile models have recently been developed and it is easy to perform flexible expectile estimation with modern software like R. We extend the model class by allowing for panel observations, i.e. clustered data with repeated measurements taken at the same individual. A random (individual) effect is incorporated in the model which accounts for the dependence structure in the data. We fit expectile sheets, meaning that not a single expectile is estimated but a whole range of expectiles is estimated simultaneously. The presented model allows for multiple covariates, where a semi-parametric approach with penalized splines is pursued to fit smooth expectile curves. We apply our methods to panel data from the German Socio-Economic Panel.  相似文献   

3.
In this paper, we suggest a similar unit root test statistic for dynamic panel data with fixed effects. The test is based on the LM, or score, principle and is derived under the assumption that the time dimension of the panel is fixed, which is typical in many panel data studies. It is shown that the limiting distribution of the test statistic is standard normal. The similarity of the test with respect to both the initial conditions of the panel and the fixed effects is achieved by allowing for a trend in the model using a parameterisation that has the same interpretation under both the null and alternative hypotheses. This parameterisation can be expected to increase the power of the test statistic. Simulation evidence suggests that the proposed test has empirical size that is very close to the nominal level and considerably more power than other panel unit root tests that assume that the time dimension of the panel is large. As an application of the test, we re-examine the stationarity of real stock prices and dividends using disaggregated panel data over a relatively short period of time. Our results suggest that while real stock prices contain a unit root, real dividends are trend stationary.  相似文献   

4.
本文首次将Elastic Net这种用于高度相关变量的惩罚方法用于面板数据的贝叶斯分位数回归,并基于非对称Laplace先验分布推导所有参数的后验分布,进而构建Gibbs抽样。为了验证模型的有效性,本文将面板数据的贝叶斯Elastic Net分位数回归方法(BQR. EN)与面板数据的贝叶斯分位数回归方法(BQR)、面板数据的贝叶斯Lasso分位数回归方法(BLQR)、面板数据的贝叶斯自适应Lasso分位数回归方法(BALQR)进行了多种情形下的全方位比较,结果表明BQR. EN方法适用于具有高度相关性、数据维度很高和尖峰厚尾分布特征的数据。进一步地,本文就BQR. EN方法在不同扰动项假设、不同样本量的情形展开模拟比较,验证了新方法的稳健性和小样本特性。最后,本文选取互联网金融类上市公司经济增加值(EVA)作为实证研究对象,检验新方法在实际问题中的参数估计与变量选择能力,实证结果符合预期。  相似文献   

5.
高华川  张晓峒 《统计研究》2015,32(12):101-109
动态因子模型(DFM)的基本职能是对高维数据进行降维处理,即从高维数据集中提取变量间的协同变动信息。在理论上,本文系统梳理了DFM的模型形式设定、估计方法以及结构化建模技术的发展历程和研究前沿。在应用方面,本文总结了DFM在预测、构建经济周期指标和通胀指数、以及经济结构分析中的应用研究。最后,归纳出了DFM计量分析的研究脉络和未来的发展方向。  相似文献   

6.
Time series of counts occur in many different contexts, the counts being usually of certain events or objects in specified time intervals. In this paper we introduce a model called parameter-driven state-space model to analyse integer-valued time series data. A key property of such model is that the distribution of the observed count data is independent, conditional on the latent process, although the observations are correlated marginally. Our simulation shows that the Monte Carlo Expectation Maximization (MCEM) algorithm and the particle method are useful for the parameter estimation of the proposed model. In the application to Malaysia dengue data, our model fits better when compared with several other models including that of Yang et al. (2015)  相似文献   

7.
Summary.  We propose a flexible generalized auto-regressive conditional heteroscedasticity type of model for the prediction of volatility in financial time series. The approach relies on the idea of using multivariate B -splines of lagged observations and volatilities. Estimation of such a B -spline basis expansion is constructed within the likelihood framework for non-Gaussian observations. As the dimension of the B -spline basis is large, i.e. many parameters, we use regularized and sparse model fitting with a boosting algorithm. Our method is computationally attractive and feasible for large dimensions. We demonstrate its strong predictive potential for financial volatility on simulated and real data, and also in comparison with other approaches, and we present some supporting asymptotic arguments.  相似文献   

8.
Cornwell, Schmidt, and Sickles (1990) and Kumbhakar (1990), among others, developed stochasticfrontier production models which allow firm specific inefficiency levels to change over time. These studies assumed arbitrary restrictions on the short-run dynamics of efficiency levels which have little theoretical justification. Further, the models are inappropriate for estimation of long-run efficiencies. We consider estimation of an alternative frontier model in which firmspecific technical inefficiency levels are autoregressive. This model is particularly useful to examine a potential dynamic link between technical innovations and production inefficiency levels. We apply our methodology to a panel of US airlines.  相似文献   

9.
Estimation of long-run inefficiency levels: a dynamic frontier approach   总被引:2,自引:0,他引:2  
Cornwell, Schmidt, and Sickles (1990) and Kumbhakar (1990), among others, developed stochasticfrontier production models which allow firm specific inefficiency levels to change over time. These studies assumed arbitrary restrictions on the short-run dynamics of efficiency levels which have little theoretical justification. Further, the models are inappropriate for estimation of long-run efficiencies. We consider estimation of an alternative frontier model in which firmspecific technical inefficiency levels are autoregressive. This model is particularly useful to examine a potential dynamic link between technical innovations and production inefficiency levels. We apply our methodology to a panel of US airlines.  相似文献   

10.
Spatio-temporal processes are often high-dimensional, exhibiting complicated variability across space and time. Traditional state-space model approaches to such processes in the presence of uncertain data have been shown to be useful. However, estimation of state-space models in this context is often problematic since parameter vectors and matrices are of high dimension and can have complicated dependence structures. We propose a spatio-temporal dynamic model formulation with parameter matrices restricted based on prior scientific knowledge and/or common spatial models. Estimation is carried out via the expectation–maximization (EM) algorithm or general EM algorithm. Several parameterization strategies are proposed and analytical or computational closed form EM update equations are derived for each. We apply the methodology to a model based on an advection–diffusion partial differential equation in a simulation study and also to a dimension-reduced model for a Palmer Drought Severity Index (PDSI) data set.  相似文献   

11.
In this paper, we study the maximum likelihood estimation of a model with mixed binary responses and censored observations. The model is very general and includes the Tobit model and the binary choice model as special cases. We show that, by using additional binary choice observations, our method is more efficient than the traditional Tobit model. Two iterative procedures are proposed to compute the maximum likelihood estimator (MLE) for the model based on the EM algorithm (Dempster et al, 1977) and the Newton-Raphson method. The uniqueness of the MLE is proved. The simulation results show that the inconsistency and inefficiency can be significant when the Tobit method is applied to the present mixed model. The experiment results also suggest that the EM algorithm is much faster than the Newton-Raphson method for the present mixed model. The method also allows one to combine two data sets, the smaller data set with more detailed observations and the larger data set with less detailed binary choice observations in order to improve the efficiency of estimation. This may entail substantial savings when one conducts surveys.  相似文献   

12.
We study estimation and hypothesis testing in single‐index panel data models with individual effects. Through regressing the individual effects on the covariates linearly, we convert the estimation problem in single‐index panel data models to that in partially linear single‐index models. The conversion is valid regardless of the individual effects being random or fixed. We propose an estimating equation approach, which has a desirable double robustness property. We show that our method is applicable in single‐index panel data models with heterogeneous link functions. We further design a chi‐squared test to evaluate whether the individual effects are random or fixed. We conduct simulations to demonstrate the finite sample performance of the method and conduct a data analysis to illustrate its usefulness.  相似文献   

13.
Frequently in process monitoring, situations arise in which the order that events occur cannot be distinguished, motivating the need to accommodate multiple observations occurring at the same time, or concurrent observations. The risk-adjusted Bernoulli cumulative sum (CUSUM) control chart can be used to monitor the rate of an adverse event by fitting a risk-adjustment model, followed by a likelihood ratio-based scoring method that produces a statistic that can be monitored. In our paper, we develop a risk-adjusted Bernoulli CUSUM control chart for concurrent observations. Furthermore, we adopt a novel approach that uses a combined mixture model and kernel density estimation approach in order to perform risk-adjustment with regard to spatial location. Our proposed method allows for monitoring binary outcomes through time with multiple observations at each time point, where the chart is spatially adjusted for each Bernoulli observation's estimated probability of the adverse event. A simulation study is presented to assess the performance of the proposed monitoring scheme. We apply our method using data from Wayne County, Michigan between 2005 and 2014 to monitor the rate of foreclosure as a percentage of all housing transactions.  相似文献   

14.
Time series which have more than one time dependent variable require building an appropriate model in which the variables not only have relationships with each other, but also depend on previous values in time. Based on developments for a sufficient dimension reduction, we investigate a new class of multiple time series models without parametric assumptions. First, for the dependent and independent time series, we simply use a univariate time series central subspace to estimate the autoregressive lags of the series. Secondly, we extract the successive directions to estimate the time series central subspace for regressors which include past lags of dependent and independent series in a mutual information multiple-index time series. Lastly, we estimate a multiple time series model for the reduced directions. In this article, we propose a unified estimation method of minimal dimension using an Akaike information criterion, for situations in which the dimension for multiple regressors is unknown. We present an analysis using real data from the housing price index showing that our approach is an alternative for multiple time series modeling. In addition, we check the accuracy for the multiple time series central subspace method using three simulated data sets.  相似文献   

15.
Andreas Artemiou 《Statistics》2013,47(5):1037-1051
In this paper, we combine adaptively weighted large margin classifiers with Support Vector Machine (SVM)-based dimension reduction methods to create dimension reduction methods robust to the presence of extreme outliers. We discuss estimation and asymptotic properties of the algorithm. The good performance of the new algorithm is demonstrated through simulations and real data analysis.  相似文献   

16.
In this article, we propose an efficient and robust estimation for the semiparametric mixture model that is a mixture of unknown location-shifted symmetric distributions. Our estimation is derived by minimizing the profile Hellinger distance (MPHD) between the model and a nonparametric density estimate. We propose a simple and efficient algorithm to find the proposed MPHD estimation. Monte Carlo simulation study is conducted to examine the finite sample performance of the proposed procedure and to compare it with other existing methods. Based on our empirical studies, the newly proposed procedure works very competitively compared to the existing methods for normal component cases and much better for non-normal component cases. More importantly, the proposed procedure is robust when the data are contaminated with outlying observations. A real data application is also provided to illustrate the proposed estimation procedure.  相似文献   

17.
Approximate Bayesian computation (ABC) methods permit approximate inference for intractable likelihoods when it is possible to simulate from the model. However, they perform poorly for high-dimensional data and in practice must usually be used in conjunction with dimension reduction methods, resulting in a loss of accuracy which is hard to quantify or control. We propose a new ABC method for high-dimensional data based on rare event methods which we refer to as RE-ABC. This uses a latent variable representation of the model. For a given parameter value, we estimate the probability of the rare event that the latent variables correspond to data roughly consistent with the observations. This is performed using sequential Monte Carlo and slice sampling to systematically search the space of latent variables. In contrast, standard ABC can be viewed as using a more naive Monte Carlo estimate. We use our rare event probability estimator as a likelihood estimate within the pseudo-marginal Metropolis–Hastings algorithm for parameter inference. We provide asymptotics showing that RE-ABC has a lower computational cost for high-dimensional data than standard ABC methods. We also illustrate our approach empirically, on a Gaussian distribution and an application in infectious disease modelling.  相似文献   

18.
Estimation of a general multi-index model comprises determining the number of linear combinations of predictors (structural dimension) that are related to the response, estimating the loadings of each index vector, selecting the active predictors and estimating the underlying link function. These objectives are often achieved sequentially at different stages of the estimation process. In this study, we propose a unified estimation approach under a semi-parametric model framework to attain these estimation goals simultaneously. The proposed estimation method is more efficient and stable than many existing methods where the estimation error in the structural dimension may propagate to the estimation of the index vectors and variable selection stages. A detailed algorithm is provided to implement the proposed method. Comprehensive simulations and a real data analysis illustrate the effectiveness of the proposed method.  相似文献   

19.
We develop a hierarchical Gaussian process model for forecasting and inference of functional time series data. Unlike existing methods, our approach is especially suited for sparsely or irregularly sampled curves and for curves sampled with nonnegligible measurement error. The latent process is dynamically modeled as a functional autoregression (FAR) with Gaussian process innovations. We propose a fully nonparametric dynamic functional factor model for the dynamic innovation process, with broader applicability and improved computational efficiency over standard Gaussian process models. We prove finite-sample forecasting and interpolation optimality properties of the proposed model, which remain valid with the Gaussian assumption relaxed. An efficient Gibbs sampling algorithm is developed for estimation, inference, and forecasting, with extensions for FAR(p) models with model averaging over the lag p. Extensive simulations demonstrate substantial improvements in forecasting performance and recovery of the autoregressive surface over competing methods, especially under sparse designs. We apply the proposed methods to forecast nominal and real yield curves using daily U.S. data. Real yields are observed more sparsely than nominal yields, yet the proposed methods are highly competitive in both settings. Supplementary materials, including R code and the yield curve data, are available online.  相似文献   

20.
This article develops the adaptive elastic net generalized method of moments (GMM) estimator in large-dimensional models with potentially (locally) invalid moment conditions, where both the number of structural parameters and the number of moment conditions may increase with the sample size. The basic idea is to conduct the standard GMM estimation combined with two penalty terms: the adaptively weighted lasso shrinkage and the quadratic regularization. It is a one-step procedure of valid moment condition selection, nonzero structural parameter selection (i.e., model selection), and consistent estimation of the nonzero parameters. The procedure achieves the standard GMM efficiency bound as if we know the valid moment conditions ex ante, for which the quadratic regularization is important. We also study the tuning parameter choice, with which we show that selection consistency still holds without assuming Gaussianity. We apply the new estimation procedure to dynamic panel data models, where both the time and cross-section dimensions are large. The new estimator is robust to possible serial correlations in the regression error terms.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号