首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 734 毫秒
1.
Generalized method of moments (GMM) estimation has become an important unifying framework for inference in econometrics in the last 20 years. It can be thought of as encompassing almost all of the common estimation methods, such as maximum likelihood, ordinary least squares, instrumental variables, and two-stage least squares, and nowadays is an important part of all advanced econometrics textbooks. The GMM approach links nicely to economic theory where orthogonality conditions that can serve as such moment functions often arise from optimizing behavior of agents. Much work has been done on these methods since the seminal article by Hansen, and much remains in progress. This article discusses some of the developments since Hansen's original work. In particular, it focuses on some of the recent work on empirical likelihood–type estimators, which circumvent the need for a first step in which the optimal weight matrix is estimated and have attractive information theoretic interpretations.  相似文献   

2.
Generalized method of moments (GMM) has been an important innovation in econometrics. Its usefulness has motivated a search for good inference procedures based on GMM. This article presents a novel method of bootstrapping for GMM based on resampling from the empirical likelihood distribution that imposes the moment restrictions. We show that this approach yields a large-sample improvement and is efficient, and give examples. We also discuss the development of GMM and other recent work on improved inference.  相似文献   

3.
In his 1999 article with Breusch, Qian, and Wyhowski in the Journal of Econometrics, Peter Schmidt introduced the concept of “redundant” moment conditions. Such conditions arise when estimation is based on moment conditions that are valid and can be divided into two subsets: one that identifies the parameters and another that provides no further information. Their framework highlights an important concept in the moment-based estimation literature, namely, that not all valid moment conditions need be informative about the parameters of interest. In this article, we demonstrate the empirical relevance of the concept in the context of the impact of government health expenditure on health outcomes in England. Using a simulation study calibrated to this data, we perform a comparative study of the finite performance of inference procedures based on the Generalized Method of Moment (GMM) and info-metric (IM) estimators. The results indicate that the properties of GMM procedures deteriorate as the number of redundant moment conditions increases; in contrast, the IM methods provide reliable point estimators, but the performance of associated inference techniques based on first order asymptotic theory, such as confidence intervals and overidentifying restriction tests, deteriorates as the number of redundant moment conditions increases. However, for IM methods, it is shown that bootstrap procedures can provide reliable inferences; we illustrate such methods when analysing the impact of government health expenditure on health outcomes in England.  相似文献   

4.
This article develops the adaptive elastic net generalized method of moments (GMM) estimator in large-dimensional models with potentially (locally) invalid moment conditions, where both the number of structural parameters and the number of moment conditions may increase with the sample size. The basic idea is to conduct the standard GMM estimation combined with two penalty terms: the adaptively weighted lasso shrinkage and the quadratic regularization. It is a one-step procedure of valid moment condition selection, nonzero structural parameter selection (i.e., model selection), and consistent estimation of the nonzero parameters. The procedure achieves the standard GMM efficiency bound as if we know the valid moment conditions ex ante, for which the quadratic regularization is important. We also study the tuning parameter choice, with which we show that selection consistency still holds without assuming Gaussianity. We apply the new estimation procedure to dynamic panel data models, where both the time and cross-section dimensions are large. The new estimator is robust to possible serial correlations in the regression error terms.  相似文献   

5.
Abstract

This paper introduces a multiscale Gaussian convolution model of Gaussian mixture (MGC-GMM) via the convolution of the GMM and a multiscale Gaussian window function. It is found that the MGC-GMM is still a Gaussian mixture model, and its parameters can be mapped back to the parameters of the GMM. Meanwhile, the multiscale probability density function (MPDF) of the MGC-GMM can be viewed as the mathematical expectation of a random process induced by the Gaussian window function and the GMM, which can be directly estimated by the use of sample data. Based on the estimated MPDF, a novel algorithm denoted by the MGC is proposed for the selection of model and the parameter estimates of the GMM, where the component number and the means of the GMM are respectively determined by the number and the locations of the maximum points of the MPDF, and the numerical algorithms for the weight and variance parameters of the GMM are derived. The MGC is suitable for the GMM with diagonal covariance matrices. A MGC-EM algorithm is also presented for the generalized GMM, where the GMM is estimated using the EM algorithm by taking the estimates from the MGC as initial parameters of the GMM model. The proposed algorithms are tested via a series of simulated sample sets from the given GMM models, and the results show that the proposed algorithms can effectively estimate the GMM model.  相似文献   

6.
The likelihood function of a general nonlinear, non-Gaussian state space model is a high-dimensional integral with no closed-form solution. In this article, I show how to calculate the likelihood function exactly for a large class of non-Gaussian state space models that include stochastic intensity, stochastic volatility, and stochastic duration models among others. The state variables in this class follow a nonnegative stochastic process that is popular in econometrics for modeling volatility and intensities. In addition to calculating the likelihood, I also show how to perform filtering and smoothing to estimate the latent variables in the model. The procedures in this article can be used for either Bayesian or frequentist estimation of the model’s unknown parameters as well as the latent state variables. Supplementary materials for this article are available online.  相似文献   

7.
This article introduces the robust indirect technique for the slightly contaminated stochastic logistic population models. Based on discrete sampled data with a fixed unit of time between two consecutive observations, we not only construct the robust indirect inference generalized method of moments (GMM) estimator for the model parameters, but also propose a likelihood-ratio-type indirect statistic and a robust indirect GMM saddle-point statistic for testing the parameters of interest. In addition, we develop the robust exponential tilting estimator and the robust exponential tilting test to improve their small sample performances. Finally, their finite-sample properties are studied through Monte Carlo experiments.  相似文献   

8.
In the economics and biological gene expression study area where a large number of variables will be involved, even when the predictors are independent, as long as the dimension is high, the maximum sample correlation can be large. Variable selection is a fundamental method to deal with such models. The ridge regression performs well when the predictors are highly correlated and some nonconcave penalized thresholding estimators enjoy the nice oracle property. In order to provide a satisfactory solution to the collinearity problem, in this paper we report the combined-penalization (CP) mixed by the nonconcave penalty and ridge, with a diverging number of parameters. It is observed that the CP estimator with a diverging number of parameters can correctly select covariates with nonzero coefficients and can estimate parameters simultaneously in the presence of multicollinearity. Simulation studies and a real data example demonstrate the well performance of the proposed method.  相似文献   

9.
This paper proposes a unit root test for short panels with serially correlated errors. The proposed test is based on the instrumental variables (IVs) and the generalized method of moments (GMM) estimators. An advantage of the new test over other tests is that it allows for an ARMA-type serial correlation. A Monte Carlo simulation shows that the new test has good finite sample properties. Several methods to estimate the lag orders of the ARMA structure are briefly discussed.  相似文献   

10.
This article considers first-order autoregressive panel model that is a simple model for dynamic panel data (DPD) models. The generalized method of moments (GMM) gives efficient estimators for these models. This efficiency is affected by the choice of the weighting matrix that has been used in GMM estimation. The non-optimal weighting matrices have been used in the conventional GMM estimators. This led to a loss of efficiency. Therefore, we present new GMM estimators based on optimal or suboptimal weighting matrices. Monte Carlo study indicates that the bias and efficiency of the new estimators are more reliable than the conventional estimators.  相似文献   

11.
In this article we examine small sample properties of a generalized method of moments (GMM) estimation using Monte Carlo simulations. We assume that the generated time series describe the stochastic variance rate of a stock index. we use a mean reverting square-root process to simulate the dynamics of this instantaneous variance rate. The time series obtained are used to estimate the parameters of the assumed variance rate process by applying GMM. Our results are described and compared to estimates from empirical data which consist of volatility as well as daily volume data of the German stock market. One of our main findings is that estimates of the mean reverting parameter that are not significantly different from zero do not necessarily imply a rejection of the hypothesis of a mean reverting behavior of the underlying stochastic process.  相似文献   

12.
We develop a general approach to estimation and inference for income distributions using grouped or aggregate data that are typically available in the form of population shares and class mean incomes, with unknown group bounds. We derive generic moment conditions and an optimal weight matrix that can be used for generalized method-of-moments (GMM) estimation of any parametric income distribution. Our derivation of the weight matrix and its inverse allows us to express the seemingly complex GMM objective function in a relatively simple form that facilitates estimation. We show that our proposed approach, which incorporates information on class means as well as population proportions, is more efficient than maximum likelihood estimation of the multinomial distribution, which uses only population proportions. In contrast to the earlier work of Chotikapanich, Griffiths, and Rao, and Chotikapanich, Griffiths, Rao, and Valencia, which did not specify a formal GMM framework, did not provide methodology for obtaining standard errors, and restricted the analysis to the beta-2 distribution, we provide standard errors for estimated parameters and relevant functions of them, such as inequality and poverty measures, and we provide methodology for all distributions. A test statistic for testing the adequacy of a distribution is proposed. Using eight countries/regions for the year 2005, we show how the methodology can be applied to estimate the parameters of the generalized beta distribution of the second kind (GB2), and its special-case distributions, the beta-2, Singh–Maddala, Dagum, generalized gamma, and lognormal distributions. We test the adequacy of each distribution and compare predicted and actual income shares, where the number of groups used for prediction can differ from the number used in estimation. Estimates and standard errors for inequality and poverty measures are provided. Supplementary materials for this article are available online.  相似文献   

13.
The single index model is a useful regression model. In this paper, we propose a nonconcave penalized least squares method to estimate both the parameters and the link function of the single index model. Compared to other variable selection and estimation methods, the proposed method can estimate parameters and select variables simultaneously. When the dimension of parameters in the single index model is a fixed constant, under some regularity conditions, we demonstrate that the proposed estimators for parameters have the so-called oracle property, and furthermore we establish the asymptotic normality and develop a sandwich formula to estimate the standard deviations of the proposed estimators. Simulation studies and a real data analysis are presented to illustrate the proposed methods.  相似文献   

14.
Models incorporating “latent” variables have been commonplace in financial, social, and behavioral sciences. Factor model, the most popular latent model, explains the continuous observed variables in a smaller set of latent variables (factors) in a matter of linear relationship. However, complex data often simultaneously display asymmetric dependence, asymptotic dependence, and positive (negative) dependence between random variables, which linearity and Gaussian distributions and many other extant distributions are not capable of modeling. This article proposes a nonlinear factor model that can model the above-mentioned variable dependence features but still possesses a simple form of factor structure. The random variables, marginally distributed as unit Fréchet distributions, are decomposed into max linear functions of underlying Fréchet idiosyncratic risks, transformed from Gaussian copula, and independent shared external Fréchet risks. By allowing the random variables to share underlying (latent) pervasive risks with random impact parameters, various dependence structures are created. This innovates a new promising technique to generate families of distributions with simple interpretations. We dive in the multivariate extreme value properties of the proposed model and investigate maximum composite likelihood methods for the impact parameters of the latent risks. The estimates are shown to be consistent. The estimation schemes are illustrated on several sets of simulated data, where comparisons of performance are addressed. We employ a bootstrap method to obtain standard errors in real data analysis. Real application to financial data reveals inherent dependencies that previous work has not disclosed and demonstrates the model’s interpretability to real data. Supplementary materials for this article are available online.  相似文献   

15.
In this article we present a robust and efficient variable selection procedure by using modal regression for varying-coefficient models with longitudinal data. The new method is proposed based on basis function approximations and a group version of the adaptive LASSO penalty, which can select significant variables and estimate the non-zero smooth coefficient functions simultaneously. Under suitable conditions, we establish the consistency in variable selection and the oracle property in estimation. A simulation study and two real data examples are undertaken to assess the finite sample performance of the proposed variable selection procedure.  相似文献   

16.
Sliced regression is an effective dimension reduction method by replacing the original high-dimensional predictors with its appropriate low-dimensional projection. It is free from any probabilistic assumption and can exhaustively estimate the central subspace. In this article, we propose to incorporate shrinkage estimation into sliced regression so that variable selection can be achieved simultaneously with dimension reduction. The new method can improve the estimation accuracy and achieve better interpretability for the reduced variables. The efficacy of proposed method is shown through both simulation and real data analysis.  相似文献   

17.
In this article, the partially linear covariate-adjusted regression models are considered, and the penalized least-squares procedure is proposed to simultaneously select variables and estimate the parametric components. The rate of convergence and the asymptotic normality of the resulting estimators are established under some regularization conditions. With the proper choices of the penalty functions and tuning parameters, it is shown that the proposed procedure can be as efficient as the oracle estimators. Some Monte Carlo simulation studies and a real data application are carried out to assess the finite sample performances for the proposed method.  相似文献   

18.
In this article, a new efficient iteration procedure based on quantile regression is developed for single-index varying-coefficient models. The proposed estimation scheme is an extension of the full iteration procedure proposed by Carroll et al., which is different with the method adopted by Wu et al. for single-index models that a double-weighted summation is used therein. This distinguish not only be the reason that undersmoothing should be a necessary condition in our proposed procedure, but also may reduce the computational burden especially for large-sample size. The resulting estimators are shown to be robust with regardless of outliers as well as varying errors. Moreover, to achieve sparsity when there exist irrelevant variables in the index parameters, a variable selection procedure combined with adaptive LASSO penalty is developed to simultaneously select and estimate significant parameters. Theoretical properties of the obtained estimators are established under some regular conditions, and some simulation studies with various distributed errors are conducted to assess the finite sample performance of our proposed method.  相似文献   

19.
ABSTRACT

We study partial linear models where the linear covariates are endogenous and cause an over-identified problem. We propose combining the profile principle with local linear approximation and the generalized moment methods (GMM) to estimate the parameters of interest. We show that the profiled GMM estimators are root? n consistent and asymptotically normally distributed. By appropriately choosing the weight matrix, the estimators can attain the efficiency bound. We further consider variable selection by using the moment restrictions imposed on endogenous variables when the dimension of the covariates may be diverging with the sample size, and propose a penalized GMM procedure, which is shown to have the sparsity property. We establish asymptotic normality of the resulting estimators of the nonzero parameters. Simulation studies have been presented to assess the finite-sample performance of the proposed procedure.  相似文献   

20.
In many of the applied sciences, it is common that the forms of empirical relationships are almost completely unknown prior to study. Scatterplot smoothers used in nonparametric regression methods have considerable potential to ease the burden of model specification that a researcher would otherwise face in this situation. Occasionally the researcher will know the sign of the first or second derivatives, or both. This article develops a smoothing method that can incorporate this kind of information. I show that cubic regression splines with bounds on the coefficients offer a simple and effective approximation to monotonic, convex or concave transformations. I also discuss methods for testing whether the constraints should be imposed. Monte Carlo results indicate that this method, dubbed CoSmo, has a lower approximation error than either locally weighted regression or two other constrained smoothing methods. CoSmo has many potential applications and should be especially useful in applied econometrics. As an illustration, I apply CoSmo in a multivariate context to estimate a hedonic price function and to test for concavity in one of the variables.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号