首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
This article considers a simple test for the correct specification of linear spatial autoregressive models, assuming that the choice of the weight matrix Wn is true. We derive the limiting distributions of the test under the null hypothesis of correct specification and a sequence of local alternatives. We show that the test is free of nuisance parameters asymptotically under the null and prove the consistency of our test. To improve the finite sample performance of our test, we also propose a residual-based wild bootstrap and justify its asymptotic validity. We conduct a small set of Monte Carlo simulations to investigate the finite sample properties of our tests. Finally, we apply the test to two empirical datasets: the vote cast and the economic growth rate. We reject the linear spatial autoregressive model in the vote cast example but fail to reject it in the economic growth rate example. Supplementary materials for this article are available online.  相似文献   

2.
The traditional tests for rationality, the regression and volatility tests, have often rejected the hypothesis of rationality for survey data on expectations. It has been argued that these tests are not valid in the presence of unit roots and hence cointegration tests should be applied. The cointegration tests have often failed to reject the hypothesis of rationality. The present article argues that errors in variables affect tests of rationality. We use multiple sources of expectations to correct for the errors-in-variables bias but find that the hypothesis of rationality is rejected even after this correction. The article uses survey data on interest rates, stock prices, and exchange rates.  相似文献   

3.
《Econometric Reviews》2013,32(3):269-287
Abstract

In many applications, a researcher must select an instrument vector from a candidate set of instruments. If the ultimate objective is to perform inference about the unknown parameters using conventional asymptotic theory, then we argue that it is desirable for the chosen instrument vector to satisfy four conditions which we refer to as orthogonality, identification, efficiency, and non‐redundancy. It is impossible to verify a priori which elements of the candidate set satisfy these conditions; this can only be done using the data. However, once the data are used in this fashion it is important that the selection process does not contaminate the limiting distribution of the parameter estimator. We refer to this requirement as the inference condition. In a recent paper, Andrews [[Andrews, D. W. K. (1999)] Andrews, D. W.K. 1999. Consistent moment selection procedures for generalized method of moments estimation. Econometrica, 67: 543564. [Crossref], [Web of Science ®] [Google Scholar]. Consistent moment selection procedures for generalized method of moments estimation. Econometrica67:543–564] has proposed a method of moment selection based on an information criterion involving the overidentifying restrictions test. This method can be shown to select an instrument vector which satisfies the orthogonality condition with probability one in the limit. In this paper, we consider the problem of instrument selection based on a combination of the efficiency and non‐redundancy conditions which we refer to as the relevance condition. It is shown that, within a particular class of models, certain canonical correlations form the natural metric for relevancy, and this leads us to propose a canonical correlations information criterion (CCIC) for instrument selection. We establish conditions under which our method satisfies the inference condition. We also consider the properties of an instrument selection method based on the sequential application of [Andrews, D. W. K. (1999)] Andrews, D. W.K. 1999. Consistent moment selection procedures for generalized method of moments estimation. Econometrica, 67: 543564. [Crossref], [Web of Science ®] [Google Scholar]. Consistent moment selection procedures for generalized method of moments estimation. Econometrica67:543–564 method and CCIC.  相似文献   

4.
Summary. The administrators of an office automation training programme in Italy enrolled applicants on the basis of their score in an attitudinal test, with low scoring subjects mandated out of the programme. Some of the applicants who were mandated out resorted to an alternative programme. To identify the effect of the programme by comparing participants with non-participants we need to account properly for both the selection by the score and the contamination of the comparison group by a number of non-complying subjects. The estimand resulting from using the mandated status as an instrumental variable for the actual status identifies the effect of the programme on complying subjects exhibiting a score in the attitudinal test in the neighbourhood of the threshold for selection. Simple nonparametric instrumental variable estimators based on the work of Robinson and Hahn and co-workers reveal that the programme had no effect on the probability of being in work several months after its completion. Simulation results show that in spite of the small sample size the test for the no-impact hypothesis has non-negligible power even at small departures from the null hypothesis. As a side-result Robinson's test turns out to be appreciably more powerful than the other test.  相似文献   

5.
Tests for the cointegrating rank of a vector autoregressive process are considered that allow for possible exogenous shifts in the mean of the data-generation process. The break points are assumed to be known a priori. It is proposed to estimate and remove the deterministic terms such as mean, linear-trend term, and a shift in a first step. Then systems cointegration tests are applied to the adjusted series. The resulting tests are shown to have known limiting null distributions that are free of nuisance parameters and do not depend on the break point. The tests are applied for analyzing the number of cointegrating relations in two German money-demand systems.  相似文献   

6.
Recently, many standard families of distributions have been generalized by exponentiating their cumulative distribution function (CDF). In this paper, test statistics are constructed based on CDF–transformed observations and the corresponding moments of arbitrary positive order. Simulation results for generalized exponential distributions show that the proposed test compares well with standard methods based on the empirical distribution function.  相似文献   

7.
We investigate the small-sample properties of three alternative generalized method of moments (GMM) estimators of asset-pricing models. The estimators that we consider include ones in which the weighting matrix is iterated to convergence and ones in which the weighting matrix is changed with each choice of the parameters. Particular attention is devoted to assessing the performance of the asymptotic theory for making inferences based directly on the deterioration of GMM criterion functions.  相似文献   

8.
Dagenais in (Econ Lett 63:19–21, 1999) and Lucchetti in (Econ Lett 75:179–185, 2002) have demonstrated that the naive GMM estimator of Grogger in (Econ Lett 33:329–332, 1990) for the probit model with an endogenous regressor is not consistent. This paper completes their discussion by explaining the reason for the inconsistency and presenting a natural solution. Furthermore, the resulting GMM estimator is analyzed in a Monte-Carlo simulation and compared with alternative estimators.  相似文献   

9.
Revisions of the early GNP estimates may contain elements of measurement errors as well as forecast errors. These types of error behave differently but need to satisfy a common set of criteria for well-behavedness. This article tests these criteria for U.S. GNP revisions. The tests are similar to tests of rationality and are based on the generalized method of moments estimator. The flash, 15-day, and 45-day estimates are found to be ill behaved, but the 75-day estimate satisfies the criteria for well-behavedness.  相似文献   

10.
Generalized method of moments (GMM) estimation has become an important unifying framework for inference in econometrics in the last 20 years. It can be thought of as encompassing almost all of the common estimation methods, such as maximum likelihood, ordinary least squares, instrumental variables, and two-stage least squares, and nowadays is an important part of all advanced econometrics textbooks. The GMM approach links nicely to economic theory where orthogonality conditions that can serve as such moment functions often arise from optimizing behavior of agents. Much work has been done on these methods since the seminal article by Hansen, and much remains in progress. This article discusses some of the developments since Hansen's original work. In particular, it focuses on some of the recent work on empirical likelihood–type estimators, which circumvent the need for a first step in which the optimal weight matrix is estimated and have attractive information theoretic interpretations.  相似文献   

11.
The fitting of Lévy processes is an important field of interest in both option pricing and risk management. In literature, a large number of fitting methods requiring adequate initial values at the start of the optimization procedure exists. A so-called simplified method of moments (SMoM) generates by assuming a symmetric distribution these initial values for the Variance Gamma process, whereby the idea behind can be easily transferred to the Normal Inverse Gaussian process. However, the characteristics of the Generalized Hyperbolic process prevent such an easy adaption. Therefore, we provide by applying a Taylor series approximation for the modified Bessel function of the third kind, a Tschirnhaus transformation and a symmetric distribution assumption, a SMOM for the Generalized Hyperbolic distribution. Our simulation study compares the results of our SMoM with the results of the maximum likelihood estimation. The results show that our proposed approach is an appropriate and useful way for estimating Generalized Hyperbolic process parameters and significantly reduces estimation time.  相似文献   

12.
Panel data with covariate measurement error appear frequently in various studies. Due to the sampling design and/or missing data, panel data are often unbalanced in the sense that panels have different sizes. For balanced panel data (i.e., panels having the same size), there exists a generalized method of moments (GMM) approach for adjusting covariate measurement error, which does not require additional validation data. This paper extends the GMM approach of adjusting covariate measurement error to unbalanced panel data. Two health related longitudinal surveys are used to illustrate the implementation of the proposed method.  相似文献   

13.
We evaluate alternative models of variances and correlations with an economic loss function. We construct portfolios to minimize predicted variance subject to a required return. It is shown that the realized volatility is smallest for the correctly specified covariance matrix for any vector of expected returns. A test of relative performance of two covariance matrices is based on work of Diebold and Mariano. The method is applied to stocks and bonds and then to highly correlated assets. On average, dynamically correct correlations are worth around 60 basis points in annualized terms, but on some days they may be worth hundreds.  相似文献   

14.
In this article, we propose a new class of semiparametric instrumental variable models with partially varying coefficients, in which the structural function has a partially linear form and the impact of endogenous structural variables can vary over different levels of some exogenous variables. We propose a three-step estimation procedure to estimate both functional and constant coefficients. The consistency and asymptotic normality of these proposed estimators are established. Moreover, a generalized F-test is developed to test whether the functional coefficients are of particular parametric forms with some underlying economic intuitions, and furthermore, the limiting distribution of the proposed generalized F-test statistic under the null hypothesis is established. Finally, we illustrate the finite sample performance of our approach with simulations and two real data examples in economics.  相似文献   

15.
Hall et al. (2007 Hall , A. R. , Inoue , A. , Jana , K. , Shin , C. (2007). Information in generalized method of moments estimation and entropy based moment selection. Journal of Econometrics 138:488512.[Crossref] [Google Scholar]) propose a method for moment selection based on an information criterion that is a function of the entropy of the limiting distribution of the Generalized Method of Moments (GMM) estimator. They establish the consistency of the method subject to certain conditions that include the identification of the parameter vector by at least one of the moment conditions being considered. In this article, we examine the limiting behavior of this moment selection method when the parameter vector is weakly identified by all the moment conditions being considered. It is shown that the selected moment condition is random and hence not consistent in any meaningful sense. As a result, we propose a two-step procedure for moment selection in which identification is first tested using a statistic proposed by Stock and Yogo (2003 Stock , J. H. , Yogo , M. ( 2003 ). Testing for weak instruments in linear IV regression . Discussion paper, Kennedy School of Government, Harvard University, Cambridge, MA . [Google Scholar]) and then only if this statistic indicates identification does the researcher proceed to the second step in which the aforementioned information criterion is used to select moments. The properties of this two-step procedure are contrasted with those of strategies based on either using all available moments or using the information criterion without the identification pre-test. The performances of these strategies are compared via an evaluation of the finite sample behavior of various methods for inference about the parameter vector. The inference methods considered are based on the Wald statistic, Anderson and Rubin's (1949 Anderson , T. W. , Rubin , H. ( 1949 ). Estimation of the parameters of a single equation in a complete system of stochastic equations . Annals of Mathematical Statistics 20 : 4663 .[Crossref] [Google Scholar]) statistic, Kleibergen (2002 Kleibergen , F. ( 2002 ). Pivotal statistics for testing structural parameters in instrumenatl variables regression . Econometrica 70 : 17811803 .[Crossref], [Web of Science ®] [Google Scholar]) K statistic, and combinations thereof in which the choice is based on the outcome of the test for weak identification.  相似文献   

16.
In this article, we investigate the use of implied probabilities (Back and Brown, 1993) to improve estimation in unconditional moment conditions models. Using the seminal contributions of Bonnal and Renault (2001 Bonnal, H., Renault, E. (2001). Minimal Chi-Square Estimation with Conditional Moment Restrictions, Document de Travail, CESG, September 2001. [Google Scholar]) and Antoine et al. (2007 Antoine, B., Bonnal, H., Renault, E. (2007). On the efficient use of the informational content of estimating equations: Implied probabilities and euclidean empirical likelihood. Journal of Econometrics 138(2):461487.[Crossref], [Web of Science ®] [Google Scholar]), we propose two three-step Euclidian empirical likelihood (3S-EEL) estimators for weakly dependent data. Both estimators make use of a control variates principle that can be interpreted in terms of implied probabilities in order to achieve higher-order improvements relative to the traditional two-step GMM estimator. A Monte Carlo study reveals that the finite and large sample properties of the three-step estimators compare favorably to the existing approaches: the two-step GMM and the continuous updating estimator.  相似文献   

17.
This article examines structural change tests based on generalized empirical likelihood methods in the time series context, allowing for dependent data. Standard structural change tests for the Generalized method of moments (GMM) are adapted to the generalized empirical likelihood (GEL) context. We show that when moment conditions are properly smoothed, these test statistics converge to the same asymptotic distribution as in the GMM, in cases with known and unknown breakpoints. New test statistics specific to GEL methods, and that are robust to weak identification, are also introduced. A simulation study examines the small sample properties of the tests and reveals that GEL-based robust tests performed well, both in terms of the presence and location of a structural change and in terms of the nature of identification.  相似文献   

18.
Least-squares and quantile regressions are method of moments techniques that are typically used in isolation. A leading example where efficiency may be gained by combining least-squares and quantile regressions is one where some information on the error quantiles is available but the error distribution cannot be fully specified. This estimation problem may be cast in terms of solving an over-determined estimating equation (EE) system for which the generalized method of moments (GMM) and empirical likelihood (EL) are approaches of recognized importance. The major difficulty with implementing these techniques here is that the EEs associated with the quantiles are non-differentiable. In this paper, we develop a kernel-based smoothing technique for non-smooth EEs, and derive the asymptotic properties of the GMM and maximum smoothed EL (MSEL) estimators based on the smoothed EEs. Via a simulation study, we investigate the finite sample properties of the GMM and MSEL estimators that combine least-squares and quantile moment relationships. Applications to real datasets are also considered.  相似文献   

19.
The objectives of this article are threefold—(1) to test target-zone models using more efficient and direct econometric methodology than previous research, (2) to identify an implicit band, if it exists, from observed data and to test target-zone models based on the estimated implicit band rather than the stated official band, and (3) to examine whether the exchange rate can be modeled as a managed float system with a central parity that lacks a band. We find strong evidence that a model with intramarginal intervention and a narrower implicit (unofficial) band can describe the dynamics of the French franc/Deutsche mark exchange rate from January 1, 1987–July 30, 1993.  相似文献   

20.
A functional-form empirical likelihood method is proposed as an alternative method to the empirical likelihood method. The proposed method has the same asymptotic properties as the empirical likelihood method but has more flexibility in choosing the weight construction. Because it enjoys the likelihood-based interpretation, the profile likelihood ratio test can easily be constructed with a chi-square limiting distribution. Some computational details are also discussed, and results from finite-sample simulation studies are presented.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号