首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
We consider logistic regression with covariate measurement error. Most existing approaches require certain replicates of the error‐contaminated covariates, which may not be available in the data. We propose generalized method of moments (GMM) nonparametric correction approaches that use instrumental variables observed in a calibration subsample. The instrumental variable is related to the underlying true covariates through a general nonparametric model, and the probability of being in the calibration subsample may depend on the observed variables. We first take a simple approach adopting the inverse selection probability weighting technique using the calibration subsample. We then improve the approach based on the GMM using the whole sample. The asymptotic properties are derived, and the finite sample performance is evaluated through simulation studies and an application to a real data set.  相似文献   

2.
Griliches and Hausman 5 Griliches, Z. and Hausman, J. A. 1986. Errors in variables in panel data. J. Econometrics, 32: 93118. [Crossref], [Web of Science ®] [Google Scholar] and Wansbeek 11 Wansbeek, T. J. 2001. GMM estimation in panel data models with measurement error. J. Econometrics, 104: 259268. [Crossref], [Web of Science ®] [Google Scholar] proposed using the generalized method of moments (GMM) to obtain consistent estimators in linear regression models for longitudinal data with measurement error in one covariate, without requiring additional validation or replicate data. For usefulness of this methodology, we must extend it to the more realistic situation where more than one covariate are measured with error. Such an extension is not straightforward, since measurement errors across different covariates may be correlated. By a careful construction of the measurement error correlation structure, we are able to extend Wansbeek's GMM and show that the extended Griliches and Hausman's GMM is equivalent to the extended Wansbeek's GMM. For illustration, we apply the extended GMM to data from two medical studies, and compare it with the naive method and the method assuming only one covariate having measurement error.  相似文献   

3.
In this article, we propose a flexible parametric (FP) approach for adjusting for covariate measurement errors in regression that can accommodate replicated measurements on the surrogate (mismeasured) version of the unobserved true covariate on all the study subjects or on a sub-sample of the study subjects as error assessment data. We utilize the general framework of the FP approach proposed by Hossain and Gustafson in 2009 for adjusting for covariate measurement errors in regression. The FP approach is then compared with the existing non-parametric approaches when error assessment data are available on the entire sample of the study subjects (complete error assessment data) considering covariate measurement error in a multiple logistic regression model. We also developed the FP approach when error assessment data are available on a sub-sample of the study subjects (partial error assessment data) and investigated its performance using both simulated and real life data. Simulation results reveal that, in comparable situations, the FP approach performs as good as or better than the competing non-parametric approaches in eliminating the bias that arises in the estimated regression parameters due to covariate measurement errors. Also, it results in better efficiency of the estimated parameters. Finally, the FP approach is found to perform adequately well in terms of bias correction, confidence coverage, and in achieving appropriate statistical power under partial error assessment data.  相似文献   

4.
This paper adresses the measurement of technical efficiency of textile, clothing, and leather (TCL) industries in Tunisia through a panel data estimation of a dynamic translog production frontier. It provides a perspective on productivity and efficiency that should be instructive to a developing economy which will face substantial competitive pressure along the gradual economic liberalisation process. The importance of TCL industries in Tunisian manufacturing sector is a reason for obtaining more knowledge of productivity and efficiency for this key industry. Dynamic is introduced to reflect the production consequences of the adjustment costs, which are associated with changes in factor inputs. Estimation of a dynamic error components model is considered using the system generalized method of moments (GMM) estimator suggested by Arellano and Bover (1995), Another look at the instrumental-variable estimation of error-components models, J. Econometrics68:29-51) and Blundell and Bond (Blundell, R., Bond, S. (1998a), Initial conditions and moment restrictions in dynamic panel data models. J. Econometrics87:115-143; Blundell, R., Bond, S. (1998b), GMM estimation with persistent panel data: an application to production functions, Paper presented at the Eighth International Conference on Panel Data, Goteborg University). Our study evaluates the sensitivity of the results, particularly of the efficiency measures, to different specifications. Firm-specific time-invariant technical efficiency is obtained using the Schmidt and Sickles (Schmidt, P., Sickles, R. C. (1984). Production frontiers and panel data. J. Bus. Econ. Stat.2:367-374) approach after estimating the dynamic frontier. We stress the importance of allowing for lags in adjustment of output to inputs and of controlling for time-invariant variables when estimating firm-specific efficiency. The results suggest that the system GMM estimation of the dynamic specification produces the most accurate parameter estimates and technical efficiency measure. Mean efficiency scores is of 68%. Policy implications of the results are outlined.  相似文献   

5.
This article considers first-order autoregressive panel model that is a simple model for dynamic panel data (DPD) models. The generalized method of moments (GMM) gives efficient estimators for these models. This efficiency is affected by the choice of the weighting matrix that has been used in GMM estimation. The non-optimal weighting matrices have been used in the conventional GMM estimators. This led to a loss of efficiency. Therefore, we present new GMM estimators based on optimal or suboptimal weighting matrices. Monte Carlo study indicates that the bias and efficiency of the new estimators are more reliable than the conventional estimators.  相似文献   

6.
This article provides the large sample distribution of the iterated feasible generalized least-squares (IFGLS) estimator of an augmented dynamic panel data model. The regressors in the model include lagged values of the dependent variable and may include other explanatory variables that, while exogenous with respect to the time-varying error component, may be correlated with an unobserved time-invariant component. The article compares the finite sample properties of the IFGLS estimator to that of GMM estimators using both simulated and real data and finds that the IFGLS estimator compares favorably.  相似文献   

7.
Measurement-error modelling occurs when one cannot observe a covariate, but instead has possibly replicated surrogate versions of this covariate measured with error. The vast majority of the literature in measurement-error modelling assumes (typically with good reason) that given the value of the true but unobserved (latent) covariate, the replicated surrogates are unbiased for latent covariate and conditionally independent. In the area of nutritional epidemiology, there is some evidence from biomarker studies that this simple conditional independence model may break down due to two causes: (a) systematic biases depending on a person's body mass index, and (b) an additional random component of bias, so that the error structure is the same as a one-way random-effects model. We investigate this problem in the context of (1) estimating distribution of usual nutrient intake, (2) estimating the correlation between a nutrient instrument and usual nutrient intake, and (3) estimating the true relative risk from an estimated relative risk using the error-prone covariate. While systematic bias due to body mass index appears to have little effect, the additional random effect in the variance structure is shown to have a potentially important effect on overall results, both on corrections for relative risk estimates and in estimating the distribution of usual nutrient intake. However, the effect of dietary measurement error on both factors is shown via examples to depend strongly on the data set being used. Indeed, one of our data sets suggests that dietary measurement error may be masking a strong risk of fat on breast cancer, while for a second data set this masking is not so clear. Until further understanding of dietary measurement is available, measurement-error corrections must be done on a study-specific basis, sensitivity analyses should be conducted, and even then results of nutritional epidemiology studies relating diet to disease risk should be interpreted cautiously.  相似文献   

8.
Least-squares and quantile regressions are method of moments techniques that are typically used in isolation. A leading example where efficiency may be gained by combining least-squares and quantile regressions is one where some information on the error quantiles is available but the error distribution cannot be fully specified. This estimation problem may be cast in terms of solving an over-determined estimating equation (EE) system for which the generalized method of moments (GMM) and empirical likelihood (EL) are approaches of recognized importance. The major difficulty with implementing these techniques here is that the EEs associated with the quantiles are non-differentiable. In this paper, we develop a kernel-based smoothing technique for non-smooth EEs, and derive the asymptotic properties of the GMM and maximum smoothed EL (MSEL) estimators based on the smoothed EEs. Via a simulation study, we investigate the finite sample properties of the GMM and MSEL estimators that combine least-squares and quantile moment relationships. Applications to real datasets are also considered.  相似文献   

9.
For multivariate survival data, we study the generalized method of moments (GMM) approach to estimation and inference based on the marginal additive hazards model. We propose an efficient iterative algorithm using closed‐form solutions, which dramatically reduces the computational burden. Asymptotic normality of the proposed estimators is established, and the corresponding variance–covariance matrix can be consistently estimated. Inference procedures are derived based on the asymptotic chi‐squared distribution of the GMM objective function. Simulation studies are conducted to empirically examine the finite sample performance of the proposed method, and a real data example from a dental study is used for illustration.  相似文献   

10.
Censored quantile regression serves as an important supplement to the Cox proportional hazards model in survival analysis. In addition to being exposed to censoring, some covariates may subject to measurement error. This leads to substantially biased estimate without taking this error into account. The SIMulation-EXtrapolation (SIMEX) method is an effective tool to handle the measurement error issue. We extend the SIMEX approach to the censored quantile regression with covariate measurement error. The algorithm is assessed via extensive simulations. A lung cancer study is analyzed to verify the validation of the proposed method.  相似文献   

11.
Measurement error is well known to cause bias in estimated regression coefficients and a loss of power for detecting associations. Methods commonly used to correct for bias often require auxiliary data. We develop a solution for investigating associations between the change in an imprecisely measured outcome and precisely measured predictors, adjusting for the baseline value of the outcome when auxiliary data are not available. We require the specification of ranges for the reliability or the measurement error variance. The solution allows one to investigate the associations for change and to assess the impact of the measurement error.  相似文献   

12.
Nonresponse is a very common phenomenon in survey sampling. Nonignorable nonresponse – that is, a response mechanism that depends on the values of the variable having nonresponse – is the most difficult type of nonresponse to handle. This article develops a robust estimation approach to estimating equations (EEs) by incorporating the modelling of nonignorably missing data, the generalized method of moments (GMM) method and the imputation of EEs via the observed data rather than the imputed missing values when some responses are subject to nonignorably missingness. Based on a particular semiparametric logistic model for nonignorable missing response, this paper proposes the modified EEs to calculate the conditional expectation under nonignorably missing data. We can apply the GMM to infer the parameters. The advantage of our method is that it replaces the non-parametric kernel-smoothing with a parametric sampling importance resampling (SIR) procedure to avoid nonparametric kernel-smoothing problems with high dimensional covariates. The proposed method is shown to be more robust than some current approaches by the simulations.  相似文献   

13.
This article analyzes a growing group of fixed T dynamic panel data estimators with a multifactor error structure. We use a unified notational approach to describe these estimators and discuss their properties in terms of deviations from an underlying set of basic assumptions. Furthermore, we consider the extendability of these estimators to practical situations that may frequently arise, such as their ability to accommodate unbalanced panels and common observed factors. Using a large-scale simulation exercise, we consider scenarios that remain largely unexplored in the literature, albeit being of great empirical relevance. In particular, we examine (i) the effect of the presence of weakly exogenous covariates, (ii) the effect of changing the magnitude of the correlation between the factor loadings of the dependent variable and those of the covariates, (iii) the impact of the number of moment conditions on bias and size for GMM estimators, and finally (iv) the effect of sample size. We apply each of these estimators to a crime application using a panel data set of local government authorities in New South Wales, Australia; we find that the results bear substantially different policy implications relative to those potentially derived from standard dynamic panel GMM estimators. Thus, our study may serve as a useful guide to practitioners who wish to allow for multiplicative sources of unobserved heterogeneity in their model.  相似文献   

14.
Generalized method of moments (GMM) estimation has become an important unifying framework for inference in econometrics in the last 20 years. It can be thought of as encompassing almost all of the common estimation methods, such as maximum likelihood, ordinary least squares, instrumental variables, and two-stage least squares, and nowadays is an important part of all advanced econometrics textbooks. The GMM approach links nicely to economic theory where orthogonality conditions that can serve as such moment functions often arise from optimizing behavior of agents. Much work has been done on these methods since the seminal article by Hansen, and much remains in progress. This article discusses some of the developments since Hansen's original work. In particular, it focuses on some of the recent work on empirical likelihood–type estimators, which circumvent the need for a first step in which the optimal weight matrix is estimated and have attractive information theoretic interpretations.  相似文献   

15.
Generalized method of moments (GMM) is used to develop tests for discriminating discrete distributions among the two-parameter family of Katz distributions. Relationships involving moments are exploited to obtain identifying and over-identifying restrictions. The asymptotic relative efficiencies of tests based on GMM are analyzed using the local power approach and the approximate Bahadur efficiency. The paper also gives results of Monte Carlo experiments designed to check the validity of the theoretical findings and to shed light on the small sample properties of the proposed tests. Extensions of the results to compound Poisson alternative hypotheses are discussed.  相似文献   

16.
Calibration in macroeconomics involves choosing fre parameters by matching certain moments of simulted models with those of data. We formally examine this method by treating the process of calibration as an econometric estimator. A numerical version of the Mehra-Prescott (1985) economy is the setting for an evaluation of calibration estimators via Monte Carlo methods. While these estimators sometimes have reasonable finite-sample properties they are not robust to mistakes in setting non-free parameters. In contrast, generalized method-of-moments (GMM) estimators have satisfactory finite-sample characteristics, quick convergence, and informational requirements less stringent than those of calibration estimators. In dynamic equilibrium models in which GMM is infeasible we offer some suggestions for improving estimates based on calibration methodology.  相似文献   

17.
We investigate the small-sample properties of three alternative generalized method of moments (GMM) estimators of asset-pricing models. The estimators that we consider include ones in which the weighting matrix is iterated to convergence and ones in which the weighting matrix is changed with each choice of the parameters. Particular attention is devoted to assessing the performance of the asymptotic theory for making inferences based directly on the deterioration of GMM criterion functions.  相似文献   

18.
This article examines structural change tests based on generalized empirical likelihood methods in the time series context, allowing for dependent data. Standard structural change tests for the Generalized method of moments (GMM) are adapted to the generalized empirical likelihood (GEL) context. We show that when moment conditions are properly smoothed, these test statistics converge to the same asymptotic distribution as in the GMM, in cases with known and unknown breakpoints. New test statistics specific to GEL methods, and that are robust to weak identification, are also introduced. A simulation study examines the small sample properties of the tests and reveals that GEL-based robust tests performed well, both in terms of the presence and location of a structural change and in terms of the nature of identification.  相似文献   

19.
This paper introduces a new class of M-estimators based on generalised empirical likelihood (GEL) estimation with some auxiliary information available in the sample. The resulting class of estimators is efficient in the sense that it achieves the same asymptotic lower bound as that of the efficient generalised method of moment (GMM) estimator with the same auxiliary information. The paper also shows that in case of smooth estimating equations the proposed estimators enjoy a small second order bias property compared to both efficient GMM and full GEL estimators. Analytical formulae to obtain bias corrected estimators are also provided. Simulations show that with correctly specified auxiliary information the proposed estimators and in particular those based on empirical likelihood outperform standard M and efficient GMM estimators both in terms of finite sample bias and efficiency. On the other hand with moderately misspecified auxiliary information estimators based on the nonparametric tilting method are typically characterised by the best finite sample properties.  相似文献   

20.
Generalized linear models (GLMs) with error-in-covariates are useful in epidemiological research due to the ubiquity of non-normal response variables and inaccurate measurements. The link function in GLMs is chosen by the user depending on the type of response variable, frequently the canonical link function. When covariates are measured with error, incorrect inference can be made, compounded by incorrect choice of link function. In this article we propose three flexible approaches for handling error-in-covariates and estimating an unknown link simultaneously. The first approach uses a fully Bayesian (FB) hierarchical framework, treating the unobserved covariate as a latent variable to be integrated over. The second and third are approximate Bayesian approach which use a Laplace approximation to marginalize the variables measured with error out of the likelihood. Our simulation results show support that the FB approach is often a better choice than the approximate Bayesian approaches for adjusting for measurement error, particularly when the measurement error distribution is misspecified. These approaches are demonstrated on an application with binary response.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号