首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
There are relatively few discussions about measurement error in the accelerated failure time (AFT) model, particularly for the semiparametric AFT model. In this article, we propose an adjusted estimation procedure for the semiparametric AFT model with covariates subject to measurement error, based on the profile likelihood approach and simulation and exploration (SIMEX) method. The simulation studies show that the proposed semiparametric SIMEX approach performs well. The proposed approach is applied to a coronary heart disease dataset from the Busselton Health study for illustration.  相似文献   

2.
We will pursue a Bayesian nonparametric approach in the hierarchical mixture modelling of lifetime data in two situations: density estimation, when the distribution is a mixture of parametric densities with a nonparametric mixing measure, and accelerated failure time (AFT) regression modelling, when the same type of mixture is used for the distribution of the error term. The Dirichlet process is a popular choice for the mixing measure, yielding a Dirichlet process mixture model for the error; as an alternative, we also allow the mixing measure to be equal to a normalized inverse-Gaussian prior, built from normalized inverse-Gaussian finite dimensional distributions, as recently proposed in the literature. Markov chain Monte Carlo techniques will be used to estimate the predictive distribution of the survival time, along with the posterior distribution of the regression parameters. A comparison between the two models will be carried out on the grounds of their predictive power and their ability to identify the number of components in a given mixture density.  相似文献   

3.
Summary. In many biomedical studies, covariates are subject to measurement error. Although it is well known that the regression coefficients estimators can be substantially biased if the measurement error is not accommodated, there has been little study of the effect of covariate measurement error on the estimation of the dependence between bivariate failure times. We show that the dependence parameter estimator in the Clayton–Oakes model can be considerably biased if the measurement error in the covariate is not accommodated. In contrast with the typical bias towards the null for marginal regression coefficients, the dependence parameter can be biased in either direction. We introduce a bias reduction technique for the bivariate survival function in copula models while assuming an additive measurement error model and replicated measurement for the covariates, and we study the large and small sample properties of the dependence parameter estimator proposed.  相似文献   

4.
Semiparametric accelerated failure time (AFT) models directly relate the expected failure times to covariates and are a useful alternative to models that work on the hazard function or the survival function. For case-cohort data, much less development has been done with AFT models. In addition to the missing covariates outside of the sub-cohort in controls, challenges from AFT model inferences with full cohort are retained. The regression parameter estimator is hard to compute because the most widely used rank-based estimating equations are not smooth. Further, its variance depends on the unspecified error distribution, and most methods rely on computationally intensive bootstrap to estimate it. We propose fast rank-based inference procedures for AFT models, applying recent methodological advances to the context of case-cohort data. Parameters are estimated with an induced smoothing approach that smooths the estimating functions and facilitates the numerical solution. Variance estimators are obtained through efficient resampling methods for nonsmooth estimating functions that avoids full blown bootstrap. Simulation studies suggest that the recommended procedure provides fast and valid inferences among several competing procedures. Application to a tumor study demonstrates the utility of the proposed method in routine data analysis.  相似文献   

5.
Increasing attention is being given to problems involving binary outcomes with covariates subject to measurement error. Here, we consider the two group normal discriminant model where a subset of the continuous variates are subject to error and will typically be replaced by a vector of surrogates, perhaps of different dimension. Correcting for the measurement error is made possible by a double sampling scheme in which the surrogates are collected on all units and true values are obtained on a random subset of units. Such a scheme allows us to consider a rich set of measurement error models which extend the traditional additive error model. Maximum likelihood estimators and their asymptotic properties are derived under a variety of models for the relationship between true values and the surrogates. Specific attention is given to the coefficients in the resulting logistic regression model. Optimal allocations are derived which minimize the variance of the estimated slope subject to cost constraints for the case where there is a univariate covariate but a possibly multivariate surrogate.  相似文献   

6.
Nested error linear regression models using survey weights have been studied in small area estimation to obtain efficient model‐based and design‐consistent estimators of small area means. The covariates in these nested error linear regression models are not subject to measurement errors. In practical applications, however, there are many situations in which the covariates are subject to measurement errors. In this paper, we develop a nested error linear regression model with an area‐level covariate subject to functional measurement error. In particular, we propose a pseudo‐empirical Bayes (PEB) predictor to estimate small area means. This predictor borrows strength across areas through the model and makes use of the survey weights to preserve the design consistency as the area sample size increases. We also employ a jackknife method to estimate the mean squared prediction error (MSPE) of the PEB predictor. Finally, we report the results of a simulation study on the performance of our PEB predictor and associated jackknife MSPE estimator.  相似文献   

7.
ABSTRACT

Often in data arising out of epidemiologic studies, covariates are subject to measurement error. In addition ordinal responses may be misclassified into a category that does not reflect the true state of the respondents. The goal of the present work is to develop an ordered probit model that corrects for the classification errors in ordinal responses and/or measurement error in covariates. Maximum likelihood method of estimation is used. Simulation study reveals the effect of ignoring measurement error and/or classification errors on the estimates of the regression coefficients. The methodology developed is illustrated through a numerical example.  相似文献   

8.
Competing risks are common in clinical cancer research, as patients are subject to multiple potential failure outcomes, such as death from the cancer itself or from complications arising from the disease. In the analysis of competing risks, several regression methods are available for the evaluation of the relationship between covariates and cause-specific failures, many of which are based on Cox’s proportional hazards model. Although a great deal of research has been conducted on estimating competing risks, less attention has been devoted to linear regression modeling, which is often referred to as the accelerated failure time (AFT) model in survival literature. In this article, we address the use and interpretation of linear regression analysis with regard to the competing risks problem. We introduce two types of AFT modeling framework, where the influence of a covariate can be evaluated in relation to either a cause-specific hazard function, referred to as cause-specific AFT (CS-AFT) modeling in this study, or the cumulative incidence function of a particular failure type, referred to as crude-risk AFT (CR-AFT) modeling. Simulation studies illustrate that, as in hazard-based competing risks analysis, these two models can produce substantially different effects, depending on the relationship between the covariates and both the failure type of principal interest and competing failure types. We apply the AFT methods to data from non-Hodgkin lymphoma patients, where the dataset is characterized by two competing events, disease relapse and death without relapse, and non-proportionality. We demonstrate how the data can be analyzed and interpreted, using linear competing risks regression models.  相似文献   

9.
Censored quantile regression serves as an important supplement to the Cox proportional hazards model in survival analysis. In addition to being exposed to censoring, some covariates may subject to measurement error. This leads to substantially biased estimate without taking this error into account. The SIMulation-EXtrapolation (SIMEX) method is an effective tool to handle the measurement error issue. We extend the SIMEX approach to the censored quantile regression with covariate measurement error. The algorithm is assessed via extensive simulations. A lung cancer study is analyzed to verify the validation of the proposed method.  相似文献   

10.
In this paper, we propose a bias corrected estimate of the regression coefficient for the generalized probit regression model when the covariates are subject to measurement error and the responses are subject to interval censoring. The main improvement of our method is that it reduces most of the bias that the naive estimates have. The great advantage of our method is that it is baseline and censoring distribution free, in a sense that the investigator does not need to calculate the baseline or the censoring distribution to obtain the estimator of the regression coefficient, an important property of Cox regression model. A sandwich estimator for the variance is also proposed. Our procedure can be generalized to general measurement error distribution as long as the first four moments of the measurement error are known. The results of extensive simulations show that our approach is very effective in eliminating the bias when the measurement error is not too large relative to the error term of the regression model.  相似文献   

11.
There has been extensive interest in discussing inference methods for survival data when some covariates are subject to measurement error. It is known that standard inferential procedures produce biased estimation if measurement error is not taken into account. With the Cox proportional hazards model a number of methods have been proposed to correct bias induced by measurement error, where the attention centers on utilizing the partial likelihood function. It is also of interest to understand the impact on estimation of the baseline hazard function in settings with mismeasured covariates. In this paper we employ a weakly parametric form for the baseline hazard function and propose simple unbiased estimating functions for estimation of parameters. The proposed method is easy to implement and it reveals the connection between the naive method ignoring measurement error and the corrected method with measurement error accounted for. Simulation studies are carried out to evaluate the performance of the estimators as well as the impact of ignoring measurement error in covariates. As an illustration we apply the proposed methods to analyze a data set arising from the Busselton Health Study [Knuiman, M.W., Cullent, K.J., Bulsara, M.K., Welborn, T.A., Hobbs, M.S.T., 1994. Mortality trends, 1965 to 1989, in Busselton, the site of repeated health surveys and interventions. Austral. J. Public Health 18, 129–135].  相似文献   

12.
Abstract.  We propose a new method for fitting proportional hazards models with error-prone covariates. Regression coefficients are estimated by solving an estimating equation that is the average of the partial likelihood scores based on imputed true covariates. For the purpose of imputation, a linear spline model is assumed on the baseline hazard. We discuss consistency and asymptotic normality of the resulting estimators, and propose a stochastic approximation scheme to obtain the estimates. The algorithm is easy to implement, and reduces to the ordinary Cox partial likelihood approach when the measurement error has a degenerate distribution. Simulations indicate high efficiency and robustness. We consider the special case where error-prone replicates are available on the unobserved true covariates. As expected, increasing the number of replicates for the unobserved covariates increases efficiency and reduces bias. We illustrate the practical utility of the proposed method with an Eastern Cooperative Oncology Group clinical trial where a genetic marker, c- myc expression level, is subject to measurement error.  相似文献   

13.
Preterm birth, defined as delivery before 37 completed weeks' gestation, is a leading cause of infant morbidity and mortality. Identifying factors related to preterm delivery is an important goal of public health professionals who wish to identify etiologic pathways to target for prevention. Validation studies are often conducted in nutritional epidemiology in order to study measurement error in instruments that are generally less invasive or less expensive than "gold standard" instruments. Data from such studies are then used in adjusting estimates based on the full study sample. However, measurement error in nutritional epidemiology has recently been shown to be complicated by correlated error structures in the study-wide and validation instruments. Investigators of a study of preterm birth and dietary intake designed a validation study to assess measurement error in a food frequency questionnaire (FFQ) administered during pregnancy and with the secondary goal of assessing whether a single administration of the FFQ could be used to describe intake over the relatively short pregnancy period, in which energy intake typically increases. Here, we describe a likelihood-based method via Markov Chain Monte Carlo to estimate the regression coefficients in a generalized linear model relating preterm birth to covariates, where one of the covariates is measured with error and the multivariate measurement error model has correlated errors among contemporaneous instruments (i.e. FFQs, 24-hour recalls, and/or biomarkers). Because of constraints on the covariance parameters in our likelihood, identifiability for all the variance and covariance parameters is not guaranteed and, therefore, we derive the necessary and suficient conditions to identify the variance and covariance parameters under our measurement error model and assumptions. We investigate the sensitivity of our likelihood-based model to distributional assumptions placed on the true folate intake by employing semi-parametric Bayesian methods through the mixture of Dirichlet process priors framework. We exemplify our methods in a recent prospective cohort study of risk factors for preterm birth. We use long-term folate as our error-prone predictor of interest, the food-frequency questionnaire (FFQ) and 24-hour recall as two biased instruments, and serum folate biomarker as the unbiased instrument. We found that folate intake, as measured by the FFQ, led to a conservative estimate of the estimated odds ratio of preterm birth (0.76) when compared to the odds ratio estimate from our likelihood-based approach, which adjusts for the measurement error (0.63). We found that our parametric model led to similar conclusions to the semi-parametric Bayesian model.  相似文献   

14.
The purpose of this paper is to examine the properties of several bias-corrected estimators for generalized linear measurement error models, along with the naive estimator, in some special settings. In particular, we consider logistic regression, poisson regression and exponential-gamma models where the covariates are subject to measurement error. Monte Carlo experiments are conducted to compare the relative performance of the estimators in terms of several criteria. The results indicate that the naive estimator of slope is biased towards zero by a factor increasing with the magnitude of slope and measurement error as well as the sample size. It is found that none of the biased-corrected estimators always outperforms the others, and that their small sample properties typically depend on the underlying model assumptions.  相似文献   

15.
Abstract. We propose a Bayesian semiparametric methodology for quantile regression modelling. In particular, working with parametric quantile regression functions, we develop Dirichlet process mixture models for the error distribution in an additive quantile regression formulation. The proposed non‐parametric prior probability models allow the shape of the error density to adapt to the data and thus provide more reliable predictive inference than models based on parametric error distributions. We consider extensions to quantile regression for data sets that include censored observations. Moreover, we employ dependent Dirichlet processes to develop quantile regression models that allow the error distribution to change non‐parametrically with the covariates. Posterior inference is implemented using Markov chain Monte Carlo methods. We assess and compare the performance of our models using both simulated and real data sets.  相似文献   

16.
This article deals with parameter estimation in the Cox proportional hazards model when covariates are measured with error. We consider both the classical additive measurement error model and a more general model which represents the mis-measured version of the covariate as an arbitrary linear function of the true covariate plus random noise. Only moment conditions are imposed on the distributions of the covariates and measurement error. Under the assumption that the covariates are measured precisely for a validation set, we develop a class of estimating equations for the vector-valued regression parameter by correcting the partial likelihood score function. The resultant estimators are proven to be consistent and asymptotically normal with easily estimated variances. Furthermore, a corrected version of the Breslow estimator for the cumulative hazard function is developed, which is shown to be uniformly consistent and, upon proper normalization, converges weakly to a zero-mean Gaussian process. Simulation studies indicate that the asymptotic approximations work well for practical sample sizes. The situation in which replicate measurements (instead of a validation set) are available is also studied.  相似文献   

17.
The paper considers simultaneous estimation of finite population means for several strata. A model-based approach is taken, where the covariates in the super-population model are subject to measurement errors. Empirical Bayes (EB) estimators of the strata means are developed and an asymptotic expression for the MSE of the EB estimators is provided. It is shown that the proposed EB estimators are “first order optimal” in the sense of Robbins [1956. An empirical Bayes approach to statistics. In: Proceedings of the Third Berkeley Symposium on Mathematical Statistics and Probability, vol. 1, University of California Press, Berkeley, pp. 157–164], while the regular EB estimators which ignore the measurement error are not.  相似文献   

18.
We introduce a general class of semiparametric hazard regression models, called extended hazard (EH) models, that are designed to accommodate various survival schemes with time-dependent covariates. The EH model contains both the Cox model and the accelerated failure time (AFT) model as its subclasses so that we can use this nested structure to perform model selection between the Cox model and the AFT model. A class of estimating equations using counting process and martingale techniques is developed to estimate the regression parameters of the proposed model. The performance of the estimating procedure and the impact of model misspecification are assessed through simulation studies. Two data examples, Stanford heart transplant data and Mediterranean fruit flies, egg-laying data, are used to demonstrate the usefulness of the EH model.  相似文献   

19.
In this paper, we consider the ultrahigh-dimensional sufficient dimension reduction (SDR) for censored data and measurement error in covariates. We first propose the feature screening procedure based on censored data and the covariates subject to measurement error. With the suitable correction of mismeasurement, the error-contaminated variables detected by the proposed feature screening procedure are the same as the truly important variables. Based on the selected active variables, we develop the SDR method to estimate the central subspace and the structural dimension with both censored data and measurement error incorporated. The theoretical results of the proposed method are established. Simulation studies are reported to assess the performance of the proposed method. The proposed method is implemented to NKI breast cancer data.  相似文献   

20.
Regression parameter estimation in the Cox failure time model is considered when regression variables are subject to measurement error. Assuming that repeat regression vector measurements adhere to a classical measurement model, we can consider an ordinary regression calibration approach in which the unobserved covariates are replaced by an estimate of their conditional expectation given available covariate measurements. However, since the rate of withdrawal from the risk set across the time axis, due to failure or censoring, will typically depend on covariates, we may improve the regression parameter estimator by recalibrating within each risk set. The asymptotic and small sample properties of such a risk set regression calibration estimator are studied. A simple estimator based on a least squares calibration in each risk set appears able to eliminate much of the bias that attends the ordinary regression calibration estimator under extreme measurement error circumstances. Corresponding asymptotic distribution theory is developed, small sample properties are studied using computer simulations and an illustration is provided.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号