首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 46 毫秒
1.
The rejection sampling filter and smoother, proposed by Tanizaki (1996, 1999), Tanizaki and Mariano (1998) and Hiirzeler and Kiinsch (1998), take a lot of time computationally. The Markov chain Monte Carlo smoother, developed by Carlin, Poison and StofFer (1992), Carter and Kohn (1994, 1996) and Geweke and Tanizaki (1999a, 1999b), does not show a good performance depending on noniinearity and nonnormality of the system in the sense of the root mean square error criterion, which reason comes from slow convergence of the Gibbs sampler. Taking into account these problems, we propose the nonlinear and non-Gaussian filter and smoother which have much less computational burden and give us relatively better state estimates, although the proposed estimator does not yield the optimal state estimates in the sense of the minimum mean square error. The proposed filter and smoother are called the quasi-optimal filter and quasi-optimal smoother in this paper. Finally, through some Monte Carlo studies, the quasi-optimal filter and smoother are compared with the rejection sampling procedure and the Markov chain Monte Carlo procedure.  相似文献   

2.
As known, the ordinary least-squares estimator (OLSE) is unbiased and also, has the minimum variance among all the linear unbiased estimators. However, under multicollinearity the estimator is generally unstable and poor in the sense that variance of the regression coefficients may be inflated and absolute values of the estimates may be too large. There are several classes of biased estimators in statistical literature to decrease the effect of multicollinearity in the design matrix. Here, based on the Cholesky decomposition, we propose such an estimator which makes the data to be slightly distorted. The exact risk expressions as well as the biases are derived for the proposed estimator. Also, some results demonstrating superiority of the suggested estimator over OLSE are obtained. Finally, a Monté-Carlo simulation study and a real data application related to acetylene data are presented to support our theoretical discussions.  相似文献   

3.
Bootstrapping the conditional copula   总被引:1,自引:0,他引:1  
This paper is concerned with inference about the dependence or association between two random variables conditionally upon the given value of a covariate. A way to describe such a conditional dependence is via a conditional copula function. Nonparametric estimators for a conditional copula then lead to nonparametric estimates of conditional association measures such as a conditional Kendall's tau. The limiting distributions of nonparametric conditional copula estimators are rather involved. In this paper we propose a bootstrap procedure for approximating these distributions and their characteristics, and establish its consistency. We apply the proposed bootstrap procedure for constructing confidence intervals for conditional association measures, such as a conditional Blomqvist beta and a conditional Kendall's tau. The performances of the proposed methods are investigated via a simulation study involving a variety of models, ranging from models in which the dependence (weak or strong) on the covariate is only through the copula and not through the marginals, to models in which this dependence appears in both the copula and the marginal distributions. As a conclusion we provide practical recommendations for constructing bootstrap-based confidence intervals for the discussed conditional association measures.  相似文献   

4.
The relative 'performances of improved ridge estimators and an empirical Bayes estimator are studied by means of Monte Carlo simulations. The empirical Bayes method is seen to perform consistently better in terms of smaller MSE and more accurate empirical coverage than any of the estimators considered here. A bootstrap method is proposed to obtain more reliable estimates of the MSE of ridge esimators. Some theorems on the bootstrap for the ridge estimators are also given and they are used to provide an analytical understanding of the proposed bootstrap procedure. Empirical coverages of the ridge estimators based on the proposed procedure are generally closer to the nominal coverage when compared to their earlier counterparts. In general, except for a few cases, these coverages are still less accurate than the empirical coverages of the empirical Bayes estimator.  相似文献   

5.
Two-parameter Gompertz distribution has been introduced as a lifetime model for reliability inference recently. In this paper, the Gompertz distribution is proposed for the baseline lifetimes of components in a composite system. In this composite system, failure of a component induces increased load on the surviving components and thus increases component hazard rate via a power-trend process. Point estimates of the composite system parameters are obtained by the method of maximum likelihood. Interval estimates of the baseline survival function are obtained by using the maximum-likelihood estimator via a bootstrap percentile method. Two parametric bootstrap procedures are proposed to test whether the hazard rate function changes with the number of failed components. Intensive simulations are carried out to evaluate the performance of the proposed estimation procedure.  相似文献   

6.
Bootstrap procedures are useful to obtain forecast densities for both returns and volatilities in the context of generalized autoregressive conditional heteroscedasticity models. In this paper, we analyse the effect of additive outliers on the finite sample properties of these bootstrap densities and show that, when obtained using maximum likelihood estimates of the parameters and standard filters for the volatilities, they are badly affected with dramatic consequences on the estimation of Value-at-Risk. We propose constructing bootstrap densities for returns and volatilities using a robust parameter estimator based on variance targeting implemented together with an adequate modification of the volatility filter. We show that the performance of the proposed procedure is adequate when compared with available robust alternatives. The results are illustrated with both simulated and real data.  相似文献   

7.
Variable selection is an important issue in all regression analysis and in this paper, we discuss this in the context of regression analysis of recurrent event data. Recurrent event data often occur in long-term studies in which individuals may experience the events of interest more than once and their analysis has recently attracted a great deal of attention (Andersen et al., Statistical models based on counting processes, 1993; Cook and Lawless, Biometrics 52:1311–1323, 1996, The analysis of recurrent event data, 2007; Cook et al., Biometrics 52:557–571, 1996; Lawless and Nadeau, Technometrics 37:158-168, 1995; Lin et al., J R Stat Soc B 69:711–730, 2000). However, it seems that there are no established approaches to the variable selection with respect to recurrent event data. For the problem, we adopt the idea behind the nonconcave penalized likelihood approach proposed in Fan and Li (J Am Stat Assoc 96:1348–1360, 2001) and develop a nonconcave penalized estimating function approach. The proposed approach selects variables and estimates regression coefficients simultaneously and an algorithm is presented for this process. We show that the proposed approach performs as well as the oracle procedure in that it yields the estimates as if the correct submodel was known. Simulation studies are conducted for assessing the performance of the proposed approach and suggest that it works well for practical situations. The proposed methodology is illustrated by using the data from a chronic granulomatous disease study.  相似文献   

8.
In this paper, we focus on resampling non-stationary weakly dependent point processes in two dimensions to make inference on the inhomogeneous K function ( Baddeley et al., 2000). We provide theoretical results that show a consistency result of the bootstrap estimates of the variance as the observation region and resampling blocks increase in size. We present results of a simulation study that examines the performance of nominal 95% confidence intervals for the inhomogeneous K function obtained via our bootstrap procedure. The procedure is also applied to a rainforest dataset.  相似文献   

9.
This paper studies the covariance structure and the asymptotic properties of Yule–Walker (YW) type estimators for a bilinear time series model with periodically time-varying coefficients. We give necessary and sufficient conditions ensuring the existence of moments up to eighth order. Expressions of second and third order joint moments, as well as the limiting covariance matrix of the sample moments are given. Strong consistency and asymptotic normality of the YW estimator as well as hypotheses testing via Wald’s procedure are derived. We use a residual bootstrap version to construct bootstrap estimators of the YW estimates. Some simulation results will demonstrate the large sample behavior of the bootstrap procedure.  相似文献   

10.
Bootstrap in functional linear regression   总被引:1,自引:0,他引:1  
We have considered the functional linear model with scalar response and functional explanatory variable. One of the most popular methodologies for estimating the model parameter is based on functional principal components analysis (FPCA). In recent literature, weak convergence for a wide class of FPCA-type estimates has been proved, and consequently asymptotic confidence sets can be built. In this paper, we have proposed an alternative approach in order to obtain pointwise confidence intervals by means of a bootstrap procedure, for which we have obtained its asymptotic validity. Besides, a simulation study allows us to compare the practical behaviour of asymptotic and bootstrap confidence intervals in terms of coverage rates for different sample sizes.  相似文献   

11.
Missing data are a common problem in almost all areas of empirical research. Ignoring the missing data mechanism, especially when data are missing not at random (MNAR), can result in biased and/or inefficient inference. Because MNAR mechanism is not verifiable based on the observed data, sensitivity analysis is often used to assess it. Current sensitivity analysis methods primarily assume a model for the response mechanism in conjunction with a measurement model and examine sensitivity to missing data mechanism via the parameters of the response model. Recently, Jamshidian and Mata (Post-modelling sensitivity analysis to detect the effect of missing data mechanism, Multivariate Behav. Res. 43 (2008), pp. 432–452) introduced a new method of sensitivity analysis that does not require the difficult task of modelling the missing data mechanism. In this method, a single measurement model is fitted to all of the data and to a sub-sample of the data. Discrepancy in the parameter estimates obtained from the the two data sets is used as a measure of sensitivity to missing data mechanism. Jamshidian and Mata describe their method mainly in the context of detecting data that are missing completely at random (MCAR). They used a bootstrap type method, that relies on heuristic input from the researcher, to test for the discrepancy of the parameter estimates. Instead of using bootstrap, the current article obtains confidence interval for parameter differences on two samples based on an asymptotic approximation. Because it does not use bootstrap, the developed procedure avoids likely convergence problems with the bootstrap methods. It does not require heuristic input from the researcher and can be readily implemented in statistical software. The article also discusses methods of obtaining sub-samples that may be used to test missing at random in addition to MCAR. An application of the developed procedure to a real data set, from the first wave of an ongoing longitudinal study on aging, is presented. Simulation studies are performed as well, using two methods of missing data generation, which show promise for the proposed sensitivity method. One method of missing data generation is also new and interesting in its own right.  相似文献   

12.
A bootstrap based method to construct 1−α simultaneous confidence intervals for relative effects in the one-way layout is presented. This procedure takes the stochastic correlation between the test statistics into account and results in narrower simultaneous confidence intervals than the application of the Bonferroni correction. Instead of using the bootstrap distribution of a maximum statistic, the coverage of the confidence intervals for the individual comparisons are adjusted iteratively until the overall confidence level is reached. Empirical coverage and power estimates of the introduced procedure for many-to-one comparisons are presented and compared with asymptotic procedures based on the multivariate normal distribution.  相似文献   

13.
Log-normal linear models are widely used in applications, and many times it is of interest to predict the response variable or to estimate the mean of the response variable at the original scale for a new set of covariate values. In this paper we consider the problem of efficient estimation of the conditional mean of the response variable at the original scale for log-normal linear models. Several existing estimators are reviewed first, including the maximum likelihood (ML) estimator, the restricted ML (REML) estimator, the uniformly minimum variance unbiased (UMVU) estimator, and a bias-corrected REML estimator. We then propose two estimators that minimize the asymptotic mean squared error and the asymptotic bias, respectively. A parametric bootstrap procedure is also described to obtain confidence intervals for the proposed estimators. Both the new estimators and the bootstrap procedure are very easy to implement. Comparisons of the estimators using simulation studies suggest that our estimators perform better than the existing ones, and the bootstrap procedure yields confidence intervals with good coverage properties. A real application of estimating the mean sediment discharge is used to illustrate the methodology.  相似文献   

14.
Recent work has shown that the presence of ties between an outcome event and the time that a binary covariate changes or jumps can lead to biased estimates of regression coefficients in the Cox proportional hazards model. One proposed solution is the Equally Weighted method. The coefficient estimate of the Equally Weighted method is defined to be the average of the coefficient estimates of the Jump Before Event method and the Jump After Event method, where these two methods assume that the jump always occurs before or after the event time, respectively. In previous work, the bootstrap method was used to estimate the standard error of the Equally Weighted coefficient estimate. However, the bootstrap approach was computationally intensive and resulted in overestimation. In this article, two new methods for the estimation of the Equally Weighted standard error are proposed. Three alternative methods for estimating both the regression coefficient and the corresponding standard error are also proposed. All the proposed methods are easy to implement. The five methods are investigated using a simulation study and are illustrated using two real datasets.  相似文献   

15.
Nuisance parameter elimination is a central problem in capture–recapture modelling. In this paper, we consider a closed population capture–recapture model which assumes the capture probabilities varies only with the sampling occasions. In this model, the capture probabilities are regarded as nuisance parameters and the unknown number of individuals is the parameter of interest. In order to eliminate the nuisance parameters, the likelihood function is integrated with respect to a weight function (uniform and Jeffrey's) of the nuisance parameters resulting in an integrated likelihood function depending only on the population size. For these integrated likelihood functions, analytical expressions for the maximum likelihood estimates are obtained and it is proved that they are always finite and unique. Variance estimates of the proposed estimators are obtained via a parametric bootstrap resampling procedure. The proposed methods are illustrated on a real data set and their frequentist properties are assessed by means of a simulation study.  相似文献   

16.
In the nonparametric setting, the standard bootstrap method is based on the empirical distribution function of a random sample. The author proposes, by means of the empirical likelihood technique, an alternative bootstrap procedure under a nonparametric model in which one has some auxiliary information about the population distribution. By proving the almost sure weak convergence of the modified bootstrapped empirical process, the validity of the proposed bootstrap procedure is established. This new result is used to obtain bootstrap confidence bands for the population distribution function and to perform the bootstrap Kolmogorov test in the presence of auxiliary information. Other applications include bootstrapping means and variances with auxiliary information. Three simulation studies are presented to demonstrate the performance of the proposed bootstrap procedure for small samples.  相似文献   

17.
Two new methods for improving prediction regions in the context of vector autoregressive (VAR) models are proposed. These methods, which are based on the bootstrap technique, take into account the uncertainty associated with the estimation of the model order and parameters. In particular, by exploiting an independence property of the prediction error, we will introduce a bootstrap procedure that allows for better estimates of the forecasting distribution, in the sense that the variability of its quantile estimators is substantially reduced, without requiring additional bootstrap replications. The proposed methods have a good performance even if the disturbances distribution is not Gaussian. An application to a real data set is presented.  相似文献   

18.
The present study deals with the method of estimation of the parameters of k-components load-sharing parallel system model in which each component’s failure time distribution is assumed to be geometric. The maximum likelihood estimates of the load-share parameters with their standard errors are obtained. (1 − γ) 100% joint, Bonferroni simultaneous and two bootstrap confidence intervals for the parameters have been constructed. Further, recognizing the fact that life testing experiments are time consuming, it seems realistic to consider the load-share parameters to be random variable. Therefore, Bayes estimates along with their standard errors of the parameters are obtained by assuming Jeffrey’s invariant and gamma priors for the unknown parameters. Since, Bayes estimators can not be found in closed form expressions, Tierney and Kadane’s approximation method have been used to compute Bayes estimates and standard errors of the parameters. Markov Chain Monte Carlo technique such as Gibbs sampler is also used to obtain Bayes estimates and highest posterior density credible intervals of the load-share parameters. Metropolis–Hastings algorithm is used to generate samples from the posterior distributions of the unknown parameters.  相似文献   

19.
A version of the nonparametric bootstrap, which resamples the entire subjects from original data, called the case bootstrap, has been increasingly used for estimating uncertainty of parameters in mixed‐effects models. It is usually applied to obtain more robust estimates of the parameters and more realistic confidence intervals (CIs). Alternative bootstrap methods, such as residual bootstrap and parametric bootstrap that resample both random effects and residuals, have been proposed to better take into account the hierarchical structure of multi‐level and longitudinal data. However, few studies have been performed to compare these different approaches. In this study, we used simulation to evaluate bootstrap methods proposed for linear mixed‐effect models. We also compared the results obtained by maximum likelihood (ML) and restricted maximum likelihood (REML). Our simulation studies evidenced the good performance of the case bootstrap as well as the bootstraps of both random effects and residuals. On the other hand, the bootstrap methods that resample only the residuals and the bootstraps combining case and residuals performed poorly. REML and ML provided similar bootstrap estimates of uncertainty, but there was slightly more bias and poorer coverage rate for variance parameters with ML in the sparse design. We applied the proposed methods to a real dataset from a study investigating the natural evolution of Parkinson's disease and were able to confirm that the methods provide plausible estimates of uncertainty. Given that most real‐life datasets tend to exhibit heterogeneity in sampling schedules, the residual bootstraps would be expected to perform better than the case bootstrap. Copyright © 2013 John Wiley & Sons, Ltd.  相似文献   

20.
Asymptotic variance plays an important role in the inference using interval estimate of attributable risk. This paper compares asymptotic variances of attributable risk estimate using the delta method and the Fisher information matrix for a 2×2 case–control study due to the practicality of applications. The expressions of these two asymptotic variance estimates are shown to be equivalent. Because asymptotic variance usually underestimates the standard error, the bootstrap standard error has also been utilized in constructing the interval estimates of attributable risk and compared with those using asymptotic estimates. A simulation study shows that the bootstrap interval estimate performs well in terms of coverage probability and confidence length. An exact test procedure for testing independence between the risk factor and the disease outcome using attributable risk is proposed and is justified for the use with real-life examples for a small-sample situation where inference using asymptotic variance may not be valid.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号