首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Abstract

This paper presents a new method to estimate the quantiles of generic statistics by combining the concept of random weighting with importance resampling. This method converts the problem of quantile estimation to a dual problem of tail probabilities estimation. Random weighting theories are established to calculate the optimal resampling weights for estimation of tail probabilities via sequential variance minimization. Subsequently, the quantile estimation is constructed by using the obtained optimal resampling weights. Experimental results on real and simulated data sets demonstrate that the proposed random weighting method can effectively estimate the quantiles of generic statistics.  相似文献   

2.
In this paper, we will extend the joint model of longitudinal biomarker and recurrent event via copula function for accounting the dependence between the two processes. The general idea of joining separate processes by allowing model-specific random effect may come from different families distribution. It is a main advantage of the proposed method that a copula construction does not constrain the choice of marginal distributions of random effects. A maximum likelihood estimation with importance sampling technique as a simple and easy understanding method is employed to model inference. To evaluate and verify the validation of the proposed joint model, a bootstrapping method as a model-based resampling is developed. Our proposed joint model is also applied to pemphigus disease data for assessing the effect of biomarker trajectory on risk of recurrence.  相似文献   

3.
This paper presents a goodness-of-fit test for a semiparametric random censorship model proposed by Dikta (1998 ). The test statistic is derived from a model-based process which is asymptotically Gaussian. In addition to test consistency, the proposed test can detect local alternatives distinct n -1/2 from the null hypothesis. Due to the intractability of the asymptotic null distribution of the test statistic, we turn to two resampling approximations. We first use the well-known bootstrap method to approximate critical values of the test. We then introduce a so-called random symmetrization method for carrying out the test. Both methods perform very well with a sample of moderate size. A simulation study shows that the latter possesses better empirical powers and sizes for small samples.  相似文献   

4.
Abstract.  The sampling-importance resampling (SIR) algorithm aims at drawing a random sample from a target distribution π. First, a sample is drawn from a proposal distribution q , and then from this a smaller sample is drawn with sample probabilities proportional to the importance ratios π/ q . We propose here a simple adjustment of the sample probabilities and show that this gives faster convergence. The results indicate that our version converges better also for small sample sizes. The SIR algorithms are compared with the Metropolis–Hastings (MH) algorithm with independent proposals. Although MH converges asymptotically faster, the results indicate that our improved SIR version is better than MH for small sample sizes. We also establish a connection between the SIR algorithms and importance sampling with normalized weights. We show that the use of adjusted SIR sample probabilities as importance weights reduces the bias of the importance sampling estimate.  相似文献   

5.
It is well-known that when ranked set sampling (RSS) scheme is employed to estimate the mean of a population, it is more efficient than simple random sampling (SRS) with the same sample size. One can use a RSS analog of SRS regression estimator to estimate the population mean of Y using its concomitant variable X when they are linearly related. Unfortunately, the variance of this estimate cannot be evaluated unless the distribution of X is known. We investigate the use of resampling methods to establish confidence intervals for the regression estimation of the population mean. Simulation studies show that the proposed methods perform well in a variety of situations when the assumption of linearity holds, and decently well under mild non-linearity.  相似文献   

6.
Variance estimation under systematic sampling with probability proportional to size is known to be a difficult problem. We attempt to tackle this problem by the bootstrap resampling method. It is shown that the usual way to bootstrap fails to give satisfactory variance estimates. As a remedy, we propose a double bootstrap method which is based on certain working models and involves two levels of resampling. Unlike existing methods which deal exclusively with the Horvitz–Thompson estimator, the double bootstrap method can be used to estimate the variance of any statistic. We illustrate this within the context of both mean and median estimation. Empirical results based on five natural populations are encouraging.  相似文献   

7.
In treating dynamic systems, sequential Monte Carlo methods use discrete samples to represent a complicated probability distribution and use rejection sampling, importance sampling and weighted resampling to complete the on-line 'filtering' task. We propose a special sequential Monte Carlo method, the mixture Kalman filter, which uses a random mixture of the Gaussian distributions to approximate a target distribution. It is designed for on-line estimation and prediction of conditional and partial conditional dynamic linear models, which are themselves a class of widely used non-linear systems and also serve to approximate many others. Compared with a few available filtering methods including Monte Carlo methods, the gain in efficiency that is provided by the mixture Kalman filter can be very substantial. Another contribution of the paper is the formulation of many non-linear systems into conditional or partial conditional linear form, to which the mixture Kalman filter can be applied. Examples in target tracking and digital communications are given to demonstrate the procedures proposed.  相似文献   

8.
For estimating the distribution of a standardized statistic, the bootstrap estimate is known to be local asymptotic minimax. Various computational techniques have been developed to improve on the simulation efficiency of uniform resampling, the standard Monte Carlo approach to approximating the bootstrap estimate. Two new approaches are proposed which give accurate yet simple approximations to the bootstrap estimate. The second of the approaches even improves the convergence rate of the simulation error. A simulation study examines the performance of these two approaches in comparison with other modified bootstrap estimates.  相似文献   

9.
Resampling methods are proposed to estimate the distributions of sums of m -dependent possibly differently distributed real-valued random variables. The random variables are allowed to have varying mean values. A non parametric resampling method based on the moving blocks bootstrap is proposed for the case in which the mean values are smoothly varying or 'asymptotically equal'. The idea is to resample blocks in pairs. It is also confirmed that a 'circular' block resampling scheme can be used in the case where the mean values are 'asymptotically equal'. A central limit resampling theorem for each of the two cases is proved. The resampling methods have a potential application to time series analysis, to distinguish between two different forecasting models. This is illustrated with an example using Swedish export prices of coated paper products.  相似文献   

10.
Sequential Monte Carlo methods (also known as particle filters and smoothers) are used for filtering and smoothing in general state-space models. These methods are based on importance sampling. In practice, it is often difficult to find a suitable proposal which allows effective importance sampling. This article develops an original particle filter and an original particle smoother which employ nonparametric importance sampling. The basic idea is to use a nonparametric estimate of the marginally optimal proposal. The proposed algorithms provide a better approximation of the filtering and smoothing distributions than standard methods. The methods’ advantage is most distinct in severely nonlinear situations. In contrast to most existing methods, they allow the use of quasi-Monte Carlo (QMC) sampling. In addition, they do not suffer from weight degeneration rendering a resampling step unnecessary. For the estimation of model parameters, an efficient on-line maximum-likelihood (ML) estimation technique is proposed which is also based on nonparametric approximations. All suggested algorithms have almost linear complexity for low-dimensional state-spaces. This is an advantage over standard smoothing and ML procedures. Particularly, all existing sequential Monte Carlo methods that incorporate QMC sampling have quadratic complexity. As an application, stochastic volatility estimation for high-frequency financial data is considered, which is of great importance in practice. The computer code is partly available as supplemental material.  相似文献   

11.
Alternative methods of estimating properties of unknown distributions include the bootstrap and the smoothed bootstrap. In the standard bootstrap setting, Johns (1988) introduced an importance resam¬pling procedure that results in more accurate approximation to the bootstrap estimate of a distribution function or a quantile. With a suitable “exponential tilting” similar to that used by Johns, we derived a smoothed version of importance resampling in the framework of the smoothed bootstrap. Smoothed importance resampling procedures were developed for the estimation of distribution functions of the Studentized mean, the Studentized variance, and the correlation coefficient. Implementation of these procedures are presented via simulation results which concentrate on the problem of estimation of distribution functions of the Studentized mean and Studentized variance for different sample sizes and various pre-specified smoothing bandwidths for the normal data; additional simulations were conducted for the estimation of quantiles of the distribution of the Studentized mean under an optimal smoothing bandwidth when the original data were simulated from three different parent populations: lognormal, t(3) and t(10). These results suggest that in cases where it is advantageous to use the smoothed bootstrap rather than the standard bootstrap, the amount of resampling necessary might be substantially reduced by the use of importance resampling methods and the efficiency gains depend on the bandwidth used in the kernel density estimation.  相似文献   

12.
In this article, a non-iterative posterior sampling algorithm for linear quantile regression model based on the asymmetric Laplace distribution is proposed. The algorithm combines the inverse Bayes formulae, sampling/importance resampling, and the expectation maximization algorithm to obtain independently and identically distributed samples approximately from the observed posterior distribution, which eliminates the convergence problems in the iterative Gibbs sampling and overcomes the difficulty in evaluating the standard deviance in the EM algorithm. The numeric results in simulations and application to the classical Engel data show that the non-iterative sampling algorithm is more effective than the Gibbs sampling and EM algorithm.  相似文献   

13.
Two new implementations of the EM algorithm are proposed for maximum likelihood fitting of generalized linear mixed models. Both methods use random (independent and identically distributed) sampling to construct Monte Carlo approximations at the E-step. One approach involves generating random samples from the exact conditional distribution of the random effects (given the data) by rejection sampling, using the marginal distribution as a candidate. The second method uses a multivariate t importance sampling approximation. In many applications the two methods are complementary. Rejection sampling is more efficient when sample sizes are small, whereas importance sampling is better with larger sample sizes. Monte Carlo approximation using random samples allows the Monte Carlo error at each iteration to be assessed by using standard central limit theory combined with Taylor series methods. Specifically, we construct a sandwich variance estimate for the maximizer at each approximate E-step. This suggests a rule for automatically increasing the Monte Carlo sample size after iterations in which the true EM step is swamped by Monte Carlo error. In contrast, techniques for assessing Monte Carlo error have not been developed for use with alternative implementations of Monte Carlo EM algorithms utilizing Markov chain Monte Carlo E-step approximations. Three different data sets, including the infamous salamander data of McCullagh and Nelder, are used to illustrate the techniques and to compare them with the alternatives. The results show that the methods proposed can be considerably more efficient than those based on Markov chain Monte Carlo algorithms. However, the methods proposed may break down when the intractable integrals in the likelihood function are of high dimension.  相似文献   

14.
Bootstrap methods are proposed for estimating sampling distributions and associated statistics for regression parameters in multivariate survival data. We use an Independence Working Model (IWM) approach, fitting margins independently, to obtain consistent estimates of the parameters in the marginal models. Resampling procedures, however, are applied to an appropriate joint distribution to estimate covariance matrices, make bias corrections, and construct confidence intervals. The proposed methods allow for fixed or random explanatory variables, the latter case using extensions of existing resampling schemes (Loughin,1995), and they permit the possibility of random censoring. An application is shown for the viral positivity time data previously analyzed by Wei, Lin, and Weissfeld (1989). A simulation study of small-sample properties shows that the proposed bootstrap procedures provide substantial improvements in variance estimation over the robust variance estimator commonly used with the IWM. This revised version was published online in July 2006 with corrections to the Cover Date.  相似文献   

15.
The bootstrap is a intensive computer-based method originally mainly devoted to estimate the standard deviations, confidence intervals and bias of the studied statistic. This technique is useful in a wide variety of statistical procedures, however, its use for hypothesis testing, when the data structure is complex, is not straightforward and each case must be particularly treated. A general bootstrap method for hypothesis testing is studied. The considered method preserves the data structure of each group independently and the null hypothesis is only used in order to compute the bootstrap statistic values (not at the resampling, as usual). The asymptotic distribution is developed and several case studies are discussed.  相似文献   

16.
For m–dependent, identically distributed random observation, the bootstrap method provides inconsistent estimators of the distribution and variance of the sample mean. This paper proposes an alternative resampling procedure. For estimating the distribution and variance of a function of the sample mean, the proposed resampling estimators are shown to be strongly consistent.  相似文献   

17.
We consider variable acceptance sampling plans that control the lot or process fraction defective, where a specification limit defines acceptable quality. The problem is to find a sampling plan that fulfils some conditions, usually on the operation characteristic. Its calculation heavily depends on distributional properties that, in practice, might be doubtful. If prior data are already available, we propose to estimate the sampling plan by means of bootstrap methods. The bias and standard error of the estimated plan can be assessed easily by Monte Carlo approximation to the respective bootstrap moments. This resampling approach does not require strong assumptions and, furthermore, is a flexible method that can be extended to any statistic that might be informative for the fraction defective in a lot.  相似文献   

18.
This study investigates the influences of additive outliers on financial durations. An outlier test statistic and an outlier detection procedure are proposed to detect and estimate outlier effects for the logarithmic Autoregressive Conditional Duration (Log-ACD) model. The proposed test statistic has an exact sampling distribution and performs very well, in terms of size and power, in a series of Monte Carlo simulations. Furthermore, the test statistic is robust to several alternative distribution assumptions. An empirical application shows that parameter estimates without considering outliers tend to be biased.  相似文献   

19.
The Metropolis–Hastings algorithm is one of the most basic and well-studied Markov chain Monte Carlo methods. It generates a Markov chain which has as limit distribution the target distribution by simulating observations from a different proposal distribution. A proposed value is accepted with some particular probability otherwise the previous value is repeated. As a consequence, the accepted values are repeated a positive number of times and thus any resulting ergodic mean is, in fact, a weighted average. It turns out that this weighted average is an importance sampling-type estimator with random weights. By the standard theory of importance sampling, replacement of these random weights by their (conditional) expectations leads to more efficient estimators. In this paper we study the estimator arising by replacing the random weights with certain estimators of their conditional expectations. We illustrate by simulations that it is often more efficient than the original estimator while in the case of the independence Metropolis–Hastings and for distributions with finite support we formally prove that it is even better than the “optimal” importance sampling estimator.  相似文献   

20.
In this article, a non-iterative sampling algorithm is developed to obtain an independently and identically distributed samples approximately from the posterior distribution of parameters in Laplace linear regression model. By combining the inverse Bayes formulae, sampling/importance resampling, and expectation maximum algorithm, the algorithm eliminates the diagnosis of convergence in the iterative Gibbs sampling and the samples generated from it can be used for inferences immediately. Simulations are conducted to illustrate the robustness and effectiveness of the algorithm. Finally, real data are studied to show the usefulness of the proposed methodology.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号