首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 217 毫秒
1.
This study considers the problem of testing for a parameter change in integer-valued time series models in which the conditional density of current observations is assumed to follow a Poisson distribution. As a test, we consider the CUSUM of the squares test based on the residuals from INGARCH models and find that the test converges weakly to the supremum of a Brownian bridge. A simulation study demonstrates its superiority to the residual and standardized residual-based CUSUM tests of Kang and Lee [Parameter change test for Poisson autoregressive models. Scand J Statist. 2014;41:1136–1152] and Lee and Lee [CUSUM tests for general nonlinear inter-valued GARCH models: comparison study. Ann Inst Stat Math. 2019;71:1033–1057.] as well as the CUSUM of squares test based on standardized residuals.  相似文献   

2.
Scale mixtures of normal distributions form a class of symmetric thick-tailed distributions that includes the normal one as a special case. In this paper we consider local influence analysis for measurement error models (MEM) when the random error and the unobserved value of the covariates jointly follow scale mixtures of normal distributions, providing an appealing robust alternative to the usual Gaussian process in measurement error models. In order to avoid difficulties in estimating the parameter of the mixing variable, we fixed it previously, as recommended by Lange et al. (J Am Stat Assoc 84:881–896, 1989) and Berkane et al. (Comput Stat Data Anal 18:255–267, 1994). The local influence method is used to assess the robustness aspects of the parameter estimates under some usual perturbation schemes. However, as the observed log-likelihood associated with this model involves some integrals, Cook’s well–known approach may be hard to apply to obtain measures of local influence. Instead, we develop local influence measures following the approach of Zhu and Lee (J R Stat Soc Ser B 63:121–126, 2001), which is based on the EM algorithm. Results obtained from a real data set are reported, illustrating the usefulness of the proposed methodology, its relative simplicity, adaptability and practical usage.  相似文献   

3.
A Bayesian approach is considered for the interval estimation of a binomial proportion in doubly sampled data. The coverage probability and the expected width of the Bayesian confidence interval are compared with likelihood-related confidence intervals. It is shown that a hierarchical Bayesian approach provides relatively simple and effective confidence intervals. In addition, it is shown that Agresti–Coull type confidence interval, discussed by  Lee and Choi (2009), can be justified by the Bayesian framework.  相似文献   

4.
For defining a Modified Maximum Likelihood Estimate of the scale parameter of Rayleigh distribution, a hyperbolic approximation is used instead of linear approximation for a function which appears in the Maximum Likelihood equation. This estimate is shown to perform better, in the sense of accuracy and simplicity of calculation, than the one based on linear approximation for the same function. Also the estimate of the scale parameter obtained is shown to be asymptotically unbiased. Numerical computation for random samples of different sizes from Rayleigh distribution, using type I1 censoring is done and is shown to be better than that obtained by Lee et al. (1980)  相似文献   

5.
This paper describes the estimating procedures of mean number of entities that possess a rare sensitive attribute using the Mangat (1992) randomized device, when the population consists of some clusters and the population is again stratified with some clusters in each stratum. Unbiased estimation procedures for the mean number of individuals have been discussed and their properties are described when the parameter of a rare unrelated attribute is assumed to be known and unknown. An empirical study is carried out to show the dominance of the proposed estimator over Lee et al. (2013) estimator.  相似文献   

6.
In this paper, we study the robust estimation for the order of hidden Markov model (HMM) based on a penalized minimum density power divergence estimator, which is obtained by utilizing the finite mixture marginal distribution of HMM. For this task, we adopt the locally conic parametrization method used in [D. Dacunha-Castelle and E. Gassiate, Testing in locally conic models and application to mixture models. ESAIM Probab. Stat. (1997), pp. 285–317; D. Dacunha-Castelle and E. Gassiate, Testing the order of a model using locally conic parametrization: population mixtures and stationary arma processes, Ann. Statist. 27 (1999), pp. 1178–1209; T. Lee and S. Lee, Robust and consistent estimation of the order of finite mixture models based on the minimizing a density power divergence estimator, Metrika 68 (2008), pp. 365–390] to avoid the difficulties that arise in handling mixture marginal models, such as the non-identifiability of the parameter space and the singularity problem with the asymptotic variance. We verify that the estimated order is consistent and simulation results are provided for illustration.  相似文献   

7.
The problem of testing for a parameter change has been a core issue in time series analysis. It is well known that the estimates-based CUSUM test often suffers from severe size distortions in general GARCH type models. The residual-based CUSUM test has been used as an alternative, which, however, has a defect not to detect the ARMA parameter changes in ARMA–GARCH models. As a remedy, one can employ the score vector-based CUSUM test in ARMA–GARCH models as in Oh and Lee (0000). However, it shows some size distortions for relatively small samples. Hence, we consider the bootstrap counterpart for obtaining a more stable test. Focus is made on the verification of the weak consistency of the proposed test. An empirical study is illustrated for its evaluation.  相似文献   

8.
In a recent paper in this journal, Lee, Kapadia and Brock (1980) developed maximum likelihood (ML) methods for estimating the scale parameter of the Rayleigh distribution from doubly censored samples. They reported convergence difficulties in attempting to solve numerically the nonlinear likelihood equation (LE). To mitigate these difficulties, they employed approximations to simplify the LE, but found that the solution of the resulting simplified equation can give rise to parameter estimates of erratic accuracy. We show that the use of approximations to simplify the LE is unnecessary. In fact, under suitable parametric transformation, the log-likelihood function is strictly concave, the ML estimate always exists, is unique and finite. Furthermore, the LE is easy to solve numerically. A numerical example is given to illustrate the computations involved.  相似文献   

9.
Lee and Carter proposed in 1992 a non-linear model mxt = exp (ax + bx kt + εxt) for fitting and forecasting age-specific mortality rates at age x and time t. For the model parameter estimation, they employed the singular value decomposition method to find a least squares solution. However, the singular value decomposition algorithm does not provide the standard errors of estimated parameters, making it impossible to assess the accuracy of model parameters. This article describes the Lee-Carter model and the technical procedures to fit and extrapolate this model. To estimate the precision of the parameter estimates of the Lee-Carter model, we propose a binomial framework, whose parameter point estimates can be obtained by the maximum likelihood approach and interval estimates by a bootstrap approach. This model is used to fit mortality data in England and Wales from 1951 to 1990 and to forecast mortality change from 1991 to 2020. The Lee-Carter model fits these mortality data very well with R2 being 0.9980. The estimated overall age pattern of mortality ax is very robust whereas there is considerable uncertainty in bx (changes in the age pattern over time) and kt (overall change in mortality). The fitted log age-specific mortality rates have been declining linearly from 1951 to 1990 at different paces and the projected rates will continue to decline in such a way in the 30 years prediction period.  相似文献   

10.
Meeden and Lee [More efficient inferences using ranking information obtained from judgment sampling. J Surv Stat Methodol. 2014;2:38–57] recently showed that one can improve upon the standard unbiased mean estimator for judgement post-stratification (JPS) by using the ordering information in the sample. We propose an alternate mean estimator that uses this same information. This alternate estimator is far simpler to compute than the estimator of Meeden and Lee (2014), and we show through simulations that it typically outperforms the Meeden and Lee (2014) estimator in cases where the rankings are sufficiently good that JPS is useful.  相似文献   

11.
In this paper, we study the Jarque–Bera (JB) normality test for the innovations of ARMA–GARCH models, whose construction is based on the residuals. The validity of the JB test for ARMA–GARCH innovations should be carefully investigated in advance of actual practice, since the residual-based test may behave differently, depending upon the structure of the time series models and the form of the test statistic (cf. Chen and Kuan, 2003, Hwang and Baek, 2009, Lee and Wei, 1999). In order to demonstrate the validity of the JB test, we prove that the asymptotic distribution of the original form of the JB test is identical to that of the test statistic based on true errors under mild conditions. Simulation results are provided for illustration.  相似文献   

12.
Lee et al. in 2016 proposed a nonparametric estimator of the joint distribution of the gap time between transplant and the first infection and the following gap times between consecutive infections. In this article, we propose an alternative estimator based on the inverse-probability weighted (IPW) approach. Asymptotic properties of the proposed estimator are established . Simulation results indicate that the IPW estimator performs as well as the estimator proposed by Lee et al. We also propose an IPW estimator for estimating the joint distribution function of the gap times between consecutive recurrent events beyond the first episode.  相似文献   

13.
Inference, quantile forecasting and model comparison for an asymmetric double smooth transition heteroskedastic model is investigated. A Bayesian framework in employed and an adaptive Markov chain Monte Carlo scheme is designed. A mixture prior is proposed that alleviates the usual identifiability problem as the speed of transition parameter tends to zero, and an informative prior for this parameter is suggested, that allows for reliable inference and a proper posterior, despite the non-integrability of the likelihood function. A formal Bayesian posterior model comparison procedure is employed to compare the proposed model with its two limiting cases: the double threshold GARCH and symmetric ARX GARCH models. The proposed methods are illustrated using both simulated and international stock market return series. Some illustrations of the advantages of an adaptive sampling scheme for these models are also provided. Finally, Bayesian forecasting methods are employed in a Value-at-Risk study of the international return series. The results generally favour the proposed smooth transition model and highlight explosive and smooth nonlinear behaviour in financial markets.  相似文献   

14.
Haibing (2009) proposed a procedure for successive comparisons between ordered treatment effects in one-way layout and showed that the proposed procedure has greater power than the procedure proposed by Lee and Spurrier (1995). Critical constants required for the proposed procedure were estimated using Monte Carlo simulation and few values of the constants were tabulated which limit the applications of the proposed procedure. In this article, a numerical method, using recursive integration methodology, is discussed to compute the critical constants which work efficiently for a large number of treatments and extensive values of critical constants are tabulated for the use of practitioners. Power comparisons of Haibing's and Lee and Spurrier's procedure is also discussed.  相似文献   

15.
The conditional specification technique introduced by Arnold et al. (Conditional specification of statistical models. Springer series in statistics. Springer, New York, 1999) was used in Sarabia et al. (Astin Bull 34(1):85–98, 2004) to obtain bonus-malus premiums. The Poisson distribution for which the parameter is a function of the classical structure parameter was used and a new class of prior distributions appeared in a natural way. This model contains, as a particular case, the classical compound Poisson model. In the present paper, the Bayesian robustness of this new model is examined and found to be much more robust than in the classical model in Gómez et al. (Insur Math Econ 31:105–113, 2002). For the present study, the moment conditions on the prior distribution are required. Examples, with real data, are given to illustrate our ideas under the net and exponential premium principles.  相似文献   

16.
Every hedonic price index is an estimate of an unknown economic parameter. It depends, in practice, on one or more random samples of prices and characteristics of a certain good. Bootstrap resampling methods provide a tool for quantifying sampling errors. Following some general reflections on hedonic elementary price indices, this paper proposes a case-based, a model-based, and a wild bootstrap approach for estimating confidence intervals for hedonic price indices. Empirical results are obtained for a data set on used cars in Switzerland. A simple and an enhanced adaptive semi-logarithmic model are fit to monthly samples, and bootstrap confidence intervals are estimated for Jevons-type hedonic elementary price indices.  相似文献   

17.
Lee and Krutchkoff [Biometrics, 36, 1980, 531–536] introduced partially truncated distributions and derived their mean and variance with applications to a contaminant data set. Here, we derive the moments of partially truncated distributions of general order. We revisit the data application of Lee and Krutchkoff to show that the conclusion can be different for different orders.  相似文献   

18.
The posterior predictive p value (ppp) was invented as a Bayesian counterpart to classical p values. The methodology can be applied to discrepancy measures involving both data and parameters and can, hence, be targeted to check for various modeling assumptions. The interpretation can, however, be difficult since the distribution of the ppp value under modeling assumptions varies substantially between cases. A calibration procedure has been suggested, treating the ppp value as a test statistic in a prior predictive test. In this paper, we suggest that a prior predictive test may instead be based on the expected posterior discrepancy, which is somewhat simpler, both conceptually and computationally. Since both these methods require the simulation of a large posterior parameter sample for each of an equally large prior predictive data sample, we furthermore suggest to look for ways to match the given discrepancy by a computation‐saving conflict measure. This approach is also based on simulations but only requires sampling from two different distributions representing two contrasting information sources about a model parameter. The conflict measure methodology is also more flexible in that it handles non‐informative priors without difficulty. We compare the different approaches theoretically in some simple models and in a more complex applied example.  相似文献   

19.
A recent theorem by Hannig and Lee on consistency of their estimator of Kullback–Leibler discrepancy is re-proved under assumptions suitably modified to correct a fault in the original proof.  相似文献   

20.
In this article, we analyze the performance of five estimation methods for the long memory parameter d. The goal of our article is to construct a wavelet estimate for the fractional differencing parameter in nonstationary long memory processes that dominate the well-known estimate of Shimotsu and Phillips (2005) Shimotsu, K., Phillips, P. (2005). Exact local whittle estimation of fractional integration. Annals of statistics 20:87127. [Google Scholar]. The simulation results show that the wavelet estimation method of Lee (2005) Lee, J. (2005). Estimating memory parameter in the US inflation rate. Economics Letters 87:207210. [Google Scholar] with several tapering techniques performs better under most cases in nonstationary long memory. The comparison is based on the empirical root mean squared error of each estimate.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号