首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 626 毫秒
1.
Given a fractional integrated, autoregressive, moving average,ARFIMA (p, d, q) process, the simultaneous estimation of the short and long memory parameters can be achieved by maximum likelihood estimators. In this paper, following a two-step algorithm, the coefficients are estimated combining the maximum likelihood estimators with the general orthogonal decomposition of stochastic processes. In particular, the principal component analysis of stochastic processes is exploited to estimate the short memory parameters, which are plugged into the maximum likelihood function to obtain the fractional differencingd.  相似文献   

2.
Some real-world phenomena in geo-science, micro-economy, and turbulence, to name a few, can be effectively modeled by a fractional Brownian motion indexed by a Hurst parameter, a regularity level, and a scaling parameter σ2, an energy level. This article discusses estimation of a scaling parameter σ2 when a Hurst parameter is known. To estimate σ2, we propose three approaches based on maximum likelihood estimation, moment-matching, and concentration inequalities, respectively, and discuss the theoretical characteristics of the estimators and optimal-filtering guidelines. We also justify the improvement of the estimation of σ2 when a Hurst parameter is known. Using the three approaches and a parametric bootstrap methodology in a simulation study, we compare the confidence intervals of σ2 in terms of their lengths, coverage rates, and computational complexity and discuss empirical attributes of the tested approaches. We found that the approach based on maximum likelihood estimation was optimal in terms of efficiency and accuracy, but computationally expensive. The moment-matching approach was found to be not only comparably efficient and accurate but also computationally fast and robust to deviations from the fractional Brownian motion model.  相似文献   

3.
Seasonal fractional ARIMA (ARFISMA) model with infinite variance innovations is used in the analysis of seasonal long-memory time series with large fluctuations (heavy-tailed distributions). Two methods, which are the empirical characteristic function (ECF) procedure developed by Knight and Yu [The empirical characteristic function in time series estimation. Econometric Theory. 2002;18:691–721] and the Two-Step method (TSM) are proposed to estimate the parameters of stable ARFISMA model. The ECF method estimates simultaneously all the parameters, while the TSM considers in the first step the Markov Chains Monte Carlo–Whittle approach introduced by Ndongo et al. [Estimation of long-memory parameters for seasonal fractional ARIMA with stable innovations. Stat Methodol. 2010;7:141–151], combined with the maximum likelihood estimation method developed by Alvarez and Olivares [Méthodes d'estimation pour des lois stables avec des applications en finance. Journal de la Société Française de Statistique. 2005;1(4):23–54] in the second step. Monte Carlo simulations are also used to evaluate the finite sample performance of these estimation techniques.  相似文献   

4.
We present a methodology for computing the point and interval maximum likelihood parameter estimation for the two-parameter generalized Pareto distribution (GPD) with censored data. The basic idea underlying our method is a reduction of the two-dimensional numerical search for the zeros of the GPD log-likelihood gradient vector to a one-dimensional numerical search. We describe a computationally efficient algorithm which implement this approach. Two illustrative examples are presented. Simulation results indicate that the estimates derived by maximum likelihood estimation are more reliable against those of method of moments. An evaluation of the practical sample size requirements for the asymptotic normality is also included.  相似文献   

5.
Local maximum likelihood estimation is a nonparametric counterpart of the widely used parametric maximum likelihood technique. It extends the scope of the parametric maximum likelihood method to a much wider class of parametric spaces. Associated with this nonparametric estimation scheme is the issue of bandwidth selection and bias and variance assessment. This paper provides a unified approach to selecting a bandwidth and constructing confidence intervals in local maximum likelihood estimation. The approach is then applied to least squares nonparametric regression and to nonparametric logistic regression. Our experiences in these two settings show that the general idea outlined here is powerful and encouraging.  相似文献   

6.
Coarse data is a general type of incomplete data that includes grouped data, censored data, and missing data. The likelihood‐based estimation approach with coarse data is challenging because the likelihood function is in integral form. The Monte Carlo EM algorithm of Wei & Tanner [Wei & Tanner (1990). Journal of the American Statistical Association, 85, 699–704] is adapted to compute the maximum likelihood estimator in the presence of coarse data. Stochastic coarse data is also covered and the computation can be implemented using the parametric fractional imputation method proposed by Kim [Kim (2011). Biometrika, 98, 119–132]. Results from a limited simulation study are presented. The proposed method is also applied to the Korean Longitudinal Study of Aging (KLoSA). The Canadian Journal of Statistics 40: 604–618; 2012 © 2012 Statistical Society of Canada  相似文献   

7.
In this paper, we consider the problem of robust estimation of the fractional parameter, d, in long memory autoregressive fractionally integrated moving average processes, when two types of outliers, i.e. additive and innovation, are taken into account without knowing their number, position or intensity. The proposed method is a weighted likelihood estimation (WLE) approach for which needed definitions and algorithm are given. By an extensive Monte Carlo simulation study, we compare the performance of the WLE method with the performance of both the approximated maximum likelihood estimation (MLE) and the robust M-estimator proposed by Beran (Statistics for Long-Memory Processes, Chapman & Hall, London, 1994). We find that robustness against the two types of considered outliers can be achieved without loss of efficiency. Moreover, as a byproduct of the procedure, we can classify the suspicious observations in different kinds of outliers. Finally, we apply the proposed methodology to the Nile River annual minima time series.  相似文献   

8.
We consider a stochastic differential equation involving standard and fractional Brownian motion with unknown drift parameter to be estimated. We investigate the standard maximum likelihood estimate of the drift parameter, two non-standard estimates and three estimates for the sequential estimation. Model strong consistency and some other properties are proved. The linear model and Ornstein–Uhlenbeck model are studied in detail. As an auxiliary result, an asymptotic behaviour of the fractional derivative of the fractional Brownian motion is established.  相似文献   

9.
Parameter estimation with missing data is a frequently encountered problem in statistics. Imputation is often used to facilitate the parameter estimation by simply applying the complete-sample estimators to the imputed dataset.In this article, we consider the problem of parameter estimation with nonignorable missing data using the approach of parametric fractional imputation proposed by Kim (2011). Using the fractional weights, the E-step of the EM algorithm can be approximated by the weighted mean of the imputed data likelihood where the fractional weights are computed from the current value of the parameter estimates. Calibration fractional imputation is also considered as a way for improving the Monte Carlo approximation in the fractional imputation. Variance estimation is also discussed. Results from two simulation studies are presented to compare the proposed method with the existing methods. A real data example from the Korea Labor and Income Panel Survey (KLIPS) is also presented.  相似文献   

10.
Maximum likelihood estimation under constraints for estimation in the Wishart class of distributions, is considered. It provides a unified approach to estimation in a variety of problems concerning covariance matrices. Virtually all covariance structures can be translated to constraints on the covariances. This includes covariance matrices with given structure such as linearly patterned covariance matrices, covariance matrices with zeros, independent covariance matrices and structurally dependent covariance matrices. The methodology followed in this paper provides a useful and simple approach to directly obtain the exact maximum likelihood estimates. These maximum likelihood estimates are obtained via an estimation procedure for the exponential class using constraints.  相似文献   

11.
The paper considers non-parametric maximum likelihood estimation of the failure time distribution for interval-censored data subject to misclassification. Such data can arise from two types of observation scheme; either where observations continue until the first positive test result or where tests continue regardless of the test results. In the former case, the misclassification probabilities must be known, whereas in the latter case, joint estimation of the event-time distribution and misclassification probabilities is possible. The regions for which the maximum likelihood estimate can only have support are derived. Algorithms for computing the maximum likelihood estimate are investigated and it is shown that algorithms appropriate for computing non-parametric mixing distributions perform better than an iterative convex minorant algorithm in terms of time to absolute convergence. A profile likelihood approach is proposed for joint estimation. The methods are illustrated on a data set relating to the onset of cardiac allograft vasculopathy in post-heart-transplantation patients.  相似文献   

12.
In this paper, we consider the problem of estimation of semi-linear regression models. Using invariance arguments, Bhowmik and King [2007. Maximal invariant likelihood based testing of semi-linear models. Statist. Papers 48, 357–383] derived the probability density function of the maximal invariant statistic for the non-linear component of these models. Using this density function as a likelihood function allows us to estimate these models in a two-step process. First the non-linear component parameters are estimated by maximising the maximal invariant likelihood function. Then the non-linear component, with the parameter values replaced by estimates, is treated as a regressor and ordinary least squares is used to estimate the remaining parameters. We report the results of a simulation study conducted to compare the accuracy of this approach with full maximum likelihood and maximum profile-marginal likelihood estimation. We find maximising the maximal invariant likelihood function typically results in less biased and lower variance estimates than those from full maximum likelihood.  相似文献   

13.
While much used in practice, latent variable models raise challenging estimation problems due to the intractability of their likelihood. Monte Carlo maximum likelihood (MCML), as proposed by Geyer & Thompson (1992 ), is a simulation-based approach to maximum likelihood approximation applicable to general latent variable models. MCML can be described as an importance sampling method in which the likelihood ratio is approximated by Monte Carlo averages of importance ratios simulated from the complete data model corresponding to an arbitrary value of the unknown parameter. This paper studies the asymptotic (in the number of observations) performance of the MCML method in the case of latent variable models with independent observations. This is in contrast with previous works on the same topic which only considered conditional convergence to the maximum likelihood estimator, for a fixed set of observations. A first important result is that when is fixed, the MCML method can only be consistent if the number of simulations grows exponentially fast with the number of observations. If on the other hand, is obtained from a consistent sequence of estimates of the unknown parameter, then the requirements on the number of simulations are shown to be much weaker.  相似文献   

14.
Summary.  A parsimonious model for treated tumours is developed as a continuation of our previous work on regrowth curve theory. The statistical model belongs to the family of marginal non-linear models since the only linear parameters of the model are tumour specific and random facilitating parameter estimation. An important feature of the model is that it enables the estimation of the fraction of cancer cells surviving the treatment in vivo having easy-to-obtain longitudinal measurements of tumour volume. We compare several methods of estimation, including Lindstrom–Bates, iterated reweighted least squares and maximum likelihood. The last two methods are computed via the total estimating equations approach and variance least squares. The theory is illustrated with a photodynamic tumour therapy example.  相似文献   

15.
The currently existing estimation methods and goodness-of-fit tests for the Cox model mainly deal with right censored data, but they do not have direct extension to other complicated types of censored data, such as doubly censored data, interval censored data, partly interval-censored data, bivariate right censored data, etc. In this article, we apply the empirical likelihood approach to the Cox model with complete sample, derive the semiparametric maximum likelihood estimators (SPMLE) for the Cox regression parameter and the baseline distribution function, and establish the asymptotic consistency of the SPMLE. Via the functional plug-in method, these results are extended in a unified approach to doubly censored data, partly interval-censored data, and bivariate data under univariate or bivariate right censoring. For these types of censored data mentioned, the estimation procedures developed here naturally lead to Kolmogorov-Smirnov goodness-of-fit tests for the Cox model. Some simulation results are presented.  相似文献   

16.
The present article discusses alternative regression models and estimation methods for dealing with multivariate fractional response variables. Both conditional mean models, estimable by quasi-maximum likelihood, and fully parametric models (Dirichlet and Dirichlet-multinomial), estimable by maximum likelihood, are considered. A new parameterization is proposed for the parametric models, which accommodates the most common specifications for the conditional mean (e.g., multinomial logit, nested logit, random parameters logit, dogit). The text also discusses at some length the specification analysis of fractional regression models, proposing several tests that can be performed through artificial regressions. Finally, an extensive Monte Carlo study evaluates the finite sample properties of most of the estimators and tests considered.  相似文献   

17.
Standard methods for maximum likelihood parameter estimation in latent variable models rely on the Expectation-Maximization algorithm and its Monte Carlo variants. Our approach is different and motivated by similar considerations to simulated annealing; that is we build a sequence of artificial distributions whose support concentrates itself on the set of maximum likelihood estimates. We sample from these distributions using a sequential Monte Carlo approach. We demonstrate state-of-the-art performance for several applications of the proposed approach.  相似文献   

18.
This paper is concerned wim ine maximum likelihood estimation and the likelihood ratio test for hierarchical loglinear models of multidimensional contingency tables with missing data. The problems of estimation and test for a high dimensional contingency table can be reduced into those for a class of low dimensional tables. In some cases, the incomplete data in the high dimensional table can become complete in the low dimensional tables through the reduction can indicate how much the incomplete data contribute to the estimation and the test.  相似文献   

19.
Stochastic ordering is a useful concept in order restricted inferences. In this paper, we propose a new estimation technique for the parameters in two multinomial populations under stochastic orderings when missing data are present. In comparison with traditional maximum likelihood estimation method, our new method can guarantee the uniqueness of the maximum of the likelihood function. Furthermore, it does not depend on the choice of initial values for the parameters in contrast to the EM algorithm. Finally, we give the asymptotic distributions of the likelihood ratio statistics based on the new estimation method.  相似文献   

20.
In this article we propose a penalized likelihood approach for the semiparametric density model with parametric and nonparametric components. An efficient iterative procedure is proposed for estimation. Approximate generalized maximum likelihood criterion from Bayesian point of view is derived for selecting the smoothing parameter. The finite sample performance of the proposed estimation approach is evaluated through simulation. Two real data examples, suicide study data and Old Faithful geyser data, are analyzed to demonstrate use of the proposed method.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号