首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
We prove a self-normalized central limit theorem for a mixing class of processes introduced in Kacem M, Loisel S, Maume-Deschamps V. [Some mixing properties of conditionally independent processes. Commun Statist Theory Methods. 2016;45:1241–1259]. This class is larger than more classical strongly mixing processes and thus our result is more general than [Peligrad M, Shao QM. Estimation of the variance of partial sums for ρ-mixing random variables. J Multivar Anal. 1995;52:140–157; Shi S. Estimation of the variance for strongly mixing sequences. Appl Math J Chinese Univ. 2000;15(1):45–54] ones. The fact that some conditionally independent processes satisfy this kind of mixing properties motivated our study. We investigate the weak consistency as well as the asymptotic normality of the estimator of the variance that we propose.  相似文献   

2.
Strong mixing property holds for a broad class of linear and nonlinear time series models such as Auto-Regressive Moving Average Processes and Generalized Auto-Regressive Conditional Heteroscedasticity Processes models. In this article, we study correlation structure of strong mixing sequences, and some asymptotic properties are presented. We also present a new method for detecting change point in correlation structure of strong mixing sequences, and present a nonparametric sequential analysis for detecting changes named cumulative sum test statistic for this. Asymptotic consistency of this test statistics is shown. This method is applied to simulated data of some linear and nonlinear models and power of the test is evaluated. For linear models, it is shown that this method has a better performance in comparison to Berkes et al. (2009 Berkes, I., Gombay, E., Horvath, L. (2009). Testing for changes in the covariance structure of linear processes. J. Stat. Plan. Inf. 139:20442063.[Crossref], [Web of Science ®] [Google Scholar]).  相似文献   

3.
We study minimum contrast estimation for parametric stationary determinantal point processes. These processes form a useful class of models for repulsive (or regular, or inhibitive) point patterns and are already applied in numerous statistical applications. Our main focus is on minimum contrast methods based on the Ripley's K‐function or on the pair correlation function. Strong consistency and asymptotic normality of theses procedures are proved under general conditions that only concern the existence of the process and its regularity with respect to the parameters. A key ingredient of the proofs is the recently established Brillinger mixing property of stationary determinantal point processes. This work may be viewed as a complement to the study of Y. Guan and M. Sherman who establish the same kind of asymptotic properties for a large class of Cox processes, which in turn are models for clustering (or aggregation).  相似文献   

4.
5.
The lognormal and Weibull distributions are the most popular distributions for modeling lifetime data. In practical applications, they usually fit the data at hand well. However, their predictions may lead to large differences. The main purpose of the present article is to investigate the impacts of mis-specification between the lognormal and Weibull distributions on the interval estimation of a pth quantile of the distributions for complete data. The coverage probabilities of the confidence intervals (CIs) with mis-specification are evaluated. The results indicate that for both the lognormal and the Weibull distributions, the coverage probabilities are significantly influenced by mis-specification, especially for a small or a large p on lower or upper tail of the distributions. In addition, based on the coverage probabilities with correct and mis-specification, a maxmin criterion is proposed to make a choice between these two distributions. The numerical results indicate that for p ≤ 0.05 and 0.6 ≤ p ≤ 0.8, Weibull distribution is suggested to evaluate CIs of a pth quantile of the distributions, while, for 0.2 ≤ p ≤ 0.5 and p = 0.99, lognormal distribution is suggested to evaluate CIs of a pth quantile of the distributions. Besides, for p = 0.9 and 0.95, lognormal distribution is suggested if the sample size is large enough, while, for p = 0.1, Weibull distribution is suggested if the sample size is large enough. Finally, a simulation study is conducted to evaluate the efficiency of the proposed method.  相似文献   

6.
Abstract. In this study we are concerned with inference on the correlation parameter ρ of two Brownian motions, when only high‐frequency observations from two one‐dimensional continuous Itô semimartingales, driven by these particular Brownian motions, are available. Estimators for ρ are constructed in two situations: either when both components are observed (at the same time), or when only one component is observed and the other one represents its volatility process and thus has to be estimated from the data as well. In the first case it is shown that our estimator has the same asymptotic behaviour as the standard one for i.i.d. normal observations, whereas a feasible estimator can still be defined in the second framework, but with a slower rate of convergence.  相似文献   

7.
The rate of convergence in the central limit theorem and in the random central limit theorem for some functions of U-statistics are established. The theorems refer to the asymptotic behaviour of the sequence {g(Un),n≥1}, where g belongs to the class of all differentiable functions g such that g′εL(δ) and Un is a U-statistics.  相似文献   

8.
A reduced ‐statistic is a ‐statistic with its summands drawn from a restricted but balanced set of pairs. In this article, central limit theorems are derived for reduced ‐statistics under ‐mixing, which significantly extends the work of Brown & Kildea in various aspects. It will be shown and illustrated that reduced ‐statistics are quite useful in deriving test statistics in various nonparametric testing problems.  相似文献   

9.
In this article, we derive the likelihood ratio tests (LRTs) for simultaneously testing interval hypotheses for normal means with known and unknown variances, and also with unknown but equal variance. Special cases when the interval hypotheses boil down to a point hypothesis are also discussed. Remarks regarding comparison of the LRT with tests based on combination of p-values are made, and several applications based on real data are mentioned.  相似文献   

10.
The purpose of this paper is to prove, through the analysis of the behaviour of a standard kernel density estimator, that the notion of weak dependence defined in a previous paper (cf. Doukhan & Louhichi, 1999) has sufficiently sharp properties to be used in various situations. More precisely we investigate the asymptotics of high order losses, asymptotic distributions and uniform almost sure behaviour of kernel density estimates. We prove that they are the same as for independent samples (with some restrictions for a.s. behaviours). Recall finally that this weak dependence condition extends on the previously defined ones such as mixing, association and it allows considerations of new classes such as weak shifts processes based on independent sequences as well as some non-mixing Markov processes.  相似文献   

11.
In this article, we analyze Generalized Method of Moments (GMM) and Continuous Updating Estimator (CUE) with strong, nearly-weak, and weak identification. We show that with this mixed system, the limits of the estimators are nonstandard. In the subcase of GMM estimator with only nearly-weak instruments, the correlation between the instruments and the first order conditions decline at a slower rate than root T. We find an important difference between the nearly-weak case and the weak case. Inference with point estimates is possible with the Wald, likelihood ratio (LR), and Lagrange multiplier (LM) tests in GMM estimator with only nearly-weak instruments present in the system. The limit is the standard χ2 limit. This is important from an applied perspective, since tests on the weak case do depend on the true value and can only test simple null. We also show this in the more realistic case of mixed type of strong, weak, and nearly-weak instruments, Anderson and Rubin (1949 Anderson , T. W. , Rubin , H. ( 1949 ). Estimation of the parameters of a single equation in a complete system of stochastic equations . Annals of Mathematical Statistics 20 : 4663 .[Crossref] [Google Scholar]) and Kleibergen (2005 Kleibergen , F. ( 2005 ). Testing parameters in GMM without assuming that they are identified . Econometrica Forthcoming . [Google Scholar]) type of tests are asymptotically pivotal and have χ2 limit.  相似文献   

12.
Empirical likelihood ratio confidence regions based on the chi-square calibration suffer from an undercoverage problem in that their actual coverage levels tend to be lower than the nominal levels. The finite sample distribution of the empirical log-likelihood ratio is recognized to have a mixture structure with a continuous component on [0, + ∞) and a point mass at + ∞. The undercoverage problem of the Chi-square calibration is partly due to its use of the continuous Chi-square distribution to approximate the mixture distribution of the empirical log-likelihood ratio. In this article, we propose two new methods of calibration which will take advantage of the mixture structure; we construct two new mixture distributions by using the F and chi-square distributions and use these to approximate the mixture distributions of the empirical log-likelihood ratio. The new methods of calibration are asymptotically equivalent to the chi-square calibration. But the new methods, in particular the F mixture based method, can be substantially more accurate than the chi-square calibration for small and moderately large sample sizes. The new methods are also as easy to use as the chi-square calibration.  相似文献   

13.
We derive likelihood ratio (LR) tests for the null hypothesis of equivalence that the normal means fall into a practical indifference zone. The LR test can easily be constructed and applied to k ≥ 2 treatments. Simulation results indicate that the LR test might be slightly anticonservative statistically, but when the sample sizes are large, it always produces the nominal level for mean configurations under the null hypothesis. More powerful than the studentized range test, the LR test is a straightforward application that requires only current existing statistical tables, with no complicated computations.  相似文献   

14.
In their recent work, Jiang and Yang studied six classical Likelihood Ratio Test statistics under high‐dimensional setting. Assuming that a random sample of size n is observed from a p‐dimensional normal population, they derive the central limit theorems (CLTs) when p and n are proportional to each other, which are different from the classical chi‐square limits as n goes to infinity, while p remains fixed. In this paper, by developing a new tool, we prove that the mentioned six CLTs hold in a more applicable setting: p goes to infinity, and p can be very close to n. This is an almost sufficient and necessary condition for the CLTs. Simulations of histograms, comparisons on sizes and powers with those in the classical chi‐square approximations and discussions are presented afterwards.  相似文献   

15.
16.
We discuss the functional central limit theorem (FCLT) for the empirical process of a moving-average stationary sequence with long memory. The cases of one-sided and double-sided moving averages are discussed. In the case of one-sided (causal) moving average, the FCLT is obtained under weak conditions of smoothness of the distribution and the existence of (2+δ)-moment of i.i.d. innovations, by using the martingale difference decomposition due to Ho and Hsing (1996, Ann. Statist. 24, 992–1014). In the case of double-sided moving average, the proof of the FCLT is based on an asymptotic expansion of the bivariate probability density.  相似文献   

17.
Large O and small o approximations of the expected value of a class of functions (modified K-functional and Lipschitz class) of the normalized partial sums of dependent random variables by the expectation of the corresponding functions of infinitely divisible random variables have been established. As a special case, we have obtained rates of convergence to the Stable Limit Laws and to the Weak Laws of Large Numbers. The technique used is the conditional version of the operator method of Trotter and the Taylor expansion.  相似文献   

18.
The problems of estimating the mean and an upper percentile of a lognormal population with nonnegative values are considered. For estimating the mean of a such population based on data that include zeros, a simple confidence interval (CI) that is obtained by modifying Tian's [Inferences on the mean of zero-inflated lognormal data: the generalized variable approach. Stat Med. 2005;24:3223—3232] generalized CI, is proposed. A fiducial upper confidence limit (UCL) and a closed-form approximate UCL for an upper percentile are developed. Our simulation studies indicate that the proposed methods are very satisfactory in terms of coverage probability and precision, and better than existing methods for maintaining balanced tail error rates. The proposed CI and the UCL are simple and easy to calculate. All the methods considered are illustrated using samples of data involving airborne chlorine concentrations and data on diagnostic test costs.  相似文献   

19.
This paper focuses on a novel method of developing one-sample confidence bands for survival functions from right censored data. The approach is model-based, relying on a parametric model for the conditional expectation of the censoring indicator given the observed minimum, and derives its strength from easy access to a good-fitting model among a plethora of choices available for binary response data. The substantive methodological contribution is in exploiting a semiparametric estimator of the survival function to produce improved simultaneous confidence bands. To obtain critical values for computing the confidence bands, a two-stage bootstrap approach that combines the classical bootstrap with the more recent model-based regeneration of censoring indicators is proposed and a justification of its asymptotic validity is also provided. Several different confidence bands are studied using the proposed approach. Numerical studies, including robustness of the proposed bands to misspecification, are carried out to check efficacy. The method is illustrated using two lung cancer data sets.  相似文献   

20.
Comparative lifetime experiments are important when the object of a study is to determine the relative merits of two competing duration of life products. This study considers the interval estimation for two Weibull populations when joint Type-II progressive censoring is implemented. We obtain the conditional maximum likelihood estimators of the two Weibull parameters under this scheme. Moreover, simultaneous approximate confidence region based on the asymptotic normality of the maximum likelihood estimators are also discussed and compared with two Bootstrap confidence regions. We consider the behavior of probability of failure structure with different schemes. A simulation study is performed and an illustrative example is also given.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号