首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
In this article, we will study the strong laws of large numbers and asymptotic equipartition property (AEP) for mth-order asymptotic odd–even Markov chains indexed by an m-rooted Cayley tree. First, the definition of mth-order asymptotic odd–even Markov chains indexed by an m-rooted Cayley tree is introduced, then the strong limit theorem for this Markov chains is established. Next, the strong laws of large numbers for the frequencies of ordered couple of states for mth-order asymptotic odd–even Markov chains indexed by an m-rooted Cayley tree are obtained. Finally, we prove the AEP for this Markov chains.  相似文献   

2.
ABSTRACT

In this article, we studied the strong law of large numbers(LLN) and Shannon-McMillan theorem for an mth-order nonhomogeneous Markov chain indexed by an m- rooted Cayley tree. This article generalized the relative results of level mth-order nonhomogeneous Markov chains indexed by an m- rooted Cayley tree.  相似文献   

3.
In this paper, we introduce a model of a second-order circular Markov chain indexed by a two-rooted Cayley tree and establish two strong law of large numbers and the asymptotic equipartition property (AEP) for circular second-order finite Markov chains indexed by this homogeneous tree. In the proof, we apply a limit property for a sequence of multi-variable functions of a non homogeneous Markov chain indexed by such tree. As a corollary, we obtain the strong law of large numbers and AEP about the second-order finite homogeneous Markov chain indexed by the two-rooted homogeneous tree.  相似文献   

4.
In this article, we introduce the notion of a countable asymptotic circular Markov chain, and prove a strong law of large numbers: as a corollary, we generalize a well-known version of the strong law of large numbers for nonhomogeneous Markov chains, and prove the Shannon-McMillan-Breiman theorem in this context, extending the result for the finite case.  相似文献   

5.
Hai-Bo Yu 《随机性模型》2017,33(4):551-571
ABSTRACT

Motivated by various applications in queueing theory, this article is devoted to the stochastic monotonicity and comparability of Markov chains with block-monotone transition matrices. First, we introduce the notion of block-increasing convex order for probability vectors, and characterize the block-monotone matrices in the sense of the block-increasing order and block-increasing convex order. Second, we characterize the Markov chain with general transition matrix by martingale and provide a stochastic comparison of two block-monotone Markov chains under the two block-monotone orders. Third, the stochastic comparison results for the Markov chains corresponding to the discrete-time GI/G/1 queue with different service distributions under the two block-monotone orders are given, and the lower bound and upper bound of the Markov chain corresponding to the discrete-time GI/G/1 queue in the sense of the block-increasing convex order are found.  相似文献   

6.
ABSTRACT

This paper studies the asymptotic distribution of the largest eigenvalue of the sample covariance matrix. The multivariate distribution for the population is assumed to be elliptical with finite kurtosis 3κ. An expression as an expectation is obtained for the distribution function of the largest eigenvalue regardless of the multiplicity, m, of the population's largest eigenvalue. The asymptotic distribution function and density function are evaluated numerically for m = 2,3,4,5. The bootstrap of the average of the m largest eigenvalues is shown to be consistent for any underlying distribution with finite fourth-order cumulants.  相似文献   

7.
In this article, we introduce and study Markov systems on general spaces (MSGS) as a first step of an entire theory on the subject. Also, all the concepts and basic results needed for this scope are given and analyzed. This could be thought of as an extension of the theory of a non homogeneous Markov system (NHMS) and that of a non homogeneous semi-Markov system on countable spaces, which has realized an interesting growth in the last thirty years. In addition, we study the asymptotic behaviour or ergodicity of Markov systems on general state spaces. The problem of asymptotic behaviour of Markov chains has been central for finite or countable spaces since the foundation of the subject. It has also been basic in the theory of NHMS and NHSMS. Two basic theorems are provided in answering the important problem of the asymptotic distribution of the population of the memberships of a Markov system that lives in the general space (X, ?(X)). Finally, we study the total variability from the invariant measure of the Markov system given that there exists an asymptotic behaviour. We prove a theorem which states that the total variation is finite. This problem is known also as the coupling problem.  相似文献   

8.
In this article, a semi-Markovian random walk with delay and a discrete interference of chance (X(t)) is considered. It is assumed that the random variables ζ n , n = 1, 2,…, which describe the discrete interference of chance form an ergodic Markov chain with ergodic distribution which is a gamma distribution with parameters (α, λ). Under this assumption, the asymptotic expansions for the first four moments of the ergodic distribution of the process X(t) are derived, as λ → 0. Moreover, by using the Riemann zeta-function, the coefficients of these asymptotic expansions are expressed by means of numerical characteristics of the summands, when the process considered is a semi-Markovian Gaussian random walk with small drift β.  相似文献   

9.
ABSTRACT

This article addresses the problem of repeats detection used in the comparison of significant repeats in sequences. The case of self-overlapping leftmost repeats for large sequences generated by an homogeneous stationary Markov chain has not been treated in the literature. In this work, we are interested by the approximation of the number of self-overlapping leftmost long enough repeats distribution in an homogeneous stationary Markov chain. Using the Chen–Stein method, we show that the number of self-overlapping leftmost long enough repeats distribution is approximated by the Poisson distribution. Moreover, we show that this approximation can be extended to the case where the sequences are generated by a m-order Markov chain.  相似文献   

10.
Although Markov chain Monte Carlo methods have been widely used in many disciplines, exact eigen analysis for such generated chains has been rare. In this paper, a special Metropolis-Hastings algorithm, Metropolized independent sampling, proposed first in Hastings (1970), is studied in full detail. The eigenvalues and eigenvectors of the corresponding Markov chain, as well as a sharp bound for the total variation distance between the nth updated distribution and the target distribution, are provided. Furthermore, the relationship between this scheme, rejection sampling, and importance sampling are studied with emphasis on their relative efficiencies. It is shown that Metropolized independent sampling is superior to rejection sampling in two respects: asymptotic efficiency and ease of computation.  相似文献   

11.
We address the problem of robust model selection for finite memory stochastic processes. Consider m independent samples, with most of them being realizations of the same stochastic process with law Q, which is the one we want to retrieve. We define the asymptotic breakdown point γ for a model selection procedure and also we devise a model selection procedure. We compute the value of γ which is 0.5, when all the processes are Markovian. This result is valid for any family of finite order Markov models but for simplicity we will focus on the family of variable length Markov chains.  相似文献   

12.
Yang et al. (Yang et al., J. Math. Anal. Appl., 410 (2014), 179–189.) have obtained the strong law of large numbers and asymptotic equipartition property for the asymptotic even–odd Markov chains indexed by a homogeneous tree. In this article, we are going to study the strong law of large numbers and the asymptotic equipartition property for a class of non homogeneous Markov chains indexed by a homogeneous tree which are the generalizations of above results. We also provide an example showing that our generalizations are not trivial.  相似文献   

13.
Extensions of some limit theorems are proved for tail probabilities of sums of independent identically distributed random variables satisfying the one-sided or two-sided Cramér's condition. The large deviation x-region under consideration is broader than in the classical Cramér's theorem, and the estimate of the remainder is uniform with respect to x. The corresponding asymptotic expansion with arbitrarily many summands is also obtained.  相似文献   

14.
Abstract

In this paper, we will study the strong law of large numbers of the delayed sums for Markov chains indexed by a Cayley tree with countable state spaces. Firstly, we prove a strong limit theorem for the delayed sums of the bivariate functions for Markov chains indexed by a Cayley tree. Secondly, the strong law of large numbers for the frequencies of occurrence of states of the delayed sums is obtained. As a corollary, we obtain the strong law of large numbers for the frequencies of occurrence of states for countable Markov chains indexed by a Cayley tree.  相似文献   

15.
ABSTRACT

A two-dimensionally indexed random coefficients autoregressive models (2D ? RCAR) and the corresponding statistical inference are important tools for the analysis of spatial lattice data. The study of such models is motivated by their second-order properties that are similar to those of 2D ? (G)ARCH which play an important role in spatial econometrics. In this article, we study the asymptotic properties of two-stage generalized moment method (2S ? GMM) under general asymptotic framework for 2D ? RCA models. So, the efficiency, strong consistency, the asymptotic normality, and hypothesis tests of 2S ? GMM estimation are derived. A simulation experiment is presented to highlight the theoretical results.  相似文献   

16.
17.
Consider a population of n individuals that move independently among a finite set {1, 2,……, k} of states in a sequence of trials. t = 0. 1, 2,…, m. each according to a Markov chain with transition probability matrix P . This paper deals with the problem of estimating P on the basis of aggregate data which record only the numbers of individuals that occupy each of the k states at times t = 0. 1,2,……,m. Estimation is accomplished using conditional least squares, and asymptotic results are verified for the case n → ∞. A weighted least-squares estimator is introduced and compared with previous estimators. Some comments are made on estimability questions that arise when only aggregate data are available.  相似文献   

18.
Traditional resampling methods for estimating sampling distributions sometimes fail, and alternative approaches are then needed. For example, if the classical central limit theorem does not hold and the naïve bootstrap fails, the m/n bootstrap, based on smaller-sized resamples, may be used as an alternative. An alternative to the naïve bootstrap, the sufficient bootstrap, which uses only the distinct observations in a bootstrap sample, is another recently proposed bootstrap approach that has been suggested to reduce the computational burden associated with bootstrapping. It works as long as naïve bootstrap does. However, if the naïve bootstrap fails, so will the sufficient bootstrap. In this paper, we propose combining the sufficient bootstrap with the m/n bootstrap in order to both regain consistent estimation of sampling distributions and to reduce the computational burden of the bootstrap. We obtain necessary and sufficient conditions for asymptotic normality of the proposed method, and propose new values for the resample size m. We compare the proposed method with the naïve bootstrap, the sufficient bootstrap, and the m/n bootstrap by simulation.  相似文献   

19.
《随机性模型》2013,29(2-3):343-375
Abstract

The purpose of this article is to present analytic methods for determining the asymptotic behaviour of the coefficents of power series that can be applied to homogeneous discrete quasi death and birth processes. It turns that there are in principle only three types for the asymptotic behaviour. The process either converges to the stationary distribution or it can be approximated in terms of a reflected Brownian motion or by a Brownian motion. In terms of Markov chains these cases correspond to positive recurrence, to null recurrence, and to non recurrence. The same results hold for the continuous case, too.  相似文献   

20.
《随机性模型》2013,29(1):75-111
In this paper, we study the classification problem of discrete time and continuous time Markov processes with a tree structure. We first show some useful properties associated with the fixed points of a nondecreasing mapping. Mainly we find the conditions for a fixed point to be the minimal fixed point by using fixed point theory and degree theory. We then use these results to identify conditions for Markov chains of M/G/1 type or GI/M/1 type with a tree structure to be positive recurrent, null recurrent, or transient. The results are generalized to Markov chains of matrix M/G/1 type with a tree structure. For all these cases, a relationship between a certain fixed point, the matrix of partial differentiation (Jacobian) associated with the fixed point, and the classification of the Markov chain with a tree structure is established. More specifically, we show that the Perron-Frobenius eigenvalue of the matrix of partial differentiation associated with a certain fixed point provides information for a complete classification of the Markov chains of interest.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号