首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 682 毫秒
1.
Abstract

In this paper a new stochastic process is introduced by subordinating fractional Lévy stable motion (FLSM) with gamma process. This new process incorporates stochastic volatility in the parent process FLSM. Fractional order moments, tail asymptotic, codifference and persistence of signs long-range dependence of the new process are discussed. A step-by-step procedure for simulations of sample trajectories and estimation of the parameters of the introduced process are given. Our study complements and generalizes the results available on variance-gamma process and fractional Laplace motion in various directions, which are well studied processes in literature.  相似文献   

2.
Stochastic models are of fundamental importance in many scientific and engineering applications. For example, stochastic models provide valuable insights into the causes and consequences of intra-cellular fluctuations and inter-cellular heterogeneity in molecular biology. The chemical master equation can be used to model intra-cellular stochasticity in living cells, but analytical solutions are rare and numerical simulations are computationally expensive. Inference of system trajectories and estimation of model parameters from observed data are important tasks and are even more challenging. Here, we consider the case where the observed data are aggregated over time. Aggregation of data over time is required in studies of single cell gene expression using a luciferase reporter, where the emitted light can be very faint and is therefore collected for several minutes for each observation. We show how an existing approach to inference based on the linear noise approximation (LNA) can be generalised to the case of temporally aggregated data. We provide a Kalman filter (KF) algorithm which can be combined with the LNA to carry out inference of system variable trajectories and estimation of model parameters. We apply and evaluate our method on both synthetic and real data scenarios and show that it is able to accurately infer the posterior distribution of model parameters in these examples. We demonstrate how applying standard KF inference to aggregated data without accounting for aggregation will tend to underestimate the process noise and can lead to biased parameter estimates.  相似文献   

3.
The polar plumes are very fine structures of the solar K-corona lying around the poles and visible during the period of minimum of activity. These poorly known structures are linked with the solar magnetic field and with numerous coronal phenomena such as the fast solar wind and the coronal holes. The SOHO space mission has provided some continuous observations to high cadence (each 10 min). From these observations the images of the K-corona have been derived and preprocessed with an adapted anisotropic filtering. Then, a peculiar type of sinogram called Time Intensity Diagram (TID) has been built. It is adapted to the evolution of polar plumes with the time. A multiresolution wavelet analysis of the TID has then revealed that the spatial distribution of the polar plumes as well as their temporal evolution were fractal. The present study consists in simulating polar plumes by forward modeling techniques in order to validate several assumptions concerning their nature and their temporal evolution. Our work involves two main steps. The first one concerns the simulation of polar plumes and the computation of their corresponding TID. The second one concerns the estimation of analysis criteria in order to compare the original TID and the simulated ones. Static and dynamic models were both used in order to confirm the fractal behavior of the temporal evolution of plumes. The most recent and promising model is based on a Hidden Markov Tree. It allows us to control the fractal parameters of the TID.  相似文献   

4.
This paper addresses the image modeling problem under the assumption that images can be represented by third-order, hidden Markov mesh random field models. The range of applications of the techniques described hereafter comprises the restoration of binary images, the modeling and compression of image data, as well as the segmentation of gray-level or multi-spectral images, and image sequences under the short-range motion hypothesis. We outline coherent approaches to both the problems of image modeling (pixel labeling) and estimation of model parameters (learning). We derive a real-time labeling algorithm-based on a maximum, marginal a posteriori probability criterion-for a hidden third-order Markov mesh random field model. Our algorithm achieves minimum time and space complexities simultaneously, and we describe what we believe to be the most appropriate data structures to implement it. Critical aspects of the computer simulation of a real-time implementation are discussed, down to the computer code level. We develop an (unsupervised) learning technique by which the model parameters can be estimated without ground truth information. We lay bare the conditions under which our approach can be made time-adaptive in order to be able to cope with short-range motion in dynamic image sequences. We present extensive experimental results for both static and dynamic images from a wide variety of sources. They comprise standard, infra-red and aerial images, as well as a sequence of ultrasound images of a fetus and a series of frames from a motion picture sequence. These experiments demonstrate that the method is subjectively relevant to the problems of image restoration, segmentation and modeling.  相似文献   

5.
Generalized Laplacian distribution is considered. A new distribution called geometric generalized Laplacian distribution is introduced and its properties are studied. First- and higher-order autoregressive processes with these stationary marginal distributions are developed and studied. Simulation studies are conducted and trajectories of the process are obtained for selected values of the parameters. Various areas of application of these models are discussed.  相似文献   

6.

The additive AR-2D model has been successfully related to the modeling of satelital images both optic and of radar of synthetic opening. Having in mind the errors that are produced in the process of captation and quantification of the image, an interesting subject, is the robust estimation of the parameters in this model. Besides the robust methods in image models are also applied in some important image processing situations such as segmentation by texture and image restoration in the presence of outliers. This paper is concerned with the development and performance of the robust RA estimator proposed by Ojeda (1998) for the estimation of parameters in contaminated AR-2D models. Here, we implement this estimator and we show by simulation study that it has a better performance than the classic least square estimator and the robust M and GM estimators in an additive outlier contaminated image model.  相似文献   

7.
Objectives in many longitudinal studies of individuals infected with the human immunodeficiency virus (HIV) include the estimation of population average trajectories of HIV ribonucleic acid (RNA) over time and tests for differences in trajectory across subgroups. Special features that are often inherent in the underlying data include a tendency for some HIV RNA levels to be below an assay detection limit, and for individuals with high initial levels or high ranges of change to drop out of the study early because of illness or death. We develop a likelihood for the observed data that incorporates both of these features. Informative drop-outs are handled by means of an approach previously published by Schluchter. Using data from the HIV Epidemiology Research Study, we implement a maximum likelihood procedure to estimate initial HIV RNA levels and slopes within a population, compare these parameters across subgroups of HIV-infected women and illustrate the importance of appropriate treatment of left censoring and informative drop-outs. We also assess model assumptions and consider the prediction of random intercepts and slopes in this setting. The results suggest that marked bias in estimates of fixed effects, variance components and standard errors in the analysis of HIV RNA data might be avoided by the use of methods like those illustrated.  相似文献   

8.
In partial step-stress accelerated life testing, models extrapolating data obtained under more severe conditions to infer the lifetime distribution under normal use conditions are needed. Bhattacharyya (Invited paper for 46th session of the ISI, 1987) proposed a tampered Brownian motion process model and later derived the probability distribution from a decay process perspective without linear assumption. In this paper, the model is described and the features of the failure time distribution are discussed. The maximum likelihood estimates of the parameters in the model and their asymptotic properties are presented. An application of models for step-stress accelerated life test to fields other than engineering is described and illustrated by applying the tampered Brownian motion process model to data taken from a clinical trial.  相似文献   

9.
M-estimation (robust estimation) for the parameters in nonlinear mixed effects models using Fisher scoring method is investigated in the article, which shares some of the features of the existing maximum likelihood estimation: consistency and asymptotic normality. Score tests for autocorrelation and random effects based on M-estimation, together with their asymptotic distribution are also studied. The performance of the test statistics are evaluated via simulations and a real data analysis of plasma concentrations data.  相似文献   

10.
Laplace motion is a Lévy process built upon Laplace distributions. Non Gaussian stochastic fields that are integrals with respect to this process are considered and methods for their model fitting are discussed. The proposed procedures allow for inference about the parameters of the underlying Laplace distributions. A fit of dependence structure is also addressed. The importance of a convenient parameterization that admits natural and consistent estimation for this class of models is emphasized. Several parameterizations are introduced and their advantages over one another discussed. The proposed estimation method targets the standard characteristics: mean, variance, skewness and kurtosis. Their sample equivalents are matched in the closest possible way as allowed by natural constraints within this class. A simulation study and an example of potential applications conclude the article.  相似文献   

11.
A new Bayesian state and parameter learning algorithm for multiple target tracking models with image observations are proposed. Specifically, a Markov chain Monte Carlo algorithm is designed to sample from the posterior distribution of the unknown time-varying number of targets, their birth, death times and states as well as the model parameters, which constitutes the complete solution to the specific tracking problem we consider. The conventional approach is to pre-process the images to extract point observations and then perform tracking, i.e. infer the target trajectories. We model the image generation process directly to avoid any potential loss of information when extracting point observations using a pre-processing step that is decoupled from the inference algorithm. Numerical examples show that our algorithm has improved tracking performance over commonly used techniques, for both synthetic examples and real florescent microscopy data, especially in the case of dim targets with overlapping illuminated regions.  相似文献   

12.
Abstract. Parameter estimation in diffusion processes from discrete observations up to a first‐passage time is clearly of practical relevance, but does not seem to have been studied so far. In neuroscience, many models for the membrane potential evolution involve the presence of an upper threshold. Data are modelled as discretely observed diffusions which are killed when the threshold is reached. Statistical inference is often based on a misspecified likelihood ignoring the presence of the threshold causing severe bias, e.g. the bias incurred in the drift parameters of the Ornstein–Uhlenbeck model for biological relevant parameters can be up to 25–100 per cent. We compute or approximate the likelihood function of the killed process. When estimating from a single trajectory, considerable bias may still be present, and the distribution of the estimates can be heavily skewed and with a huge variance. Parametric bootstrap is effective in correcting the bias. Standard asymptotic results do not apply, but consistency and asymptotic normality may be recovered when multiple trajectories are observed, if the mean first‐passage time through the threshold is finite. Numerical examples illustrate the results and an experimental data set of intracellular recordings of the membrane potential of a motoneuron is analysed.  相似文献   

13.
We construct a mixture distribution including infant, exogenous and Gompertzian/non-Gompertzian senescent mortality. Using mortality data from Swedish females 1751–, we show that this outperforms models without these features, and compare its trends in cohort and period mortality over time. We find an almost complete disappearance of exogenous mortality within the last century of period mortality, with cohort mortality approaching the same limits. Both Gompertzian and non-Gompertzian senescent mortality are consistently present, with the estimated balance between them oscillating constantly. While the parameters of the latter appear to be trending over time, the parameters of the former do not.  相似文献   

14.
We study the maximum likelihood estimator of the drift parameters of a stochastic differential equation, with both drift and diffusion coefficients constant on the positive and negative axis, yet discontinuous at zero. This threshold diffusion is called drifted oscillating Brownian motion. For this continuously observed diffusion, the maximum likelihood estimator coincides with a quasi-likelihood estimator with constant diffusion term. We show that this estimator is the limit, as observations become dense in time, of the (quasi)-maximum likelihood estimator based on discrete observations. In long time, the asymptotic behaviors of the positive and negative occupation times rule the ones of the estimators. Differently from most known results of the literature, we do not restrict ourselves to the ergodic framework: indeed, depending on the signs of the drift, the process may be ergodic, transient, or null recurrent. For each regime, we establish whether or not the estimators are consistent; if they are, we prove the convergence in long time of the properly rescaled difference of the estimators towards a normal or mixed normal distribution. These theoretical results are backed by numerical simulations.  相似文献   

15.
We propose a flexible functional approach for modelling generalized longitudinal data and survival time using principal components. In the proposed model the longitudinal observations can be continuous or categorical data, such as Gaussian, binomial or Poisson outcomes. We generalize the traditional joint models that treat categorical data as continuous data by using some transformations, such as CD4 counts. The proposed model is data-adaptive, which does not require pre-specified functional forms for longitudinal trajectories and automatically detects characteristic patterns. The longitudinal trajectories observed with measurement error or random error are represented by flexible basis functions through a possibly nonlinear link function, combining dimension reduction techniques resulting from functional principal component (FPC) analysis. The relationship between the longitudinal process and event history is assessed using a Cox regression model. Although the proposed model inherits the flexibility of non-parametric methods, the estimation procedure based on the EM algorithm is still parametric in computation, and thus simple and easy to implement. The computation is simplified by dimension reduction for random coefficients or FPC scores. An iterative selection procedure based on Akaike information criterion (AIC) is proposed to choose the tuning parameters, such as the knots of spline basis and the number of FPCs, so that appropriate degree of smoothness and fluctuation can be addressed. The effectiveness of the proposed approach is illustrated through a simulation study, followed by an application to longitudinal CD4 counts and survival data which were collected in a recent clinical trial to compare the efficiency and safety of two antiretroviral drugs.  相似文献   

16.
This article deals with the problem of estimating all the unknown parameters in the drift fractional Brownian motion with discretely sampled data. The estimation procedure is built upon the marriage of the variation method and the ergodic theory. The strong consistencies of these estimators are provided. Moreover, our method and two existing approaches are compared based on the computational running time and the accuracy of estimation via simulation studies. We also apply the proposed method to the real high-frequency financial data within a window of 4 h in the trading day from the Chinese mainland stock market.  相似文献   

17.
Two families of distributions are introduced and studied within the framework of parametric survival analysis. The families are derived from a general linear form by specifying a function of the survival function with certain restrictions. Distributions within each family are generated by transformations of the survival time variable subject to certain restrictions. Two specific transformations were selected and, thus, four distributions are identified for further study. The distributions have one scale and two shape parameters and include as special cases the exponential, Weibull, log-logistic and Gompertz distributions. One of the new distributions, the modified Weibull, is studied in some detail.

The distributions are developed with an emphasis on those features that data analysts find especially useful for survivorship studies, A wide variety of hazard shapes are available. The survival, density and hazard functions may be written in simple algebraic forms. Parameter estimation is demonstrated using the least squares and maximum likelihood methods. Graphical techniques to assess goodness of fit are demonstrated. The models may be extended to include concmitant information.  相似文献   

18.
The iterative simulation of the Brownian bridge is well known. In this article, we present a vectorial simulation alternative based on Gaussian processes for machine learning regression that is suitable for interpreted programming languages implementations. We extend the vectorial simulation of path-dependent trajectories to other Gaussian processes, namely, sequences of Brownian bridges, geometric Brownian motion, fractional Brownian motion, and Ornstein–Ulenbeck mean reversion process.  相似文献   

19.
Many records in environmental sciences exhibit asymmetric trajectories. The physical mechanisms behind these records may lead for example to sample paths with different characteristics at high and low levels (up–down asymmetries) or in the ascending and descending phases leading to time irreversibility (front–back asymmetries). Such features are important for many applications, and there is a need for simple and tractable models that can reproduce them. In this paper, we explore original time‐change models where the clock is a stochastic process that depends on the observed trajectory. The ergodicity of the proposed model is established under general conditions, and this result is used to develop nonparametric estimation procedures based on the joint distribution of the process and its derivative. The methodology is illustrated on meteorological and oceanographic data sets. We show that, combined with a marginal transformation, the proposed methodology is able to reproduce important characteristics of the data set such as marginal distributions, up‐crossing intensity, and up–down and front–back asymmetries.  相似文献   

20.
When an appropriate parametric model and a prior distribution of its parameters are given to describe clinical time courses of a dynamic biological process, Bayesian approaches allow us to estimate the entire profiles from a few or even a single observation per subject. The goodness of the estimation depends on the measurement points at which the observations were made. The number of measurement points per subject is generally limited to one or two. The limited measurement points have to be selected carefully. This paper proposes an approach to the selection of the optimum measurement point for Bayesian estimations of clinical time courses. The selection is made among given candidates, based on the goodness of estimation evaluated by the Kullback-Leibler information. This information measures the discrepancy of an estimated time course from the true one specified by a given appropriate model. The proposed approach is applied to a pharmacokinetic analysis, which is a typical clinical example where the selection is required. The results of the present study strongly suggest that the proposed approach is applicable to pharmacokinetic data and has a wide range of clinical applications.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号