首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
In this paper we motivate solutions to simultaneous estimation of multiple dynamic processes in situations where the correspondence between the set of measurements and the set of processes is uncertain and thus special modelling is required to accomodate the unclassified data. We derive the optimal Bayesian solution for non linear processes which turns out to be very computationally complicated, and then suggest a quasi Bayes approximation which removes the complication due to the uncertain measurement-process correspondence. Numerical illustrations are provided for the linear case.  相似文献   

2.
The paper deals with the problem of sequential estimation for stochastic processes in the presence of a nuisance parameter. Using the approach to estimation through estimating equations, optimum estimating functions based on a random observation time are investigated in some models for processes appearing in reliability systems theory.  相似文献   

3.
Most macroeconomic data are uncertain—they are estimates rather than perfect measures of underlying economic variables. One symptom of that uncertainty is the propensity of statistical agencies to revise their estimates in the light of new information or methodological advances. This paper sets out an approach for extracting the signal from uncertain data. It describes a two-step estimation procedure in which the history of past revisions is first used to estimate the parameters of a measurement equation describing the official published estimates. These parameters are then imposed in a maximum likelihood estimation of a state space model for the macroeconomic variable.  相似文献   

4.
《Econometric Reviews》2013,32(4):385-424
This paper introduces nonlinear dynamic factor models for various applications related to risk analysis. Traditional factor models represent the dynamics of processes driven by movements of latent variables, called the factors. Our approach extends this setup by introducing factors defined as random dynamic parameters and stochastic autocorrelated simulators. This class of factor models can represent processes with time varying conditional mean, variance, skewness and excess kurtosis. Applications discussed in the paper include dynamic risk analysis, such as risk in price variations (models with stochastic mean and volatility), extreme risks (models with stochastic tails), risk on asset liquidity (stochastic volatility duration models), and moral hazard in insurance analysis.

We propose estimation procedures for models with the marginal density of the series and factor dynamics parameterized by distinct subsets of parameters. Such a partitioning of the parameter vector found in many applications allows to simplify considerably statistical inference. We develop a two- stage Maximum Likelihood method, called the Finite Memory Maximum Likelihood, which is easy to implement in the presence of multiple factors. We also discuss simulation based estimation, testing, prediction and filtering.  相似文献   

5.
Spatio-temporal processes are often high-dimensional, exhibiting complicated variability across space and time. Traditional state-space model approaches to such processes in the presence of uncertain data have been shown to be useful. However, estimation of state-space models in this context is often problematic since parameter vectors and matrices are of high dimension and can have complicated dependence structures. We propose a spatio-temporal dynamic model formulation with parameter matrices restricted based on prior scientific knowledge and/or common spatial models. Estimation is carried out via the expectation–maximization (EM) algorithm or general EM algorithm. Several parameterization strategies are proposed and analytical or computational closed form EM update equations are derived for each. We apply the methodology to a model based on an advection–diffusion partial differential equation in a simulation study and also to a dimension-reduced model for a Palmer Drought Severity Index (PDSI) data set.  相似文献   

6.
We investigate the estimation of dynamic models of criminal activity, when there is significant under-recording of crime. We give a theoretical analysis and use simulation techniques to investigate the resulting biases in conventional regression estimates. We find the biases to be of little practical significance. We develop and apply a new simulated maximum likelihood procedure that estimates simultaneously the measurement error and crime processes, using extraneous survey data. This also confirms that measurement error biases are small. Our estimation results for data from England and Wales imply a significant response of crime to both the economic and the enforcement environment.  相似文献   

7.
This paper adresses the measurement of technical efficiency of textile, clothing, and leather (TCL) industries in Tunisia through a panel data estimation of a dynamic translog production frontier. It provides a perspective on productivity and efficiency that should be instructive to a developing economy which will face substantial competitive pressure along the gradual economic liberalisation process. The importance of TCL industries in Tunisian manufacturing sector is a reason for obtaining more knowledge of productivity and efficiency for this key industry. Dynamic is introduced to reflect the production consequences of the adjustment costs, which are associated with changes in factor inputs. Estimation of a dynamic error components model is considered using the system generalized method of moments (GMM) estimator suggested by Arellano and Bover (1995), Another look at the instrumental-variable estimation of error-components models, J. Econometrics68:29-51) and Blundell and Bond (Blundell, R., Bond, S. (1998a), Initial conditions and moment restrictions in dynamic panel data models. J. Econometrics87:115-143; Blundell, R., Bond, S. (1998b), GMM estimation with persistent panel data: an application to production functions, Paper presented at the Eighth International Conference on Panel Data, Goteborg University). Our study evaluates the sensitivity of the results, particularly of the efficiency measures, to different specifications. Firm-specific time-invariant technical efficiency is obtained using the Schmidt and Sickles (Schmidt, P., Sickles, R. C. (1984). Production frontiers and panel data. J. Bus. Econ. Stat.2:367-374) approach after estimating the dynamic frontier. We stress the importance of allowing for lags in adjustment of output to inputs and of controlling for time-invariant variables when estimating firm-specific efficiency. The results suggest that the system GMM estimation of the dynamic specification produces the most accurate parameter estimates and technical efficiency measure. Mean efficiency scores is of 68%. Policy implications of the results are outlined.  相似文献   

8.
In this paper, we consider inferences in a binary dynamic mixed model. The existing estimation approaches mainly estimate the regression effects and the dynamic dependence parameters either through the estimation of the random effects or by avoiding the random effects technically. Under the assumption that the random effects follow a Gaussian distribution, we propose a generalized quasilikelihood (GQL) approach for the estimation of the parameters of the dynamic mixed models. The proposed approach is computationally less cumbersome than the exact maximum likelihood (ML) approach. We also carry out the GQL estimation under two competitive, namely, probit and logit mixed models, and discuss both the asymptotic and small-sample behaviour of their estimators.  相似文献   

9.
This paper deals with the problem of estimating all the unknown parameters of geometric fractional Brownian processes from discrete observations. The estimation procedure is built upon the marriage of the quadratic variation and the maximum likelihood approach. The asymptotic properties of the estimators are provided. Moveover, we compare our derived method with the approach proposed by Misiran et al. [Fractional Black-Scholes models: complete MLE with application to fractional option pricing. In International conference on optimization and control; Guiyang, China; 2010. p. 573–586.], namely the complete maximum likelihood estimation. Simulation studies confirm theoretical findings and illustrate that our methodology is efficient and reliable. To show how to apply our approach in realistic contexts, an empirical study of Chinese financial market is also presented.  相似文献   

10.
Spatiotemporal prediction for log-Gaussian Cox processes   总被引:1,自引:0,他引:1  
Space–time point pattern data have become more widely available as a result of technological developments in areas such as geographic information systems. We describe a flexible class of space–time point processes. Our models are Cox processes whose stochastic intensity is a space–time Ornstein–Uhlenbeck process. We develop moment-based methods of parameter estimation, show how to predict the underlying intensity by using a Markov chain Monte Carlo approach and illustrate the performance of our methods on a synthetic data set.  相似文献   

11.
The main purpose of this article is to introduce the E-Bayesian approach to gain flexibility in the reliability-availability system estimation. This approach will be used in series systems, parallel systems, and k-out-of-m systems, based on exponential distribution under squared error loss function, when time is continuous. We use three prior distributions to investigate its impact on the E-Bayesian approach, those prior distributions cover a big spectrum of possibilities. We show in real examples and also by simulations, how the procedure behaves. In the simulation study also we explore the impact on this estimation approach, when the number of components of the system increases.  相似文献   

12.
Consider a machine that can start production off-target where the initial offset is unknown and unobservable. The goal is to determine the optimal series of machine adjustments that minimize the expected value of the sum of quadratic off-target costs and fixed adjustment costs. Apart of the unknown initial offset, the process is supposed to be in a state of statistical control, so the process model is applicable to discrete-part production processes. The process variance is also assumed unknown. We show, using a dynamic programming formulation based on the Bayesian estimation of all unknown process parameters, how the optimal process adjustment policy is of a deadband form where the width of the deadband is time-varying and U-shaped. Computational results and implementation details are presented. The simpler case of a known process variance is also solved using a dynamic programming approach. It is shown that the solution to this case is a good approximation to the first case, when the variance is actually unknown. The unknown process variance solution, however, is the most robust with respect to variation in the process parameters.  相似文献   

13.
The dynamic generalized linear model and the dynamic discount Bayesian model have been used to describe processes involving time-varying parameters. This paper develops an estimation algorithm for the multiprocess extension of these model. These algorithms have the same characteristics as Harrison-Steven forecasting, namely insensitivity to outliers and quick reaction to real change in the parameters.  相似文献   

14.
The modelling of discrete such as binary time series, unlike the continuous time series, is not easy. This is due to the fact that there is no unique way to model the correlation structure of the repeated binary data. Some models may also provide a complicated correlation structure with narrow ranges for the correlations. In this paper, we consider a nonlinear dynamic binary time series model that provides a correlation structure which is easy to interpret and the correlations under this model satisfy the full?1 to 1 range. For the estimation of the parameters of this nonlinear model, we use a conditional generalized quasilikelihood (CGQL) approach which provides the same estimates as those of the well-known maximum likelihood approach. Furthermore, we consider a competitive linear dynamic binary time series model and examine the performance of the CGQL approach through a simulation study in estimating the parameters of this linear model. The model mis-specification effects on estimation as well as forecasting are also examined through simulations.  相似文献   

15.
While large models based on a deterministic-reductionist philosophy have an important part to play in environmental research, it is advantageous to consider alternative modelling methodologies which overtly acknowledge the poorly defined and uncertain nature of most environmental systems. The paper discusses this topic and presents an integrated statistical modelling procedure which involves three main methodological tools: uncertainty and sensitivity studies based on Monte Carlo simulation techniques; dominant mode analysis using a new method of combined linearization and model-order reduction; and data-based mechanistic modelling. This novel approach is illustrated by two practical examples: modelling the global carbon cycle in relation to possible climate change; and modelling a horticultural glasshouse for the purposes of automatic climate control system design.  相似文献   

16.
While large models based on a deterministic-reductionist philosophy have an important part to play in environmental research, it is advantageous to consider alternative modelling methodologies which overtly acknowledge the poorly defined and uncertain nature of most environmental systems. The paper discusses this topic and presents an integrated statistical modelling procedure which involves three main methodological tools: uncertainty and sensitivity studies based on Monte Carlo simulation techniques; dominant mode analysis using a new method of combined linearization and model-order reduction; and data-based mechanistic modelling. This novel approach is illustrated by two practical examples: modelling the global carbon cycle in relation to possible climate change; and modelling a horticultural glasshouse for the purposes of automatic climate control system design.  相似文献   

17.
Eden UT  Brown EN 《Statistica Sinica》2008,18(4):1293-1310
Neural spike trains, the primary communication signals in the brain, can be accurately modeled as point processes. For many years, significant theoretical work has been done on the construction of exact and approximate filters for state estimation from point process observations in continuous-time. We have previously developed approximate filters for state estimation from point process observations in discrete-time and applied them in the study of neural systems. Here, we present a coherent framework for deriving continuous-time filters from their discrete-counterparts. We present an accessible derivation of the well-known unnormalized conditional density equation for state evolution, construct a new continuous-time filter based on a Gaussian approximation, and propose a method for assessing the validity of the approximation following an approach by Brockett and Clark. We apply these methods to the problem of reconstructing arm reaching movements from simulated neural spiking activity from the primary motor cortex. This work makes explicit the connections between adaptive point process filters for analyzing neural spiking activity in continuous-time, and standard continuous-time filters for state estimation from continuous and point process observations.  相似文献   

18.
This article studies the threshold autoregression analysis for the self-exciting threshold binomial autoregressive processes. Parameters' point estimation and interval estimation problems are considered via the empirical likelihood method. A new algorithm to estimate the threshold value of the threshold model is also given. Simulation study is conducted for the evaluation of the developed approach. An application on measles data is provided to show the applicability of the method.  相似文献   

19.
Through an appeal to asymptotic Gaussian representations of certain empirical stochastic processes, the techniques of continuous regression are applied to derive estimates for underlying parametric probability laws. This asymptotic regression approach yields estimates for a wide range of statistical problems, including estimation based on the empirical quantile function, Poisson process intensity estimation, and parametric density estimation.  相似文献   

20.
Complex biological processes are usually experimented along time among a collection of individuals, longitudinal data are then available. The statistical challenge is to better understand the underlying biological mechanisms. A standard statistical approach is mixed-effects model where the regression function is highly-developed to describe precisely the biological processes (solutions of multi-dimensional ordinary differential equations or of partial differential equation). A classical estimation method relies on coupling a stochastic version of the EM algorithm with a Monte Carlo Markov Chain algorithm. This algorithm requires many evaluations of the regression function. This is clearly prohibitive when the solution is numerically approximated with a time-consuming solver. In this paper a meta-model relying on a Gaussian process emulator is proposed to approximate the regression function, that leads to what is called a mixed meta-model. The uncertainty of the meta-model approximation can be incorporated in the model. A control on the distance between the maximum likelihood estimates of the mixed meta-model and the maximum likelihood estimates of the exact mixed model is guaranteed. Eventually, numerical simulations are performed to illustrate the efficiency of this approach.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号