首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
The paper introduces a new method for flexible spline fitting for copula density estimation. Spline coefficients are penalized to achieve a smooth fit. To weaken the curse of dimensionality, instead of a full tensor spline basis, a reduced tensor product based on so called sparse grids (Notes Numer. Fluid Mech. Multidiscip. Des., 31, 1991, 241‐251) is used. To achieve uniform margins of the copula density, linear constraints are placed on the spline coefficients, and quadratic programming is used to fit the model. Simulations and practical examples accompany the presentation.  相似文献   

2.
Straightforward intermediate rank tensor product smoothing in mixed models   总被引:3,自引:0,他引:3  
Tensor product smooths provide the natural way of representing smooth interaction terms in regression models because they are invariant to the units in which the covariates are measured, hence avoiding the need for arbitrary decisions about relative scaling of variables. They would also be the natural way to represent smooth interactions in mixed regression models, but for the fact that the tensor product constructions proposed to date are difficult or impossible to estimate using most standard mixed modelling software. This paper proposes a new approach to the construction of tensor product smooths, which allows the smooth to be written as the sum of some fixed effects and some sets of i.i.d. Gaussian random effects: no previously published construction achieves this. Because of the simplicity of this random effects structure, our construction is useable with almost any flexible mixed modelling software, allowing smooth interaction terms to be readily incorporated into any Generalized Linear Mixed Model. To achieve the computationally convenient separation of smoothing penalties, the construction differs from previous tensor product approaches in the penalties used to control smoothness, but the penalties have the advantage over several alternative approaches of being explicitly interpretable in terms of function shape. Like all tensor product smoothing methods, our approach builds up smooth functions of several variables from marginal smooths of lower dimension, but unlike much of the previous literature we treat the general case in which the marginal smooths can be any quadratically penalized basis expansion, and there can be any number of them. We also point out that the imposition of identifiability constraints on smoothers requires more care in the mixed model setting than it would in a simple additive model setting, and show how to deal with the issue. An interesting side effect of our construction is that an ANOVA-decomposition of the smooth can be read off from the estimates, although this is not our primary focus. We were motivated to undertake this work by applied problems in the analysis of abundance survey data, and two examples of this are presented.  相似文献   

3.
The paper considers the modelling of time series using a generalized additive model with first-order Markov structure and mixed transition density having a discrete component at zero and a continuous component with positive sample space. Such models have application, for example, in modelling daily occurrence and intensity of rainfall, and in modelling numbers and sizes of insurance claims. The paper shows how these methods extend the usual sinusoidal seasonal assumption in standard chain-dependent models by assuming a general smooth pattern of occurrence and intensity over time. These models can be fitted using standard statistical software. The methods of Grunwald & Jones (2000) can be used to combine these separate occurrence and intensity models into a single model for amount. The models are used to investigate the relationship between the Southern Oscillation Index and Melbourne's rainfall, illustrated with 36 years of rainfall data from Melbourne, Australia.  相似文献   

4.
Many chronic medical conditions are manifested by alternating sojourns in symptom-free and symptomatic states. In many cases, in addition to their relapsing and remitting nature, these conditions lead to worsening disease patterns over time and may exhibit seasonal trends. We develop a mixed-effect two-state model for such disease processes in which covariate effects are modeled multiplicatively on transition intensities. The transition intensities, in turn, are functions of three time scales: the semi-Markov scale involving the backward recurrence time for the cyclical component, the Markov scale for the time trend component, and a seasonal time scale. Multiplicative bivariate log-normal random effects are introduced to accommodate heterogeneity in disease activity between subjects and to admit a possible negative correlation between the transition intensities. Maximum likelihood estimation is carried out using Gauss-Hermite integration and a standard Newton-Raphson procedure. Tests of homogeneity are presented based on score statistics. An application of the methodology to data from a multi-center clinical trial of chronic bronchitis is provided for illustrative purposes.  相似文献   

5.
The basic structural model is a univariate time series model consisting of a slowly changing trend component, a slowly changing seasonal component, and a random irregular component. It is part of a class of models that have a number of advantages over the seasonal ARIMA models adopted by Box and Jenkins (1976). This article reports the results of an exercise in which the basic structural model was estimated for six U.K. macroeconomic time series and the forecasting performance compared with that of ARIMA models previously fitted by Prothero and Wallis (1976).  相似文献   

6.
In long-term trials, not only are individual plot errors correlated over time but there is also a consistent underlying spatial variability in field conditions. The current study sought the most appropriate covariance structure of errors correlated in three dimensions for evaluating the productivity and time-trends in the barley yield data from the monocropping system established in northern Syria. The best spatial-temporal model found reflected the contribution of autocorrelations in spatial and temporal dimensions with estimates varying with the yield variable and location. Compared with a control structure based on independent errors, this covariance structure improved the significance of the fertilizer effect and the interaction with year. Time-trends were estimated in two ways: by accounting the seasonal variable contribution in annual variability (Method 1), which is suitable for detecting significant trends in short data series; and by using the linear component of the orthogonal polynomial on time (year), which is appropriate for long series (Method 2). Method 1 strengthened time-trend detection compared with the method of Jones and Singh [J. Agri. Sci., Cambridge 135 (2000), pp. 251-259] which assumed independence of temporal errors. Most estimates of yield trends over time from fertilizer application were numerically greater than the corresponding linear trends estimated from orthogonal polynomials in time (Method 2), reflecting the effect of accounting for seasonal variables. Grain yield declined over time at the drier site in the absence of nitrogen or phosphorus application, but positive trends were observed fairly generally for straw yield and for grain yield under higher levels of fertilizer inputs. It is suggested that analyses of long-term trials on other crops and cropping systems in other agro-ecological zones could be improved by taking spatial and temporal variability into account in the data evaluation.  相似文献   

7.
This article shows how to compute the in-sample effect of exogenous inputs on the endogenous variables in any linear model written in a state–space form. Estimating this component may be either interesting by itself, or a previous step before decomposing a time series into trend, cycle, seasonal and error components. The practical application and usefulness of this method is illustrated by estimating the effect of advertising on the monthly sales of Lydia Pinkham's vegetable compound.  相似文献   

8.
A stochastic model, which is well suited to capture space–time dependence of an infectious disease, was employed in this study to describe the underlying spatial and temporal pattern of measles in Barisal Division, Bangladesh. The model has two components: an endemic component and an epidemic component; weights are used in the epidemic component for better accounting of the disease spread into different geographical regions. We illustrate our findings using a data set of monthly measles counts in the six districts of Barisal, from January 2000 to August 2009, collected from the Expanded Program on Immunization, Bangladesh. The negative binomial model with both the seasonal and autoregressive components was found to be suitable for capturing space–time dependence of measles in Barisal. Analyses were done using general optimization routines, which provided the maximum likelihood estimates with the corresponding standard errors.  相似文献   

9.
An algorithm is derived that develops measures of variability for the estimates of the nonseasonal component computed from a model-based seasonal adjustment procedure. The measures of variability are developed from signal extraction theory. Properties of components of the variance are developed, and the behavior of the variance is investigated for one popular time series model. The results are illustrated by using real data.  相似文献   

10.
A Bayesian analysis is presented of a time series which is the sum of a stationary component with a smooth spectral density and a deterministic component consisting of a linear combination of a trend and periodic terms. The periodic terms may have known or unknown frequencies. The advantage of our approach is that different features of the data—such as the regression parameters, the spectral density, unknown frequencies and missing observations—are combined in a hierarchical Bayesian framework and estimated simultaneously. A Bayesian test to detect deterministic components in the data is also constructed. By using an asymptotic approximation to the likelihood, the computation is carried out efficiently using the Markov chain Monte Carlo method in O ( Mn ) operations, where n is the sample size and M is the number of iterations. We show empirically that our approach works well on real and simulated samples.  相似文献   

11.
This paper considers a general model which allows for both deterministic and stochastic forms of seasonality, including fractional (stationary and nonstationary) seasonal orders of integration, and also incorporating endogenously determined structural breaks. Monte Carlo analysis shows that, in the case of a single break, the suggested procedure performs well even in small samples, accurately capturing the seasonal properties of the series, and correctly detecting the break date. As an illustration, the model is estimated using four US series (output, consumption, imports and exports). The results suggest that the seasonal patterns of these variables have changed over time: specifically, in the second subsample the systematic component of seasonality becomes insignificant, whilst the degree of persistence increases.  相似文献   

12.
This paper considers spurious regression between two different types of seasonal time series: one with a deterministic seasonal component and the other with a stochastic seasonal component. When one type of seasonal time series is regressed on the other type and they are independent of each other, the phenomenon of spurious regression occurs. Asymptotic properties of the regression coefficient estimator and the associated regression ‘t-ratio’ are studied. A Monte Carlo simulation study is conducted to confirm the phenomenon of spurious regression and spurious rejection of seasonal cointegration for finite samples.  相似文献   

13.
Time series seasonal extraction techniques are quite often applied in the context of a policy aimed at controlling the nonseasonal components of a time series. Monetary policies targeting the nonseasonal components of monetary aggregates are an example. Such policies can be studied as a quadratic optimal control model in which observations are contaminated by seasonal noise. Optimal extraction filters in such models do not correspond to univariate time series seasonal extraction filters. The linear quadratic control model components are nonorthogonal due to the presence of control feedback. This article presents the Kalman filter as a conceptual and computational device used to extract seasonal noise in the presence of feedback.  相似文献   

14.
Modeling cylindrical data, comprised of a linear component and a directional component, can be done using Fourier series expansions if we consider the conditional distribution of the linear component given the angular component. This paper presents the second order model which is a natural extension of the Mardia and Sutton (1978) first order model. This model can be parameterized either in polar or Cartesian coordinates, and allows for parameter estimation using standard multiple linear regression. Characteristic of the new model, how to compare the adequacy of the fit for first and second order models, and an example involving wind direction and temperature are presented.  相似文献   

15.
We investigate whether seasonal-adjustment procedures are, at least approximately, linear data transformations. This question was initially addressed by Young and is important with respect to many issues including estimation of regression models with seasonally adjusted data. We focus on the X-11 program and rely on simulation evidence, involving linear unobserved component autoregressive integrated moving average models. We define a set of properties for the adequacy of a linear approximation to a seasonal-adjustment filter. These properties are examined through statistical tests. Next, we study the effect of X-11 seasonal adjustment on regression statistics assessing the statistical significance of the relationship between economic variables. Several empirical results involving economic data are also reported.  相似文献   

16.
We present a smooth function that can be used as regression curve for modeling growth phenomena requiring an increasing curvilinear concave asymptote. This model is obtained as the product of a concave asymptotic curve and the exponential model. In addition to its increasing character with a curvilinear asymptote, including horizontal or linear increasing asymptote, the resulting model provides curves with a single inflection point. Numerical examples are presented.  相似文献   

17.
This paper brings together two topics in the estimation of time series forecasting models: the use of the multistep-ahead error sum of squares as a criterion to be minimized and frequency domain methods for carrying out this minimization. The methods are developed for the wide class of time series models having a spectrum which is linear in unknown coefficients. This includes the IMA(1, 1) model for which the common exponentially weigh-ted moving average predictor is optimal, besides more general structural models for series exhibiting trends and seasonality. The method is extended to include the Box–Jenkins `air line' model. The value of the multistep criterion is that it provides protection against using an incorrectly specified model. The value of frequency domain estimation is that the iteratively reweighted least squares scheme for fitting generalized linear models is readily extended to construct the parameter estimates and their standard errors. It also yields insight into the loss of efficiency when the model is correct and the robustness of the criterion against an incorrect model. A simple example is used to illustrate the method, and a real example demonstrates the extension to seasonal models. The discussion considers a diagnostic test statistic for indicating an incorrect model.  相似文献   

18.
Estimation of a smooth function is considered when observations on this function added with Gaussian errors are observed. The problem is formulated as a general linear model, and a hierarchical Bayesian approach is then used to study it. Credible bands are also developed for the function. Sensitivity analysis is conducted to determine the influence of the choice of priors on hyperparameters. Finally, the methodology is illustrated using real and simulated examples where it is compared with classical cubic splines. It is also shown that our approach provides a Bayesian solution to some problems in discrete time series.  相似文献   

19.
Summary.  Hansen, Kooperberg and Sardy introduced a family of continuous, piecewise linear functions defined over adaptively selected triangulations of the plane as a general approach to statistical modelling of bivariate densities and regression and hazard functions. These triograms enjoy a natural affine equivariance that offers distinct advantages over competing tensor product methods that are more commonly used in statistical applications. Triograms employ basis functions consisting of linear 'tent functions' defined with respect to a triangulation of a given planar domain. As in knot selection for univariate splines, Hansen and colleagues adopted the regression spline approach of Stone. Vertices of the triangulation are introduced or removed sequentially in an effort to balance fidelity to the data and parsimony. We explore a smoothing spline variant of the triogram model based on a roughness penalty adapted to the piecewise linear structure of the triogram model. We show that the roughness penalty proposed may be interpreted as a total variation penalty on the gradient of the fitted function. The methods are illustrated with real and artificial examples, including an application to estimated quantile surfaces of land value in the Chicago metropolitan area.  相似文献   

20.
In this paper, we study a nonparametric additive regression model suitable for a wide range of time series applications. Our model includes a periodic component, a deterministic time trend, various component functions of stochastic explanatory variables, and an AR(p) error process that accounts for serial correlation in the regression error. We propose an estimation procedure for the nonparametric component functions and the parameters of the error process based on smooth backfitting and quasimaximum likelihood methods. Our theory establishes convergence rates and the asymptotic normality of our estimators. Moreover, we are able to derive an oracle‐type result for the estimators of the AR parameters: Under fairly mild conditions, the limiting distribution of our parameter estimators is the same as when the nonparametric component functions are known. Finally, we illustrate our estimation procedure by applying it to a sample of climate and ozone data collected on the Antarctic Peninsula.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号