首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
The authors consider the problem of testing the validity of the logistic regression model using a random sample. Given the values of the response variable, they observe that the sample actually consists of two independent subsets of observations whose density ratio has a known parametric form when the model is true. They are thus led to propose a generalized-moments specification test in detail. In addition, they show that this test can be derived using Neyman's smooth tests for goodness of fit. They present simulation results and apply the methodology to the analysis of two real data sets.  相似文献   

2.
We propose a method for specifying the distribution of random effects included in a model for cluster data. The class of models we consider includes mixed models and frailty models whose random effects and explanatory variables are constant within clusters. The method is based on cluster residuals obtained by assuming that the random effects are equal between clusters. We exhibit an asymptotic relationship between the cluster residuals and variations of the random effects as the number of observations increases and the variance of the random effects decreases. The asymptotic relationship is used to specify the random-effects distribution. The method is applied to a frailty model and a model used to describe the spread of plant diseases.  相似文献   

3.
Single index models are frequently used in econometrics and biometrics. Logit and Probit models arc special cases with fixed link functions. In this paper we consider a bootstrap specification test that detects nonparametric deviations of the link function. The bootstrap is used with the aim to rind a more accurate distribution under the null than the normal approximation. We prove that the statistic and its bootstrapped version have the same asymptotic distribution. In a simulation study we show that the bootstrap is able to capture the negative bias and the skewness of the test statistic. It yields better approximations to the true critical values and consequently it has a more accurate level than the normal approximation.  相似文献   

4.
5.
Multi-stage time evolving models are common statistical models for biological systems, especially insect populations. In stage-duration distribution models, parameter estimation for the models use the Laplace transform method. This method involves assumptions such as known constant shapes, known constant rates or the same overall hazard rate for all stages. These assumptions are strong and restrictive. The main aim of this paper is to weaken these assumptions by using a Bayesian approach. In particular, a Metropolis-Hastings algorithm based on deterministic transformations is used to estimate parameters. We will use two models, one which has no hazard rates, and the other has stage-wise constant hazard rates. These methods are validated in simulation studies followed by a case study of cattle parasites. The results show that the proposed methods are able to estimate the parameters comparably well, as opposed to using the Laplace transform methods.  相似文献   

6.
Closed form expressions are developed for the estimators of functions of the variance components in balanced, mixed, linear models. These estimators are averages of sample covariances (variances) which offer diagnostic information on the data and the model. The cause of negative estimates may be revealed. Examples illustrate the basic concepts.  相似文献   

7.
In this paper a specification strategy is proposed for the determination of the orders in ARMA models. The strategy is based on two newly defined concepts: the q-conditioned partial auto-regressive function and the p-conditioned partial moving average function. These concepts are similar to the generalized partial autocorrelation function which has been recently suggested for order determination. The main difference is that they are defined and employed in connection with an asymptotically efficient estimation method instead of the rather inefficient generalized Yule-Walker method. The specification is performed by using sequential Wald type tests. In contrast to the traditional testing of hypotheses, these tests use critical values which increase with the sample size at an appropriate rate  相似文献   

8.
Summary.  Contingent valuation researchers are often interested in a comparison of the underlying willingness-to-pay distributions in two independent studies. Since willingness to pay is not observable, traditional testing procedures for comparing distributions cannot be applied directly. The paper proposes a permutation test for this sort of comparison. The main distinguishing characteristic of the test proposed is that it does not rely on asymptotic approximations and facilitates the introduction of covariates. The permutation test is illustrated with the case of projects of investment for the improvement of two important Brazilian river basins.  相似文献   

9.
Summary This paper surveys the state of the art of the analysis and application of large scale structural simultaneous econometric models (SSEM). First, the importance of such models in empirical economics and especially for economic policy analysis is emphasized. We then focus on the methodological issues in the application of these models like questions about identification, nonstationarity of variables, adequate estimation of the parameters, and the inclusion of identities. In the light of the latest development in econometrics, we identify the main unsolved problems in this area, recommend a combined data-theory-driven procedure for the specification of such models, and give suggestions how one could overcome some of the indicated problems.  相似文献   

10.
This paper developed an exact method of random permutations when testing both interaction and main effects in the two-way ANOVA model. The method of this paper can be regarded as a much improved model when compared with those of the previous studies such as Still and White (1981) and ter Braak (1992). We further conducted a simulation experiment in order to check the statistical performance of the proposed method. The proposed method works relatively well for small sample sizes compare with the existing methods. This work was supported by Korea Science and Engineering Foundation Grant (R14-2003-002-0100)  相似文献   

11.
In this article, we consider European option pricing for time-changed Brownian models using Laplace transform. We obtain a general formula for the option price as the integral of a real-valued function involving the Laplace transform of the random time change. Unlike the usual Fourier transform technique, our method does not suffer from difficulties specific to complex integration, such as the evaluation of multiple-valued functions, and allows for a model-independent analysis of the truncation error. In the numerical analysis part, we compare option prices in variance gamma (VG), normal inverse Gaussian (NIG), and generalized hyperbolic (GH) models obtained by Laplace transform with those obtained by the Fourier transform method introduced by Carr and Madan in 1999. The results show that our method converges faster than the Fourier approach when the Laplace transforms of the subordinators decay exponentially, for examples like NIG and GH models.  相似文献   

12.
In this paper, we consider a unified approach to stochastic comparisons of random vectors corresponding to two general multivariate mixture models. These stochastic comparisons are made with respect to multivariate hazard rate, reversed hazard rate and likelihood ratio orders. As an application, results are presented for stochastic comparisons of generalized multivariate frailty models.  相似文献   

13.
Shuo Li 《Econometric Reviews》2019,38(10):1202-1215
This paper develops a testing procedure to simultaneously check (i) the independence between the error and the regressor(s), and (ii) the parametric specification in nonlinear regression models. This procedure generalizes the existing work of Sen and Sen [“Testing Independence and Goodness-of-fit in Linear Models,” Biometrika, 101, 927–942.] to a regression setting that allows any smooth parametric form of the regression function. We establish asymptotic theory for the test procedure under both conditional homoscedastic error and heteroscedastic error. The derived tests are easily implementable, asymptotically normal, and consistent against a large class of fixed alternatives. Besides, the local power performance is investigated. To calibrate the finite sample distribution of the test statistics, a smooth bootstrap procedure is proposed and found work well in simulation studies. Finally, two real data examples are analyzed to illustrate the practical merit of our proposed tests.  相似文献   

14.
The paper investigates diagnostic procedures for the specification of common hazard models in duration analysis. It is shown that under mixed hazard specifications the survival functions of different subgroups cannot cross. A nonparametric test for the crossing of two survival functions is provided and its applications in duration analysis are discussed. In particular, the proportional hazard model with unobserved heterogeneity (PHU) is investigated, and procedures are developed to test whether given data are consistent with the PHU model and whether they contain unobserved heterogeneity within the PHU specification. Examples in which crossing survivals are of substantive concern are discussed, including the dynamics of infectious diseases and the demand for vaccination.  相似文献   

15.
The problem of choosing optimal levels of the acceleration variable for accelerated testing is an important issue in reliability analysis. Most recommendations have focused on minimizing the variance of an estimator of a particular characteristic, such as a percentile, for a specific parametric model. In this paper, a general approach based on “locally penalized” D-optimality (LPD-optimality) is proposed, which simultaneously minimizes the variances of the model parameter estimators. Application of the method is illustrated for inverse Gaussian-accelerated test models fitted to carbon fiber tensile strength data, where the fiber length is the “acceleration variable”.  相似文献   

16.
There are several procedures for fitting generalized additive models, i.e. regression models for an exponential family response where the influence of each single covariates is assumed to have unknown, potentially non-linear shape. Simulated data are used to compare a smoothing parameter optimization approach for selection of smoothness and of covariates, a stepwise approach, a mixed model approach, and a procedure based on boosting techniques. In particular it is investigated how the performance of procedures is linked to amount of information, type of response, total number of covariates, number of influential covariates, and extent of non-linearity. Measures for comparison are prediction performance, identification of influential covariates, and smoothness of fitted functions. One result is that the mixed model approach returns sparse fits with frequently over-smoothed functions, while the functions are less smooth for the boosting approach and variable selection is less strict. The other approaches are in between with respect to these measures. The boosting procedure is seen to perform very well when little information is available and/or when a large number of covariates is to be investigated. It is somewhat surprising that in scenarios with low information the fitting of a linear model, even with stepwise variable selection, has not much advantage over the fitting of an additive model when the true underlying structure is linear. In cases with more information the prediction performance of all procedures is very similar. So, in difficult data situations the boosting approach can be recommended, in others the procedures can be chosen conditional on the aim of the analysis.  相似文献   

17.
A multivariate GARCH model is used to investigate Granger causality in the conditional variance of time series. Parametric restrictions for the hypothesis of noncausality in conditional variances between two groups of variables, when there are other variables in the system as well, are derived. These novel conditions are convenient for the analysis of potentially large systems of economic variables. To evaluate hypotheses of noncausality, a Bayesian testing procedure is proposed. It avoids the singularity problem that may appear in the Wald test, and it relaxes the assumption of the existence of higher-order moments of the residuals required in classical tests.  相似文献   

18.
This paper presents a new test statistic for dynamic or stochastic mis-specification for the dynamic demand or dynamic adjustment class of economic models. The test statistic is based on residual autocorrelations, asymptotically X2 and is suspected to be of low power. The test is illustrated with an example from recent econometric literature.  相似文献   

19.
The problem of spuriousity has been dealt with from a Bayesian perspective by, among others, Box and Taio (1968) and in several papers by Guttman with various co-authors, beginning with Guttman (1973), The main objective of these papers has been to obtain posterior distributions of parameters, and to base inference on these distributions. In the current paper, the Bayesian argument is carried one step further by deriving predictive distributions of future observations. Inferences are then based on these distributions. We will obtain predictive results for several models, First, we consider the univariate normal case with one spurious observation, This is then generalized to several spurious observations. The multivariate normal situation is studied next. Finally, we consider the general linear model with normal errors.  相似文献   

20.
We describe stationarity and ergodicity (SE) regions for a recently proposed class of score driven dynamic correlation models. These models have important applications in empirical work. The regions are derived from sufficiency conditions in Bougerol (1993 Bougerol, P. (1993). Kalman filtering with random coefficients and contractions. SIAM Journal on Control and Optimization 31(4):942959.[Crossref], [Web of Science ®] [Google Scholar]) and take a nonstandard form. We show that the nonstandard shape of the sufficiency regions cannot be avoided by reparameterizing the model or by rescaling the score steps in the transition equation for the correlation parameter. This makes the result markedly different from the volatility case. Observationally equivalent decompositions of the stochastic recurrence equation yield regions with different shapes and sizes. We use these results to establish the consistency and asymptotic normality of the maximum likelihood estimator. We illustrate our results with an analysis of time-varying correlations between U.K. and Greek equity indices. We find that also in empirical applications different decompositions can give rise to different conclusions regarding the stability of the estimated model.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号