首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 281 毫秒
1.
Traditional factor analysis (FA) rests on the assumption of multivariate normality. However, in some practical situations, the data do not meet this assumption; thus, the statistical inference made from such data may be misleading. This paper aims at providing some new tools for the skew-normal (SN) FA model when missing values occur in the data. In such a model, the latent factors are assumed to follow a restricted version of multivariate SN distribution with additional shape parameters for accommodating skewness. We develop an analytically feasible expectation conditional maximization algorithm for carrying out parameter estimation and imputation of missing values under missing at random mechanisms. The practical utility of the proposed methodology is illustrated with two real data examples and the results are compared with those obtained from the traditional FA counterparts.  相似文献   

2.
Non-Gaussian factor analysis differs from ordinary factor analysis because of the distribution assumption on the factors which are modelled by univariate mixtures of Gaussians thus relaxing the classical normal hypothesis. From this point of view, the model can be thought of as a generalization of ordinary factor analysis and its estimation problem can still be solved via the maximum likelihood method. The focus of this work is to introduce, develop and explore a Bayesian analysis of the model in order to provide an answer to unresolved questions about the number of latent factors and simultaneously the number of mixture components to model each factor. The effectiveness of the proposed method is explored in a simulation study and in a real example of international exchange rates.  相似文献   

3.
In this paper, we consider the estimated weights of the tangency portfolio. We derive analytical expressions for the higher order non-central and central moments of these weights when the returns are assumed to be independently and multivariate normally distributed. Moreover, the expressions for mean, variance, skewness and kurtosis of the estimated weights are obtained in closed forms. Later, we complement our results with a simulation study where data from the multivariate normal and t-distributions are simulated, and the first four moments of estimated weights are computed by using the Monte Carlo experiment. It is noteworthy to mention that the distributional assumption of returns is found to be important, especially for the first two moments. Finally, through an empirical illustration utilizing returns of four financial indices listed in NASDAQ stock exchange, we observe the presence of time dynamics in higher moments.  相似文献   

4.
《Econometric Reviews》2013,32(4):385-424
This paper introduces nonlinear dynamic factor models for various applications related to risk analysis. Traditional factor models represent the dynamics of processes driven by movements of latent variables, called the factors. Our approach extends this setup by introducing factors defined as random dynamic parameters and stochastic autocorrelated simulators. This class of factor models can represent processes with time varying conditional mean, variance, skewness and excess kurtosis. Applications discussed in the paper include dynamic risk analysis, such as risk in price variations (models with stochastic mean and volatility), extreme risks (models with stochastic tails), risk on asset liquidity (stochastic volatility duration models), and moral hazard in insurance analysis.

We propose estimation procedures for models with the marginal density of the series and factor dynamics parameterized by distinct subsets of parameters. Such a partitioning of the parameter vector found in many applications allows to simplify considerably statistical inference. We develop a two- stage Maximum Likelihood method, called the Finite Memory Maximum Likelihood, which is easy to implement in the presence of multiple factors. We also discuss simulation based estimation, testing, prediction and filtering.  相似文献   

5.
In this article, we present the EM-algorithm for performing maximum likelihood estimation of an asymmetric linear calibration model with the assumption of skew-normally distributed error. A simulation study is conducted for evaluating the performance of the calibration estimator with interpolation and extrapolation situations. As one application in a real data set, we fitted the model studied in a dimensional measurement method used for calculating the testicular volume through a caliper and its calibration by using ultrasonography as the standard method. By applying this methodology, we do not need to transform the variables to have symmetrical errors. Another interesting aspect of the approach is that the developed transformation to make the information matrix nonsingular, when the skewness parameter is near zero, leaves the parameter of interest unchanged. Model fitting is implemented and the best choice between the usual calibration model and the model proposed in this article was evaluated by developing the Akaike information criterion, Schwarz’s Bayesian information criterion and Hannan–Quinn criterion.  相似文献   

6.
Abstract

The MaxEWMA chart has recently been introduced as an improvement over the standard EWMA chart for detecting changes in the mean and/or standard deviation of a normally distributed process. Although this chart was originally developed for normally distributed process data, its robustness to violations of the normality assumption is the central theme of this study. For data distributions with heavy tails or displaying strong skewness, the in-control average run lengths (ARLs) for the MaxEWMA chart are shown to be significantly shorter than expected. On the other hand, out-of-control ARLs are comparable to normal theory values for a variety of symmetric non-normal distributions. The MaxEWMA chart is not robust to skewness.  相似文献   

7.
采用Monte Carlo模拟方法对STAR模型样本矩的统计特性进行研究。分析结果表明:STAR模型的样本均值、样本方差、样本偏度及样本峰度都渐近服从正态分布;即使STAR模型的数据生成过程中不含有常数项,其总体均值可能也不是0,这与线性ARMA模型有显著区别;即使STAR模型数据生成过程中的误差项服从正态分布,数据仍有可能是有偏分布。  相似文献   

8.
In this paper, a joint model for analyzing multivariate mixed ordinal and continuous responses, where continuous outcomes may be skew, is presented. For modeling the discrete ordinal responses, a continuous latent variable approach is considered and for describing continuous responses, a skew-normal mixed effects model is used. A Bayesian approach using Markov Chain Monte Carlo (MCMC) is adopted for parameter estimation. Some simulation studies are performed for illustration of the proposed approach. The results of the simulation studies show that the use of the separate models or the normal distributional assumption for shared random effects and within-subject errors of continuous and ordinal variables, instead of the joint modeling under a skew-normal distribution, leads to biased parameter estimates. The approach is used for analyzing a part of the British Household Panel Survey (BHPS) data set. Annual income and life satisfaction are considered as the continuous and the ordinal longitudinal responses, respectively. The annual income variable is severely skewed, therefore, the use of the normality assumption for the continuous response does not yield acceptable results. The results of data analysis show that gender, marital status, educational levels and the amount of money spent on leisure have a significant effect on annual income, while marital status has the highest impact on life satisfaction.  相似文献   

9.
Degradation analysis is a useful technique when life tests result in few or even no failures. The degradation measurements are recorded over time and the estimation of time-to-failure distribution plays a vital role in degradation analysis. The parametric method to estimate the time-to-failure distribution assumed a specific parametric model with known shape for the random effects parameter. To avoid any assumption about the model shape, a nonparametric method can be used. In this paper, we suggest to use the nonparametric fourth-order kernel method to estimate the time-to-failure distribution and its percentiles for the simple linear degradation model. The performances of the proposed method are investigated and compared with the classical kernel; maximum likelihood and ordinary least squares methods via simulation technique. The numerical results show the good performance of the fourth-order kernel method and demonstrate its superiority over the parametric method when there is no information about the shape of the random effect parameter distribution.  相似文献   

10.
We develop a discrete-time affine stochastic volatility model with time-varying conditional skewness (SVS). Importantly, we disentangle the dynamics of conditional volatility and conditional skewness in a coherent way. Our approach allows current asset returns to be asymmetric conditional on current factors and past information, which we term contemporaneous asymmetry. Conditional skewness is an explicit combination of the conditional leverage effect and contemporaneous asymmetry. We derive analytical formulas for various return moments that are used for generalized method of moments (GMM) estimation. Applying our approach to S&P500 index daily returns and option data, we show that one- and two-factor SVS models provide a better fit for both the historical and the risk-neutral distribution of returns, compared to existing affine generalized autoregressive conditional heteroscedasticity (GARCH), and stochastic volatility with jumps (SVJ) models. Our results are not due to an overparameterization of the model: the one-factor SVS models have the same number of parameters as their one-factor GARCH competitors and less than the SVJ benchmark.  相似文献   

11.
Very often, in psychometric research, as in educational assessment, it is necessary to analyze item response from clustered respondents. The multiple group item response theory (IRT) model proposed by Bock and Zimowski [12] provides a useful framework for analyzing such type of data. In this model, the selected groups of respondents are of specific interest such that group-specific population distributions need to be defined. The usual assumption for parameter estimation in this model, which is that the latent traits are random variables following different symmetric normal distributions, has been questioned in many works found in the IRT literature. Furthermore, when this assumption does not hold, misleading inference can result. In this paper, we consider that the latent traits for each group follow different skew-normal distributions, under the centered parameterization. We named it skew multiple group IRT model. This modeling extends the works of Azevedo et al. [4], Bazán et al. [11] and Bock and Zimowski [12] (concerning the latent trait distribution). Our approach ensures that the model is identifiable. We propose and compare, concerning convergence issues, two Monte Carlo Markov Chain (MCMC) algorithms for parameter estimation. A simulation study was performed in order to evaluate parameter recovery for the proposed model and the selected algorithm concerning convergence issues. Results reveal that the proposed algorithm recovers properly all model parameters. Furthermore, we analyzed a real data set which presents asymmetry concerning the latent traits distribution. The results obtained by using our approach confirmed the presence of negative asymmetry for some latent trait distributions.  相似文献   

12.
Skewed and fat-tailed distributions frequently occur in many applications. Models proposed to deal with skewness and kurtosis may be difficult to treat because the density function cannot usually be written in a closed form and the moments might not exist. The log-Dagum distribution is a flexible and simple model obtained by a logarithmic transformation of the Dagum random variable. In this paper, some characteristics of the model are illustrated and the estimation of the parameters is considered. An application is given with the purpose of modeling kurtosis and skewness that mark the financial return distribution.   相似文献   

13.
Non‐random sampling is a source of bias in empirical research. It is common for the outcomes of interest (e.g. wage distribution) to be skewed in the source population. Sometimes, the outcomes are further subjected to sample selection, which is a type of missing data, resulting in partial observability. Thus, methods based on complete cases for skew data are inadequate for the analysis of such data and a general sample selection model is required. Heckman proposed a full maximum likelihood estimation method under the normality assumption for sample selection problems, and parametric and non‐parametric extensions have been proposed. We generalize Heckman selection model to allow for underlying skew‐normal distributions. Finite‐sample performance of the maximum likelihood estimator of the model is studied via simulation. Applications illustrate the strength of the model in capturing spurious skewness in bounded scores, and in modelling data where logarithm transformation could not mitigate the effect of inherent skewness in the outcome variable.  相似文献   

14.
The exclusion restriction is usually assumed for identifying causal effects in true or only natural randomized experiments with noncompliance. It requires that the assignment to treatment does not have a direct causal effect on the outcome. Despite its importance, the restriction can often be unrealistic, especially in situations of natural experiments. It is shown that, without the exclusion restriction, the parametric model is identified if the outcome distributions of various compliance statuses are in the same parametric class and that class is a linearly independent set over the field of real numbers. However, the relaxation of the exclusion restriction yields a parametric model that is characterized by the presence of mixtures of distributions. This scenario complicates the likelihood‐based estimation procedures because it implies more than one maximum likelihood point. A two‐step estimation procedure based on detecting the root that is closest to the method of moments estimate of the parameter vector is then proposed and analyzed in detail, under normally distributed outcomes. An economic example with real data concerning returns to schooling concludes the paper.  相似文献   

15.
This paper studies estimation of a partially specified spatial panel data linear regression with random-effects and spatially correlated error components. Under the assumption of exogenous spatial weighting matrix and exogenous regressors, the unknown parameter is estimated by applying the instrumental variable estimation. Under some sufficient conditions, the proposed estimator for the finite dimensional parameters is shown to be root-N consistent and asymptotically normally distributed; the proposed estimator for the unknown function is shown to be consistent and asymptotically distributed as well, though at a rate slower than root-N. Consistent estimators for the asymptotic variance–covariance matrices of both the parametric and unknown components are provided. The Monte Carlo simulation results suggest that the approach has some practical value.  相似文献   

16.
Product moments of bivariate chi-square distribution have been derived in closed forms. Finite expressions have been derived for product moments of integer orders. Marginal and conditional distributions, conditional moments, coefficient of skewness and kurtosis of conditional distribution have also been discussed. Shannon entropy of the distribution is also derived. We also discuss the Bayesian estimation of a parameter of the distribution. Results match with the independent case when the variables are uncorrelated.  相似文献   

17.
In this paper, a competing risks model is considered under adaptive type-I progressive hybrid censoring scheme (AT-I PHCS). The lifetimes of the latent failure times have Weibull distributions with the same shape parameter. We investigate the maximum likelihood estimation of the parameters. Bayes estimates of the parameters are obtained based on squared error and LINEX loss functions under the assumption of independent gamma priors. We propose to apply Markov Chain Monte Carlo (MCMC) techniques to carry out a Bayesian estimation procedure and in turn calculate the credible intervals. To evaluate the performance of the estimators, a simulation study is carried out.  相似文献   

18.
In variety testing as well as in psychological assessment, the situation occurs that in a two-way ANOVA-type model with only one replication per cell, analysis is done under the assumption of no interaction between the two factors. Tests for this situation are known only for fixed factors and normally distributed outcomes. In the following we will present five additivity tests and apply them to fixed and mixed models and to quantitative as well as to Bernoulli distributed data. We consider their performance via simulation studies with respect to the type-I-risk and power. Furthermore, two new approaches will be presented, one being a modification of Tukey’s test and the other being a new experimental design to test for interactions.  相似文献   

19.
In this paper, we propose a new semiparametric heteroscedastic regression model allowing for positive and negative skewness and bimodal shapes using the B-spline basis for nonlinear effects. The proposed distribution is based on the generalized additive models for location, scale and shape framework in order to model any or all parameters of the distribution using parametric linear and/or nonparametric smooth functions of explanatory variables. We motivate the new model by means of Monte Carlo simulations, thus ignoring the skewness and bimodality of the random errors in semiparametric regression models, which may introduce biases on the parameter estimates and/or on the estimation of the associated variability measures. An iterative estimation process and some diagnostic methods are investigated. Applications to two real data sets are presented and the method is compared to the usual regression methods.  相似文献   

20.
In this paper we examine maximum likelihood estimation procedures in multilevel models for two level nesting structures. Usually, for fixed effects and variance components estimation, level-one error terms and random effects are assumed to be normally distributed. Nevertheless, in some circumstances this assumption might not be realistic, especially as concerns random effects. Thus we assume for random effects the family of multivariate exponential power distributions (MEP); subsequently, by means of Monte Carlo simulation procedures, we study robustness of maximum likelihood estimators under normal assumption when, actually, random effects are MEP distributed.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号