ML estimation for factor analysis: EM or non-EM? |
| |
Authors: | J-H Zhao Philip L H Yu Qibao Jiang |
| |
Institution: | (1) Department of Statistics and Actuarial Science, The University of Hong Kong, Fokfulam, Hong Kong;(2) Department of Statistics, Yunnan University, Kunming, 650091, China;(3) Department of Mathematics, Southeast University, Nanjing, 210096, China |
| |
Abstract: | To obtain maximum likelihood (ML) estimation in factor analysis (FA), we propose in this paper a novel and fast conditional maximization (CM) algorithm, which has quadratic and monotone
convergence, consisting of a sequence of CM log-likelihood (CML) steps. The main contribution of this algorithm is that the
closed form expression for the parameter to be updated in each step can be obtained explicitly, without resorting to any numerical
optimization methods. In addition, a new ECME algorithm similar to Liu’s (Biometrika 81, 633–648, 1994) one is obtained as a by-product, which turns out to be very close to the simple iteration algorithm proposed by Lawley (Proc.
R. Soc. Edinb. 60, 64–82, 1940) but our algorithm is guaranteed to increase log-likelihood at every iteration and hence to converge. Both algorithms inherit
the simplicity and stability of EM but their convergence behaviors are much different as revealed in our extensive simulations:
(1) In most situations, ECME and EM perform similarly; (2) CM outperforms EM and ECME substantially in all situations, no
matter assessed by the CPU time or the number of iterations. Especially for the case close to the well known Heywood case, it accelerates EM by factors of around 100 or more. Also, CM is much more insensitive to the choice of starting values
than EM and ECME. |
| |
Keywords: | CM ECME EM Factor analysis Maximum likelihood estimation |
本文献已被 SpringerLink 等数据库收录! |
|