A Study on SMO Algorithm for Solving ϵ-SVR with Non-PSD Kernels |
| |
Authors: | Xiaojian Zhou Yizhong Ma |
| |
Institution: | 1. School of Economics and Management , Nanjing University of Posts and Telecommunications , Nanjing , China;2. Department of Management Science and Technology , Nanjing University of Science and Technology , Nanjing , China |
| |
Abstract: | Sequential minimal optimization (SMO) algorithm is effective in solving large-scale support vector machine (SVM). The existing algorithms all assume that the kernels are positive definite (PD) or positive semi-definite (PSD) and should meet the Mercer condition. Some kernels, however, such as sigmoid kernel, which originates from neural network and then is extensively used in SVM, are conditionally PD in certain circumstances; in addition, practically, it is often difficult to prove whether a kernel is PD or PSD or not except some well-known kernels. So, the applications of the existing algorithm of SMO are limited. Considering the deficiency of the traditional ones, this algorithm of solving ?-SVR with nonpositive semi-definite (non-PSD) kernels is proposed. Different from the existing algorithms which must consider four Lagrange multipliers, the algorithm proposed in this article just need to consider two Lagrange multipliers in the process of implementation. The proposed algorithm simplified the implementation by expanding the original dual programming of ?-SVR and solving its KKT conditions, thus being easily applied in solving ?-SVR with non-PSD kernels. The presented algorithm is evaluated using five benchmark problems and one reality problem. The results show that ?-SVR with non-PSD provides more accurate prediction than that with PD kernel. |
| |
Keywords: | Non-positive semi-definite kernel Sequential minimal optimization algorithm Support vector machine ?-Support vector regression |
|
|