Loading...
SMO-based pruning methods for sparse least squares support vector machines
Zeng, Xiangyan ; Chen, Xue-wen
Zeng, Xiangyan
Chen, Xue-wen
Citations
Altmetric:
Abstract
Solutions of least squares support vector machines (LS-SVMs) are typically nonsparse. The sparseness is imposed by subsequently omitting data that introduce the smallest training errors and retraining the remaining data. Iterative retraining requires more intensive computations than training a single nonsparse LS-SVM. In this paper, we propose a new pruning algorithm for sparse LS-SVMs: the sequential minimal optimization (SMO) method is introduced into pruning process; in addition, instead of determining the pruning points by errors, we omit the data points that will introduce minimum changes to a dual objective function. This new criterion is computationally efficient. The effectiveness of the proposed method in terms of computational cost and classification accuracy is demonstrated by numerical experiments.
Description
Date
2005-11
Journal Title
Journal ISSN
Volume Title
Publisher
IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
Research Projects
Organizational Units
Journal Issue
Keywords
Least squares support vector machine, Pruning, Sequential minimal optimization, Sparseness
Citation
Zeng, XY; Chen, XW. SMO-based pruning methods for sparse least squares support vector machines. IEEE TRANSACTIONS ON NEURAL NETWORKS. November 2005. 16(6) : 1541-1546