SMO-based pruning methods for sparse least squares support vector machines

View/ Open
Issue Date
2005-11Author
Zeng, Xiangyan
Chen, Xue-wen
Publisher
IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
Type
Article
Metadata
Show full item recordAbstract
Solutions of least squares support vector machines (LS-SVMs) are typically nonsparse. The sparseness is imposed by subsequently omitting data that introduce the smallest training errors and retraining the remaining data. Iterative retraining requires more intensive computations than training a single nonsparse LS-SVM. In this paper, we propose a new pruning algorithm for sparse LS-SVMs: the sequential minimal optimization (SMO) method is introduced into pruning process; in addition, instead of determining the pruning points by errors, we omit the data points that will introduce minimum changes to a dual objective function. This new criterion is computationally efficient. The effectiveness of the proposed method in terms of computational cost and classification accuracy is demonstrated by numerical experiments.
Collections
Citation
Zeng, XY; Chen, XW. SMO-based pruning methods for sparse least squares support vector machines. IEEE TRANSACTIONS ON NEURAL NETWORKS. November 2005. 16(6) : 1541-1546
Items in KU ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.
We want to hear from you! Please share your stories about how Open Access to this item benefits YOU.