Zeng, XiangyanChen, Xue-wen2007-05-032007-05-032005-11Zeng, XY; Chen, XW. SMO-based pruning methods for sparse least squares support vector machines. IEEE TRANSACTIONS ON NEURAL NETWORKS. November 2005. 16(6) : 1541-1546Digital Object Identifier 10.1109/TNN.2005.852239https://hdl.handle.net/1808/1512Solutions of least squares support vector machines (LS-SVMs) are typically nonsparse. The sparseness is imposed by subsequently omitting data that introduce the smallest training errors and retraining the remaining data. Iterative retraining requires more intensive computations than training a single nonsparse LS-SVM. In this paper, we propose a new pruning algorithm for sparse LS-SVMs: the sequential minimal optimization (SMO) method is introduced into pruning process; in addition, instead of determining the pruning points by errors, we omit the data points that will introduce minimum changes to a dual objective function. This new criterion is computationally efficient. The effectiveness of the proposed method in terms of computational cost and classification accuracy is demonstrated by numerical experiments.en-USLeast squares support vector machinePruningSequential minimal optimizationSparsenessSMO-based pruning methods for sparse least squares support vector machinesArticleopenAccess