Show simple item record

dc.contributor.authorZeng, Xiangyan
dc.contributor.authorChen, Xue-wen
dc.date.accessioned2007-05-03T23:45:48Z
dc.date.available2007-05-03T23:45:48Z
dc.date.issued2005-11
dc.identifier.citationZeng, XY; Chen, XW. SMO-based pruning methods for sparse least squares support vector machines. IEEE TRANSACTIONS ON NEURAL NETWORKS. November 2005. 16(6) : 1541-1546
dc.identifier.otherDigital Object Identifier 10.1109/TNN.2005.852239
dc.identifier.urihttp://hdl.handle.net/1808/1512
dc.description.abstractSolutions of least squares support vector machines (LS-SVMs) are typically nonsparse. The sparseness is imposed by subsequently omitting data that introduce the smallest training errors and retraining the remaining data. Iterative retraining requires more intensive computations than training a single nonsparse LS-SVM. In this paper, we propose a new pruning algorithm for sparse LS-SVMs: the sequential minimal optimization (SMO) method is introduced into pruning process; in addition, instead of determining the pruning points by errors, we omit the data points that will introduce minimum changes to a dual objective function. This new criterion is computationally efficient. The effectiveness of the proposed method in terms of computational cost and classification accuracy is demonstrated by numerical experiments.
dc.language.isoen_US
dc.publisherIEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
dc.subjectLeast squares support vector machine
dc.subjectPruning
dc.subjectSequential minimal optimization
dc.subjectSparseness
dc.titleSMO-based pruning methods for sparse least squares support vector machines
dc.typeArticle
dc.rights.accessrightsopenAccess


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record