Loading…

Sparse approximation using least squares support vector machines

In least squares support vector machines (LS-SVMs) for function estimation Vapnik's /spl epsiv/-insensitive loss function has been replaced by a cost function which corresponds to a form of ridge regression. In this way nonlinear function estimation is done by solving a linear set of equations...

Full description

Saved in:
Bibliographic Details
Main Authors: Suykens, J.A.K., Lukas, L., Vandewalle, J.
Format: Conference Proceeding
Language:English
Subjects:
Online Access:Request full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:In least squares support vector machines (LS-SVMs) for function estimation Vapnik's /spl epsiv/-insensitive loss function has been replaced by a cost function which corresponds to a form of ridge regression. In this way nonlinear function estimation is done by solving a linear set of equations instead of solving a quadratic programming problem. The LS-SVM formulation also involves less tuning parameters. However, a drawback is that sparseness is lost in the LS-SVM case. In this paper we investigate imposing sparseness by pruning support values from the sorted support value spectrum which results from the solution to the linear system.
DOI:10.1109/ISCAS.2000.856439