Extended Least Squares LS–SVM
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 33092
Extended Least Squares LS–SVM

Authors: József Valyon, Gábor Horváth

Abstract:

Among neural models the Support Vector Machine (SVM) solutions are attracting increasing attention, mostly because they eliminate certain crucial questions involved by neural network construction. The main drawback of standard SVM is its high computational complexity, therefore recently a new technique, the Least Squares SVM (LS–SVM) has been introduced. In this paper we present an extended view of the Least Squares Support Vector Regression (LS–SVR), which enables us to develop new formulations and algorithms to this regression technique. Based on manipulating the linear equation set -which embodies all information about the regression in the learning process- some new methods are introduced to simplify the formulations, speed up the calculations and/or provide better results.

Keywords: Function estimation, Least–Squares Support VectorMachines, Regression, System Modeling

Digital Object Identifier (DOI): doi.org/10.5281/zenodo.1071696

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2008

References:


[1] V. Vapnik, "The Nature of Statistical Learning Theory", New-York: Springer-Verlag., 1995
[2] J. A. K. Suykens, V. T. Gestel, J. De Brabanter, B. De Moor, J. Vandewalle, "Least Squares Support Vector Machines", World Scientific, 2002
[3] J. A. K. Suykens, L. Lukas, and J. Vandewalle, "Sparse approximation using least squares support vector machines", IEEE International Symposium on Circuits and Systems ISCAS'2000, 2000
[4] J. A. K. Suykens, L. Lukas, and J. Vandewalle, "Sparse least squares support vector machine classifiers", ESANN'2000 European Symposium on Artificial Neural Networks, 2000, pp. 37-42.
[5] J.A.K. Suykens, J. De Brabanter, L. Lukas, and J. Vandewalle, "Weighted least squares support vector machines: robustness and sparse approximation", Neurocomputing, 2002. pp. 85-105
[6] B. Schölkopf and A. Smola: Learning with Kernels. Support Vector Machines, Regularization, Optimization, and Beyond. The MIT Press, Cambridge, MA, 2002.
[7] J. Valyon and G. Horv├íth, ÔÇ×A generalized LS-SVM", SYSID'2003 Rotterdam, 2003, pp. 827-832.
[8] J. Valyon and G. Horv├íth, ÔÇ×A Sparse Least Squares Support Vector Machine Classifier", Proceedings of the International Joint Conference on Neural Networks IJCNN 2004, 2004, pp. 543-548.
[9] P. W. Holland, and R. E. Welsch, "Robust Regression Using Iteratively Reweighted Least-Squares," Communications in Statistics: Theory and Methods, A6, 1977, pp. 813-827.
[10] P. J. Huber, Robust Statistics, Wiley, 1981.
[11] B. Schölkopf, S. Mika, C.J.C. Burges, P. Knirsch, K.-R. M├╝ller, G. R├ñtsch, and A. Smola, ÔÇ×Input space vs. feature space in kernel-based methods". IEEE Transactions on Neural Networks, 1999, 10(5), pp. 1000-1017.
[12] G. Baudat and F. Anouar, "Kernel-based methods and function approximation". In International Joint Conference on Neural Networks, pages 1244-1249, Washington DC, 2001. July 15-19.
[13] Yuh - Jye Lee and O. L. Mangasarian, "RSVM: Reduced Support Vector Machines", Proceedings of the First SIAM International Conference on Data Mining, Chicago, 2001. April 5-7.
[14] W. H. Press, S. A. Teukolsky, W. T. Wetterling and B. P. Flannery , "Numerical Recipes in C", Cambridge University Press, Books On-Line, Available: www.nr.com, 2002
[15] H. Golub and Charles F. Van Loan, Matrix Computations", Gene Johns Hopkins University Press, 1989