Approximation Incremental Training Algorithm Based on a Changeable Training Set
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 32804
Approximation Incremental Training Algorithm Based on a Changeable Training Set

Authors: Yi-Fan Zhu, Wei Zhang, Xuan Zhou, Qun Li, Yong-Lin Lei

Abstract:

The quick training algorithms and accurate solution procedure for incremental learning aim at improving the efficiency of training of SVR, whereas there are some disadvantages for them, i.e. the nonconvergence of the formers for changeable training set and the inefficiency of the latter for a massive dataset. In order to handle the problems, a new training algorithm for a changeable training set, named Approximation Incremental Training Algorithm (AITA), was proposed. This paper explored the reason of nonconvergence theoretically and discussed the realization of AITA, and finally demonstrated the benefits of AITA both on precision and efficiency.

Keywords: support vector regression, incremental learning, changeable training set, quick training algorithm, accurate solutionprocedure

Digital Object Identifier (DOI): doi.org/10.5281/zenodo.1061870

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1429

References:


[1] V. Vapnik, The Nature of Statistical Learning Theory. New York: Springer-Verlag, 1995.
[2] C. J. Burges, "A tutorial on support vector machines for pattern recognition," Data Mining and Knowledge Discovery, vol. 2, pp. 121-167, 1998.
[3] B. Sch¨olkopf, C. J. Burges, and A. J. Smola, Advances in Kernel Methods - Support Vector Learning. Cambridge, England: The MIT Press, 1999.
[4] B. Sch¨olkopf and A. J. Smola, Learning with Kernels. Cambridge: MIT Press, 2002.
[5] A. J. Smola and B. Sch¨olkopf, "A tutorial on support vector regression," Statistics and Computing, vol. 14, no. 3, pp. 199-222, 2004.
[6] G. Bloch, F. Lauer, G. Colin, and Y. Chamaillard, "Support vector regression from simulation data and few experimental samples," Information Sciences, vol. 178, pp. 3813-3827, 2008.
[7] J.-B. Gao, S. R. Gunn, and C. J. Harris, "Mean field method for the support vector machine regression," Neurocomputing, vol. 50, pp. 391- 405, 2003.
[8] K.-R. M¨uller, A. J. Smola, G. R¨atsch, B. Sch¨olkopf, J. Kohlmorgen, and V. Vapnik, "Predicting time series with support vector machines," in Artificial Neural Networks ICANN-97, W. Gerstner, A. Germond, M. Hasler, and J.-D. Nicoud, Eds., vol. 1327. Berlin: Springer Lecture Notes in Computer Science, 1997, pp. 999-1004.
[9] D. Odapally, "Structural optimization using femlab and smooth support vector regression," Ph.D. dissertation, University of Texas, 2006.
[10] E. E. Osuna, R. Freund, and F. Girosi, "Training support vector machines: An application to face detection," in IEEE Conference on Computer Vision and Pattern Recognition, 1997, pp. 130-136.
[11] J. C. Platt, "Fast training of support vector machines using sequential minimal optimization," in Advances in Kernel Methods-Support Vector Learning, B. Sch?lkopf, C. J. Burges, and A. J. Smola, Eds. Cambridge, England: MIT Press, 1999.
[12] W. Zhou, "Kernel-based learning machines," Ph.D. dissertation, Xi-an Electronic and Science University, 2003.
[13] G. Cauwenberghs and T. Poggio, "Incremental and decremental support vector machine learning," Machine Learning, vol. 44, no. 13, pp. 409- 415, 2001.
[14] J. Ma, J. Theiler, and S. Perkins, "Accurate online support vector regression," Neural Computation, vol. 15, no. 11, pp. 2683-2703, 2003.