Comparative Studies of Support Vector Regression between Reproducing Kernel and Gaussian Kernel
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 32799
Comparative Studies of Support Vector Regression between Reproducing Kernel and Gaussian Kernel

Authors: Wei Zhang, Su-Yan Tang, Yi-Fan Zhu, Wei-Ping Wang

Abstract:

Support vector regression (SVR) has been regarded as a state-of-the-art method for approximation and regression. The importance of kernel function, which is so-called admissible support vector kernel (SV kernel) in SVR, has motivated many studies on its composition. The Gaussian kernel (RBF) is regarded as a “best" choice of SV kernel used by non-expert in SVR, whereas there is no evidence, except for its superior performance on some practical applications, to prove the statement. Its well-known that reproducing kernel (R.K) is also a SV kernel which possesses many important properties, e.g. positive definiteness, reproducing property and composing complex R.K by simpler ones. However, there are a limited number of R.Ks with explicit forms and consequently few quantitative comparison studies in practice. In this paper, two R.Ks, i.e. SV kernels, composed by the sum and product of a translation invariant kernel in a Sobolev space are proposed. An exploratory study on the performance of SVR based general R.K is presented through a systematic comparison to that of RBF using multiple criteria and synthetic problems. The results show that the R.K is an equivalent or even better SV kernel than RBF for the problems with more input variables (more than 5, especially more than 10) and higher nonlinearity.

Keywords: admissible support vector kernel, reproducing kernel, reproducing kernel Hilbert space, support vector regression.

Digital Object Identifier (DOI): doi.org/10.5281/zenodo.1071114

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1535

References:


[1] V. Vapnik, The Nature of Statistical Learning Theory. New York: Springer-Verlag, 1995.
[2] G. Bloch, F. Lauer, G. Colin, and Y. Chamaillard, "Support vector regression from simulation data and few experimental samples," Information Sciences, vol. 178, pp. 3813-3827, 2008.
[3] J.-B. Gao, S. R. Gunn, and C. J. Harris, "Mean field method for the support vector machine regression," Neurocomputing, vol. 50, pp. 391- 405, 2003.
[4] M. A. Mohandes, T. O. Halawani, S. Rehman, and A. A. Hussain, "Support vector machines for wind speed prediction," Renewable Energy, vol. 29, no. 6, pp. 939-947, 2004.
[5] W.-W. He, Z.-Z. Wang, and H. Jiang, "Model optimizing and feature selecting for support vector regression in time series forecasting," Neurocomputing, vol. 73, no. 3, pp. 600-611, 2008.
[6] F. Pan, P. Zhu, and Y. Zhang, "Metamodel-based lightweight design of b-pillar with twb structure via support vector regression," Computers and Structures, vol. 88, pp. 36-44, 2010.
[7] C. J. Burges, "A tutorial on support vector machines for pattern recognition," Data Mining and Knowledge Discovery, vol. 2, pp. 121-167, 1998.
[8] A. J. Smola and B. Sch¨olkopf, "A tutorial on support vector regression," Statistics and Computing, vol. 14, no. 3, pp. 199-222, 2004.
[9] J. Mercer, Ed., Functions of positive and negative type and their connection with the theory of integral equations, ser. Philosophical Transactions of the Royal Society, London, 1909, vol. A, 209.
[10] B. E. Boser, I. M. Guyon, and V. Vapnik, "A training algorithm for optimal margin classifiers," in Proceedings of the 5th Annual ACM Workshop on Computational Learning Theory, D. Haussler, Ed. Pittsburgh, PA: ACM Press, 1992, pp. 144-152.
[11] B. Schp¨olkopf, "The kernel trick for distances," Neural Information Process. Systems (NIPS), vol. 13, 2000.
[12] B. Sch¨olkopf, K.-K. Sung, C. J. Burges, F. Girosi, P. Niyogi, T. Poggio, and V. Vapnik, "Comparing support vector machines with gaussian kernels to radial basis function classifiers," IEEE Transactions on Signal Processing, vol. 45, pp. 2758-2765, 1997.
[13] D. Anguita and G. Bozza, "The effect of quantization on support vector machines with gaussian kernel," in Proceedings of International Joint Conference on Neural Networks, Montreal, Canada, 2005, pp. 681-684.
[14] X.-Y. Zhang and Y.-C. Liu, "Performance analysis of support vector machines with gauss kernel," Computer Engineering, vol. 29, no. 8, pp. 22-25, 2003.
[15] Y. Tan and J. Wang, "A support vector machine with a hybrid kernel and minimal vapnik-chervonenkis dimension," IEEE Transactions on Knowledge and Data Engineering, vol. 16, pp. 385-395, 2004.
[16] J.-X. Liu, J. Li, and Y.-J. Tan, "An empirical assessment on the robustness of support vector regression with different kernels," in Proceedings of the 4th International Conference on Machine Learning and Cybernetics, vol. 7, Guangzhou, China, 2005.
[17] R. Opfer, "Multiscale kernels," Advances in Computational Mathematics, vol. 25, pp. 357-380, 2006.
[18] L. Zhang, W.-D. Zhou, and L.-C. Jiao, "Wavelet support vector machine," IEEE Transactions on Systems, Man and Cybernetics - Part B: Cybernetics, vol. 34, pp. 34-39, 2004.
[19] X.-G. Zhang, D. Gao, X.-G. Zhang, and S.-J. Ren, "Robust wavelat support machines for regression estimation," International Journal Information Technology, vol. 11, no. 9, pp. 35-46, 2005.
[20] A. J. Smola, B. Sch¨olkopf, and K.-R. M¨uller, "The connection between regularization operators and support vector kernels," Neural Networks, vol. 11, pp. 637-649, 1998.
[21] G. Wahba, "Support vector machines,reproducing kernel hilbert spaces and randomized gacv," in Advances in Kernel Methods - Support Vector Learning, B. Sch?lkopf, C. J. Burges, and A. J. Smola, Eds. Cambridge, England: MIT Press, 1999, pp. 69-88.
[22] L.-M. Ma and Z.-M. Wu, "Kernel based approximation in sobolev spaces with radial basis functions," Applied Mathematics and Computation, vol. 215, pp. 2229-2237, 2009.
[23] N. Aronszajn, "Theory of reproducing kernels," Transactions of the American Mathematical Society, vol. 68, no. 3, pp. 337-404, 1950.
[24] J. Gao, C. J. Harris, and S. R. Gunn, "Support vector kernel based on frames in function hilbert spaces," Neural Computation, vol. 13, pp. 1975-1994, 2001.
[25] R. Schaback, "A unified theory of radial basis functions native hilbert spaces for radial basis functions ii," Journal of Computational and Applied Mathematics, vol. 121, pp. 165-177, 2000.
[26] R. Schaback and H. Wendland, "Approximation by positive definite kernels," in Advanced Problems in Constructive Approximation, M. D. Buhmann and D. H. Mache, Eds. Birkh?user, Basel: Verlag, 2002, pp. 203-221.
[27] A. J. Smola, B. Sch¨olkopf, and G. R¨atsch, "Linear programs for automatic accuracy control in regression," in Proceedings of the 9th international conference on artificial neural networks, vol. 2, Edinburgh, UK., 1999.
[28] O. L. Mangasarian and D. R. Musicant, "Large scale kernel regression via linear programming," Machine Learning, vol. 46, no. 1-3, pp. 255- 269, 2002.
[29] A. J. Smola, B. Sch¨olkopf, and K.-R. Mller, "General cost functions for support vector regression," in Proceedings of Ninth Australian Conf. on Neural Networks, Brisbane, Australia, University of Queensland, 1998, pp. 79-83.
[30] S. Bochner, Lectures on Fourier Integral. Princeton, New Jersey: Princeton University Press, 1959.
[31] A. J. Smola, Z. L. O' vri, and R. C. Williamson, "Regularization with dotproduct kernels," in Advances in Neural Information Processing Systems, T. K. Leen, T. G. Dietterich, and V. Tresp, Eds., vol. 13. MIT Press, 2001, pp. 308-314.
[32] G. Wahba, Spline Models for Observational Data, SIAM, Philadephia, 1990.
[33] J. St¨ockler, "Multivariate bernoulli splines and the periodic interpolation problem," Constr. Approx., vol. 7, pp. 105-120, 1991.
[34] A. Berlinet and C. Thomas-Agnan, Reproducing Kernel Hilbert Spaces in Probability and Statistics. Boston, Dordrecht, London: Kluwer Academic Publishers Group, 2003.
[35] N. Cristianini and J. Shawe-Taylor, An Introduction to Support Vector Machines and Other Kernel-based Learning Methods. Cambridge, U.K: Cambridge University Press, 2000.
[36] B. Sch¨olkopf, R. Herbrich, A. J. Smola, and R. C. Williamson, "A generalized representer theorem," Tech. Rep. NeuroCOLT Technical Report 2000-81, 2000.
[37] W. Zhang, "The construction of reproducing kernel and some approximating problems in the reproducing kernel spaces," Ph.D. dissertation, National University of Defense Technology, 2005.
[38] R. A. Adams, Sobolev Spaces. New York: Academic Press, 1975.
[39] R. Jin, W. Chen, and T. W. Simpson, "Comparative studies of metamodeling techniques under multiple modeling criteria," in Proceedings of the 8th AIIA/NASA/USAF/ISSMO Symposium on Multidisciplinary Analysis and Optimization, Long Beach, CA, 2000.
[40] Y.-J. Park, "Application of genetic algorithms in response surface optimization problems," Doctor of Philosophy, Arizona State University, December 2003.
[41] S. S. Chaudhry and W. Luo, "Application of genetic algorithms in production and operations management: A review," International Journal of Production Research, vol. 43, pp. 4083-4101, 2005.
[42] C. R. Houck, J. A. Joines, and M. G. Kay, "A genetic algorithm for function optimization: A matlab implementation," Tech. Rep. NCSU-IE TR 95-09, 1995.
[43] W. Zhang, H. Wu, J. Liu, Y.-F. Zhu, and Q. Li, "A study of exploratory analysis experimental design supporting robust decision-making," Journal of System Simulation, vol. 21, no. 14, pp. 4461-4466, 2009.
[44] J. P. C. Kleijnen and R. G. Sargent, "A methodology for fitting and validating metamodels in simulation," European Journal of Operational Research, vol. 120, no. 1, pp. 14-29, 2000.