Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 30184
Motivated Support Vector Regression with Structural Prior Knowledge

Authors: Wei Zhang, Yao-Yu Li, Yi-Fan Zhu, Qun Li, Wei-Ping Wang

Abstract:

It-s known that incorporating prior knowledge into support vector regression (SVR) can help to improve the approximation performance. Most of researches are concerned with the incorporation of knowledge in form of numerical relationships. Little work, however, has been done to incorporate the prior knowledge on the structural relationships among the variables (referred as to Structural Prior Knowledge, SPK). This paper explores the incorporation of SPK in SVR by constructing appropriate admissible support vector kernel (SV kernel) based on the properties of reproducing kernel (R.K). Three-levels specifications of SPK are studies with the corresponding sub-levels of prior knowledge that can be considered for the method. These include Hierarchical SPK (HSPK), Interactional SPK (ISPK) consisting of independence, global and local interaction, Functional SPK (FSPK) composed of exterior-FSPK and interior-FSPK. A convenient tool for describing the SPK, namely Description Matrix of SPK is introduced. Subsequently, a new SVR, namely Motivated Support Vector Regression (MSVR) whose structure is motivated in part by SPK, is proposed. Synthetic examples show that it is possible to incorporate a wide variety of SPK and helpful to improve the approximation performance in complex cases. The benefits of MSVR are finally shown on a real-life military application, Air-toground battle simulation, which shows great potential for MSVR to the complex military applications.

Keywords: admissible support vector kernel, reproducing kernel, structural prior knowledge, motivated support vector regression

Digital Object Identifier (DOI): doi.org/10.5281/zenodo.1078891

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1050

References:


[1] V. Vapnik, The Nature of Statistical Learning Theory. New York: Springer-Verlag, 1995.
[2] G. Bloch, F. Lauer, G. Colin, and Y. Chamaillard, "Support vector regression from simulation data and few experimental samples," Information Sciences, vol. 178, pp. 3813-3827, 2008.
[3] J.-B. Gao, S. R. Gunn, and C. J. Harris, "Mean field method for the support vector machine regression," Neurocomputing, vol. 50, pp. 391- 405, 2003.
[4] M. A. Mohandes, T. O. Halawani, S. Rehman, and A. A. Hussain, "Support vector machines for wind speed prediction," Renewable Energy, vol. 29, no. 6, pp. 939-947, 2004.
[5] W.-W. He, Z.-Z. Wang, and H. Jiang, "Model optimizing and feature selecting for support vector regression in time series forecasting," Neurocomputing, vol. 73, no. 3, pp. 600-611, 2008.
[6] F. Pan, P. Zhu, and Y. Zhang, "Metamodel-based lightweight design of b-pillar with twb structure via support vector regression," Computers and Structures, vol. 88, pp. 36-44, 2010.
[7] A. J. Smola, T. Friess, and K.-R. Mller, "Semiparametric support vector and linear programming machines," Advances in neural information processing systems, vol. 11, pp. 585-591, 1998.
[8] O. L. Mangasarian, J. Shavlik, and E. W. Wild, "Knowledge-based kernel approximation," Journal of Machine Learning Research, vol. 5, pp. 1127-1141, 2004.
[9] M. Lzaro, F. Prez-Cruz, and A. Arts-Rodriguez, "Learning a function and its derivative forcing the support vector expansion," IEEE Signal Processing Letters, vol. 12, pp. 194-197, 2005.
[10] F. Lauer and G. Bloch, "Incorporating prior knowledge in support vector regression," Mach. Learn., vol. 70, pp. 89-118, 2008.
[11] O. L. Mangasarian and E. W. wild, "Nonlinear knowledge in kernel approximation," IEEE Transactions on Neural Networks, vol. 18, pp. 300-306, 2007.
[12] B. Sch¨olkopf, P. Simard, A. J. Smola, and V. Vapnik, "Prior knowledge in support vector kernels," in Advanced in Kernel Method-Support Vector Learning, C. B. A. S. Sch¨olkopf, B., Ed. Cambridge, England: MIT Press, 1998.
[13] A. J. Smola and B. Sch¨olkopf, "A tutorial on support vector regression," Statistics and Computing, vol. 14, no. 3, pp. 199-222, 2004.
[14] P. K. Davis and J. H. Bigelow, "Motivated metamodels: Synthesis of cause-effect reasoning and statistical metamodeling," RAND, Tech. Rep. MR-1570, 2003.
[15] W. Zhang, X. Zhao, Y.-F. Zhu, and X.-J. Zhang, "A new composition method of admissible support vector kernel based on reproducing kernel," in International Conference on Computer and Applied Mathematics, Rio de Janeiro, March 2010.
[16] N. Aronszajn, "Theory of reproducing kernels," Transactions of the American Mathematical Society, vol. 68, no. 3, pp. 337-404, 1950.
[17] D. Haussler, "Convolution kernels on discrete structures," University of California, Sana Cruz, CA,, Tech. Rep. UCSC-CRL-99-10, 1999.
[18] G. F. Smits and E. M. Jondaan, "Improved svm regression using mixtures of kernels," in Proceeding of the 2002 international joint conference on neural networks, vol. 3. Honolulu, Hawaii, USA: IEEE, 2002.
[19] Y. Tan and J. Wang, "A support vector machine with a hybrid kernel and minimal vapnik-chervonenkis dimension," IEEE Transactions on Knowledge and Data Engineering, vol. 16, pp. 385-395, 2004.
[20] B. Sch¨olkopf, K.-K. Sung, C. J. Burges, F. Girosi, P. Niyogi, T. Poggio, and V. Vapnik, "Comparing support vector machines with gaussian kernels to radial basis function classifiers," IEEE Transactions on Signal Processing, vol. 45, pp. 2758-2765, 1997.
[21] G. Wahba, Spline Models for Observational Data, SIAM, Philadephia, 1990.
[22] J. St¨ockler, "Multivariate bernoulli splines and the periodic interpolation problem," Constr. Approx., vol. 7, pp. 105-120, 1991.
[23] G. Wahba, "Support vector machines,reproducing kernel hilbert spaces and randomized gacv," in Advances in Kernel Methods - Support Vector Learning, B. Sch¨olkopf, C. J. Burges, and A. J. Smola, Eds. Cambridge, England: MIT Press, 1999, pp. 69-88.
[24] A. Berlinet and C. Thomas-Agnan, Reproducing Kernel Hilbert Spaces in Probability and Statistics. Boston, Dordrecht, London: Kluwer Academic Publishers Group, 2003.
[25] R. Schaback, "A unified theory of radial basis functions native hilbert spaces for radial basis functions ii," Journal of Computational and Applied Mathematics, vol. 121, pp. 165-177, 2000.
[26] J. Mercer, Ed., Functions of positive and negative type and their connection with the theory of integral equations, ser. Philosophical Transactions of the Royal Society, London, 1909, vol. A, 209.
[27] P. K. Davis, "Introduction to multiresolution, multiperspective modeling (mrmpm) and exploratory analysis," RAND, Tech. Rep. WR-224, 2005.
[28] R. Horn, Topics in Matrix Analysis. Cambridge, MA: Cambridge University Press, 1994.
[29] C. J. Burges, "A tutorial on support vector machines for pattern recognition," Data Mining and Knowledge Discovery, vol. 2, pp. 121-167, 1998.
[30] Y.-J. Park, "Application of genetic algorithms in response surface optimization problems," Doctor of Philosophy, Arizona State University, December 2003.
[31] S. S. Chaudhry and W. Luo, "Application of genetic algorithms in production and operations management: A review," International Journal of Production Research, vol. 43, pp. 4083-4101, 2005.
[32] C. R. Houck, J. A. Joines, and M. G. Kay, "A genetic algorithm for function optimization: A matlab implementation," Tech. Rep. NCSU-IE TR 95-09, 1995.
[33] B. P. Zeigler, H. Praehofer, and T. G. Kim, Theory of Modeling and Simulation: Integrating Discrete Event and Continuous Complex Dynamic Systems, second edition ed. CA, USA: Academic Press, 2000.