Kernel’s Parameter Selection for Support Vector Domain Description
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 33041
Kernel’s Parameter Selection for Support Vector Domain Description

Authors: Mohamed EL Boujnouni, Mohamed Jedra, Noureddine Zahid

Abstract:

Support Vector Domain Description (SVDD) is one of the best-known one-class support vector learning methods, in which one tries the strategy of using balls defined on the feature space in order to distinguish a set of normal data from all other possible abnormal objects. As all kernel-based learning algorithms its performance depends heavily on the proper choice of the kernel parameter. This paper proposes a new approach to select kernel's parameter based on maximizing the distance between both gravity centers of normal and abnormal classes, and at the same time minimizing the variance within each class. The performance of the proposed algorithm is evaluated on several benchmarks. The experimental results demonstrate the feasibility and the effectiveness of the presented method.

Keywords: Gravity centers, Kernel’s parameter, Support Vector Domain Description, Variance.

Digital Object Identifier (DOI): doi.org/10.5281/zenodo.1088902

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1823

References:


[1] B. H. Asa, David Horn, T. H. Siegelmann, Vladimir. Vapnik, “Support vector clustering”, Journal of Machine Learning Research. Vol. 2, No. 12, pp.125-137, 2001.
[2] D. Tax, R. Duin, “Data Domain Description Using Support Vectors”. Proceedings- European Symposium on Artificial Neural Networks Bruges, pp 251-256, (Belgium, 1999a).
[3] D. Tax, R. Duin, “Support vector domain description”. Pattern Recognition Letters, Vol. 20, No. 11–13, pp. 1191–1199, 1999b.
[4] D. Tax, R. Duin, “Support Vector Data Description”, Machine Learning. Vol.54, pp.45–66, 2004.
[5] K. Lee, D.W Kim, D. Lee, and K. H Lee. “Improving support vector data description using local density degree”. Pattern Recognition, Vol.38, No. 10, pp. 1768 – 1771, 2005.
[6] B. Schölkopf, A.J. Smola. “Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond”. Cambridge, Mass: MIT Press, London, 2002.
[7] K.-R. Mäller, S. Mika, G. Rätsch, K. Tsuda, B. Schölkopf, “An introduction to kernel-based learning algorithms”, IEEE Transactions on Neural Networks, Vol. 12, No. 2 , pp.181-201, 2001.
[8] Y. Liu, S. Gururajan, B. Cukic, T. Menzies, and M. Napolitano, "Validating an Online Adaptive System Using SVDD," in 15th IEEE Int. Conf. on Tools with Artificial Intelligence, 2003.
[9] R. Ji, D. Liu, M. Wu, and J. Liu, "The Application of SVDD in Gene Expression Data Clustering," in 2nd Int. Conf. on Bioinformatics and Biomedical Engineering, pp. 371-374, 2008.
[10] X. Yu, D. DeMenthon, and D. Doermann, "Support Vector Data Description for Image Categorization From Internet Images," in 19th Int. Conf. on Pattern Recognition, 2008.
[11] J. Peng, D.R. Heisterkamp, H.K. Dai, “Adaptive quasiconformal kernel nearest neighbor classification”, IEEE Trans. PAMI ,Vol. 26, No. 5, pp. 656-661, 2004.
[12] L. Wang, K.L. Chan, “Learning kernel parameters by using class separability measure”, NIPS’02 Workshop on Kernel Machines, Canada, 2002.
[13] D.Q. Zhang, S.C. Chen, “Clustering incomplete data using kernel-based fuzzy c-means algorithm”, Neural Processing Letters, Vol. 18, No. 3, pp. 155-162, 2003.
[14] S.S. Keerthi, V. Sindhwani, O. Chapelle, “An efficient method for gradient- based adaptation of hyperparameters in svm models”, NIPS 19 pp. 673–680, 2007.
[15] A.C. Lorena, A.C.P.L.F. de Carvalho, “Evolutionary tuning of SVM parameter values in multiclass problems”, Neurocomputing, Vol. 71, No. 16–18, pp. 3326–3334, 2008.
[16] UCI repository of machine learning databases. http://archive.ics.uci.edu/ml/.
[17] K.M. Chung, W.C. Kao, C.L. Sun, L.L. Wang, C.J. Lin, “Radius margin bounds for support vector machines with the RBF kernel”, Neural Comput. Vol. 15, No. 11, pp. 2643–2681, November. 2003.
[18] S. Lin, Z. Lee, C. Chen, T. Tseng, “Parameter determination of support vector machine and feature selection using simulated annealing approach”, Appl. Soft Comput. Vol. 8, No. 4, pp. 1505–1512, Sep. 2008.
[19] H. Frohlich, A. Zell, “Efficient parameter selection for support vector machines in classification and regression via model-based global optimization”, In Proc. Int. Joint Conf. Neural Networks, pp. 1431–1436, 2005.
[20] O. Chapelle, V. Vapnik, O. Bousquet, S. Mukherjee, “Choosing multiple parameters for support vector machines”, Mach. Learn. Vol. 46, No. 1, pp. 131–159, January. 2002.
[21] Li. Shutao, Mingkui Tan, “Tuning SVM parameters by using a hybrid CLPSO–BFGS algorithm”, Neurocomputing. Vol. 73, No 10–12, pp. 2089–2096, 2010.
[22] C.-W. Hsu, C.-C. Chang, C.-J .Lin,”A practical guide to support vector classification”. (Online) Available from World Wide Web: http://www.csie. ntu.edu.tw/~cjlin/libsvm.
[23] F. Takahashi, S. Abe, “Optimizing directed acyclic graph support vector machines”, in: Proceedings of the Artificial Neural Networks in Pattern Recognition (ANNPR 2003), pp.166–170, September2003.
[24] T. Phetkaew, B. Kijsirikul, W. Rivepiboon, “Reordering adaptive directed acyclic graphs: an improved algorithm for multiclass support vector machines”, Proceedings of the International Joint Conference on Neural Networks (IJCNN2003), Vol. 2, pp.1605–1610, 2003.
[25] K.-P. Wu, S.-D. Wang, “Choosing the kernel parameters of support vector machines according to the inter-cluster distance”, Proceedings of the International Joint Conference on Neural Networks (IJCNN2006), 2006.
[26] C. Bezdek, N. R. Pal, “Some new indexes of cluster validity”, IEEE Trans. Syst. Man Cybern. Part B Cybern. Vol. 28, No. 3, pp. 301–315, 1998.
[27] K.-P. Wu, S.-D. Wang, “Choosing the kernel parameters for support vector machines by the inter-cluster distance in the feature space”, Pattern Recognition, Vol. 42, No 5, pp. 710–717, 2009.
[28] D. Zhang, Chen, S., Zhou, Z., Learning the kernel parameters in kernel minimum distance classifier. Pattern Recognition. Vol. 39, No. 1, pp 133–135, January 2006.