An Experimental Study of a Self-Supervised Classifier Ensemble
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 33122
An Experimental Study of a Self-Supervised Classifier Ensemble

Authors: Neamat El Gayar

Abstract:

Learning using labeled and unlabelled data has received considerable amount of attention in the machine learning community due its potential in reducing the need for expensive labeled data. In this work we present a new method for combining labeled and unlabeled data based on classifier ensembles. The model we propose assumes each classifier in the ensemble observes the input using different set of features. Classifiers are initially trained using some labeled samples. The trained classifiers learn further through labeling the unknown patterns using a teaching signals that is generated using the decision of the classifier ensemble, i.e. the classifiers self-supervise each other. Experiments on a set of object images are presented. Our experiments investigate different classifier models, different fusing techniques, different training sizes and different input features. Experimental results reveal that the proposed self-supervised ensemble learning approach reduces classification error over the single classifier and the traditional ensemble classifier approachs.

Keywords: Multiple Classifier Systems, classifier ensembles, learning using labeled and unlabelled data, K-nearest neighbor classifier, Bayes classifier.

Digital Object Identifier (DOI): doi.org/10.5281/zenodo.1085163

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1647

References:


[1] J. Kittler, F. Roli and T. Windeatt (Eds.): Multiple Classifier Systems, Fifth International Workshop, MCS 2004 Cagliari, Italy, June 9--11, 2004, Proceedings. Lecture Notes in Computer Science, vol. 3077, Springer-Verlag Inc., NY, USA, 2004.
[2] N. F. El Gayar. "Fuzzy Neural Network Models for Unsupervised and Confidence-Based Learning", Ph.d. thesis, Dept. of Comp. Sc., University of Alexandria, 1999.
[3] M. Seeger. "Learning with labeled and unlabeled data" (Technical Report). Institute for Adaptive and Neural Computation, University of Edinburgh, Edinburgh, United Kingdom,2001
[4] I. Muslea, S. Minton & C. Knoblock. "Active +semi-supervised learning = robust multi-view learning" Proc. of ICML-02, 19th Int. Conf. on Machine Learning (pp. 435-442) 2002.
[5] X. Zhu, J. Lafferty & Z. Ghahramani "Combining active learning and semi-supervised learning using gaussian fields and harmonic functions". Proc. of the ICML-2003 Workshop on The Continuum from Labeled to Unlabeled Data, Washington DC, 2003.
[6] Ira Cohen, Nicu Sebe, Fabio G. Cozman,Thomas S. Huang, "Semisupervised learning for facial expression recognition" MIR-03, November 7, 2003, Berkeley, California, USA.
[7] D. Miller and H.S. Uyar. "A mixture of experts classifier with learning based on both labeled and unlabeled data." NIPS 1997
[8] K.P. Bennett, A. Demiriz and R. Maclin. "Exploiting unlabeled data in ensemble methods" Proceedings of the eighth ACM SIGKDD international conference on Knowledge discovery and data mining, Edmonton, Alberta, Canada ., pp 289 - 296, 2002.
[9] E. Dimitriadou, A. Weingessel and K. Hornik. "A mixed ensemble approach for the semi-supervised problem", Proceedings of the International Conference on Artificial Neural Networks, pp 571 - 576 , 2002.
[10] S. A. Nene, S. K. Nayar and H. Murase "Columbia Object Image Library (COIL-20)," Technical Report CUCS-005-96, February 1996.
[11] R.P.W. Duin, P. Juszczak, P. Paclik, E. Pekalska, D. de Ridder and D.M.J. Tax, PRTools 4.0, A Matlab Toolbox for Pattern Recognition, Delft University of Technology, February 2004. http://prtools.org/
[12] R.O. Duda, P.E. Hart, D.G. Stork, Pattern Classification, 2nd ed., New York: John Wiley and Sons, 2001.
[13] N. Friedman and R. Kohavi. "Data mining tasks and methods: Classification: Bayesian classification", Handbook of data mining and knowledge discovery , pp. 282 - 288. Oxford University Press, 2002.