A Kernel Based Rejection Method for Supervised Classification
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 32794
A Kernel Based Rejection Method for Supervised Classification

Authors: Abdenour Bounsiar, Edith Grall, Pierre Beauseroy

Abstract:

In this paper we are interested in classification problems with a performance constraint on error probability. In such problems if the constraint cannot be satisfied, then a rejection option is introduced. For binary labelled classification, a number of SVM based methods with rejection option have been proposed over the past few years. All of these methods use two thresholds on the SVM output. However, in previous works, we have shown on synthetic data that using thresholds on the output of the optimal SVM may lead to poor results for classification tasks with performance constraint. In this paper a new method for supervised classification with rejection option is proposed. It consists in two different classifiers jointly optimized to minimize the rejection probability subject to a given constraint on error rate. This method uses a new kernel based linear learning machine that we have recently presented. This learning machine is characterized by its simplicity and high training speed which makes the simultaneous optimization of the two classifiers computationally reasonable. The proposed classification method with rejection option is compared to a SVM based rejection method proposed in recent literature. Experiments show the superiority of the proposed method.

Keywords: rejection, Chow's rule, error-reject tradeoff, SupportVector Machine.

Digital Object Identifier (DOI): doi.org/10.5281/zenodo.1083045

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1396

References:


[1] A. Bounsiar, P. Beauseroy, and E. Grall, "A straightforward svm approach for classification with constraints," in proceedings of EUSIPCO- 05, Antalya, Turkey, September 2005.
[2] E. Grall, P. Beauseroy, and A. Bounsiar, "Classification avec contraintes : problématique et apprentissage d-une règle de décision par svm," in Proceedings of GRETSI-05. 2005, Louvain-la-Naeuve, Belgium.
[3] T. Ha, "The optimum class-selective rejection rule," Transactions on Pattern Analysis ans Machine Intelligence, vol. 19, no. 6, pp. 608-615, 1997.
[4] T. Horiuchi, "Class-selective rejection rule to minimize the maximum distance between selected classes," Pattern recognition, vol. 31, no. 10, pp. 579-588, 1998.
[5] E. Grall, P. Beauseroy, and A. Bounsiar, "Multilabel classification rule with performance constraints," To appear in proceedings of ICASSP-06. Toulouse, France, May 14-19 2006.
[6] C. K. Chow, "An optimum character recognition system using decision functions," IEEE Trans. Electronic computers, vol. EC-6, pp. 247-254, December 1957.
[7] C. K. Chow, "On optimum error and reject trade-off," IEEE Transactions on Information Theory, vol. 16, pp. 41-46, 1970.
[8] G. Fumera, F. Roli, and G. Giacinto, "Analysis of error-reject tradeoff in linearly combined multiple classifiers," Pattern Recognition, vol. 33(12), pp. 2099-2101, 2000.
[9] G. Fumera and F. Roli, "Analysis of error-reject trade-off in linearly combined multiple classifiers," Pattern Recognition, vol. 37(6), pp. 1245-1265, 2004.
[10] M. Golfarelli, D. Maio, and D. Maltoni, "On the error-reject trade-off in biometric verification systems," IEEE Transactions on Pattern analysis and Machine Intelligence, vol. 19(7), pp. 789-796, 1997.
[11] L. K. Hansen, C. Lissberg, and P. Salomon, "The error-reject tradeoff," Open systems and Information Dynamics, vol. 4, pp. 159-185, 1997.
[12] G. Fumera and F. Roli, "Support vector machines with embedded rejection option," In: Lee S, Verri A (eds) Pattern recognition with support vector machines. Lecture notes in computer science, vol. 2388, pp. 68-82, Springer, Berlin Heidelberg New York, 2002.
[13] S. Mukherjee, P. Tamayo, D. Slonim, A. Verri, T. Golub, J.P. Mesirov, and T. Poggio, "Support vector machine classification of microarray data," AI Memo 1677, Massachusetts Institute of Technology, 1999.
[14] J.C. Platt, "Probabilistic outputs for support vector machines and comparisons to regularized likelihood methods," In: Smola AJ, Bartlet PL, Schölkopf B, Schurmans D (eds) Advances in large margin classifiers, pp. 61-74, MIT Press, 2004.
[15] J.T. Kwok, "Moderating the outputs of support vector machine classifiers," IEEE Transactions on Neural Networks, vol. 10, pp. 1018-1031, 1999.
[16] F. Tortorella, "Reducing the classifcation cost of support vector classifiers through an roc-based reject rule," Pattern Anal. Applic., vol. 7, pp. 128-143, 2004.
[17] A. Bounsiar, E. Grall, and P. Beauseroy, "Using svm for binary classification with first type error constraint," in proceedings of ICSIT-05, Algiers, Algeria, july 2005, pp. 494-499.
[18] A. Bounsiar, P. Beauseroy, and E. Grall, "Fast training and efficient linear learning machine," To appear in proceedings of ICASSP-06. Toulouse, France, May 14-19 2006.
[19] V. Vapnik, The Nature of Statistical Learning Theory, Spring Verlag, 1995.
[20] V. Vapnik, Statistical Learning Theory, Wiley, 1998.
[21] N. Cristianini and J. Shawe-Taylor, Support vector machines and other kernel-based learning methods, Combridge University Press, 2000.
[22] C. J. C. Burges, "A tutorial on support vector machines for pattern recognition," Data Mining and Knowledge Discovery, vol. 2, no. 2, pp. 121-167, 1998.
[23] C. Cortes and V. Vapnik, "Support vector networks," Machine Learning, vol. 20, pp. 273-279, 1995.
[24] B. Schölkopf and A. J. Smola, Learning with kernels, MIT Press, MA, 2002.
[25] C. J Van Rijsbergen, Information retrieval, Butterworths, London, 1979.
[26] J. A. Swets, Information Retrieval Systems, Bolt, Beranek and Newman. Cambridge, Massachusetts, 1967.
[27] C. Drummond and R. Holte, "What roc curves can-t do (and cost curves can)," in Proceedings of the ROC Analysis in Artificial Intelligence, 1st International Workshop. 22 ao├╗t 2004, pp. 19-26, Valencia, Espagne.
[28] K. Fukunaga, Introduction to Statistical Patern Rcognition, Academic Press, New York, 2nd edition, 1990.