The Labeled Classification and its Application
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 32797
The Labeled Classification and its Application

Authors: M. Nemissi, H. Seridi, H. Akdag

Abstract:

This paper presents and evaluates a new classification method that aims to improve classifiers performances and speed up their training process. The proposed approach, called labeled classification, seeks to improve convergence of the BP (Back propagation) algorithm through the addition of an extra feature (labels) to all training examples. To classify every new example, tests will be carried out each label. The simplicity of implementation is the main advantage of this approach because no modifications are required in the training algorithms. Therefore, it can be used with others techniques of acceleration and stabilization. In this work, two models of the labeled classification are proposed: the LMLP (Labeled Multi Layered Perceptron) and the LNFC (Labeled Neuro Fuzzy Classifier). These models are tested using Iris, wine, texture and human thigh databases to evaluate their performances.

Keywords: Artificial neural networks, Fusion of neural networkfuzzysystems, Learning theory, Pattern recognition.

Digital Object Identifier (DOI): doi.org/10.5281/zenodo.1074885

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1360

References:


[1] G. J. S. Roger Jang and C. T. Sun, "A Neuro-Fuzzy Classifier and Its Applications" in Proc. of IEEE international conference on fuzzy systems (1), San Francisco, 1993, pp. 94-98.
[2] B. D. Chakraborty and N. R. Pal, "A Neuro-Fuzzy Scheme for Simultaneous Feature Selection and Fuzzy Rule Based Classification" IEEE trans., Neural Networks. vol. 15, 2004, pp. 110-123,
[3] D. E. Rumelhart, G. E. Hinton and R. J. Williams, "Learning internal representation by error propagation" Parallel distributed processing: exploration in the microstructure of cognition, D.E.Rumelhart and J.L.McClelland edition, MIT press Cambridge, 1986, pp. 318-362.
[4] Y. H. Zweiri, J. F. Whidborne and L. D. Seneviratne, "Three-term backpropagation algorithm" Neurocomputing, Vol. 50, 2003, pp. 305- 318.
[5] C. Looney, Pattern Recognition Using Neural Networks. Oxford University Press, New York, 1997.
[6] M. A. Hoehfeld, and S. E. Fahlman, "Learning with limited numerical precision using cascade correlation algorithm" IEEE Trans. Neural Networks vol. 3, 1992, pp. 902-6111.
[7] R. A. Jacobs, "Increased rates of convergence through learning rate adaptation" Neural Networks, vol. 1, 1988, pp. 295-307.
[8] G. li. H. Alnuweiri and W. Wu, "Acceleration of backpropagation trough initial weight pre-training with delta rule" in Proc. 1993 IEEE Int. Conf. Neural Networks, San Francisco, vol. 1, 1993, pp. 580-585.
[9] A. P. Russo, "Neural networks for sonar signal processing" Tutorial No. 8, IEEE Conf. on Neural Networks for Ocean engineering, Washington, D.C., 1991.
[10] D. Nguyen and B. Widrow, "Improving the learning speed two-layer neural networks by choosing initial values of the adaptive weights" in Proc. IEEE Conf. Neural Networks, San Diego, vol. 3, 1990, 21-26.
[11] J. Y. F. Yam and T. W. S. Chow, "A weight initialization method for improving training speed in feedforward neural network" Neurocomputing, vol. 30, 2000, pp. 219-232.
[12] Y. Lee, S. H. Oh and M. Kim, "The effect of initial weights on premature saturation in back-propagation learning" in Proc. Joint Conf. Neural Networks, Seattle, vol. 1, 1991, pp. 765-770.
[13] S.V. Kamarthi and S. Pittner, "Accelerating neural network training using weight extrapolations" Neural Networks, vol. 12, 1999, pp.1285- 1299.
[14] M. Zurada, "Lambda learning rule for feedforward neural networks" in Proc, IEEE int. Conf. Neural Networks, vol. 3, 1993, 1808-1811.
[15] P. Chandra and Y. Singh, "An activation function adapting training algorithm for sigmoidal feedforward networks" Neurocomputing, vol. 61, 2004, pp. 429- 437.
[16] Y. H. Zweiri, "Optimization of a Three-Term Backpropagation Algorithm Used for Neural Network Learning" International Journal of Computational Intelligence. vol. 3, 2006, pp. 322-327.
[17] N. Ampazis, S. J. Perantonis and J.G. Taylor, "Dynamics of multilayer networks in the vicinity of temporary minima" Neural Networks, vol. 12, pp. 43-58, 1999.
[18] V. V. Phansalkar and P. S. Sastry, "Analysis of the back-propagation algorithm with momentum" IEEE Transactions on Neural Networks, vol. 5, 1994, pp. 505-506.
[19] J. Vitela and J. Reifman, "Premature Saturation in Backpropagation Networks: Mechanism and Necessary Conditions" Neural Networks, vol. 10, 1997, pp. 721-725.
[20] S. W. Cho and T. W. S. Chow, " Training multilayer neural networks using fast global learning algorithm-least-squares and penalized optimization methods" Neurocomputing, vol. 25, 1999, pp. 115-131.
[21] X.G. Wang, Z. Tang, H. Tamura and M. Ishii, "A modified error function for the backpropagation algorithm" Neurocomputing, vol. 57, 2004, pp. 477 - 484.
[22] C. M. Bishop, Neural networks for pattern recognition. Clarendon press, Oxford, 1995.
[23] M. Nemissi, H. Boudouda and H. Seridi, "Comparing performances of the MPL, RVFLNN and NFC for human thigh classification" in Proc. of the International Workshop on Text, Image and Speech Recognition, Annaba-Algeria, 2005, pp. 125-131.
[24] M. Nemissi, H. Seridi and H. Akdag, "Labeled Neuro-Fuzzy Classification." Asian Journal of Information Technology, vol. 4, 2005, pp. 868-872.
[25] F. Masulli and A.Sperduti, "Learning Techniques for Supervised Fuzzy Classifiers" Fuzzy Learning and Applications. CRC press, ch. 4, 2001, pp. 147-169.
[26] A. Lotfi, "Learning Fuzzy Systems" Chapter 6, Fuzzy Learning and Applications, CRC press, 2001, ch. 6, pp. 205-222.