Improved Back Propagation Algorithm to Avoid Local Minima in Multiplicative Neuron Model
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 32771
Improved Back Propagation Algorithm to Avoid Local Minima in Multiplicative Neuron Model

Authors: Kavita Burse, Manish Manoria, Vishnu P. S. Kirar

Abstract:

The back propagation algorithm calculates the weight changes of artificial neural networks, and a common approach is to use a training algorithm consisting of a learning rate and a momentum factor. The major drawbacks of above learning algorithm are the problems of local minima and slow convergence speeds. The addition of an extra term, called a proportional factor reduces the convergence of the back propagation algorithm. We have applied the three term back propagation to multiplicative neural network learning. The algorithm is tested on XOR and parity problem and compared with the standard back propagation training algorithm.

Keywords: Three term back propagation, multiplicative neuralnetwork, proportional factor, local minima.

Digital Object Identifier (DOI): doi.org/10.5281/zenodo.1328734

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2752

References:


[1] Y. H. Zweiri, J. F. Whidborne, K Althoefer and L. D. Seneviratne, "A three term back propagation algorithm," Neurocomputing, vol. 50, pp. 305-318, 2003.
[2] R. J. Edward: An Introduction to Neural Networks. A White paper. United States of America, Visual Numerics Inc. 2004.
[3] Y. F. Yam and T.W.S. Chow, "Extended back propagation algorithm," Electronics Letters, vol. 29(19), pp. 1701-1702, 1993.
[4] B. K. Verma and J. J. Mulawka, "A modified back propagation algorithm," in Proc. IEEE World Congress on Computational Intelligence, pp. 840-844, 1994.
[5] G. P. Drago, M. Morando and S. Ridella," An adaptive momentum back propagation, Neural Computing and Application, vol. 3, pp. 213-221, 1995.
[6] Y. Q. Chen, T. Yin and H. A. Babri, "A stochastic back propagation algorithm for training neural networks," in proc. International Conference on Information, Communications and Signal Processing, pp. 703-707, 9-12 September, 1997, Singapore.
[7] S. C. Ng, S.H. Leung and A. Luk, "Fast convergent generalized back propagation algorithm with constant learning rate", Neural Processing Letters, vol. 9, pp. 13-23, 1999.
[8] J. W. Wen, J. L. Zhao, S. W. Luo and Z. Han, "The improvements of BP neural network learning algorithm," in Proc. of ICSP2000, pp. 1647- 1649, 2000.
[9] X. G. Wang, Z. Tang, H. Tamura, M. Ishii and W. D. Sun, "An improved back propagation algorithm to avoid the local minima problem," Neurocomputing, vol. 56, pp. 455-460, 2004.
[10] C. H. Wang, C.H. Kao, and W. H. Lee, "A new interactive model for improving the learning performance of back propagation neural network," Automation in Construction, vol. 16, no. 6, pp.745-758, 2007.
[11] B. Mel, "Information processing in dendritic trees," Neural Computing, vol.6, pp.1031-1085, 1994.
[12] B. Mel, "Information processing in dendritic trees," Neural Computing, vol.6, pp.1031-1085, 1994.
[13] R. N. Yadav, P. K. Kalra and J. John, "Time series prediction with single multiplicative neuron model," Applied soft computing, vol. 7 pp 1157-1163, 2007.
[14] R. N. Yadav, V. Singh and P. K. Kalra, "Classification using single neuron," in Proc. IEEE Int. Conf. on Industrial Informatics, pp. 124- 129, 21-24 August, 2003, Banff, Alberta, Canada.
[15] C-C Yu, and B-D Liu, "A back propagation algorithm with adaptive learning rate and momentum coefficient," in Proc. of the International Joint Conference on Neural Networks, IJCNN 2002, pp.1218-1223, May 2007.