Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 33093
Performance Evaluation of Complex Valued Neural Networks Using Various Error Functions
Authors: Anita S. Gangal, P. K. Kalra, D. S. Chauhan
Abstract:
The backpropagation algorithm in general employs quadratic error function. In fact, most of the problems that involve minimization employ the Quadratic error function. With alternative error functions the performance of the optimization scheme can be improved. The new error functions help in suppressing the ill-effects of the outliers and have shown good performance to noise. In this paper we have tried to evaluate and compare the relative performance of complex valued neural network using different error functions. During first simulation for complex XOR gate it is observed that some error functions like Absolute error, Cauchy error function can replace Quadratic error function. In the second simulation it is observed that for some error functions the performance of the complex valued neural network depends on the architecture of the network whereas with few other error functions convergence speed of the network is independent of architecture of the neural network.Keywords: Complex backpropagation algorithm, complex errorfunctions, complex valued neural network, split activation function.
Digital Object Identifier (DOI): doi.org/10.5281/zenodo.1077728
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2424References:
[1] B. Widrow, J. McCool, and M. Ball, "The Complex LMS algorithm," Proc. of the IEEE, April 1975.
[2] M. S. Kim and C. C. Guest, "Modification of backpropagation for complex- valued signal processing in frequency domain," Int. Joint Conf. on Neural Networks, San Diego, vol. 3, pp. 27-31, June 1990.
[3] G. M. Georgiou and C. Koutsougeras, "Complex domain backpropagation," IEEE Trans. On Circuits and Systems - II: Analog and Digital Signal Processing, Vol.39, No. 5., May 1992.
[4] A. Hirose, "Dynamics of fully complex-valued neural networks," Electronics letters, vol. 28, no. 13, pp.1492-1494.
[5] N. Benvenuoto and F. Piazza, "On the complex backpropagation algorithm," IEEE Trans. On Signal Processing, vol. 40, no. 4, April 1992.
[6] J. Wang, "Recurrent neural networks for solving Systems of complex valued linear equations," Electronics letters, vol. 28, no. 18, pp. 1751- 1753.
[7] Y. Deville, "A neural network implementation of complex activation function for digital VLSI neural networks," Microelectronics Jjournal, vol. 24, pp. 259-262.
[8] M. R. Smith and Y. Hui, "A data exploration algorithm using a complex domain neural network," IEEE Trans.on Circuits and systems-II: Analog and digital signal processing, vol. 22,no.2.
[9] H. Leung and S. Haykin," The complex backpropagation algorithm", IEEE Trans. On signal Processing, Vol. 39,No.9, September(1991).
[10] T. Nitta, "An extension of the back-propagation algorithm to complex numbers," Neural Networks, vol. 10, no. 8, pp 1391-1415, 1997.
[11] P. J. Werbos, andJ Titus, "An empirical test of new forecasting methods derived from a theory of intelligence: The prediction of conflict in Latin America," IEEE Trans.on Systems, Man and Cybernetics, September, 1978.
[12] P. E. Gill, and M. H. Wright, "Practical optimization," Academic Press, 1981.
[13] W. J. J. Rey, "Introduction to robust and quasi-robust statistical methods," Springer- Verlag, Berlin.
[14] B. Fernandez, "Tools for artificial neural networks learning," Intelligent engineering systems through Artificial Neural Networks, ASME Press, New York, pp. 69-76, 1991.
[15] K. Matsuoka, and J. Yi, "Backpropagation based on the Logarithmic error function and elimination of local minima," in Proc. Of the International Joint Conference on Neural Networks, Singapore, vol. 2, pp. 1117-1122, 1991.
[16] V. Ooyen, Nienhaus, "Improving the convergence of backpropagetion algorithm," Neural Networks, vol. 5, pp. 465-571.
[17] A. Prashanth, "Investigation on complex variable based backpropagation algorithm and applications," Ph.D. Thesis, IIT, Kanpur, India, March 2003.