Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 33104
System Identification with General Dynamic Neural Networks and Network Pruning
Authors: Christian Endisch, Christoph Hackl, Dierk Schröder
Abstract:
This paper presents an exact pruning algorithm with adaptive pruning interval for general dynamic neural networks (GDNN). GDNNs are artificial neural networks with internal dynamics. All layers have feedback connections with time delays to the same and to all other layers. The structure of the plant is unknown, so the identification process is started with a larger network architecture than necessary. During parameter optimization with the Levenberg- Marquardt (LM) algorithm irrelevant weights of the dynamic neural network are deleted in order to find a model for the plant as simple as possible. The weights to be pruned are found by direct evaluation of the training data within a sliding time window. The influence of pruning on the identification system depends on the network architecture at pruning time and the selected weight to be deleted. As the architecture of the model is changed drastically during the identification and pruning process, it is suggested to adapt the pruning interval online. Two system identification examples show the architecture selection ability of the proposed pruning approach.Keywords: System identification, dynamic neural network, recurrentneural network, GDNN, optimization, Levenberg Marquardt, realtime recurrent learning, network pruning, quasi-online learning.
Digital Object Identifier (DOI): doi.org/10.5281/zenodo.1080760
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1938References:
[1] Y. Le Cun, J. S. Denker, S. A. Solla, "Optimal Brain Damage," in D. S. Touretzky, "Advances in Neural Information Processing Systems," Morgan Kaufmann, 1990, pp. 598-605.
[2] B. Hassibi, D. G. Stork, G. J. Wolff, "Optimal Brain Surgeon and General Network Pruning," IEEE International Conference on Neural Networks, vol. 1, pp. 293-299, April 1993.
[3] R. Reed, "Pruning Algorithms - A Survey," IEEE Transactions on Neural Networks, vol. 4, no. 5, pp. 740-747, September 1993.
[4] M. Attik, L. Bougrain, F. Alexandre, "Optimal Brain Surgeon Variants For Feature Selection," in IEEE Proceedings of the International Joint Conference on Neural Networks, pp. 1371-1374, 2004.
[5] O. De Jes'us, M. Hagan, "Backpropagation Algorithms Through Time for a General Class of Recurrent Network," IEEE Int. Joint Conf. Neural Network, Washington, 2001, pp. 2638-2643.
[6] O. De Jes'us, M. Hagan, "Forward Perturbation Algorithm For a General Class of Recurrent Network," IEEE Int. Joint Conf. Neural Network, Washington, 2001, pp. 2626-2631.
[7] O. De Jes'us, "Training General Dynamic Neural Networks," Ph.D. dissertation, Oklahoma State University, Stillwater, OK, 2002.
[8] O. De Jes'us, M. Hagan, "Backpropagation Algorithms for a Broad Class of Dynamic Networks," IEEE Transactions on Neural Networks, vol. 18, no. 1, pp. 14-27, January 2007.
[9] M. Hagan, B. M. Mohammed, "Training Feedforward Networks with the Marquardt Algorithm," IEEE Transactions on Neural Networks, vol. 5, no. 6, pp. 989-993, November 1994.
[10] L.S.H. Ngia, J. Sj¨oberg, "Efficient Training of Neural Nets for Nonlinear Adaptive Filtering Using a Recursive Levenberg-Marquardt Algorithm," IEEE Transactions on Signal Processing, vol. 48, no. 7, pp. 1915-1927, July 2000.
[11] P.J. Werbos, "Backpropagation Through Time: What it is and how to to it," Proc. IEEE, vol. 78, no. 10, pp. 1550-1560, 1990.
[12] R.J. Williams, D. Zipser, "A Learning Algorithm for Continually Running Fully Recurrent Neural Networks," Neural Computing, vol. 1, pp. 270-280, 1989.
[13] O. Nelles, Nonlinear System Identification. Berlin Heidelberg New York: Springer-Verlag, 2001.
[14] D. Schr¨oder, Elektrische Antriebe - Regelung von Antriebssystemen. 2nd edn., Berlin Heidelberg New York: Springer-Verlag, 2001.
[15] K. S. Narendra, K. Parthasarathy, "Identification and Control of Dynamical Systems Using Neural Networks," IEEE Transactions on Neural Networks, vol. 1, no. 1, pp. 4-27, November 1990.