Avoiding Catastrophic Forgetting by a Dual-Network Memory Model Using a Chaotic Neural Network
Authors: Motonobu Hattori
Abstract:
In neural networks, when new patterns are learned by a network, the new information radically interferes with previously stored patterns. This drawback is called catastrophic forgetting or catastrophic interference. In this paper, we propose a biologically inspired neural network model which overcomes this problem. The proposed model consists of two distinct networks: one is a Hopfield type of chaotic associative memory and the other is a multilayer neural network. We consider that these networks correspond to the hippocampus and the neocortex of the brain, respectively. Information given is firstly stored in the hippocampal network with fast learning algorithm. Then the stored information is recalled by chaotic behavior of each neuron in the hippocampal network. Finally, it is consolidated in the neocortical network by using pseudopatterns. Computer simulation results show that the proposed model has much better ability to avoid catastrophic forgetting in comparison with conventional models.
Keywords: catastrophic forgetting, chaotic neural network, complementary learning systems, dual-network
Digital Object Identifier (DOI): doi.org/10.5281/zenodo.1084720
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2099References:
[1] R. M. French, "Using semi-distributed representation to overcome catastrophic forgetting in connectionist network," Proceedings of the 13th Annual Cognitive Science Society Conference, pp.173-178, 1991.
[2] R. M. French, "Dynamically constraining connectionist networks to produce distributed, orthogonal representations to reduce catastrophic interference," Proceedings of the 16th Annual Cognitive Science Society Conference, pp.335-340, 1994.
[3] R. M. French, "Pseudo-recurrent connectionist networks: An approach to the "sensitivity-stability" dilemma," Connection Science, vol.9, no.4, pp.353-379, 1997.
[4] R. M. French, "Catastrophic forgetting in connectionist networks," Trends in Cognitive Sciences, vol.3, no.4, pp.128-135, 1997.
[5] R. M. French, B. Ans and S. Rousset, "Pseudopatterns and dual-network memory models: Advantages and shortcomings," In Connectionist Models of Learning, Development and Evolution (R. French and J. Sougn'e eds.), London, Springer, pp.13-22, 2001.
[6] B. Ans and S. Rousset, "Avoiding catastrophic forgetting by coupling two reverberating neural networks", Academie des Sciences, Sciences de la vie, vol.320, pp.989-997, 1997.
[7] J. McClelland, B. McNaughton and R. O-Reilly, "Why there are complementary learning systems in the hippocampus and neocortex: Insights from the successes and failures of connectionist models of learning and memory", Psychological Review, vol.102, no.3, pp.419-457, 1995.
[8] K. Aihara, T. Takabe and M. Toyoda, "Chaotic Neural Networks," Physics Letters A, vol.144, no.6-7, pp.333-340, 1990.
[9] Y. Osana, M. Hattori and M. Hagiwara, "Chaotic Bidirectional Associative Memory," Proceeding of the International Joint Conference on Neural Networks, Washington D.C., vol.2, pp.816-821, 1996.
[10] G. Buzs'aki, "Memory consolidation during sleep: a neurophysiological perspective," Journal of Sleep Research, vol.7, issue S1, pp.17-23, 1998.
[11] Y. Wakagi and M. Hattori, "A Model of Hippocampal Learning with Neuronal Turnover in Dentate Gyrus," International Journal of Mathematics and Computers in Simulation, issue 2, vol.2, pp.215-222, 2008.
[12] R. C. O-Reilly and J. W. Ruby, "Computational Principles of Learning in the Neocortex and Hippocampus," HIPPOCAMPUS, vol.10, pp.389-397, 2000.
[13] K. A. Norman and R. C. O-Reilly, "Modeling Hippocampal and Neocortical Contribution to Recognition Memory: A Complementary-Learning- Systems Approach," Psychological Review, vol.110, no.4, pp.611-646, 2003.