Dual-Network Memory Model for Temporal Sequences
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 32797
Dual-Network Memory Model for Temporal Sequences

Authors: Motonobu Hattori, Rina Suzuki

Abstract:

In neural networks, when new patters are learned by a network, they radically interfere with previously stored patterns. This drawback is called catastrophic forgetting. We have already proposed a biologically inspired dual-network memory model which can much reduce this forgetting for static patterns. In this model, information is first stored in the hippocampal network, and thereafter, it is transferred to the neocortical network using pseudopatterns. Because temporal sequence learning is more important than static pattern learning in the real world, in this study, we improve our conventional  dual-network memory model so that it can deal with temporal sequences without catastrophic forgetting. The computer simulation results show the effectiveness of the proposed dual-network memory model.  

Keywords: Catastrophic forgetting, dual-network, temporal sequences.

Digital Object Identifier (DOI): doi.org/10.5281/zenodo.1336104

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1380

References:


[1] R. French, "Using semi-distributed representation to overcome catastrophic forgetting in connectionist network,” in Proceedings of the 13th Annual Cognitive Science Society Conference, 1991, pp. 173–178.
[2] ——, "Dynamically constraining connectionist networks to produce distributed, orthogonal representations to reduce catastrophic interference,” in Proceedings of the 16th Annual Cognitive Science Society Conference, 1994, pp. 335–340.
[3] T. Yamada, M. Hattori, M. Morisawa, and H. Ito, "Sequential learning for associative memory using kohonen feature map,” in Proceedings of IEEE and INNS International Joint Conference on Neural Networks, vol. 3, Washington, D.C., July 10–16 1999, pp. 1920–1923.
[4] M. Hattori, H. Arisumi, and H. Ito, "Sequential learning for som associative memory with map reconstruction,” in Lecture Notes in Computer Science, vol. 2130, Vienna, August 21–25 2001, pp. 477–484.
[5] A. Robins, "Catastrophic forgetting, rehearsal, and pseudorehearsal,” Connection Science, vol. 7, pp. 123–146, 1995.
[6] R. French, "Pseudo-recurrent connectionist networks: An approach to the "sensitivity-stability” dilemma,” Connection Science, vol. 9, no. 4, pp. 353–379, 1997.
[7] B. Ans and S. Rousset, "Avoiding catastrophic forgetting by coupling two reverberating neural networks,” cademie des Sciences, Sciences de la vie, vol. 320, pp. 989–997, 1997.
[8] M. Hattori, "Dual-network memory model using a chaotic neural network,” in Proceedings of IEEE and INNS International Joint Conference on Neural Networks, Barcelona, July 18–23 2010, pp. 1678–1682.
[9] ——, "Avoiding catastrophic forgetting by a biologically inspired dual-network memory model,” in Lecture Notes in Computer Science, Part II, vol. 7664, Nov. 12–15 2012, pp. 392–400.
[10] K. Aihara, T. Takabe, and M. Toyoda, "Chaotic neural networks,” Physics Letters A, vol. 144, no. 6–7, pp. 333–340, 1990.
[11] H. Eichenbaum, The Cognitive Neuroscience of Memory. New York: Oxford University Press, 2002.
[12] B. Ans, S. Rousset, R. French, and S. Musca, "Self-refreshing memory in artificial neural networks: learning temporal sequences without catastrophic forgetting,” Connection Science, vol. 16, no. 2, pp. 71–99, 2004.
[13] J. Elman, "Finding structure in time,” Cognitive Science, vol. 14, pp. 179–211, 1990.
[14] J. Hertz, A. Krogh, and R. Palmer, Introduction to The Theory of Neural Computation. Redwood City: Addison-Wesley Publishing Company, 1991.
[15] M. Yeckel and T. Berger, "Feedforward excitation of the hippocampus the trisynaptic pathway,” in Proceedings of the National Academy of Sciences of the USA, vol. 87, 1990, pp. 5832–5836.
[16] R. Ratcliff, "Connectionist models of recognition memory: constraints imposed by learning and forgetting functions,” Psychological Review, vol. 87, no. 2, pp. 285–308, 1990.