WASET
	%0 Journal Article
	%A Guojian Cheng and  Tianshi Liu and  Jiaxin Han and  Zheng Wang
	%D 2008
	%J International Journal of Computer and Information Engineering
	%B World Academy of Science, Engineering and Technology
	%I Open Science Index 22, 2008
	%T Towards Growing Self-Organizing Neural Networks with Fixed Dimensionality
	%U https://publications.waset.org/pdf/7656
	%V 22
	%X The competitive learning is an adaptive process in
which the neurons in a neural network gradually become sensitive to
different input pattern clusters. The basic idea behind the Kohonen-s
Self-Organizing Feature Maps (SOFM) is competitive learning.
SOFM can generate mappings from high-dimensional signal spaces
to lower dimensional topological structures. The main features of this
kind of mappings are topology preserving, feature mappings and
probability distribution approximation of input patterns. To overcome
some limitations of SOFM, e.g., a fixed number of neural units and a
topology of fixed dimensionality, Growing Self-Organizing Neural
Network (GSONN) can be used. GSONN can change its topological
structure during learning. It grows by learning and shrinks by
forgetting. To speed up the training and convergence, a new variant
of GSONN, twin growing cell structures (TGCS) is presented here.
This paper first gives an introduction to competitive learning, SOFM
and its variants. Then, we discuss some GSONN with fixed
dimensionality, which include growing cell structures, its variants
and the author-s model: TGCS. It is ended with some testing results
comparison and conclusions.
	%P 3564 - 3568