WASET
	%0 Journal Article
	%A Yahya H. Zweiri
	%D 2007
	%J International Journal of Electrical and Computer Engineering
	%B World Academy of Science, Engineering and Technology
	%I Open Science Index 12, 2007
	%T Optimization of a Three-Term Backpropagation Algorithm Used for Neural Network Learning
	%U https://publications.waset.org/pdf/14541
	%V 12
	%X The back-propagation algorithm calculates the weight
changes of an artificial neural network, and a two-term algorithm
with a dynamically optimal learning rate and a momentum factor
is commonly used. Recently the addition of an extra term, called a
proportional factor (PF), to the two-term BP algorithm was proposed.
The third term increases the speed of the BP algorithm. However,
the PF term also reduces the convergence of the BP algorithm, and
optimization approaches for evaluating the learning parameters are
required to facilitate the application of the three terms BP algorithm.
This paper considers the optimization of the new back-propagation
algorithm by using derivative information. A family of approaches
exploiting the derivatives with respect to the learning rate, momentum
factor and proportional factor is presented. These autonomously
compute the derivatives in the weight space, by using information
gathered from the forward and backward procedures. The three-term
BP algorithm and the optimization approaches are evaluated using
the benchmark XOR problem.
	%P 1850 - 1855