Monotonic Incrementation of Backpropagation Networks

Abstract

One of the most challenging problems in neural networks research is finding methods which increase approximation power but maintain generalization capabilities of simple nets. The approach pursued here employs methods of monotonic network incrementation, i.e. modifications which locally change error surfaces, but retain network outputs.
A conservative splitting algorithm is presented which detects units stuck in local minima and replaces them monotonically in order to speed up learning. In a first evaluation based on the well-known parity problem, the algorithm has been shown to be superior to standard backpropagation (BP) both in terms of speed and accuracy.

Keywords

Neural Networks, Backpropagation, Growing Networks, Meta-Algorithms for Neural Network Training

Reference

Ingo Glöckner, Monotonic incrementation of backpropagation networks. In Proceedings of the International Conference on Artificial Neural Networks (ICANN 93), 1993.

If you wish to receive a copy of the manuscripts, just contact me by email.


Ingo Glöckner, Ingo.Gloeckner@FernUni-Hagen.DE (Homepage)