Complex Systems

Hard Learning the Easy Way: Backpropagation with Deformation Download PDF

Frank J. Śmieja
Gareth D. Richards
Department of Physics, University of Edinburgh,
Mayfield Road, Edinburgh, EH9 3JZ, United Kingdom

Abstract

The backpropagation algorithm for feed-forward layered neural networks is applied to a problem domain of variable difficulty. The algorithm in its basic form is shown to be very sensitive to step-size and momentum parameters as problem difficulty is increased. To counter this we suggest a way of changing them during learning for a faster and more stable gradient descent. A technique for the deformation of the error surface is introduced as a way of using the algorithm to learn hard problem domains by gradually changing the shape of the error surface from a gentle to the final craggy form. This deformation procedure is applied to a second problem domain and is shown to improve the learning performance by gradually increasing the difficulty of the problem domain so that the net may build upon past experience, rather than being subjected to a complicated set of associations from the start.