Complex Systems

Evolving Optimal Neural Networks Using Genetic Algorithms with Occam's Razor Download PDF

Byoung-Tak Zhang
Heinz Mühlenbein
Artificial Intelligence Research Division
German National Research Center for Computer Science (GMD)
Schloss Birlinghoven, D-53757 Sankt Augustin, Germany

Abstract

Genetic algorithms have had two primary applications for neural networks: optimization of network architecture, and training weights of a fixed architecture. While most previous work focuses on one or the other of these options, this paper investigates an alternative evolutionary approach---breeder genetic programming (BGP)---in which the architecture and the weights are optimized simultaneously. In this method, the genotype of each network is represented as a tree whose depth and width are dynamically adapted to the particular application by specifically defined genetic operators. The weights are trained by a next-ascent hillclimbing search. A new fitness function is proposed that quantifies the principle of Occam's razor; it makes an optimal trade-off between the error fitting ability and the parsimony of the network. Simulation results on two benchmark problems of differing complexity suggest that the method finds minimal networks on clean data. The experiments on noisy data show that using Occam's razor not only improves the generalization performance, it also accelerates convergence.