Complex Systems

Dynamic Properties of Neural Learning in the Information-theoretic Plane Download PDF

Perambur S. Neelakanta
Salahalddin Abusalah
Raghavan Sudhakar
Dolores De Groff
Valentine Aalo
Department of Electrical Engineering,
Florida Atlantic University,
Boca Raton, FL 33431, USA

Joseph C. Park
Atlantic Undersea Test and Evaluation Center (AUTEC),
West Palm Beach, FL 33401, USA


Learning in reference to the real neural complex depicts progressive modifications occurring at the synaptic levels of the interconnected neurons. The presence of intraneural disturbances (inherently present) or any extraneural noise in the input data or in the teacher values may affect such synaptic modifications as specified by the set of weighting vectors of the interconnections. Translated to artificial neurons, the noise considerations refer to inducing an offset in the convergence performance of the network in striving to reach the goal or objective value via the supervised learning procedure implemented. The dynamic response of a learning network when the target itself changes with time can be studied in the information-theoretic plane and the relevant nonlinear (stochastic) dynamics of the learning process can be specified by the Fokker--Planck equation, in terms of a conditional entropy-- (or mutual information--) based error measure elucidated from the probabilities associated with the input and teacher (target) values. In this paper, the logistic growth (evolutionary aspects) and certain attractor features of the learning process are described and discussed in reference to neural manifolds using the mathematical foundations of statistical dynamics. Computer simulation studies on a test multilayer perceptron are presented, and the asymptotic behavior of accuracy and speed of learning vis-à-vis the convergence aspects of the test error measure(s) is elucidated.