Application of gradient-like dynamics to neural networks

Research output: Contribution to conferencePaperpeer-review

Abstract

This paper reviews a formalism that enables the dynamics of a broad class of neural networks to be understood. This formalism is then applied to a specific network and the predicted and simulated behavior of the system are compared. A number of previous works have analyzed the Lyapunov stability of neural network models. This type of analysis shows that the excursion of the solutions from a stable point is bounded. The purpose of this work is to review and then utilize a model of the dynamics that also describes the phase space behavior and structural stability of the system. This is achieved by writing the general equations of the neural network dynamics as a gradient-like system. In this paper it is demonstrated that a network with additive activation dynamics and Hebbian weight update dynamics can be expressed as a gradient-like system. An example of a 3-layer network with feedback between adjacent layers is presented. It is shown that the process of weight learning is stable in this network when the learned weights are symmetric. Furthermore, the weight learning process is stable when the learned weights are asymmetric, provided that the activation is computed using only the symmetric part of the weights.

Original languageEnglish (US)
Pages92-96
Number of pages5
DOIs
StatePublished - 1994
Externally publishedYes
EventProceedings of the 1994 Southcon Conference - Orlando, FL, USA
Duration: Mar 29 1994Mar 31 1994

Conference

ConferenceProceedings of the 1994 Southcon Conference
CityOrlando, FL, USA
Period3/29/943/31/94

ASJC Scopus subject areas

  • Electrical and Electronic Engineering
  • Electronic, Optical and Magnetic Materials

Fingerprint

Dive into the research topics of 'Application of gradient-like dynamics to neural networks'. Together they form a unique fingerprint.

Cite this