Correspondence: Computational-Complexity Reduction for Neural Network Algorithms

Allon Guez, Moshe Kam, James L. Eilbert

Research output: Contribution to journalArticlepeer-review

2 Scopus citations

Abstract

An important class of neural models is described as a set of coupled nonlinear differential equations with state variables corresponding to the axon hillock potential of neurons. Through a nonlinear transformation, these models can be converted to an equivalent system of differential equations whose state variables correspond to firing rates. The new firing rate formulation has certain computational advantages over the potential formulation of the model. The computational and storage burdens per cycle in simulations are reduced, and the resulting equations become quasilinear in a large significant subset of the state space. Moreover, the dynamic range of the state space is bounded, alleviating the numerical stability problems in network simulation. These advantages are demonstrated through an example, using our model for the “neural” solution to the traveling salesman problem proposed by Hopfield and Tank.

Original languageEnglish (US)
Pages (from-to)409-414
Number of pages6
JournalIEEE Transactions on Systems, Man and Cybernetics
Volume19
Issue number2
DOIs
StatePublished - 1989
Externally publishedYes

All Science Journal Classification (ASJC) codes

  • Engineering(all)

Fingerprint Dive into the research topics of 'Correspondence: Computational-Complexity Reduction for Neural Network Algorithms'. Together they form a unique fingerprint.

Cite this