An alpha -rule perceptron-based training algorithm for J. J. Hopfield's binary neural network (1982) is suggested. Unlike the sum-of-outer-products algorithm, the scheme partitions the memory's state space acoording to the relative importance of the stored patterns, as judged by the history of their occurrence in the environment. It is demonstrated that the algorithm leads to training that resembles features of biological learning: gradual learning is shown to be superior to mass practice; overlearning reduces the rate of forgetting; and a 'saving score' for past knowledge speeds the relearning of forgotten patterns. Compared to sum-of-outer-products design, the proposed scheme assigns larger regions of attraction to the stored patterns and reduces the number of spurious attractors. A modification to the algorithm is suggested that is based on the adaptive-delta modulation procedure used in digital communications. The modified scheme is shown to have a faster convergence rate and does not change the basic memory partition between impressed patterns.
|Original language||English (US)|
|Title of host publication||Unknown Host Publication Title|
|Editors||Maureen Caudill, Charles T. Butler, San Diego Adaptics|
|State||Published - Dec 1 1987|
All Science Journal Classification (ASJC) codes