ON THE PROBABILISTIC INTERPRETATION OF NEURAL NETWORK BEHAVIOR.

Moshe Kam, Allon Guez

Research output: Contribution to journalConference articlepeer-review

1 Scopus citations

Abstract

Recent probabilistic interpretations of neural network models have suggested the formulation of network operations in information-theoretic terms. In these interpretations, the neural network develops an assumed probability density function which represents its assumptions on the environment. Using a set of hypotheses, this probability density function is shown to maintain an exponential relationship with an energy-like function that the network tends to minimize. The authors obtain this probability density function through C. Shannon's derivation of the entropy measure (1948) and E. T. Jaynes' maximum entropy principle (1957). The main conclusion is that the neural network assumes the worst case (i. e. , most uncertain or maximum-entropy) probability density function for the unknown environment.

Original languageEnglish (US)
Pages (from-to)1968-1972
Number of pages5
JournalProceedings of the American Control Conference
StatePublished - Dec 1 1987
Externally publishedYes

All Science Journal Classification (ASJC) codes

  • Electrical and Electronic Engineering

Fingerprint Dive into the research topics of 'ON THE PROBABILISTIC INTERPRETATION OF NEURAL NETWORK BEHAVIOR.'. Together they form a unique fingerprint.

Cite this