Abstract
Recent probabilistic interpretations of neural network models have suggested the formulation of network operations in information-theoretic terms. In these interpretations, the neural network develops an assumed probability density function which represents its assumptions on the environment. Using a set of hypotheses, this probability density function is shown to maintain an exponential relationship with an energy-like function that the network tends to minimize. The authors obtain this probability density function through C. Shannon's derivation of the entropy measure (1948) and E. T. Jaynes' maximum entropy principle (1957). The main conclusion is that the neural network assumes the worst case (i. e. , most uncertain or maximum-entropy) probability density function for the unknown environment.
Original language | English (US) |
---|---|
Pages (from-to) | 1968-1972 |
Number of pages | 5 |
Journal | Proceedings of the American Control Conference |
State | Published - 1987 |
Externally published | Yes |
All Science Journal Classification (ASJC) codes
- Electrical and Electronic Engineering