Abstract
Neural networks have been successfully applied to various pattern recognition and function approximation problems. However, the training process remains a time-consuming procedure that often gets stuck in a local minimum. The optimum network size and topology are usually unknown. In this paper, we formulate the concept of extrema equivalence for estimating the complexity of a function. Based on this formulation, the optimal network size and topology can be selected according to the number of extrema. Mini-max initialization method is then proposed to select the initial values of the weights for the network that is proven to greatly speed up training. The superior performance of our method in terms of convergence and generalization has been substantiated by experimental results.
Original language | English (US) |
---|---|
Pages (from-to) | 389-409 |
Number of pages | 21 |
Journal | Neurocomputing |
Volume | 57 |
Issue number | 1-4 |
DOIs | |
State | Published - Mar 2004 |
All Science Journal Classification (ASJC) codes
- Computer Science Applications
- Cognitive Neuroscience
- Artificial Intelligence
Keywords
- Chessboard initialization
- Extrema equivalence
- Mini-max initialization
- Promising area
- Random initialization