Abstract
This paper introduces heuristics for the random optimization methods which result in better performance than random optimization method. A hybrid method is also demonstrated which is a combination of standard back-propagation and heuristic random optimization method, and which performs better than its constituents in that it results in a faster learning rate than these methods and does not get trapped in a local minimum. The above results are demonstrated through extensive simulation experiments, which demonstrate that heuristic and hybrid approaches are simple to implement, computationally effective, and robust over wide ranges of parameter values.
| Original language | English |
|---|---|
| Pages | 521-525 |
| Number of pages | 5 |
| State | Published - 1990 |
| Externally published | Yes |
| Event | Proceedings of the Twenty-First Annual Pittsburgh Conference Part 4 (of 5) - Pittsburgh, PA, USA Duration: May 3 1990 → May 4 1990 |
Conference
| Conference | Proceedings of the Twenty-First Annual Pittsburgh Conference Part 4 (of 5) |
|---|---|
| City | Pittsburgh, PA, USA |
| Period | 5/3/90 → 5/4/90 |
ASJC Scopus Subject Areas
- General Engineering
Fingerprint
Dive into the research topics of 'Heuristic and hybrid methods for finding the global minimum of the error function in artificial neural networks'. Together they form a unique fingerprint.Cite this
- APA
- Standard
- Harvard
- Vancouver
- Author
- BIBTEX
- RIS