Optimization and global minimization methods suitable for neural networks

Wlodzislaw Duch,
Department of Informatics, Nicolaus Copernicus University,
Grudziadzka 5, 87-100 Torun, Poland.
E-mail: wduch@fizyka.umk.pl
and
Jerzy Korczak,
Laboratoire des Sciences de l'Image, de l'Informatique et de la Teledetection, CNRS, Universite Louis Pasteur,
Blvd. Sebastien Brant, 67400 Illkirch, France

Neural Computing Surveys (submitted, Dec. 1998)


 


Neural networks are usually trained using local, gradient-based procedures. Such methods frequently find suboptimal solutions being trapped in local minima. Optimization of neural structures and global minimization methods applied to network cost functions have strong influence on all aspects of network performance. Recently genetic algorithms are frequently combined with neural methods to select best architectures and avoid drawbacks of local minimization methods. Many other global minimization methods are suitable for that purpose, although they are used rather rarely in this context. This paper provides a survey of such global methods, including some aspects of genetic algorithms.

Paper in PDF format, 221 kB.

Projects on similar subject and BACK to the on-line publications of W. Duch.