Variable Step Search Training for Feedforward Neural Networks


Miroslaw Kordos1 and Wlodzislaw Duch2,3
1Department of Biomedical Informatics, Children's Hospital Research Foundation, Cincinnati, Ohio, USA
2Department of Informatics, Nicolaus Copernicus University, Grudziadzka 5, 87-100 Torun, Poland.
3School of Computer Engineering, Nanyang Technological University, Singapore.

Abstract.

A new class of search-based training algorithms for feedforward networks is introduced. These algorithms do not calculate analytical gradients and do not use stochastic or genetic search techniques. The forward step is performed to calculate error in response to localized weight changes using systematic search techniques. One of the simplest variants of this type of algorithms, the Variable Step Search (VSS) algorithm, is studied in details. The VSS search procedure changes one network parameter at a time and thus does not impose any restrictions on the network structure or the type of transfer functions. Rough approximation to the gradient direction and the determination of the optimal step along this direction to find the minimum of error are performed simultaneously. Modifying the value of a single weight changes the signals only in a small fragment of the network, allowing for efficient calculations of contributions to errors. Several heuristics are discussed to increase the efficiency of VSS algorithm. Tests on benchmark data show that VSS can outperform such renown algorithms as the Levenberg-Marquardt or scaled conjugate gradient algorithm.

Preprint for comments in PDF, 755 KB.
Reference: Kordos M, Duch W, Variable Step Search Training for Feedforward Neural Networks. Neurocomputing 71(13-15), 2470-2480, 2008.

BACK to the publications of W. Duch.
BACK to the on-line publications of the Department of Informatics, NCU.