Monte Carlo comparison of back-propagation, conjugate-gradient, and finite-difference training algorithms for multilayer perceptrons

Show full item record

Title: Monte Carlo comparison of back-propagation, conjugate-gradient, and finite-difference training algorithms for multilayer perceptrons
Author: Wehry, Stephen
Abstract: Monte Carlo Simulation is used to compare the performance of the Back-Propagation, Conjugate-Gradient, and Finite-Difference algorithms when training simple Multilayer Perceptron networks to solve pattern recognition and bit counting problems. Twelve individual simulations will be run for each training algorithm-test problem combination, resulting in an overall total of 72 simulations. The random elements in each Monte Carlo simulation are to be the individual synaptic weights between layers, which will be uniformly distributed. Two other factors, the size of the hidden layer and the exponent of the error function, will also be tested within the simulation plan outlined above.
Record URI: http://hdl.handle.net/1850/13681
Date: 2011

Files in this item

Files Size Format View
SWehryDissertation2011.pdf 1.202Mb PDF View/Open

The following license files are associated with this item:

This item appears in the following Collection(s)

Show full item record

Search RIT DML


Advanced Search

Browse