Using quasirandom numbers in neural networks

Show full item record

Title: Using quasirandom numbers in neural networks
Author: Anderson, Peter; Gaborski, Roger; Ge, Ming; Raghavendra, Sanjay; Lung, Mei-ling
Abstract: We present a novel training algorithm for a feed forward neural network with a single hidden layer of nodes (i.e., two layers of connection weights). Our algorithm is capable of training networks for hard problems, such as the classic two-spirals problem. The weights in the first layer are determined using a quasirandom number generator. These weights are frozen---they are never modified during the training process. The second layer of weights is trained as a simple linear discriminator using methods such as the pseudo-inverse, with possible iterations. We also study the problem of reducing the hidden layer: pruning low-weight nodes and a genetic algorithm search for good subsets.
Description: "Using Quasirandom Numbers in Neural Networks," Proceedings of the International ICSC Symposium on Fuzzy Logic. ICSC Academic Press. Held at the Swiss Federal Institute of Technology (ETH): Zurich, Switzerland,: May 26-27, 1995.
Record URI: http://hdl.handle.net/1850/3067
Date: 1995-05-26

Files in this item

Files Size Format View
PAndersonConfProc05-26-1995.pdf 460.2Kb PDF View/Open

The following license files are associated with this item:

This item appears in the following Collection(s)

Show full item record

Search RIT DML


Advanced Search

Browse