Abstract:
The authors develop a mathematical model of the effects of synaptic arithmetic noise in multilayer perceptron training. Predictions are made regarding enhanced fault-tole...Show MoreMetadata
Abstract:
The authors develop a mathematical model of the effects of synaptic arithmetic noise in multilayer perceptron training. Predictions are made regarding enhanced fault-tolerance and generalization ability and improved learning trajectory. These predictions are subsequently verified by simulation. The results are perfectly general and have profound implications for the accuracy requirements in multilayer perceptron (MLP) training, particularly in the analog domain.<>
Published in: IEEE Transactions on Neural Networks ( Volume: 4, Issue: 4, July 1993)
DOI: 10.1109/72.238328