Abstract:
The author describes a neural network architecture and training procedure that provide an efficient means of modeling complicated surface functions. Essentially, the tech...Show MoreMetadata
Abstract:
The author describes a neural network architecture and training procedure that provide an efficient means of modeling complicated surface functions. Essentially, the technique operates by constructing surfaces in a step-wise manner out of Gaussian-shaped bumps and depressions. The rationale behind the approach is explained with reference to a surface modeling interpretation of layered feedforward networks. This is followed by a description of the training procedure, using the modeling of a cowboy-hat-shaped surface as an example problem. The advantages of the technique are that it ensures convergence on a solution to within any tolerance for a set of training patterns, converges rapidly, and circumvents the issue of how many hidden neurons to incorporate in a network. The author also presents a demonstration of how to smooth the output produced by a network and thereby improve its powers of interpolation, this time using the problem of drawing a square as an example.<>
Date of Conference: 18-21 November 1991
Date Added to IEEE Xplore: 12 September 2019
Print ISBN:0-7803-0227-3