Gaussian activation functions using Markov chains | IEEE Journals & Magazine | IEEE Xplore

Gaussian activation functions using Markov chains


Abstract:

We extend, in two major ways, earlier work in which sigmoidal neural nonlinearities were implemented using stochastic counters. 1) We define the signal to noise limitatio...Show More

Abstract:

We extend, in two major ways, earlier work in which sigmoidal neural nonlinearities were implemented using stochastic counters. 1) We define the signal to noise limitations of unipolar and bipolar stochastic arithmetic and signal processing. 2) We generalize the use of stochastic counters to include neural transfer functions employed in Gaussian mixture models. The hardware advantages of (nonlinear) stochastic signal processing (SSP) may be offset by increased processing time; we quantify these issues. The ability to realize accurate Gaussian activation functions for neurons in pulsed digital networks using simple hardware with stochastic signals is also analyzed quantitatively.
Published in: IEEE Transactions on Neural Networks ( Volume: 13, Issue: 6, November 2002)
Page(s): 1465 - 1471
Date of Publication: 30 November 2002

ISSN Information:

PubMed ID: 18244541
No metrics found for this document.

I. Introduction

Signals IN digital neural networks may be represented by the Bernoulli probabilities of binary random variables. These signals may be estimated by the frequency of 1s or pulses, i.e., by their pulse count distributions, taken over a sampling interval of multiple clock cycles. Signal values may be multiplied using simple logic gates and may be added or weight-averaged using (stochastic) multiplexers. Unlike the binary radix representations of conventional digital signals, the stochastic signals have unary representations, and their estimates are, therefore, relatively insensitive to imperfect pulse detection and noise. These are among the advantages of (nonlinear) stochastic signal processing (SSP), which is a method of reducing the power dissipation and the silicon area of digital circuit implementations of neural networks, while improving their error and fault tolerance and enabling variable-precision computations in fixed hardware.

Usage
Select a Year
2024

View as

Total usage sinceJan 2011:293
00.511.522.53JanFebMarAprMayJunJulAugSepOctNovDec200000010200
Year Total:5
Data is updated monthly. Usage includes PDF downloads and HTML views.

References

References is not available for this document.