Loading [a11y]/accessibility-menu.js
Simplified Hardware Implementation of the Softmax Activation Function | IEEE Conference Publication | IEEE Xplore

Simplified Hardware Implementation of the Softmax Activation Function


Abstract:

In this paper a simplified hardware implementation of a CNN softmax layer is proposed. Initially the softmax activation function is analyzed in terms of required accuracy...Show More

Abstract:

In this paper a simplified hardware implementation of a CNN softmax layer is proposed. Initially the softmax activation function is analyzed in terms of required accuracy and certain optimizations are proposed. Subsequently the proposed hardware architecture is evaluated in terms of the introduced approximation error. Finally the proposed circuits are synthesized in a 90-nm 1.0 V CMOS standard-cell library using Synopsys Design Compiler. Comparisons reveal significant reduction up to 47% and 43% for certain cases, in terms of area × delay product over prior art. Area savings are achieved with no performance penalty.
Date of Conference: 13-15 May 2019
Date Added to IEEE Xplore: 20 June 2019
ISBN Information:
Conference Location: Thessaloniki, Greece
Electrical and Computer Engineering Dept., University of Patras, Patras, Greece
Electrical and Computer Engineering Dept., University of Patras, Patras, Greece

I. Introduction

Deep Neural Networks (DNN) have emerged as a means to tackle complex problems such as image classification and speech recognition. The achievements of the DNNs are attributed to the big data availability, the capability of enormous computational power and the introduction of novel algorithms that make training and inference efficient [1].

Electrical and Computer Engineering Dept., University of Patras, Patras, Greece
Electrical and Computer Engineering Dept., University of Patras, Patras, Greece

References

References is not available for this document.