Low Cost Hybrid Spin-CMOS Compressor for Stochastic Neural Networks

Proceedings of the 2019 on Great Lakes Symposium on VLSI(2019)

Cited 4|Views121
No score
Abstract
With expansion of neural network (NN) applications lowering their hardware implementation cost becomes an urgent task especially in back-end applications where the power-supply is limited. Stochastic computing (SC) is a promising solution to realize low-cost hardware designs. Implementation of matrix multiplication has been a bottleneck in previous stochastic neural networks (SC-NNs). In this paper, we introduce spintronic components into the design of SC-NNs. A novel spin-CMOS matrix multiplier is proposed in which the stochastic multiplications are performed by CMOS AND gates while the sum of products is implemented by spintronic compressor gates. The experimental results indicate that compared to the conventional binary implementations the proposed hybrid spin-CMOS architecture can achieve over 125x, 4.5x and 43x; reduction in terms of power, energy and area consumptions, respectively. Moreover, compared to previous CMOS-based SC-NNs, our design saves the power by 3.1x - 7.3x, reduces energy consumption by 3.1x - 7.3x and decreases area by 1.4x - 7.6x while maintaining similar recognition rates.
More
Translated text
Key words
compressor, neural network, spintronic, stochastic computing
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined