A Weight Importance Analysis Technique for Area- and Power-Efficient Binary Weight Neural Network Processor Design

COGNITIVE COMPUTATION(2021)

引用 1|浏览2
暂无评分
摘要
Recently, the binary weight neural network (BWNN) processor design has attracted lots of attention due to its low computational complexity and memory demands. For the design of BWNN processor, emerging memory technologies such as RRAM can be used to replace conventional SRAM to save area and accessing power. However, RRAM is prone to bit errors, leading to reduced classification accuracy. To combine BWNN and RRAM to reduce the area overhead and power consumption while maintaining a high classification accuracy is a significant research challenge. In this work, we propose an automatic weight importance analysis technique and a mixed weight storage scheme to address the above-mentioned issue. For demonstration, we applied the proposed techniques to two typical BWNNs. The experimental results show that more than 78% (40%) area saving and 57% (30%) power saving can be achieved with less than 1% accuracy loss. The proposed techniques are applicable in resource- and power-constrained neural network processor design and show significant potentials for AI-based Internet-of-Things (IoT) devices that usually have low computational and storage resources.
更多
查看译文
关键词
Neural network processor, Binary weight, RRAM, Area-efficient, Power-efficient
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要