EMAT: an efficient multi-task architecture for transfer learning using ReRAM.
ICCAD(2018)
摘要
Transfer learning has demonstrated a great success recently towards general supervised learning to mitigate expensive training efforts. However, existing neural network accelerators have been proven inefficient in executing transfer learning by failing to accommodate the layer-wise heterogeneity in computation and memory requirements. In this work, we propose EMAT–an efficient multi-task architecture for transfer learning built on resistive memory (ReRAM) technology. EMAT utilizes the energy-efficiency of ReRAM arrays for matrix-vector multiplication and realizes a hierarchical reconfigurable design with heterogeneous computation components to incorporate the data patterns in transfer learning. Compared to the GPU platform, EMAT can perform averagely 120× performance speedup and 87× energy saving. EMAT also obtains 2.5× speedup compared to the-state-of-the-art CMOS accelerator.
更多查看译文
关键词
ReRAM,transfer learning,accelerator
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络