Compute-MLROM: Compute-in-Multi Level Read Only Memory for Energy Efficient Edge AI Inference Engines

ESSCIRC 2023- IEEE 49th European Solid State Circuits Conference (ESSCIRC)(2023)

引用 0|浏览0
暂无评分
摘要
An energy-efficient and high-density first-ever Read Only Memory (ROM) based compute-in-memory (CIM) design for edge AI inference is demonstrated featuring 1) MultiLevel-Cell (MLC) using standard via-programed ROM bitcell; 2) Source-Line PMOS current driving; 3) Auto-scaling diode-connected current-to-voltage (I2V) converters; 4) 3-b ADC with ROM-based VREF generation; 5) Early termination of convolution operation for power savings. A 65nm CMOS prototype achieves a state-of-the-art energy efficiency of ~66.21 to 1324.26 TOPS/W and an area efficiency of ~230.2 to 4604. SGOPS/mm2 for the CIFAR-10 dataset with the ResNet-20 model. It advances the defined FoM by~2.7X to 3.2X over the prior CIM designs.
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要