Pipelined Memristive Analog-to-Digital Converter With Self-Adaptive Weight Tuning

IEEE Journal on Emerging and Selected Topics in Circuits and Systems(2022)

引用 1|浏览7
暂无评分
摘要
Benefiting from area and power efficiency, memristors enable the development of neural network analog-to-digital converter (ADC) to break through the limitations of conventional ADCs. Although some memristive ADC (mADC) architectures have been proposed recently, the current research is still at an early stage, which is mainly on the simulation and requires numerous target labels to train the synapse weights. In this paper, we propose a pipelined Hopfield neural network mADC architecture and experimentally demonstrate that such mADC has the capability of self-adaptive weight tuning. The proposed training algorithm is an unsupervised method originated from the random weight change (RWC) algorithm, which is modified to reduce the complexity of error feedback circuit to make it more hardware friendly. The synapse matrix could be adapted to the 1T1R crossbar array. For an 8-bit two-stage pipelined mADC, the proposed architecture in the simulation could achieve 7.69 fJ/conv FOM, 7.90 ENOB, 0.1 LSB INL, and 0.1 LSB DNL. And the experimental performance only achieves 1.56 pJ/conv FOM, 7.59 ENOB, 0.21 LSB INL, and 0.29 LSB DNL, which is mainly limited by the comparator’s switching time.
更多
查看译文
关键词
Memristor,Hopfield neural network,analog-to-digital converter,random weight change,gradient descent algorithm,edge computing
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要