Generalized Key-Value Memory to Flexibly Adjust Redundancy in Memory-Augmented Networks

IEEE transactions on neural networks and learning systems(2023)

引用 6|浏览24
暂无评分
摘要
Memory-augmented neural networks enhance a neural network with an external key-value (KV) memory whose complexity is typically dominated by the number of support vectors in the key memory. We propose a generalized KV memory that decouples its dimension from the number of support vectors by introducing a free parameter that can arbitrarily add or remove redundancy to the key memory representation. In effect, it provides an additional degree of freedom to flexibly control the tradeoff between robustness and the resources required to store and compute the generalized KV memory. This is particularly useful for realizing the key memory on in-memory computing hardware where it exploits nonideal, but extremely efficient nonvolatile memory devices for dense storage and computation. Experimental results show that adapting this parameter on demand effectively mitigates up to 44% nonidealities, at equal accuracy and number of devices, without any need for neural network retraining.
更多
查看译文
关键词
Support vector machines,Nonvolatile memory,Computer architecture,Manganese,Task analysis,Organizations,Training,Hyperdimensional computing,in-memory computing,key-value (KV) memory,linear distributed memories with associations,memory-augmented neural networks (MANNs),nonvolatile memory (NVM),phase-change memory (PCM),vector symbolic architectures
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要