Chrome Extension
WeChat Mini Program
Use on ChatGLM

Sparse Distributed Memory is a Continual Learner

ICLR 2023(2023)

Cited 6|Views55
No score
Abstract
Continual learning is a problem for artificial neural networks that their biological counterparts are adept at solving. Building on work using Sparse Distributed Memory (SDM) to connect a core neural circuit with the powerful Transformer model, we create a modified Multi-Layered Perceptron (MLP) that is a strong continual learner. We find that every component of our MLP variant translated from biology is necessary for continual learning. Our solution is also free from any memory replay or task information, and introduces novel methods to train sparse networks that may be broadly applicable.
More
Translated text
Key words
Sparse Distributed Memory,Sparsity,Top-K Activation,Continual Learning,Biologically Inspired
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined