Chrome Extension
WeChat Mini Program
Use on ChatGLM

Memory degradation induced by attention in recurrent neural architectures

Neurocomputing(2022)

Cited 0|Views5
No score
Abstract
•The work proposes an empirical analysis of memory degradation in RNN networks.•Attention based architectures tend to not use the RNN memory.•Direct input attention allows the RNN to work without attention interference.•The conjectures were tested on eight different problems.
More
Translated text
Key words
Long short-term memory networks,Attention mechanisms,Recurrence,Gate activations,Forget gate
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined