Learning Simpler Language Models with the Differential State Framework.
Neural Computation(2017)
摘要
Learning useful information across long time lags is a critical and difficult problem for temporal neural models in tasks such as language modeling. Existing architectures that address the issue are often complex and costly to train. The differential state framework (DSF) is a simple and high-performing design that unifies previously introduced gated neural models. DSF models maintain longer-term ...
更多查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络