Improving The State Space Organization Of Untrained Recurrent Networks

ICONIP'08: Proceedings of the 15th international conference on Advances in neuro-information processing - Volume Part I(2009)

引用 1|浏览4
暂无评分
摘要
Recurrent neural networks are frequently used in cognitive science community for modeling linguistic structures. More or less intensive training process is usually performed but several works showed that untrained recurrent networks initialized with small weights can be also successfully used for this type of tasks. In this work we demonstrate that the state space organization of untrained recurrent neural network can be significantly improved by choosing appropriate input representations. We experimentally support this notion on several linguistic time series.
更多
查看译文
关键词
recurrent neural network,untrained recurrent network,untrained recurrent neural network,linguistic structure,linguistic time series,appropriate input representation,cognitive science community,intensive training process,small weight,state space organization
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要