A Concept of Spatiotemporal Attractors

ADVANCES IN COGNITIVE NEURODYNAMICS (V)(2016)

Cited 3|Views25
No score
Abstract
The memory neural network is organized as an attractor space by both bottom-up (sensory) and top-down (contextual) information. This paper presents a possible mechanism of spatiotemporal attractors in one layer neural network based on the experimental data and their theoretical models in learning and memory. The model consists of following important concepts: First, a sequence of sensory events (bottom-up information) carried by gamma-wave is consolidated in the synaptic weight space by spatiotemporal learning rule (local learning rule, non Hebb type). In the process, the learning rule plays an important role for the pattern discrimination of spatiotemporal sequences [1, 2]. Second, contextual (top-down) information carried by theta-wave is also consolidated in the same space by Hebb type learning rule. Integration of two consolidated synaptic weight spaces forms an attractor with pattern completion. The attractor is defined as a spatiotemporal attractor.
More
Translated text
Key words
Bottom-up, Top-down, Spatiotemporal attractor
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined