Mental compression of binary sequences in a language of thought

crossref(2020)

引用 0|浏览0
暂无评分
摘要
The capacity to store information in working memory strongly depends upon the ability to recode the information in a compressed form. Here, we tested the theory that human adults encode binary sequences of stimuli in memory using a recursive compression algorithm akin to a “language of thought”, and capable of capturing nested patterns of repetitions and alternations. In five experiments, we probed memory for auditory or visual sequences using both subjective and objective measures. We used a sequence violation paradigm in which participants detected occasional violations in an otherwise fixed sequence. Both subjective ratings of complexity and objective sequence violation detection rates were well predicted by complexity, as measured by minimal description length (also known as Kolmogorov complexity) in the binary version of the “language of geometry”, a formal language previously found to account for the human encoding of complex spatial sequences in the proposed language. We contrasted the language model with a model based solely on surprise given the stimulus transition probabilities. While both models accounted for variance in the data, the language model dominated over the transition probability model for long sequences (with a number of elements far exceeding the limits of working memory). We use model comparison to show that the minimal description length in a recursive language provides a better fit than a variety of previous encoding models for sequences. The data support the hypothesis that, beyond the extraction of statistical knowledge, human sequence coding relies on an internal compression using language-like nested structures.
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要