Comparing the cognitive load of gesture and action production: a dual-task study

LANGUAGE AND COGNITION(2023)

引用 0|浏览1
暂无评分
摘要
Speech-accompanying gestures have been shown to reduce cognitive load on a secondary task compared to speaking without gestures. In the current study, we investigate whether this benefit of speech-accompanying gestures is shared by speech-accompanying actions (i.e., movements that leave a lasting trace in the physical world). In two experiments, participants attempted to retain verbal and spatial information from a grid while describing a pattern while gesturing, while making the pattern, or while keeping hands still. Producing gestures reduced verbal load compared to keeping hands still when the pattern being described was visually present (Experiment 1), and this benefit was not shared by making the pattern. However, when the pattern being described was not visually present (Experiment 2), making the pattern benefited verbal load compared to keeping hands still. Neither experiment revealed a significant difference between gesture and action. Taken together, the findings suggest that moving the hands in meaningful ways can benefit verbal load.
更多
查看译文
关键词
cognitive load,action production,gesture,dual-task
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要