Visual-spatial dynamics drive adaptive social learning in immersive environments

bioRxiv (Cold Spring Harbor Laboratory)(2024)

引用 0|浏览8
暂无评分
摘要
Human cognition is distinguished by our ability to adapt to different environments and circumstances. Yet the mechanisms driving adaptive behavior have predominantly been studied in separate asocial and social contexts, with an integrated framework remaining elusive. Here, we use a collective foraging task in a virtual Minecraft environment to unify these two fields, by leveraging automated transcriptions of visual field data combined with high-resolution spatial trajectories. Our behavioral analyses capture both the structure and temporal dynamics of social interactions, which are then directly tested using computational models sequentially predicting each foraging decision. These results reveal that individual performance (rather than social cues) drives adaptation of asocial foraging strategies, while also modulating the influence and selectivity of social learning. These findings not only unify theories across asocial and social domains, but also provide key insights into the adaptability of human decision-making in complex and dynamic social landscapes. ### Competing Interest Statement The authors have declared no competing interest.
更多
查看译文
关键词
adaptive social learning,visual-spatial
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要