Understanding the complexity of computational models through optimization and sloppy parameter analyses: The case of the Connectionist Dual-Process Model

JOURNAL OF MEMORY AND LANGUAGE(2024)

引用 0|浏览7
暂无评分
摘要
A major strength of computational cognitive models is their capacity to accurately predict empirical data. However, challenges in understanding how complex models work and the risk of overfitting have often been addressed by trading off predictive accuracy with model simplification. Here, we introduce state-of-the-art model analysis techniques to show how a large number of parameters in a cognitive model can be reduced into a smaller set that is simpler to understand and can be used to make more constrained predictions with. As a test case, we created different versions of the Connectionist Dual-Process model (CDP) of reading aloud whose parameters were optimized on seven different databases. The results showed that CDP was not overfit and could predict a large amount of variance across those databases. Indeed, the quantitative performance of CDP was higher than that of previous models in this area. Moreover, sloppy parameter analysis, a mathematical technique used to quantify the effects of different parameters on model performance, revealed that many of the parameters in CDP have very little effect on its performance. This shows that the dynamics of CDP are much simpler than its relatively large number of parameters might suggest. Overall, our study shows that cognitive models with large numbers of parameters do not necessarily overfit the empirical data and that understanding the behavior of complex models is more tractable using appropriate mathematical tools. The same techniques could be applied to many different complex cognitive models whenever appropriate datasets for model optimization exist.
更多
查看译文
关键词
Reading,Optimization,Sloppy parameters,Computational modelling
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要