MILEAGE: An Automated Optimal Clause Search Paradigm for Tsetlin Machines

2022 International Symposium on the Tsetlin Machine (ISTM)(2022)

引用 1|浏览4
暂无评分
摘要
The logic based underpinning of the Tsetlin Machine (TM) offers substantial benefits for compute-efficient inference. The caveat to this is the large memory footprint of the TM growing substantially as problem complexity increases. This paper presents a novel automated approach to find an optimum model complexity in terms of number of clauses for a given application. It proposes a run-time pruning system which interfaces with the TM model between training cycles and determines the inconsequential clauses. The clauses are iteratively removed till no degradation of accuracy is observed before commencing the next training cycle. Our approach retains important clause propositions developed at a higher model complexity and translates it to a smaller clause-sparse model while retaining the learning efficacy. We validate our approach using MNIST, Fashion-MNIST, Kuzushiji-MNIST and KWS6 where we demonstrate that compared to large model size, a similar accuracy gain can be achieved with fewer clauses resulting in significantly smaller training time. We demonstrate up to 63% compression in model size for the chosen training parameters.
更多
查看译文
关键词
Machine Learning,Clause pruning,Logic Propositions,Tsetlin Machines,Compression,MNIST
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要