Evolutionary Multi-Objective Quantization of Randomization-Based Neural Networks.

2023 IEEE Symposium Series on Computational Intelligence (SSCI)(2023)

引用 0|浏览3
暂无评分
摘要
The deployment of Machine Learning models on hardware devices has motivated a notable research activity around different strategies to alleviate their complexity and size. This is the case of neural architecture search or pruning in Deep Learning. This work places its focus on simplifying randomization-based neural networks by discovering fixed-point quantization policies that optimally balance the trade-off between performance and complexity reduction featured by these models. Specifically, we propose a combinatorial formulation of this problem, which we show to be efficiently solvable by multi-objective evolutionary algorithms. A benchmark for time series forecasting with Echo State Networks over 400 datasets reveals that high compression ratios can be achieved at practically admissible levels of performance degradation, showcasing the utility of the proposed problem formulation to deploy reservoir computing models on resource-constrained hardware devices.
更多
查看译文
关键词
Randomization-based neural networks,model quantization,multi-objective optimization,fixed-point arithmetic
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要