Modular grammatical evolution for the generation of artificial neural networks: (hot-off-the-press track at GECCO 2022).

Annual Conference on Genetic and Evolutionary Computation (GECCO)(2022)

引用 0|浏览4
暂无评分
摘要
This paper proposes a NeuroEvolution algorithm, Modular Grammatical Evolution (MGE), that enables the evolution of both topology and weights of neural networks for more challenging classification benchmarks like MNIST and Letter with 10 and 26 class counts. The success of MGE is mainly due to (1) restricting the solution space to regular network topologies with a special form of modularity, and (2) improving the search properties of state-of-the-art GE methods by improving the mapping locality and the representation scalability. We have defined and evaluated five forms of structural constraints and observe that single-layer modular restriction of solution space helps in finding smaller and more efficient neural networks faster. Our experimental evaluations on ten well-known classification benchmarks demonstrate that MGE-generated neural networks provide better classification accuracy with respect to other NeuroEvolution methods. Finally, our experimental results indicate that MGE outperforms other GE methods in terms of locality and scalability properties.
更多
查看译文
关键词
Modular Grammatical Evolution, NeuroEvolution, Representation, Evolutionary Learning
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要