谷歌Chrome浏览器插件
订阅小程序
在清言上使用

An adaptive fitness evolutionary algorithm for sparse large-scale multi-objective optimization problems

Ge Zhang, Ni Wu,Chaonan Shen,Kai Zhang

2022 IEEE Symposium Series on Computational Intelligence (SSCI)(2022)

引用 0|浏览6
暂无评分
摘要
Recently large-scale sparse multi-objective optimization problems are increasingly concerned by researchers. Different from large-scale multi-objective optimization problems, most of the decision variables (Decs) of the large-scale sparse multi-objective optimization problems are equal to zero. Among the existing large-scale sparse evolutionary algorithms, SparseEA and SparseEA2 dynamically mask some of the real decision variables to zero by setting Mask which can accelerate convergence. When determining Mask updates, the SparseEA and SparseEA2 algorithms both use a static fitness. However, static fitness is limited by the number of iterations so that it is difficult to cover the global information. To address this issue, we design an adaptive fitness for Mask updates. Moreover, we increase the number of decision variables per flip according to the number of decision variables, and gradually decrease the flip probability as the number of iterations increases. In the case of real Decs cross, we only cross the real Decs with the same Mask as the parents. We experiment on eight benchmark problems and three real-world application problems, and the simulation results show that our algorithm is significantly more efficient than the other four currently available sparse large-scale multi-objective algorithms.
更多
查看译文
关键词
Large-scale multi-objective optimization,Sparse pareto optimal solutions,Evolutionary algorithm
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要