Hybrid Optimization Method Based on Coupling Local Gradient Information and Global Evolution Mechanism

MATHEMATICS(2024)

Cited 0|Views2
No score
Abstract
Multi-objective evolutionary algorithms (MOEA) have attracted much attention because of their good global exploration ability; however, their local search ability near the optimal value is weak, and for large-scale decision-variable optimization problems the number of populations and iterations required by MOEA are very large, so the optimization efficiency is low. Gradient optimization algorithms can overcome these difficulties well, but gradient search methods are difficult to apply to multi-objective optimization problems (MOPs). To this end, this paper introduces a stochastic weighting function based on the weighted average gradient and proposes two multi-objective stochastic gradient operators. Further, two efficient evolutionary algorithms, MOGBA and HMOEA, are developed. Their local search capability has been greatly enhanced while retaining the good global exploration capability by using different offspring update strategies for different subpopulations. Numerical experiments show that HMOEA has excellent capture ability for various Pareto formations, and it can easily solve multi-objective optimization problems with many objectives, which improves the efficiency by a factor of 5-10 compared with typical multi-objective evolutionary algorithms. HMOEA is further applied to the multi-objective aerodynamic optimization design of the RAE2822 airfoil and the ideal Pareto front is obtained, which indicates that HMOEA is an efficient optimization algorithm with potential applications in aerodynamic optimization design.
More
Translated text
Key words
aerodynamic optimization,multi-objective optimization,evolutionary algorithm,gradient optimization
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined