Generative Multiform Bayesian Optimization.

IEEE transactions on cybernetics(2023)

引用 5|浏览37
暂无评分
摘要
Many real-world problems, such as airfoil design, involve optimizing a black-box expensive objective function over complex-structured input space (e.g., discrete space or non-Euclidean space). By mapping the complex-structured input space into a latent space of dozens of variables, a two-stage procedure labeled as generative model-based optimization (GMO), in this article, shows promise in solving such problems. However, the latent dimension of GMO is hard to determine, which may trigger the conflicting issue between desirable solution accuracy and convergence rate. To address the above issue, we propose a multiform GMO approach, namely, generative multiform optimization (GMFoO), which conducts optimization over multiple latent spaces simultaneously to complement each other. More specifically, we devise a generative model which promotes a positive correlation between latent spaces to facilitate effective knowledge transfer in GMFoO. And furthermore, by using Bayesian optimization (BO) as the optimizer, we propose two strategies to exchange information between these latent spaces continuously. Experimental results are presented on airfoil and corbel design problems and an area maximization problem as well to demonstrate that our proposed GMFoO converges to better designs on a limited computational budget.
更多
查看译文
关键词
Optimization,Training,Convergence,Task analysis,Linear programming,Bayes methods,Generators,Bayesian optimization (BO),generative model-based optimization (GMO),multiform optimization (MFoO),transfer optimization (TO)
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要