Enhanced Kriging leave-one-out cross-validation in improving model estimation and optimization

Computer Methods in Applied Mechanics and Engineering(2023)

引用 0|浏览2
暂无评分
摘要
Leave-one-out cross-validation (LOOCV) is a widely used technique in model estimation and selection of the Kriging surrogate model for engineering problems, such as structural optimization and reliability analysis. However, the traditional LOOCV method has some disadvantages in terms of accuracy and efficiency. This paper proposes an enhanced-LOOCV method that incorporates hyperparameters from the Kriging model based on the complete training dataset (i.e., the complete Kriging model) into the LOOCV error calculation. By keeping the model hyperparameters in LOOCV consistent with the complete Kriging model, it reduces the number of hyperparameter optimizations and significantly increases the accuracy and efficiency of the LOOCV process. Additionally, a decremental calculation is proposed to reduce the computational costs of the correlation matrix inversion without sacrificing accuracy, resulting in an improved time complexity of the traditional LOOCV from O(n4) to O(n3). Experimental results with thirty test functions verify that the enhanced-LOOCV has better estimation performance than the Kriging model with significantly higher efficiency compared to the traditional LOOCV. Numerical experiments and an engineering case in optimization demonstrate that the enhanced-LOOCV can reduce the number of samples needed for infilling in the Kriging model, and it is more suitable for expensive optimizations in engineering. & COPY; 2023 Elsevier B.V. All rights reserved.
更多
查看译文
关键词
Kriging surrogate model,Leave-one-out cross-validation,Decremental calculation,Expensive optimization,Structural optimization
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要