Data-augmentation for graph neural network learning of the relaxed energies of unrelaxed structures

NPJ COMPUTATIONAL MATERIALS(2022)

引用 12|浏览21
暂无评分
摘要
Computational materials discovery has grown in utility over the past decade due to advances in computing power and crystal structure prediction algorithms (CSPA). However, the computational cost of the ab initio calculations required by CSPA limits its utility to small unit cells, reducing the compositional and structural space the algorithms can explore. Past studies have bypassed unneeded ab initio calculations by utilizing machine learning to predict the stability of a material. Specifically, graph neural networks trained on large datasets of relaxed structures display high fidelity in predicting formation energy. Unfortunately, the geometries of structures produced by CSPA deviate from the relaxed state, which leads to poor predictions, hindering the model’s ability to filter unstable material. To remedy this behavior, we propose a simple, physically motivated, computationally efficient perturbation technique that augments training data, improving predictions on unrelaxed structures by 66%. Finally, we show how this error reduction can accelerate CSPA.
更多
查看译文
关键词
relaxed energies,data-augmentation data-augmentation,graph,neural network learning
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要