Generalization of graph-based active learning relaxation strategies across materials

MACHINE LEARNING-SCIENCE AND TECHNOLOGY(2024)

Cited 0|Views12
No score
Abstract
Although density functional theory (DFT) has aided in accelerating the discovery of new materials, such calculations are computationally expensive, especially for high-throughput efforts. This has prompted an explosion in exploration of machine learning (ML) assisted techniques to improve the computational efficiency of DFT. In this study, we present a comprehensive investigation of the broader application of Finetuna, an active learning framework to accelerate structural relaxation in DFT with prior information from Open Catalyst Project pretrained graph neural networks. We explore the challenges associated with out-of-domain systems: alcohol (C->2) on metal surfaces as larger adsorbates, metal oxides with spin polarization, and three-dimensional (3D) structures like zeolites and metal organic frameworks. By pre-training ML models on large datasets and fine-tuning the model along the simulation, we demonstrate the framework's ability to conduct relaxations with fewer DFT calculations. Depending on the similarity of the test systems to the training systems, a more conservative querying strategy is applied. Our best-performing Finetuna strategy reduces the number of DFT single-point calculations by 80% for alcohols and 3D structures, and 42% for oxide systems.
More
Translated text
Key words
machine learned potential,geometry optimization,graph neural network
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined