Chrome Extension
WeChat Mini Program
Use on ChatGLM

Transformer-powered surrogates close the ICF simulation-experiment gap with extremely limited data

MACHINE LEARNING-SCIENCE AND TECHNOLOGY(2024)

Cited 0|Views11
No score
Abstract
Recent advances in machine learning, specifically transformer architecture, have led to significant advancements in commercial domains. These powerful models have demonstrated superior capability to learn complex relationships and often generalize better to new data and problems. This paper presents a novel transformer-powered approach for enhancing prediction accuracy in multi-modal output scenarios, where sparse experimental data is supplemented with simulation data. The proposed approach integrates transformer-based architecture with a novel graph-based hyper-parameter optimization technique. The resulting system not only effectively reduces simulation bias, but also achieves superior prediction accuracy compared to the prior method. We demonstrate the efficacy of our approach on inertial confinement fusion experiments, where only 10 shots of real-world data are available, as well as synthetic versions of these experiments.
More
Translated text
Key words
simulation,machine learning,deep learning,inertial confinement fusion,hyper-parameter optimization
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined