Loading…
Efficient data acquisition and training of collisional-radiative model artificial neural network surrogates through adaptive parameter space sampling
Abstract Effective plasma transport modeling of magnetically confined fusion devices relies on having an accurate understanding of the ion composition and radiative power losses of the plasma. Generally, these quantities can be obtained from solutions of a collisional-radiative (CR) model at each ti...
Saved in:
Published in: | Machine learning: science and technology 2022-10, Vol.3 (4) |
---|---|
Main Authors: | , , , , |
Format: | Article |
Language: | English |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Abstract
Effective plasma transport modeling of magnetically confined fusion devices relies on having an accurate understanding of the ion composition and radiative power losses of the plasma. Generally, these quantities can be obtained from solutions of a collisional-radiative (CR) model at each time step within a plasma transport simulation. However, even compact, approximate CR models can be computationally onerous to evaluate, and in-situ evaluation of these models within a larger plasma transport code can lead to a rigid bottleneck. As a way to bypass this bottleneck, we propose deploying artificial neural network surrogates to allow rapid evaluation of the necessary plasma quantities. However, one issue with training an accurate artificial neural network surrogate is the reliance on a sufficiently large and representative training and validation data set, which can be time-consuming to generate. In this work we explore a data-driven active learning and training routine to allow autonomous adaptive sampling of the problem parameter space to ensure a sufficiently large and meaningful set of training data is assembled for the network training. As a result, we can demonstrate approximately order-of-magnitude savings in required training data samples to produce an accurate surrogate. |
---|---|
ISSN: | 2632-2153 2632-2153 |