Loading…

Prediction of magnetization dynamics in a reduced dimensional feature space setting utilizing a low-rank kernel method

•A machine learning model to predict the dynamics described by the LLG equation with the field as parameter is established.•We introduce low-rank kernel principal component analysis and low-rank kernel ridge regression for larger training sets.•The model is trained entirely in a reduced dimensional...

Full description

Saved in:
Bibliographic Details
Published in:Journal of computational physics 2021-11, Vol.444, p.110586, Article 110586
Main Authors: Exl, Lukas, Mauser, Norbert J., Schaffer, Sebastian, Schrefl, Thomas, Suess, Dieter
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:•A machine learning model to predict the dynamics described by the LLG equation with the field as parameter is established.•We introduce low-rank kernel principal component analysis and low-rank kernel ridge regression for larger training sets.•The model is trained entirely in a reduced dimensional feature space obtained from unsupervised learning. We establish a machine learning model for the prediction of the magnetization dynamics as function of the external field described by the Landau-Lifschitz-Gilbert equation, the partial differential equation of motion in micromagnetism. The model allows for fast and accurate determination of the response to an external field which is illustrated by a thin-film standard problem. The data-driven method internally reduces the dimensionality of the problem by means of nonlinear model reduction for unsupervised learning. This not only makes accurate prediction of the time steps possible, but also decisively reduces complexity in the learning process where magnetization states from simulated micromagnetic dynamics associated with different external fields are used as input data. We use a truncated representation of kernel principal components to describe the states between time predictions. The method is capable of handling large training sample sets owing to a low-rank approximation of the kernel matrix and an associated low-rank extension of kernel principal component analysis and kernel ridge regression. The approach entirely shifts computations into a reduced dimensional setting breaking down the problem dimension from the thousands to the tens.
ISSN:0021-9991
1090-2716
DOI:10.1016/j.jcp.2021.110586