Loading…
Self-supervised representations and node embedding graph neural networks for accurate and multi-scale analysis of materials
Supervised machine learning algorithms, such as graph neural networks (GNN), have successfully predicted material properties. However, the superior performance of GNN usually relies on end-to-end learning on large material datasets, which may lose the physical insight of multi-scale information abou...
Saved in:
Published in: | Machine learning: science and technology 2024-09, Vol.5 (3), p.35018 |
---|---|
Main Authors: | , , , , , , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Supervised machine learning algorithms, such as graph neural networks (GNN), have successfully predicted material properties. However, the superior performance of GNN usually relies on end-to-end learning on large material datasets, which may lose the physical insight of multi-scale information about materials. And the process of labeling data consumes many resources and inevitably introduces errors, which constrains the accuracy of prediction. We propose to train the GNN model by self-supervised learning on the node and edge information of the crystal graph. Compared with the popular manually constructed material descriptors, the self-supervised atomic representation can reach better prediction performance on material properties. Furthermore, it may provide physical insights by tuning the range information. Applying the self-supervised atomic representation on the magnetic moment datasets, we show how they can extract rules and information from the magnetic materials. To incorporate rich physical information into the GNN model, we develop the node embedding graph neural networks (NEGNN) framework and show significant improvements in the prediction performance. The self-supervised material representation and the NEGNN framework may investigate in-depth information from materials and can be applied to small datasets with increased prediction accuracy. |
---|---|
ISSN: | 2632-2153 2632-2153 |
DOI: | 10.1088/2632-2153/ad612b |