Loading…
AsPINN: Adaptive symmetry-recomposition physics-informed neural networks
Physics-informed neural networks (PINNs) have shown promise for solving partial differential equations (PDEs). However, PINNs’ loss, the regularization terms, can only guarantee that the prediction results conform to the physical constraints in the average sense, which results in PINNs’ inability to...
Saved in:
Published in: | Computer methods in applied mechanics and engineering 2024-12, Vol.432, p.117405, Article 117405 |
---|---|
Main Authors: | , , , , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Physics-informed neural networks (PINNs) have shown promise for solving partial differential equations (PDEs). However, PINNs’ loss, the regularization terms, can only guarantee that the prediction results conform to the physical constraints in the average sense, which results in PINNs’ inability to strictly adhere to implied physical laws such as conservation laws and symmetries. This limits the optimization speed and accuracy of PINNs. Although some feature-enhanced PINNs attempt to address this issue by adding explicit constraints, their generality is limited due to specific question settings. To overcome this limitation, our study proposes the adaptive symmetry-recomposition PINN (AsPINN). By analyzing the parameter-sharing patterns of fully connected PINNs, specific network structures are developed to provide predictions with strict symmetry constraints. These structures are incorporated into diverse subnetworks to provide constrained intermediate outputs, then a specialized multi-head attention mechanism is attached to evaluate and composite them into final predictions adaptively. Thus, AsPINN maintains precise constraints while addressing the inability of individual structural subnetworks’ generality. This method is then applied to address several physically significant PDEs, including both forward and inverse problems. The numerical results demonstrates AsPINN’s mathematical consistency and generality, offering advantages in terms of optimization speed and accuracy with a reduced number of trainable parameters. The results also manifest that AsPINN mitigates the impact of ill-conditioned data. |
---|---|
ISSN: | 0045-7825 |
DOI: | 10.1016/j.cma.2024.117405 |