Loading…

Variational Physics-Informed Neural Networks For Solving Partial Differential Equations

Physics-informed neural networks (PINNs) [31] use automatic differentiation to solve partial differential equations (PDEs) by penalizing the PDE in the loss function at a random set of points in the domain of interest. Here, we develop a Petrov-Galerkin version of PINNs based on the nonlinear approx...

Full description

Saved in:
Bibliographic Details
Published in:arXiv.org 2019-11
Main Authors: Kharazmi, E, Zhang, Z, Karniadakis, G E
Format: Article
Language:English
Subjects:
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Physics-informed neural networks (PINNs) [31] use automatic differentiation to solve partial differential equations (PDEs) by penalizing the PDE in the loss function at a random set of points in the domain of interest. Here, we develop a Petrov-Galerkin version of PINNs based on the nonlinear approximation of deep neural networks (DNNs) by selecting the {\em trial space} to be the space of neural networks and the {\em test space} to be the space of Legendre polynomials. We formulate the \textit{variational residual} of the PDE using the DNN approximation by incorporating the variational form of the problem into the loss function of the network and construct a \textit{variational physics-informed neural network} (VPINN). By integrating by parts the integrand in the variational form, we lower the order of the differential operators represented by the neural networks, hence effectively reducing the training cost in VPINNs while increasing their accuracy compared to PINNs that essentially employ delta test functions. For shallow networks with one hidden layer, we analytically obtain explicit forms of the \textit{variational residual}. We demonstrate the performance of the new formulation for several examples that show clear advantages of VPINNs over PINNs in terms of both accuracy and speed.
ISSN:2331-8422