Loading…

A neuron-wise subspace correction method for the finite neuron method

In this paper, we propose a novel algorithm called Neuron-wise Parallel Subspace Correction Method (NPSC) for the finite neuron method that approximates numerical solutions of partial differential equations (PDEs) using neural network functions. Despite extremely extensive research activities in app...

Full description

Saved in:
Bibliographic Details
Published in:arXiv.org 2024-03
Main Authors: Park, Jongho, Xu, Jinchao, Xu, Xiaofeng
Format: Article
Language:English
Subjects:
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:In this paper, we propose a novel algorithm called Neuron-wise Parallel Subspace Correction Method (NPSC) for the finite neuron method that approximates numerical solutions of partial differential equations (PDEs) using neural network functions. Despite extremely extensive research activities in applying neural networks for numerical PDEs, there is still a serious lack of effective training algorithms that can achieve adequate accuracy, even for one-dimensional problems. Based on recent results on the spectral properties of linear layers and landscape analysis for single neuron problems, we develop a special type of subspace correction method that optimizes the linear layer and each neuron in the nonlinear layer separately. An optimal preconditioner that resolves the ill-conditioning of the linear layer is presented for one-dimensional problems, so that the linear layer is trained in a uniform number of iterations with respect to the number of neurons. In each single neuron problem, a good local minimum that avoids flat energy regions is found by a superlinearly convergent algorithm. Numerical experiments on function approximation problems and PDEs demonstrate better performance of the proposed method than other gradient-based methods.
ISSN:2331-8422
DOI:10.48550/arxiv.2211.12031