Loading…

Deep learning architectures for nonlinear operator functions and nonlinear inverse problems

We develop a theoretical analysis for special neural network architectures, termed operator recurrent neural networks , for approximating nonlinear functions whose inputs are linear operators. Such functions commonly arise in solution algorithms for inverse boundary value problems. Traditional neura...

Full description

Saved in:
Bibliographic Details
Published in:Mathematical statistics and learning (Online) 2022-02, Vol.4 (1), p.1-86
Main Authors: de Hoop, Maarten V., Lassas, Matti, Wong, Christopher A.
Format: Article
Language:English
Citations: Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:We develop a theoretical analysis for special neural network architectures, termed operator recurrent neural networks , for approximating nonlinear functions whose inputs are linear operators. Such functions commonly arise in solution algorithms for inverse boundary value problems. Traditional neural networks treat input data as vectors, and thus they do not effectively capture the multiplicative structure associated with the linear operators that correspond to the data in such inverse problems. We therefore introduce a new family that resembles a standard neural network architecture, but where the input data acts multiplicatively on vectors. Motivated by compact operators appearing in boundary control and the analysis of inverse boundary value problems for the wave equation, we promote structure and sparsity in selected weight matrices in the network. After describing this architecture, we study its representation properties as well as its approximation properties.We furthermore show that an explicit regularization can be introduced that can be derived from the mathematical analysis of the mentioned inverse problems, and which leads to certain guarantees on the generalization properties. We observe that the sparsity of the weight matrices improves the generalization estimates. Lastly, we discuss how operator recurrent networks can be viewed as a deep learning analogue to deterministic algorithms such as boundary control for reconstructing the unknown wave speed in the acoustic wave equation from boundary measurements.
ISSN:2520-2316
2520-2324
DOI:10.4171/msl/28