Loading…
Refinement and Universal Approximation via Sparsely Connected ReLU Convolution Nets
We construct a highly regular and simple structured class of sparsely connected convolutional neural networks with rectifier activations that provide universal function approximation in a coarse-to-fine manner with increasing number of layers. The networks are localized in the sense that local chang...
Saved in:
Published in: | IEEE signal processing letters 2020, Vol.27, p.1175-1179 |
---|---|
Main Authors: | , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | We construct a highly regular and simple structured class of sparsely connected convolutional neural networks with rectifier activations that provide universal function approximation in a coarse-to-fine manner with increasing number of layers. The networks are localized in the sense that local changes in the function to be approximated only require local changes in the final layer of weights. At the core of the construction lies the fact that the characteristic function can be derived from a convolution of characteristic functions at the next coarser resolution via a rectifier passing. The latter refinement result holds for all higher order univariate B-splines. |
---|---|
ISSN: | 1070-9908 1558-2361 |
DOI: | 10.1109/LSP.2020.3005051 |