Loading…

Learning curves for overparametrized deep neural networks: A field theory perspective

In the past decade, deep neural networks (DNNs) came to the fore as the leading machine-learning algorithms for a variety of tasks. Their rise was founded on market needs and engineering craftsmanship, the latter based more on trial and error than on theory. While still far behind the application fo...

Full description

Saved in:
Bibliographic Details
Published in:Physical review research 2021-04, Vol.3 (2), p.023034, Article 023034
Main Authors: Cohen, Omry, Malka, Or, Ringel, Zohar
Format: Article
Language:English
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:In the past decade, deep neural networks (DNNs) came to the fore as the leading machine-learning algorithms for a variety of tasks. Their rise was founded on market needs and engineering craftsmanship, the latter based more on trial and error than on theory. While still far behind the application forefront, the theoretical study of DNNs has recently made important advancements in analyzing the highly overparametrized regime where some exact results have been obtained. Leveraging these ideas and adopting a more physicslike approach, here we construct a versatile field theory formalism for supervised deep learning, involving renormalization group, Feynman diagrams, and replicas. In particular, we show that our approach leads to highly accurate predictions of learning curves of truly deep DNNs trained on polynomial regression problems. It also explains in a concrete manner why DNNs generalize well despite being highly overparametrized, this due to an entropic bias to simple functions which, for the case of fully connected DNNs with data sampled on the hypersphere, are low-order polynomials in the input vector. Being a complex interacting system of artificial neurons, we believe that such tools and methodologies borrowed from condensed matter physics would prove essential for obtaining an accurate quantitative understanding of deep learning.
ISSN:2643-1564
2643-1564
DOI:10.1103/PhysRevResearch.3.023034