Loading…

An unfeasibility view of neural network learning

We define the notion of a continuously differentiable perfect learning algorithm for multilayer neural network architectures and show that such algorithms do not exist provided that the length of the data set exceeds the number of involved parameters and the activation functions are logistic, tanh o...

Full description

Saved in:
Bibliographic Details
Published in:Journal of Complexity 2023-04, Vol.75, p.101710, Article 101710
Main Authors: Heintz, Joos, Pardo, Luis Miguel, Segura, Enrique Carlos, Ocar, Hvara, Rojas Paredes, Andrés
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:We define the notion of a continuously differentiable perfect learning algorithm for multilayer neural network architectures and show that such algorithms do not exist provided that the length of the data set exceeds the number of involved parameters and the activation functions are logistic, tanh or sin.
ISSN:0885-064X
1090-2708
DOI:10.1016/j.jco.2022.101710