Loading…
An unfeasibility view of neural network learning
We define the notion of a continuously differentiable perfect learning algorithm for multilayer neural network architectures and show that such algorithms do not exist provided that the length of the data set exceeds the number of involved parameters and the activation functions are logistic, tanh o...
Saved in:
Published in: | Journal of Complexity 2023-04, Vol.75, p.101710, Article 101710 |
---|---|
Main Authors: | , , , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | We define the notion of a continuously differentiable perfect learning algorithm for multilayer neural network architectures and show that such algorithms do not exist provided that the length of the data set exceeds the number of involved parameters and the activation functions are logistic, tanh or sin. |
---|---|
ISSN: | 0885-064X 1090-2708 |
DOI: | 10.1016/j.jco.2022.101710 |