Loading…

SparseConnect: regularising CNNs on fully connected layers

Deep convolutional neural networks (CNNs) have achieved unprecedented success in many domains. The numerous parameters allow CNNs to learn complex features, but also tend to hinder generalisation by over-fitting training data. Despite many previously proposed regularisation methods, over-fitting is...

Full description

Saved in:
Bibliographic Details
Published in:Electronics letters 2017-08, Vol.53 (18), p.1246-1248
Main Authors: Xu, Qi, Pan, Gang
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Request full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Deep convolutional neural networks (CNNs) have achieved unprecedented success in many domains. The numerous parameters allow CNNs to learn complex features, but also tend to hinder generalisation by over-fitting training data. Despite many previously proposed regularisation methods, over-fitting is one of many problems in training a robust CNN. Among many factors that may lead to over-fitting, the numerous parameters of fully connected layers (FCLs) of a typical CNN are a contributor to the over-fitting problem. The authors propose SparseConnect, which alleviates over-fitting by sparsifying connections to FCLs. Experimental results on three benchmark datasets MNIST and CIFAR10 show SparseConnect outperforms several state-of-the-art regularisation methods.
ISSN:0013-5194
1350-911X
1350-911X
DOI:10.1049/el.2017.2621