Loading…

Does Removing Pooling Layers from Convolutional Neural Networks Improve Results?

Due to their number of parameters, convolutional neural networks are known to take long training periods and extended inference time. Learning may take so much computational power that it requires a costly machine and, sometimes, weeks for training. In this context, there is a trend already in motio...

Full description

Saved in:
Bibliographic Details
Published in:SN computer science 2020-09, Vol.1 (5), p.275, Article 275
Main Authors: Santos, Claudio Filipi Goncalves dos, Moreira, Thierry Pinheiro, Colombo, Danilo, Papa, João Paulo
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Due to their number of parameters, convolutional neural networks are known to take long training periods and extended inference time. Learning may take so much computational power that it requires a costly machine and, sometimes, weeks for training. In this context, there is a trend already in motion to replace convolutional pooling layers for a stride operation in the previous layer to save time. In this work, we evaluate the speedup of such an approach and how it trades off with accuracy loss in multiple computer vision domains, deep neural architectures, and datasets. The results showed significant acceleration with an almost negligible loss in accuracy, when any, which is a further indication that convolutional pooling on deep learning performs redundant calculations.
ISSN:2662-995X
2661-8907
DOI:10.1007/s42979-020-00295-9