Loading…

Pre-rotation Only at Inference-Stage: A Way to Rotation Invariance of Convolutional Neural Networks

The popular convolutional neural networks (CNN) require data augmentation to achieve rotation invariance. We propose an alternative mechanism, Pre-Rotation Only at Inference stage (PROAI), to make CNN rotation invariant. The overall idea is to learn how the human brain observe images. At the trainin...

Full description

Saved in:
Bibliographic Details
Published in:International journal of computational intelligence systems 2024-04, Vol.17 (1), p.1-18, Article 94
Main Authors: Fan, Yue, Zhang, Peng, Han, Jingqi, Liu, Dandan, Tang, Jinsong, Zhang, Guoping
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:The popular convolutional neural networks (CNN) require data augmentation to achieve rotation invariance. We propose an alternative mechanism, Pre-Rotation Only at Inference stage (PROAI), to make CNN rotation invariant. The overall idea is to learn how the human brain observe images. At the training stage, PROAI trains a CNN with a small number using images only at one orientation. At the inference stage, PROAI introduces a pre-rotation operation to rotate each test image into its all-possible orientations and calculate classification scores using the trained CNN with a small number of parameters. The maximum of these classification scores is able to simultaneously estimate both the category and the orientation of each test image. The specific benefits of PROAI have been experimented on rotated image recognition tasks. The results shows that PROAI improves both the classification and orientation estimation performance while greatly reduced the numbers of parameters and the training time. Codes and datasets are publicly available at https://github.com/automlresearch/FRPRF .
ISSN:1875-6883
1875-6883
DOI:10.1007/s44196-024-00490-z