Loading…
Brain Cognition-Inspired Dual-Pathway CNN Architecture for Image Classification
Inspired by the global-local information processing mechanism in the human visual system, we propose a novel convolutional neural network (CNN) architecture named cognition-inspired network (CogNet) that consists of a global pathway, a local pathway, and a top-down modulator. We first use a common C...
Saved in:
Published in: | IEEE transaction on neural networks and learning systems 2024-07, Vol.35 (7), p.9900-9914 |
---|---|
Main Authors: | , , , , , , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Inspired by the global-local information processing mechanism in the human visual system, we propose a novel convolutional neural network (CNN) architecture named cognition-inspired network (CogNet) that consists of a global pathway, a local pathway, and a top-down modulator. We first use a common CNN block to form the local pathway that aims to extract fine local features of the input image. Then, we use a transformer encoder to form the global pathway to capture global structural and contextual information among local parts in the input image. Finally, we construct the learnable top-down modulator where fine local features of the local pathway are modulated by global representations of the global pathway. For ease of use, we encapsulate the dual-pathway computation and modulation process into a building block, called the global-local block (GL block), and a CogNet of any depth can be constructed by stacking a necessary number of GL blocks one after another. Extensive experimental evaluations have revealed that the proposed CogNets have achieved the state-of-the-art performance accuracies on all the six benchmark datasets and are very effective for overcoming the "texture bias" and the "semantic confusion" problems faced by many CNN models. |
---|---|
ISSN: | 2162-237X 2162-2388 2162-2388 |
DOI: | 10.1109/TNNLS.2023.3237962 |