Loading…

Attentional Feature Fusion for End-to-End Blind Image Quality Assessment

In this paper, an end-to-end blind image quality assessment (BIQA) model based on feature fusion with an attention mechanism is proposed. We extracted the multilayer features of the image and fused them based on the attention mechanism; the fused features are then mapped into score, and the image qu...

Full description

Saved in:
Bibliographic Details
Published in:IEEE transactions on broadcasting 2023-03, Vol.69 (1), p.144-152
Main Authors: Zhou, Mingliang, Lang, Shujun, Zhang, Taiping, Liao, Xingran, Shang, Zhaowei, Xiang, Tao, Fang, Bin
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:In this paper, an end-to-end blind image quality assessment (BIQA) model based on feature fusion with an attention mechanism is proposed. We extracted the multilayer features of the image and fused them based on the attention mechanism; the fused features are then mapped into score, and the image quality assessment without reference is realized. First, because the human visual perception system hierarchically approaches the input information from local to global, we used three different neural networks to extract physically meaningful image features, and we use modified VGG19 and modified VGG16 to extract the substrate texture information and the local information of the edges, respectively. Meanwhile, we use the resNet50 to extract high-level global semantic information. Second, to take full advantage of multilevel features and avoid monotonic addition in hierarchical feature fusion, we adopt an attention-based feature fusion mechanism that combines the global and local contexts of the features and assigns different weights to the features to be fused, so that the model can perceive richer types of distortion. Experimental findings on six standard databases show that our approach yields improved performance.
ISSN:0018-9316
1557-9611
DOI:10.1109/TBC.2022.3204235