Loading…

No-Reference Quality Assessment for Stereoscopic Images Based on Binocular Quality Perception

Quality perception of 3-D images is one of the most important parameters for accelerating advances in 3-D imaging fields. Despite active research in recent years for understanding the quality perception of 3-D images, binocular quality perception of asymmetric distortions in stereoscopic images is n...

Full description

Saved in:
Bibliographic Details
Published in:IEEE transactions on circuits and systems for video technology 2014-04, Vol.24 (4), p.591-602
Main Authors: Ryu, Seungchul, Sohn, Kwanghoon
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Quality perception of 3-D images is one of the most important parameters for accelerating advances in 3-D imaging fields. Despite active research in recent years for understanding the quality perception of 3-D images, binocular quality perception of asymmetric distortions in stereoscopic images is not thoroughly comprehended. In this paper, we explore the relationship between the perceptual quality of stereoscopic images and visual information, and introduce a model for binocular quality perception. Based on this binocular quality perception model, a no-reference quality metric for stereoscopic images is proposed. The proposed metric is a top-down method modeling the binocular quality perception of the human visual system in the context of blurriness and blockiness. Perceptual blurriness and blockiness scores of left and right images were computed using local blurriness, blockiness, and visual saliency information and then combined into an overall quality index using the binocular quality perception model. Experiments for image and video databases show that the proposed metric provides consistent correlations with subjective quality scores. The results also show that the proposed metric provides higher performance than existing full-reference methods even though the proposed method is a no-reference approach.
ISSN:1051-8215
1558-2205
DOI:10.1109/TCSVT.2013.2279971