Loading…

Quality assessment of 3D synthesized images via disoccluded region discovery

Depth-Image-Based-Rendering (DIBR) is fundamental in free-viewpoint 3D video, which has been widely used to generate synthesized views from multi-view images. The majority of DIBR algorithms cause disoccluded regions, which are the areas invisible in original views but emerge in synthesized views. T...

Full description

Saved in:
Bibliographic Details
Main Authors: Yu Zhou, Leida Li, Ke Gu, Yuming Fang, Weisi Lin
Format: Conference Proceeding
Language:English
Subjects:
Online Access:Request full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Depth-Image-Based-Rendering (DIBR) is fundamental in free-viewpoint 3D video, which has been widely used to generate synthesized views from multi-view images. The majority of DIBR algorithms cause disoccluded regions, which are the areas invisible in original views but emerge in synthesized views. The quality of synthesized images is mainly contaminated by distortions in these disoccluded regions. Unfortunately, traditional image quality metrics are not effective for these synthesized images because they are sensitive to geometric distortions. To solve the problem, this paper proposes an objective quality evaluation method for 3D Synthesized images via Disoccluded Region Discovery (SDRD). A self-adaptive scale transform model is first adopted to preprocess the images on account of the impacts of view distance. Then disoccluded regions are detected by comparing the absolute difference between the preprocessed synthesized image and the warped image of preprocessed reference image. Furthermore, the disoccluded regions are weighted by a weighting function proposed to account for the varying sensitivities of human eyes to the size of disoccluded regions. Experiments conducted on IRCCyN/IVC DIBR image database demonstrate that the proposed SDRD method remarkably outperforms traditional 2D and existing DIBR-related quality metrics.
ISSN:2381-8549
DOI:10.1109/ICIP.2016.7532510