Loading…

No reference quality assessment for Thangka color image based on superpixel

In view of the situation that a large number of Thangka images are missing part of color information because of time and environmental factors, the existing image evaluation methods are inconsistent with the result of subjective evaluation. This paper aims at evaluating the damaged Thangka color ima...

Full description

Saved in:
Bibliographic Details
Published in:Journal of visual communication and image representation 2019-02, Vol.59, p.407-414
Main Authors: Hu, Wenjin, Ye, Yuqi, Meng, Jiahao, Zeng, Fuliang
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:In view of the situation that a large number of Thangka images are missing part of color information because of time and environmental factors, the existing image evaluation methods are inconsistent with the result of subjective evaluation. This paper aims at evaluating the damaged Thangka color image, and proposes a new method of image quality evaluation based on superpixel and color entropy. In this algorithm, we use the uniformity of Thangka color image to extract color feature based on CIE 1976 L* a* b* (CIELAB) color space and superpixel. Therefore, the loss of color information in the complex area of Thangka images is well handled. The color entropy is used to quantify the color distribution and structure characteristics of each superpixel, and then we can get the preliminarily evaluation score. In the end, large amounts of data are obtained through some operations such as image deformation and rotating by the Generative Adversarial Nets (GANs), which makes the final evaluation score more reliable. Experimental results show that this method can obtain a good consistency with the subjective results, and Spearman rank order the correlation coefficient (SROCC) and Pearson linear correlation coefficient (PLCC) of the new method already exceed 0.9.
ISSN:1047-3203
1095-9076
DOI:10.1016/j.jvcir.2019.01.039