Loading…
Contrastive structure and texture fusion for image inpainting
Most recent U-Net based models have shown promising results for the challenging tasks in image inpainting field. However, they often generate content with blurred textures and distorted structures due to the lack of semantic consistency and texture continuity in the missing regions. In this paper, w...
Saved in:
Published in: | Neurocomputing (Amsterdam) 2023-06, Vol.536, p.1-12 |
---|---|
Main Authors: | , , , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Most recent U-Net based models have shown promising results for the challenging tasks in image inpainting field. However, they often generate content with blurred textures and distorted structures due to the lack of semantic consistency and texture continuity in the missing regions. In this paper, we propose to restore the missing areas at both structural and textural levels. Our method is built upon a U-Net structure, which repairs images by extracting semantic information from high to low resolution and then decoding it back to the original image. Specifically, we utilize the high-level semantic features learned in encoder to guide the inpainting of structure-aware features of its adjacent low-level feature map. Meanwhile, low-level feature maps have clearer texture compared with high-level ones, which can be used as a prior for textural repair of high-level feature maps. subsequently, a module is used to fuse the two repaired feature maps (i.e., structure-aware and texture-aware features) reasonably and obtain a feature map with reasonable semantics. Moreover, in order to learn more representative high-level semantics feature, we design the model as a siamese network for contrastive learning. Experiments on practical data show that our method outperforms other state-of-the-art methods. |
---|---|
ISSN: | 0925-2312 1872-8286 |
DOI: | 10.1016/j.neucom.2023.03.014 |