Loading…
Effective background removal method based on generative adversary networks
It is a challenge to remove the cluttered background in research of hand gesture images. The popular method, image semantic segmentation, is still not efficient enough to deal well with fine-grained image background removal due to insufficient training samples. We are the first to propose a backgrou...
Saved in:
Published in: | Journal of electronic imaging 2020-09, Vol.29 (5), p.053014-053014 |
---|---|
Main Authors: | , , , |
Format: | Article |
Language: | English |
Citations: | Items that this one cites |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | It is a challenge to remove the cluttered background in research of hand gesture images. The popular method, image semantic segmentation, is still not efficient enough to deal well with fine-grained image background removal due to insufficient training samples. We are the first to propose a background removal method based on a conditional generative adversarial network (CGAN). With CGAN, our method is designed to translate the images with backgrounds to the ones without backgrounds. The proposed method does not rely on the traditional image-to-semantics complex processing, and instead, performs an image-to-image task. The image is generated without background, and a discriminator decides whether backgrounds exist in output images. With an iterative training generator and discriminator, it is easy to fulfill two goals: (i) improving the discriminator’s ability to recognize whether the generated images have backgrounds and (ii) enhancing the generator’s ability to remove backgrounds. For our study, a large number of gesture images were collected and simulated to conduct experiments. The results demonstrate that the proposed method achieves remarkable performance in background removal for different gesture images. The training is robust, and the simple network has generalization ability among different hand gestures. |
---|---|
ISSN: | 1017-9909 1560-229X |
DOI: | 10.1117/1.JEI.29.5.053014 |