Loading…
Mutually activated residual linear modeling GAN for pose-guided person image generation
Translating a pose of a given person to another desired pose is popular in computer vision applications. However, previous works usually directly utilized pose information to guide appearance information for the generation without deep consideration of the interaction between these two kinds of info...
Saved in:
Published in: | Neurocomputing (Amsterdam) 2022-12, Vol.514, p.451-463 |
---|---|
Main Authors: | , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Translating a pose of a given person to another desired pose is popular in computer vision applications. However, previous works usually directly utilized pose information to guide appearance information for the generation without deep consideration of the interaction between these two kinds of information. Moreover, the global long-range relation that exists in both kinds of data has not been well modeled due to the physical design of convolutional filters. In this paper, a novel Mutually Activated Residual Linear Modeling Generative Adversarial Network (MARLM-GAN) is proposed to address these two challenges. The MARLM-GAN consists of T cascaded MARLM modules for learning the latent transformation progressively from both appearance and pose codes. In each MARLM module, there are two mutually-activated residual linear modeling blocks for both appearance and pose pathways. In addition, an information update strategy is also developed, which makes the latent appearance and pose representations benefit each other interactively. Our experiments on two challenging datasets demonstrate that the proposed MARLM-GAN can achieve competitive results in terms of objective evaluation metrics and subjective visual realness compared with recent state-of-the-art methods. |
---|---|
ISSN: | 0925-2312 1872-8286 |
DOI: | 10.1016/j.neucom.2022.09.089 |