Loading…
AUPOD: End-to-End Automatic Poster Design by Self-Supervision
The automatic design has become a popular topic in the application field of computer vision technologies. Previous methods for automatic design are mostly saliency-based, relying on an off-the-shelf model for saliency map detection and hand-crafted aesthetic rules for ranking on multiple proposals....
Saved in:
Published in: | IEEE access 2022, Vol.10, p.47348-47360 |
---|---|
Main Authors: | , , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | The automatic design has become a popular topic in the application field of computer vision technologies. Previous methods for automatic design are mostly saliency-based, relying on an off-the-shelf model for saliency map detection and hand-crafted aesthetic rules for ranking on multiple proposals. We argue that the multi-stage generation and the excessive reliance on saliency map hindered the progress of pursuing better automatic design solutions. In this work, we explore the possibility of a saliency-free solution in a representative scenario, automatic poster design. We propose a novel end-to-end framework to solve the automatic poster design problem, which is divided into the layout prediction and attributes identification sub-tasks. We design a neural network based on multi-modality feature extraction to learn the two sub-tasks jointly. We train the deep neural network in our framework with automatically extracted supervision from semi-structured posters, bypassing a large amount of required manual labor. Both qualitative and quantitative results show the impressive performance of our end-to-end approach after discarding the explicit saliency detection module. Our system learned on self-supervision performs well on the automatic design by learning aesthetic constraints implicitly in the neural networks. |
---|---|
ISSN: | 2169-3536 2169-3536 |
DOI: | 10.1109/ACCESS.2022.3171033 |