Loading…

TET-GAN: Text Effects Transfer via Stylization and Destylization

Text effects transfer technology automatically makes the text dramatically more impressive. However, previous style transfer methods either study the model for general style, which cannot handle the highly-structured text effects along the glyph, or require manual design of subtle matching criteria...

Full description

Saved in:
Bibliographic Details
Published in:arXiv.org 2018-12
Main Authors: Yang, Shuai, Liu, Jiaying, Wang, Wenjing, Guo, Zongming
Format: Article
Language:English
Subjects:
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
cited_by
cites
container_end_page
container_issue
container_start_page
container_title arXiv.org
container_volume
creator Yang, Shuai
Liu, Jiaying
Wang, Wenjing
Guo, Zongming
description Text effects transfer technology automatically makes the text dramatically more impressive. However, previous style transfer methods either study the model for general style, which cannot handle the highly-structured text effects along the glyph, or require manual design of subtle matching criteria for text effects. In this paper, we focus on the use of the powerful representation abilities of deep neural features for text effects transfer. For this purpose, we propose a novel Texture Effects Transfer GAN (TET-GAN), which consists of a stylization subnetwork and a destylization subnetwork. The key idea is to train our network to accomplish both the objective of style transfer and style removal, so that it can learn to disentangle and recombine the content and style features of text effects images. To support the training of our network, we propose a new text effects dataset with as much as 64 professionally designed styles on 837 characters. We show that the disentangled feature representations enable us to transfer or remove all these styles on arbitrary glyphs using one network. Furthermore, the flexible network design empowers TET-GAN to efficiently extend to a new text style via one-shot learning where only one example is required. We demonstrate the superiority of the proposed method in generating high-quality stylized text over the state-of-the-art methods.
format article
fullrecord <record><control><sourceid>proquest</sourceid><recordid>TN_cdi_proquest_journals_2158091781</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2158091781</sourcerecordid><originalsourceid>FETCH-proquest_journals_21580917813</originalsourceid><addsrcrecordid>eNpjYuA0MjY21LUwMTLiYOAtLs4yMDAwMjM3MjU15mRwCHEN0XV39LNSCEmtKFFwTUtLTS4pVggpSswrTkstUijLTFQILqnMyaxKLMnMz1NIzEtRcEktRojwMLCmJeYUp_JCaW4GZTfXEGcP3YKi_MJSoMr4rPzSojygVLyRoamFgaWhuYWhMXGqALhjOF0</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2158091781</pqid></control><display><type>article</type><title>TET-GAN: Text Effects Transfer via Stylization and Destylization</title><source>Publicly Available Content (ProQuest)</source><creator>Yang, Shuai ; Liu, Jiaying ; Wang, Wenjing ; Guo, Zongming</creator><creatorcontrib>Yang, Shuai ; Liu, Jiaying ; Wang, Wenjing ; Guo, Zongming</creatorcontrib><description>Text effects transfer technology automatically makes the text dramatically more impressive. However, previous style transfer methods either study the model for general style, which cannot handle the highly-structured text effects along the glyph, or require manual design of subtle matching criteria for text effects. In this paper, we focus on the use of the powerful representation abilities of deep neural features for text effects transfer. For this purpose, we propose a novel Texture Effects Transfer GAN (TET-GAN), which consists of a stylization subnetwork and a destylization subnetwork. The key idea is to train our network to accomplish both the objective of style transfer and style removal, so that it can learn to disentangle and recombine the content and style features of text effects images. To support the training of our network, we propose a new text effects dataset with as much as 64 professionally designed styles on 837 characters. We show that the disentangled feature representations enable us to transfer or remove all these styles on arbitrary glyphs using one network. Furthermore, the flexible network design empowers TET-GAN to efficiently extend to a new text style via one-shot learning where only one example is required. We demonstrate the superiority of the proposed method in generating high-quality stylized text over the state-of-the-art methods.</description><identifier>EISSN: 2331-8422</identifier><language>eng</language><publisher>Ithaca: Cornell University Library, arXiv.org</publisher><subject>Representations ; State of the art ; Technology transfer</subject><ispartof>arXiv.org, 2018-12</ispartof><rights>2018. This work is published under http://arxiv.org/licenses/nonexclusive-distrib/1.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.</rights><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://www.proquest.com/docview/2158091781?pq-origsite=primo$$EHTML$$P50$$Gproquest$$Hfree_for_read</linktohtml><link.rule.ids>780,784,25753,37012,44590</link.rule.ids></links><search><creatorcontrib>Yang, Shuai</creatorcontrib><creatorcontrib>Liu, Jiaying</creatorcontrib><creatorcontrib>Wang, Wenjing</creatorcontrib><creatorcontrib>Guo, Zongming</creatorcontrib><title>TET-GAN: Text Effects Transfer via Stylization and Destylization</title><title>arXiv.org</title><description>Text effects transfer technology automatically makes the text dramatically more impressive. However, previous style transfer methods either study the model for general style, which cannot handle the highly-structured text effects along the glyph, or require manual design of subtle matching criteria for text effects. In this paper, we focus on the use of the powerful representation abilities of deep neural features for text effects transfer. For this purpose, we propose a novel Texture Effects Transfer GAN (TET-GAN), which consists of a stylization subnetwork and a destylization subnetwork. The key idea is to train our network to accomplish both the objective of style transfer and style removal, so that it can learn to disentangle and recombine the content and style features of text effects images. To support the training of our network, we propose a new text effects dataset with as much as 64 professionally designed styles on 837 characters. We show that the disentangled feature representations enable us to transfer or remove all these styles on arbitrary glyphs using one network. Furthermore, the flexible network design empowers TET-GAN to efficiently extend to a new text style via one-shot learning where only one example is required. We demonstrate the superiority of the proposed method in generating high-quality stylized text over the state-of-the-art methods.</description><subject>Representations</subject><subject>State of the art</subject><subject>Technology transfer</subject><issn>2331-8422</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2018</creationdate><recordtype>article</recordtype><sourceid>PIMPY</sourceid><recordid>eNpjYuA0MjY21LUwMTLiYOAtLs4yMDAwMjM3MjU15mRwCHEN0XV39LNSCEmtKFFwTUtLTS4pVggpSswrTkstUijLTFQILqnMyaxKLMnMz1NIzEtRcEktRojwMLCmJeYUp_JCaW4GZTfXEGcP3YKi_MJSoMr4rPzSojygVLyRoamFgaWhuYWhMXGqALhjOF0</recordid><startdate>20181227</startdate><enddate>20181227</enddate><creator>Yang, Shuai</creator><creator>Liu, Jiaying</creator><creator>Wang, Wenjing</creator><creator>Guo, Zongming</creator><general>Cornell University Library, arXiv.org</general><scope>8FE</scope><scope>8FG</scope><scope>ABJCF</scope><scope>ABUWG</scope><scope>AFKRA</scope><scope>AZQEC</scope><scope>BENPR</scope><scope>BGLVJ</scope><scope>CCPQU</scope><scope>DWQXO</scope><scope>HCIFZ</scope><scope>L6V</scope><scope>M7S</scope><scope>PIMPY</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PRINS</scope><scope>PTHSS</scope></search><sort><creationdate>20181227</creationdate><title>TET-GAN: Text Effects Transfer via Stylization and Destylization</title><author>Yang, Shuai ; Liu, Jiaying ; Wang, Wenjing ; Guo, Zongming</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-proquest_journals_21580917813</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2018</creationdate><topic>Representations</topic><topic>State of the art</topic><topic>Technology transfer</topic><toplevel>online_resources</toplevel><creatorcontrib>Yang, Shuai</creatorcontrib><creatorcontrib>Liu, Jiaying</creatorcontrib><creatorcontrib>Wang, Wenjing</creatorcontrib><creatorcontrib>Guo, Zongming</creatorcontrib><collection>ProQuest SciTech Collection</collection><collection>ProQuest Technology Collection</collection><collection>Materials Science &amp; Engineering Collection</collection><collection>ProQuest Central (Alumni Edition)</collection><collection>ProQuest Central</collection><collection>ProQuest Central Essentials</collection><collection>ProQuest Central</collection><collection>Technology Collection</collection><collection>ProQuest One Community College</collection><collection>ProQuest Central Korea</collection><collection>SciTech Premium Collection</collection><collection>ProQuest Engineering Collection</collection><collection>Engineering Database</collection><collection>Publicly Available Content (ProQuest)</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>ProQuest Central China</collection><collection>Engineering Collection</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Yang, Shuai</au><au>Liu, Jiaying</au><au>Wang, Wenjing</au><au>Guo, Zongming</au><format>book</format><genre>document</genre><ristype>GEN</ristype><atitle>TET-GAN: Text Effects Transfer via Stylization and Destylization</atitle><jtitle>arXiv.org</jtitle><date>2018-12-27</date><risdate>2018</risdate><eissn>2331-8422</eissn><abstract>Text effects transfer technology automatically makes the text dramatically more impressive. However, previous style transfer methods either study the model for general style, which cannot handle the highly-structured text effects along the glyph, or require manual design of subtle matching criteria for text effects. In this paper, we focus on the use of the powerful representation abilities of deep neural features for text effects transfer. For this purpose, we propose a novel Texture Effects Transfer GAN (TET-GAN), which consists of a stylization subnetwork and a destylization subnetwork. The key idea is to train our network to accomplish both the objective of style transfer and style removal, so that it can learn to disentangle and recombine the content and style features of text effects images. To support the training of our network, we propose a new text effects dataset with as much as 64 professionally designed styles on 837 characters. We show that the disentangled feature representations enable us to transfer or remove all these styles on arbitrary glyphs using one network. Furthermore, the flexible network design empowers TET-GAN to efficiently extend to a new text style via one-shot learning where only one example is required. We demonstrate the superiority of the proposed method in generating high-quality stylized text over the state-of-the-art methods.</abstract><cop>Ithaca</cop><pub>Cornell University Library, arXiv.org</pub><oa>free_for_read</oa></addata></record>
fulltext fulltext
identifier EISSN: 2331-8422
ispartof arXiv.org, 2018-12
issn 2331-8422
language eng
recordid cdi_proquest_journals_2158091781
source Publicly Available Content (ProQuest)
subjects Representations
State of the art
Technology transfer
title TET-GAN: Text Effects Transfer via Stylization and Destylization
url http://sfxeu10.hosted.exlibrisgroup.com/loughborough?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-08T02%3A55%3A53IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest&rft_val_fmt=info:ofi/fmt:kev:mtx:book&rft.genre=document&rft.atitle=TET-GAN:%20Text%20Effects%20Transfer%20via%20Stylization%20and%20Destylization&rft.jtitle=arXiv.org&rft.au=Yang,%20Shuai&rft.date=2018-12-27&rft.eissn=2331-8422&rft_id=info:doi/&rft_dat=%3Cproquest%3E2158091781%3C/proquest%3E%3Cgrp_id%3Ecdi_FETCH-proquest_journals_21580917813%3C/grp_id%3E%3Coa%3E%3C/oa%3E%3Curl%3E%3C/url%3E&rft_id=info:oai/&rft_pqid=2158091781&rft_id=info:pmid/&rfr_iscdi=true