Loading…
Few-shot Controllable Style Transfer for Low-Resource Multilingual Settings
Style transfer is the task of rewriting a sentence into a target style while approximately preserving content. While most prior literature assumes access to a large style-labelled corpus, recent work (Riley et al. 2021) has attempted "few-shot" style transfer using only 3-10 sentences at i...
Saved in:
Published in: | arXiv.org 2022-03 |
---|---|
Main Authors: | , , , , |
Format: | Article |
Language: | English |
Subjects: | |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
cited_by | |
---|---|
cites | |
container_end_page | |
container_issue | |
container_start_page | |
container_title | arXiv.org |
container_volume | |
creator | Krishna, Kalpesh Nathani, Deepak Garcia, Xavier Samanta, Bidisha Talukdar, Partha |
description | Style transfer is the task of rewriting a sentence into a target style while approximately preserving content. While most prior literature assumes access to a large style-labelled corpus, recent work (Riley et al. 2021) has attempted "few-shot" style transfer using only 3-10 sentences at inference for style extraction. In this work we study a relevant low-resource setting: style transfer for languages where no style-labelled corpora are available. We notice that existing few-shot methods perform this task poorly, often copying inputs verbatim. We push the state-of-the-art for few-shot style transfer with a new method modeling the stylistic difference between paraphrases. When compared to prior work, our model achieves 2-3x better performance in formality transfer and code-mixing addition across seven languages. Moreover, our method is better at controlling the style transfer magnitude using an input scalar knob. We report promising qualitative results for several attribute transfer tasks (sentiment transfer, simplification, gender neutralization, text anonymization) all without retraining the model. Finally, we find model evaluation to be difficult due to the lack of datasets and metrics for many languages. To facilitate future research we crowdsource formality annotations for 4000 sentence pairs in four Indic languages, and use this data to design our automatic evaluations. |
format | article |
fullrecord | <record><control><sourceid>proquest</sourceid><recordid>TN_cdi_proquest_journals_2582282372</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2582282372</sourcerecordid><originalsourceid>FETCH-proquest_journals_25822823723</originalsourceid><addsrcrecordid>eNqNi0EKwjAURIMgWLR3CLgO1B9ruy8WQd3Y7ksqv9oSEs1PKN7eLDyAm5kH82bBEpByJ8o9wIqlRFOWZXAoIM9lws41zoKe1vPKGu-s1qrXyBv_idk6ZWhAxwfr-MXO4oZkg7sjvwbtRz2aR1CaN-h9RNqw5aA0YfrrNdvWx7Y6iZez74Dkuym-TZw6yEuAEmQB8j_rCy-CPXw</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2582282372</pqid></control><display><type>article</type><title>Few-shot Controllable Style Transfer for Low-Resource Multilingual Settings</title><source>Publicly Available Content Database</source><creator>Krishna, Kalpesh ; Nathani, Deepak ; Garcia, Xavier ; Samanta, Bidisha ; Talukdar, Partha</creator><creatorcontrib>Krishna, Kalpesh ; Nathani, Deepak ; Garcia, Xavier ; Samanta, Bidisha ; Talukdar, Partha</creatorcontrib><description>Style transfer is the task of rewriting a sentence into a target style while approximately preserving content. While most prior literature assumes access to a large style-labelled corpus, recent work (Riley et al. 2021) has attempted "few-shot" style transfer using only 3-10 sentences at inference for style extraction. In this work we study a relevant low-resource setting: style transfer for languages where no style-labelled corpora are available. We notice that existing few-shot methods perform this task poorly, often copying inputs verbatim. We push the state-of-the-art for few-shot style transfer with a new method modeling the stylistic difference between paraphrases. When compared to prior work, our model achieves 2-3x better performance in formality transfer and code-mixing addition across seven languages. Moreover, our method is better at controlling the style transfer magnitude using an input scalar knob. We report promising qualitative results for several attribute transfer tasks (sentiment transfer, simplification, gender neutralization, text anonymization) all without retraining the model. Finally, we find model evaluation to be difficult due to the lack of datasets and metrics for many languages. To facilitate future research we crowdsource formality annotations for 4000 sentence pairs in four Indic languages, and use this data to design our automatic evaluations.</description><identifier>EISSN: 2331-8422</identifier><language>eng</language><publisher>Ithaca: Cornell University Library, arXiv.org</publisher><subject>Annotations ; Datasets ; Languages ; Stability</subject><ispartof>arXiv.org, 2022-03</ispartof><rights>2022. This work is published under http://arxiv.org/licenses/nonexclusive-distrib/1.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.</rights><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://www.proquest.com/docview/2582282372?pq-origsite=primo$$EHTML$$P50$$Gproquest$$Hfree_for_read</linktohtml><link.rule.ids>780,784,25753,37012,44590</link.rule.ids></links><search><creatorcontrib>Krishna, Kalpesh</creatorcontrib><creatorcontrib>Nathani, Deepak</creatorcontrib><creatorcontrib>Garcia, Xavier</creatorcontrib><creatorcontrib>Samanta, Bidisha</creatorcontrib><creatorcontrib>Talukdar, Partha</creatorcontrib><title>Few-shot Controllable Style Transfer for Low-Resource Multilingual Settings</title><title>arXiv.org</title><description>Style transfer is the task of rewriting a sentence into a target style while approximately preserving content. While most prior literature assumes access to a large style-labelled corpus, recent work (Riley et al. 2021) has attempted "few-shot" style transfer using only 3-10 sentences at inference for style extraction. In this work we study a relevant low-resource setting: style transfer for languages where no style-labelled corpora are available. We notice that existing few-shot methods perform this task poorly, often copying inputs verbatim. We push the state-of-the-art for few-shot style transfer with a new method modeling the stylistic difference between paraphrases. When compared to prior work, our model achieves 2-3x better performance in formality transfer and code-mixing addition across seven languages. Moreover, our method is better at controlling the style transfer magnitude using an input scalar knob. We report promising qualitative results for several attribute transfer tasks (sentiment transfer, simplification, gender neutralization, text anonymization) all without retraining the model. Finally, we find model evaluation to be difficult due to the lack of datasets and metrics for many languages. To facilitate future research we crowdsource formality annotations for 4000 sentence pairs in four Indic languages, and use this data to design our automatic evaluations.</description><subject>Annotations</subject><subject>Datasets</subject><subject>Languages</subject><subject>Stability</subject><issn>2331-8422</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2022</creationdate><recordtype>article</recordtype><sourceid>PIMPY</sourceid><recordid>eNqNi0EKwjAURIMgWLR3CLgO1B9ruy8WQd3Y7ksqv9oSEs1PKN7eLDyAm5kH82bBEpByJ8o9wIqlRFOWZXAoIM9lws41zoKe1vPKGu-s1qrXyBv_idk6ZWhAxwfr-MXO4oZkg7sjvwbtRz2aR1CaN-h9RNqw5aA0YfrrNdvWx7Y6iZez74Dkuym-TZw6yEuAEmQB8j_rCy-CPXw</recordid><startdate>20220311</startdate><enddate>20220311</enddate><creator>Krishna, Kalpesh</creator><creator>Nathani, Deepak</creator><creator>Garcia, Xavier</creator><creator>Samanta, Bidisha</creator><creator>Talukdar, Partha</creator><general>Cornell University Library, arXiv.org</general><scope>8FE</scope><scope>8FG</scope><scope>ABJCF</scope><scope>ABUWG</scope><scope>AFKRA</scope><scope>AZQEC</scope><scope>BENPR</scope><scope>BGLVJ</scope><scope>CCPQU</scope><scope>DWQXO</scope><scope>HCIFZ</scope><scope>L6V</scope><scope>M7S</scope><scope>PIMPY</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PRINS</scope><scope>PTHSS</scope></search><sort><creationdate>20220311</creationdate><title>Few-shot Controllable Style Transfer for Low-Resource Multilingual Settings</title><author>Krishna, Kalpesh ; Nathani, Deepak ; Garcia, Xavier ; Samanta, Bidisha ; Talukdar, Partha</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-proquest_journals_25822823723</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2022</creationdate><topic>Annotations</topic><topic>Datasets</topic><topic>Languages</topic><topic>Stability</topic><toplevel>online_resources</toplevel><creatorcontrib>Krishna, Kalpesh</creatorcontrib><creatorcontrib>Nathani, Deepak</creatorcontrib><creatorcontrib>Garcia, Xavier</creatorcontrib><creatorcontrib>Samanta, Bidisha</creatorcontrib><creatorcontrib>Talukdar, Partha</creatorcontrib><collection>ProQuest SciTech Collection</collection><collection>ProQuest Technology Collection</collection><collection>Materials Science & Engineering Collection</collection><collection>ProQuest Central (Alumni)</collection><collection>ProQuest Central</collection><collection>ProQuest Central Essentials</collection><collection>AUTh Library subscriptions: ProQuest Central</collection><collection>Technology Collection</collection><collection>ProQuest One Community College</collection><collection>ProQuest Central Korea</collection><collection>SciTech Premium Collection</collection><collection>ProQuest Engineering Collection</collection><collection>Engineering Database</collection><collection>Publicly Available Content Database</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>ProQuest Central China</collection><collection>Engineering collection</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Krishna, Kalpesh</au><au>Nathani, Deepak</au><au>Garcia, Xavier</au><au>Samanta, Bidisha</au><au>Talukdar, Partha</au><format>book</format><genre>document</genre><ristype>GEN</ristype><atitle>Few-shot Controllable Style Transfer for Low-Resource Multilingual Settings</atitle><jtitle>arXiv.org</jtitle><date>2022-03-11</date><risdate>2022</risdate><eissn>2331-8422</eissn><abstract>Style transfer is the task of rewriting a sentence into a target style while approximately preserving content. While most prior literature assumes access to a large style-labelled corpus, recent work (Riley et al. 2021) has attempted "few-shot" style transfer using only 3-10 sentences at inference for style extraction. In this work we study a relevant low-resource setting: style transfer for languages where no style-labelled corpora are available. We notice that existing few-shot methods perform this task poorly, often copying inputs verbatim. We push the state-of-the-art for few-shot style transfer with a new method modeling the stylistic difference between paraphrases. When compared to prior work, our model achieves 2-3x better performance in formality transfer and code-mixing addition across seven languages. Moreover, our method is better at controlling the style transfer magnitude using an input scalar knob. We report promising qualitative results for several attribute transfer tasks (sentiment transfer, simplification, gender neutralization, text anonymization) all without retraining the model. Finally, we find model evaluation to be difficult due to the lack of datasets and metrics for many languages. To facilitate future research we crowdsource formality annotations for 4000 sentence pairs in four Indic languages, and use this data to design our automatic evaluations.</abstract><cop>Ithaca</cop><pub>Cornell University Library, arXiv.org</pub><oa>free_for_read</oa></addata></record> |
fulltext | fulltext |
identifier | EISSN: 2331-8422 |
ispartof | arXiv.org, 2022-03 |
issn | 2331-8422 |
language | eng |
recordid | cdi_proquest_journals_2582282372 |
source | Publicly Available Content Database |
subjects | Annotations Datasets Languages Stability |
title | Few-shot Controllable Style Transfer for Low-Resource Multilingual Settings |
url | http://sfxeu10.hosted.exlibrisgroup.com/loughborough?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2024-12-27T01%3A56%3A15IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest&rft_val_fmt=info:ofi/fmt:kev:mtx:book&rft.genre=document&rft.atitle=Few-shot%20Controllable%20Style%20Transfer%20for%20Low-Resource%20Multilingual%20Settings&rft.jtitle=arXiv.org&rft.au=Krishna,%20Kalpesh&rft.date=2022-03-11&rft.eissn=2331-8422&rft_id=info:doi/&rft_dat=%3Cproquest%3E2582282372%3C/proquest%3E%3Cgrp_id%3Ecdi_FETCH-proquest_journals_25822823723%3C/grp_id%3E%3Coa%3E%3C/oa%3E%3Curl%3E%3C/url%3E&rft_id=info:oai/&rft_pqid=2582282372&rft_id=info:pmid/&rfr_iscdi=true |