Loading…

Translating with Bilingual Topic Knowledge for Neural Machine Translation

The dominant neural machine translation (NMT) models that based on the encoder-decoder architecture have recently achieved the state-of-the-art performance. Traditionally, the NMT models only depend on the representations learned during training for mapping a source sentence into the target domain....

Full description

Saved in:
Bibliographic Details
Main Authors: Wei, Xiangpeng, Hu, Yue, Xing, Luxi, Wang, Yipeng, Gao, Li
Format: Conference Proceeding
Language:English
Citations: Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
cited_by cdi_FETCH-LOGICAL-c181t-1be29cda63620472c7685496067b01d2f0ece9cf1abdc5a5828433eb0caf7e23
cites
container_end_page 7264
container_issue 1
container_start_page 7257
container_title
container_volume 33
creator Wei, Xiangpeng
Hu, Yue
Xing, Luxi
Wang, Yipeng
Gao, Li
description The dominant neural machine translation (NMT) models that based on the encoder-decoder architecture have recently achieved the state-of-the-art performance. Traditionally, the NMT models only depend on the representations learned during training for mapping a source sentence into the target domain. However, the learned representations often suffer from implicit and inadequately informed properties. In this paper, we propose a novel bilingual topic enhanced NMT (BLTNMT) model to improve translation performance by incorporating bilingual topic knowledge into NMT. Specifically, the bilingual topic knowledge is included into the hidden states of both encoder and decoder, as well as the attention mechanism. With this new setting, the proposed BLT-NMT has access to the background knowledge implied in bilingual topics which is beyond the sequential context, and enables the attention mechanism to attend to topic-level attentions for generating accurate target words during translation. Experimental results show that the proposed model consistently outperforms the traditional RNNsearch and the previous topic-informed NMT on Chinese-English and EnglishGerman translation tasks. We also introduce the bilingual topic knowledge into the newly emerged Transformer base model on English-German translation and achieve a notable improvement.
doi_str_mv 10.1609/aaai.v33i01.33017257
format conference_proceeding
fullrecord <record><control><sourceid>crossref</sourceid><recordid>TN_cdi_crossref_primary_10_1609_aaai_v33i01_33017257</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>10_1609_aaai_v33i01_33017257</sourcerecordid><originalsourceid>FETCH-LOGICAL-c181t-1be29cda63620472c7685496067b01d2f0ece9cf1abdc5a5828433eb0caf7e23</originalsourceid><addsrcrecordid>eNo9kE1OwzAUhC0EElXpDVj4Agl-tmPHS6j4qSiwyT56cezWKCSVnVJxe1KVMpsZaTSz-Ai5BZaDYuYOEUP-LURgkAvBQPNCX5AZF1pmQqrycspQmKwQxlyTRUqfbJI0AKBnZFVF7FOHY-g39BDGLX0I3ZT32NFq2AVLX_vh0Ll246gfIn13-zhVb2i3oXf0fz30N-TKY5fc4s_npHp6rJYv2frjebW8X2cWShgzaBw3tkUlFGdSc6tVWUijmNINg5Z75qwz1gM2rS2wKHkphXANs-i142JO5OnWxiGl6Hy9i-EL408NrD4CqY9A6hOQ-gxE_AIp71X6</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>conference_proceeding</recordtype></control><display><type>conference_proceeding</type><title>Translating with Bilingual Topic Knowledge for Neural Machine Translation</title><source>Freely Accessible Journals</source><creator>Wei, Xiangpeng ; Hu, Yue ; Xing, Luxi ; Wang, Yipeng ; Gao, Li</creator><creatorcontrib>Wei, Xiangpeng ; Hu, Yue ; Xing, Luxi ; Wang, Yipeng ; Gao, Li</creatorcontrib><description>The dominant neural machine translation (NMT) models that based on the encoder-decoder architecture have recently achieved the state-of-the-art performance. Traditionally, the NMT models only depend on the representations learned during training for mapping a source sentence into the target domain. However, the learned representations often suffer from implicit and inadequately informed properties. In this paper, we propose a novel bilingual topic enhanced NMT (BLTNMT) model to improve translation performance by incorporating bilingual topic knowledge into NMT. Specifically, the bilingual topic knowledge is included into the hidden states of both encoder and decoder, as well as the attention mechanism. With this new setting, the proposed BLT-NMT has access to the background knowledge implied in bilingual topics which is beyond the sequential context, and enables the attention mechanism to attend to topic-level attentions for generating accurate target words during translation. Experimental results show that the proposed model consistently outperforms the traditional RNNsearch and the previous topic-informed NMT on Chinese-English and EnglishGerman translation tasks. We also introduce the bilingual topic knowledge into the newly emerged Transformer base model on English-German translation and achieve a notable improvement.</description><identifier>ISSN: 2159-5399</identifier><identifier>EISSN: 2374-3468</identifier><identifier>DOI: 10.1609/aaai.v33i01.33017257</identifier><language>eng</language><ispartof>Proceedings of the ... AAAI Conference on Artificial Intelligence, 2019, Vol.33 (1), p.7257-7264</ispartof><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c181t-1be29cda63620472c7685496067b01d2f0ece9cf1abdc5a5828433eb0caf7e23</citedby></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>314,776,780,27903,27904</link.rule.ids></links><search><creatorcontrib>Wei, Xiangpeng</creatorcontrib><creatorcontrib>Hu, Yue</creatorcontrib><creatorcontrib>Xing, Luxi</creatorcontrib><creatorcontrib>Wang, Yipeng</creatorcontrib><creatorcontrib>Gao, Li</creatorcontrib><title>Translating with Bilingual Topic Knowledge for Neural Machine Translation</title><title>Proceedings of the ... AAAI Conference on Artificial Intelligence</title><description>The dominant neural machine translation (NMT) models that based on the encoder-decoder architecture have recently achieved the state-of-the-art performance. Traditionally, the NMT models only depend on the representations learned during training for mapping a source sentence into the target domain. However, the learned representations often suffer from implicit and inadequately informed properties. In this paper, we propose a novel bilingual topic enhanced NMT (BLTNMT) model to improve translation performance by incorporating bilingual topic knowledge into NMT. Specifically, the bilingual topic knowledge is included into the hidden states of both encoder and decoder, as well as the attention mechanism. With this new setting, the proposed BLT-NMT has access to the background knowledge implied in bilingual topics which is beyond the sequential context, and enables the attention mechanism to attend to topic-level attentions for generating accurate target words during translation. Experimental results show that the proposed model consistently outperforms the traditional RNNsearch and the previous topic-informed NMT on Chinese-English and EnglishGerman translation tasks. We also introduce the bilingual topic knowledge into the newly emerged Transformer base model on English-German translation and achieve a notable improvement.</description><issn>2159-5399</issn><issn>2374-3468</issn><fulltext>true</fulltext><rsrctype>conference_proceeding</rsrctype><creationdate>2019</creationdate><recordtype>conference_proceeding</recordtype><recordid>eNo9kE1OwzAUhC0EElXpDVj4Agl-tmPHS6j4qSiwyT56cezWKCSVnVJxe1KVMpsZaTSz-Ai5BZaDYuYOEUP-LURgkAvBQPNCX5AZF1pmQqrycspQmKwQxlyTRUqfbJI0AKBnZFVF7FOHY-g39BDGLX0I3ZT32NFq2AVLX_vh0Ll246gfIn13-zhVb2i3oXf0fz30N-TKY5fc4s_npHp6rJYv2frjebW8X2cWShgzaBw3tkUlFGdSc6tVWUijmNINg5Z75qwz1gM2rS2wKHkphXANs-i142JO5OnWxiGl6Hy9i-EL408NrD4CqY9A6hOQ-gxE_AIp71X6</recordid><startdate>20190717</startdate><enddate>20190717</enddate><creator>Wei, Xiangpeng</creator><creator>Hu, Yue</creator><creator>Xing, Luxi</creator><creator>Wang, Yipeng</creator><creator>Gao, Li</creator><scope>AAYXX</scope><scope>CITATION</scope></search><sort><creationdate>20190717</creationdate><title>Translating with Bilingual Topic Knowledge for Neural Machine Translation</title><author>Wei, Xiangpeng ; Hu, Yue ; Xing, Luxi ; Wang, Yipeng ; Gao, Li</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c181t-1be29cda63620472c7685496067b01d2f0ece9cf1abdc5a5828433eb0caf7e23</frbrgroupid><rsrctype>conference_proceedings</rsrctype><prefilter>conference_proceedings</prefilter><language>eng</language><creationdate>2019</creationdate><toplevel>online_resources</toplevel><creatorcontrib>Wei, Xiangpeng</creatorcontrib><creatorcontrib>Hu, Yue</creatorcontrib><creatorcontrib>Xing, Luxi</creatorcontrib><creatorcontrib>Wang, Yipeng</creatorcontrib><creatorcontrib>Gao, Li</creatorcontrib><collection>CrossRef</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Wei, Xiangpeng</au><au>Hu, Yue</au><au>Xing, Luxi</au><au>Wang, Yipeng</au><au>Gao, Li</au><format>book</format><genre>proceeding</genre><ristype>CONF</ristype><atitle>Translating with Bilingual Topic Knowledge for Neural Machine Translation</atitle><btitle>Proceedings of the ... AAAI Conference on Artificial Intelligence</btitle><date>2019-07-17</date><risdate>2019</risdate><volume>33</volume><issue>1</issue><spage>7257</spage><epage>7264</epage><pages>7257-7264</pages><issn>2159-5399</issn><eissn>2374-3468</eissn><abstract>The dominant neural machine translation (NMT) models that based on the encoder-decoder architecture have recently achieved the state-of-the-art performance. Traditionally, the NMT models only depend on the representations learned during training for mapping a source sentence into the target domain. However, the learned representations often suffer from implicit and inadequately informed properties. In this paper, we propose a novel bilingual topic enhanced NMT (BLTNMT) model to improve translation performance by incorporating bilingual topic knowledge into NMT. Specifically, the bilingual topic knowledge is included into the hidden states of both encoder and decoder, as well as the attention mechanism. With this new setting, the proposed BLT-NMT has access to the background knowledge implied in bilingual topics which is beyond the sequential context, and enables the attention mechanism to attend to topic-level attentions for generating accurate target words during translation. Experimental results show that the proposed model consistently outperforms the traditional RNNsearch and the previous topic-informed NMT on Chinese-English and EnglishGerman translation tasks. We also introduce the bilingual topic knowledge into the newly emerged Transformer base model on English-German translation and achieve a notable improvement.</abstract><doi>10.1609/aaai.v33i01.33017257</doi><tpages>8</tpages></addata></record>
fulltext fulltext
identifier ISSN: 2159-5399
ispartof Proceedings of the ... AAAI Conference on Artificial Intelligence, 2019, Vol.33 (1), p.7257-7264
issn 2159-5399
2374-3468
language eng
recordid cdi_crossref_primary_10_1609_aaai_v33i01_33017257
source Freely Accessible Journals
title Translating with Bilingual Topic Knowledge for Neural Machine Translation
url http://sfxeu10.hosted.exlibrisgroup.com/loughborough?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-27T02%3A29%3A46IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-crossref&rft_val_fmt=info:ofi/fmt:kev:mtx:book&rft.genre=proceeding&rft.atitle=Translating%20with%20Bilingual%20Topic%20Knowledge%20for%20Neural%20Machine%20Translation&rft.btitle=Proceedings%20of%20the%20...%20AAAI%20Conference%20on%20Artificial%20Intelligence&rft.au=Wei,%20Xiangpeng&rft.date=2019-07-17&rft.volume=33&rft.issue=1&rft.spage=7257&rft.epage=7264&rft.pages=7257-7264&rft.issn=2159-5399&rft.eissn=2374-3468&rft_id=info:doi/10.1609/aaai.v33i01.33017257&rft_dat=%3Ccrossref%3E10_1609_aaai_v33i01_33017257%3C/crossref%3E%3Cgrp_id%3Ecdi_FETCH-LOGICAL-c181t-1be29cda63620472c7685496067b01d2f0ece9cf1abdc5a5828433eb0caf7e23%3C/grp_id%3E%3Coa%3E%3C/oa%3E%3Curl%3E%3C/url%3E&rft_id=info:oai/&rft_id=info:pmid/&rfr_iscdi=true