Loading…

Natural Language Generation Using Dependency Tree Decoding for Spoken Dialog Systems

In this paper, we propose a new natural language generation (NLG) method for spoken dialog systems and demonstrate its capacity. Studies on NLG often employ sequence decoding, which generates the words comprising a sentence in sequential order and uses the input generated by each word in the previou...

Full description

Saved in:
Bibliographic Details
Published in:IEEE access 2019, Vol.7, p.7250-7258
Main Authors: Park, Youngmin, Kang, Sangwoo
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:In this paper, we propose a new natural language generation (NLG) method for spoken dialog systems and demonstrate its capacity. Studies on NLG often employ sequence decoding, which generates the words comprising a sentence in sequential order and uses the input generated by each word in the previous step. In contrast, we propose a decoding method that employs a sequence generated by traversing a dependency tree with feed input to a pair consisting of a parent and sibling in the dependency tree. As a result, the most important words are generated first, thereby enabling words with greater relevance to be fed into the process. At prediction time, our model generates dependency trees and converts the trees into sentences. The proposed decoding method was evaluated by re-implementing a semantically controlled long short-term memory structure for NLG, and the input and predicted sequence were converted to allow dependency tree decoding. The experimental results indicated that our suggested approach, i.e., dependency tree decoding, dramatically elevates the BLEU-score and naturalness. Furthermore, when creating sentences with {n} -best using dependency tree decoding, the word diversity of the output sentences was increased by approximately 6%, offering a more diverse sentence pattern.
ISSN:2169-3536
2169-3536
DOI:10.1109/ACCESS.2018.2889556