Loading…

Latent semantic modeling for slot filling in conversational understanding

In this paper, we propose a new framework for semantic template filling in a conversational understanding (CU) system. Our method decomposes the task into two steps: latent n-gram clustering using a semi-supervised latent Dirichlet allocation (LDA) and sequence tagging for learning semantic structur...

Full description

Saved in:
Bibliographic Details
Main Authors: Tur, Gokhan, Celikyilmaz, Asli, Hakkani-Tur, Dilek
Format: Conference Proceeding
Language:English
Subjects:
Online Access:Request full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:In this paper, we propose a new framework for semantic template filling in a conversational understanding (CU) system. Our method decomposes the task into two steps: latent n-gram clustering using a semi-supervised latent Dirichlet allocation (LDA) and sequence tagging for learning semantic structures in a CU system. Latent semantic modeling has been investigated to improve many natural language processing tasks such as syntactic parsing or topic tracking. However, due to several complexity problems caused by issues involving utterance length or dialog corpus size, it has not been analyzed directly for semantic parsing tasks. In this paper, we propose extending the LDA by introducing prior knowledge we obtain from semantic knowledge bases. Then, the topic posteriors obtained from the new LDA model are used as additional constraints to a sequence learning model for the semantic template filling task. The experimental results show significant performance gains on semantic slot filling models when features from latent semantic models are used in a conditional random field (CRF).
ISSN:1520-6149
2379-190X
DOI:10.1109/ICASSP.2013.6639285