Loading…

Learning Semantic Textual Similarity from Conversations

We present a novel approach to learn representations for sentence-level semantic similarity using conversational data. Our method trains an unsupervised model to predict conversational input-response pairs. The resulting sentence embeddings perform well on the semantic textual similarity (STS) bench...

Full description

Saved in:
Bibliographic Details
Published in:arXiv.org 2018-04
Main Authors: Yang, Yinfei, Yuan, Steve, Cer, Daniel, Sheng-yi, Kong, Constant, Noah, Petr, Pilar, Ge, Heming, Yun-Hsuan Sung, Strope, Brian, Kurzweil, Ray
Format: Article
Language:English
Subjects:
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:We present a novel approach to learn representations for sentence-level semantic similarity using conversational data. Our method trains an unsupervised model to predict conversational input-response pairs. The resulting sentence embeddings perform well on the semantic textual similarity (STS) benchmark and SemEval 2017's Community Question Answering (CQA) question similarity subtask. Performance is further improved by introducing multitask training combining the conversational input-response prediction task and a natural language inference task. Extensive experiments show the proposed model achieves the best performance among all neural models on the STS benchmark and is competitive with the state-of-the-art feature engineered and mixed systems in both tasks.
ISSN:2331-8422