Loading…
Pre-trained Language Embedding-based Contextual Summary and Multi-scale Transmission Network for Aspect Extraction
With the development of IOT and 5G technology, people’s demand for information acquisition is more inclined to accuracy, intelligence and timeliness. How to help designer obtain the real-time information of specific product reviews from the massive online consumers and upgrade the new design strateg...
Saved in:
Published in: | Procedia computer science 2020, Vol.174, p.40-49 |
---|---|
Main Authors: | , , , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | With the development of IOT and 5G technology, people’s demand for information acquisition is more inclined to accuracy, intelligence and timeliness. How to help designer obtain the real-time information of specific product reviews from the massive online consumers and upgrade the new design strategy has become a hot topic for research. In this paper, we define the problem as an aspect extraction task, and propose a novel deep learning model that comprises of three modules: pre-training language model embedding, multi-scale transmission network and contextual summary, which aims to provide an end-to-end solution without any additional supervision. To this end, we adopt BERT to overcome the disadvantage of traditional embedding methods, which cannot combine contextual information. Multi-scale transmission network is proposed to integrate the Bi-GRU and a group of CNN networks to extract sequential and local features of words respectively. Contextual summary is a tailor-made representation distilled from the input sentence, conditioned on each current word, and thus can assist aspect prediction. Experimental results over three benchmark SemEval datasets clearly illustrate that our model can achieve the state-of-the-art performance. |
---|---|
ISSN: | 1877-0509 1877-0509 |
DOI: | 10.1016/j.procs.2020.06.054 |