Loading…
Prior and contextual emotion of words in sentential context
•Words have prior emotion and emotion in context.•We automate this distinction, and explore features useful in this task.•We put words by emotion or lack thereof into seven classes, and group features.•Our features work best when all groups are combined. A set of words labeled with their prior emoti...
Saved in:
Published in: | Computer speech & language 2014-01, Vol.28 (1), p.76-92 |
---|---|
Main Authors: | , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | •Words have prior emotion and emotion in context.•We automate this distinction, and explore features useful in this task.•We put words by emotion or lack thereof into seven classes, and group features.•Our features work best when all groups are combined.
A set of words labeled with their prior emotion is an obvious place to start on the automatic discovery of the emotion of a sentence, but it is clear that context must also be considered. It may be that no simple function of the labels on the individual words captures the overall emotion of the sentence; words are interrelated and they mutually influence their affect-related interpretation. It happens quite often that a word which invokes emotion appears in a neutral sentence, or that a sentence with no emotional word carries an emotion. This could also happen among different emotion classes. The goal of this work is to distinguish automatically between prior and contextual emotion, with a focus on exploring features important in this task. We present a set of features which enable us to take the contextual emotion of a word and the syntactic structure of the sentence into account to put sentences into emotion classes. The evaluation includes assessing the performance of different feature sets across multiple classification methods. We show the features and a promising learning method which significantly outperforms two reasonable baselines. We group our features by the similarity of their nature. That is why another facet of our evaluation is to consider each group of the features separately and investigate how well they contribute to the result. The experiments show that all features contribute to the result, but it is the combination of all the features that gives the best performance. |
---|---|
ISSN: | 0885-2308 1095-8363 |
DOI: | 10.1016/j.csl.2013.04.009 |