Loading…
A Comparative Study on Feature Selection in Unbalance Text Classification
Feature selection plays an important role in text classification. Unbalanced text classification is a kind of special classification problem, which is widely used in practice. However, what is the most effective method on unbalanced text classification? As we all know there was not a systematic rese...
Saved in:
Main Author: | |
---|---|
Format: | Conference Proceeding |
Language: | English |
Subjects: | |
Online Access: | Request full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Feature selection plays an important role in text classification. Unbalanced text classification is a kind of special classification problem, which is widely used in practice. However, what is the most effective method on unbalanced text classification? As we all know there was not a systematic research about these feature selection methods on unbalanced text classification. This paper is a comparative study of feature selection methods in this problem. The focus is on aggressive dimensionality reduction. We run our experiments on both Chinese and English corpus. Seven methods were evaluated, including term selection based on document frequency (DF), information gain(IG), CH feature selection method, mutual information(MI), expected cross entropy (ECE), the weight of evidence for text (WET) and odds ratio (ODD). We found ODD and WET most effective in two-class classification task, in contrast, IG and CHI had relatively poor performance due to their bias towards favoring rare terms, and its sensitivity to probability estimation errors. However, in multi-class task, the IG and CHI perform had a better performance but MI perform poorly. |
---|---|
ISSN: | 2160-1283 |
DOI: | 10.1109/ISISE.2012.19 |