Loading…
Hybrid Filter–Wrapper Feature Selection Method for Sentiment Classification
The feature selection (FS) has been the latest challenge in the area of sentiment classification. The filter- and wrapper-based feature selection methods are applied in the domain to reduce feature set size and increase accuracy of the classifiers. In this paper, a hybrid of filter and wrapper metho...
Saved in:
Published in: | Arabian journal for science and engineering (2011) 2019-11, Vol.44 (11), p.9191-9208 |
---|---|
Main Authors: | , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | The feature selection (FS) has been the latest challenge in the area of sentiment classification. The filter- and wrapper-based feature selection methods are applied in the domain to reduce feature set size and increase accuracy of the classifiers. In this paper, a hybrid of filter and wrapper method for selecting relevant features is proposed. The feature subset is first selected from the original feature set using computationally fast rank-based FS methods. The selected features are further refined using two wrapper approaches. In the first approach, recursive feature elimination is applied to select optimal feature set, and in the second approach, evolutionary method based on binary particle swarm optimization is applied for finalization of feature subset. The comparison between the two proposed techniques is conducted on five different domain datasets used in the area of sentiment analysis. We used simple and efficient ML algorithms (Naïve Bayes, support vector machine and logistic regression) to evaluate the performance of the hybrid FS techniques. Finally, we assessed the performance of the proposed hybrid FS technique by comparing our results with the state-of-the-art methods. The results reveal that the proposed method is able to give better accuracy with fewer number of features. |
---|---|
ISSN: | 2193-567X 1319-8025 2191-4281 |
DOI: | 10.1007/s13369-019-04064-6 |