Loading…

Theoretical evaluation of feature selection methods based on mutual information

Feature selection methods are usually evaluated by wrapping specific classifiers and datasets in the evaluation process, resulting very often in unfair comparisons between methods. In this work, we develop a theoretical framework that allows obtaining the true feature ordering of two-dimensional seq...

Full description

Saved in:
Bibliographic Details
Published in:Neurocomputing (Amsterdam) 2017-02, Vol.226, p.168-181
Main Authors: Pascoal, Cláudia, Oliveira, M. Rosário, Pacheco, António, Valadas, Rui
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Feature selection methods are usually evaluated by wrapping specific classifiers and datasets in the evaluation process, resulting very often in unfair comparisons between methods. In this work, we develop a theoretical framework that allows obtaining the true feature ordering of two-dimensional sequential forward feature selection methods based on mutual information, which is independent of entropy or mutual information estimation methods, classifiers, or datasets, and leads to an undoubtful comparison of the methods. Moreover, the theoretical framework unveils problems intrinsic to some methods that are otherwise difficult to detect, namely inconsistencies in the construction of the objective function used to select the candidate features, due to various types of indeterminations and to the possibility of the entropy of continuous random variables taking null and negative values. •Sequential forward feature selection methods are compared theoretically.•The true feature ordering is obtained using a theoretical framework.•Several inconsistencies in the objective functions of the methods are unveiled.
ISSN:0925-2312
1872-8286
DOI:10.1016/j.neucom.2016.11.047