Loading…

A Kernel Partial least square based feature selection method

•The paper proposes a Kernel Partial Least Square (KPLS) based Feature Selection Method aiming for easy computation and improving classification accuracy for high dimensional data.•The proposed method makes use of KPLS regression coefficients to identify an optimal set of features, thus avoiding non...

Full description

Saved in:
Bibliographic Details
Published in:Pattern recognition 2018-11, Vol.83, p.91-106
Main Authors: Talukdar, Upasana, Hazarika, Shyamanta M, Gan, John Q.
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:•The paper proposes a Kernel Partial Least Square (KPLS) based Feature Selection Method aiming for easy computation and improving classification accuracy for high dimensional data.•The proposed method makes use of KPLS regression coefficients to identify an optimal set of features, thus avoiding non-linear optimization.•Experiments were carried out on seven real life datasets with four different classifiers: SVM, LDA, Random Forest and Naïve Bayes.•Experimental results highlight the advantage of the proposed method over several competing feature selection techniques. Maximum relevance and minimum redundancy (mRMR) has been well recognised as one of the best feature selection methods. This paper proposes a Kernel Partial Least Square (KPLS) based mRMR method, aiming for easy computation and improving classification accuracy for high-dimensional data. Experiments with this approach have been conducted on seven real-life datasets of varied dimensionality and number of instances, with performance measured on four different classifiers: Naive Bayes, Linear Discriminant Analysis, Random Forest and Support Vector Machine. Experimental results have exhibited the advantage of the proposed method over several competing feature selection techniques.
ISSN:0031-3203
1873-5142
DOI:10.1016/j.patcog.2018.05.012