Loading…
A new approach for reduction of attributes based on stripped quotient sets
•We propose an original method for attribute reduction based on stripped quotient sets.•The proposed method has solid theoretical background and algorithms.•The proposed method reports the superior results when compared to the state-of-the-art methods. Attribute reduction is a key problem in many ar...
Saved in:
Published in: | Pattern recognition 2020-01, Vol.97, p.106999, Article 106999 |
---|---|
Main Authors: | , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | •We propose an original method for attribute reduction based on stripped quotient sets.•The proposed method has solid theoretical background and algorithms.•The proposed method reports the superior results when compared to the state-of-the-art methods.
Attribute reduction is a key problem in many areas such as data mining, pattern recognition, machine learning. The problems of finding all reducts as well as finding a minimal reduct in a given data table have been proved to be NP-hard. Therefore, to overcome this difficulty, many heuristic attribute reduction methods have been developed in recent years. In the process of heuristic attribute reduction, accelerating calculation of attribute significance is very important, especially for big data cases. In this paper, we firstly propose attribute significance measures based on stripped quotient sets. Then, by using these measures, we design efficient algorithms for calculating core and reduct, in which the time complexity will be considered in detail. Additionally, we will also give properties directly related to efficiently computing the attribute significance and significantly reducing the data size in the process of calculation. By theoretical and experimental views, we will show that our method can perform efficiently for large-scale data sets. |
---|---|
ISSN: | 0031-3203 1873-5142 |
DOI: | 10.1016/j.patcog.2019.106999 |