Loading…
A label compression method for online multi-label classification
•We have proposed a new label space reduction method for online multi-label streams.•Different variants of the method are investigated.•Experiments show the effectiveness of the method in comparison to the other online problem transformation methods.•Experiments show the method is competitive to PLS...
Saved in:
Published in: | Pattern recognition letters 2018-08, Vol.111, p.64-71 |
---|---|
Main Authors: | , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | •We have proposed a new label space reduction method for online multi-label streams.•Different variants of the method are investigated.•Experiments show the effectiveness of the method in comparison to the other online problem transformation methods.•Experiments show the method is competitive to PLST as a famous offline label space reduction method.
Many modern applications deal with multi-label data, such as functional categorizations of genes, image labeling and text categorization. Classification of such data with a large number of labels and latent dependencies among them is a challenging task, and it becomes even more challenging when the data is received online and in chunks. Many of the current multi-label classification methods require a lot of time and memory, which make them infeasible for practical real-world applications. In this paper, we propose a fast linear label space dimension reduction method that transforms the labels into a reduced encoded space and trains models on the obtained pseudo labels. Additionally, it provides an analytical method to update the decoding matrix which maps the labels into the original space and is used during the test phase. Experimental results show the effectiveness of this approach in terms of running times and the prediction performance over different measures. |
---|---|
ISSN: | 0167-8655 1872-7344 |
DOI: | 10.1016/j.patrec.2018.04.015 |