Loading…
Recycling: Semi-Supervised Learning With Noisy Labels in Deep Neural Networks
Nowadays, as noisy labels, also known as erroneous labels, are becoming increasingly ubiquitous in daily life, it is of great importance to be able to learn from them. This paper efficiently uses semi-supervised learning to highlight a method for utilizing all training data. The proposed algorithm c...
Saved in:
Published in: | IEEE access 2019, Vol.7, p.66998-67005 |
---|---|
Main Authors: | , , , , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Nowadays, as noisy labels, also known as erroneous labels, are becoming increasingly ubiquitous in daily life, it is of great importance to be able to learn from them. This paper efficiently uses semi-supervised learning to highlight a method for utilizing all training data. The proposed algorithm comprises two stages: In the first stage, a small portion of training data regarded as clean samples is selected by the small-loss criteria with a proposed rejection ratio in a mini-batch manner. The proposed rejection ratio is determined by the number of classes regardless of the noise ratio. In the second stage, semi-supervised learning is used to update the parameters of the network with selected clean samples (labeled data) and the remaining noisy samples (unlabeled data) using the Renyi entropy regularization for low-density separation among classes. The proposed algorithm is believed to have the capability to train deep neural network robustly to noisy labels, although noise ratio is unknown. The experimental results considering the MNIST and CIFAR-10 datasets confirm that the proposed algorithm achieves the best accuracy among sample-selection methods. |
---|---|
ISSN: | 2169-3536 2169-3536 |
DOI: | 10.1109/ACCESS.2019.2918794 |