Loading…

Training extreme learning machine via regularized correntropy criterion

In this paper, a regularized correntropy criterion (RCC) for extreme learning machine (ELM) is proposed to deal with the training set with noises or outliers. In RCC, the Gaussian kernel function is utilized to substitute Euclidean norm of the mean square error (MSE) criterion. Replacing MSE by RCC...

Full description

Saved in:
Bibliographic Details
Published in:Neural computing & applications 2013-12, Vol.23 (7-8), p.1977-1986
Main Authors: Xing, Hong-Jie, Wang, Xin-Mei
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:In this paper, a regularized correntropy criterion (RCC) for extreme learning machine (ELM) is proposed to deal with the training set with noises or outliers. In RCC, the Gaussian kernel function is utilized to substitute Euclidean norm of the mean square error (MSE) criterion. Replacing MSE by RCC can enhance the anti-noise ability of ELM. Moreover, the optimal weights connecting the hidden and output layers together with the optimal bias terms can be promptly obtained by the half-quadratic (HQ) optimization technique with an iterative manner. Experimental results on the four synthetic data sets and the fourteen benchmark data sets demonstrate that the proposed method is superior to the traditional ELM and the regularized ELM both trained by the MSE criterion.
ISSN:0941-0643
1433-3058
DOI:10.1007/s00521-012-1184-y