Loading…

Non-negative Radial Basis Function Neural Network in Polynomial Feature Space

Radial basis function neural network (RBFNN) is an effective nonlinear learning model, which has a strong nonlinear fitting capability. The hidden neurons and the weight play important roles in the neural network. In the existing researches, the hidden neurons are computed in the form of a rigorous...

Full description

Saved in:
Bibliographic Details
Published in:Journal of physics. Conference series 2019-02, Vol.1168 (6), p.62005
Main Authors: Wang, Huiyang, Zhao, Yang, Pei, Jihong, Zeng, Dehuai, Liu, Meijuan
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Radial basis function neural network (RBFNN) is an effective nonlinear learning model, which has a strong nonlinear fitting capability. The hidden neurons and the weight play important roles in the neural network. In the existing researches, the hidden neurons are computed in the form of a rigorous linear combination. And the weight is hard to be solved. To address these issues, in this paper a novel neural network named non-negative radial basis function neural network (NRBFNN) is proposed. The main thought of non-negative matrix factorization (NMF) is to be used to train the parameters of RBFNN. According to the structure of the neural network, the label information of the samples is decomposed into the weight matrix and the mapped feature by activation functions in polynomial feature space. And the proposed method is able to obtain the weight matrix and the hidden neurons implied in the activation functions iteratively. Furthermore, the proposed NRBFNN is able to improve the representability of the hidden neurons and the iterative formulas of the weight can ensure the solvability and interpretability of it. ORL, Yale and Caltech 101 face databases are selected for evaluations. Experimental results show that the proposed algorithm outperforms several related algorithms.
ISSN:1742-6588
1742-6596
DOI:10.1088/1742-6596/1168/6/062005