Loading…

Investigation on Performance of Neural Networks Using Quadratic Relative Error Cost Function

The performance of neural networks with quadratic cost function (MSE cost function) is analyzed in terms of the adjustment rate of weights and its performance in multi-magnitude data processing using a qualitative mathematical method based on mean squared error. However, neural networks using quadra...

Full description

Saved in:
Bibliographic Details
Published in:IEEE access 2019, Vol.7, p.106642-106652
Main Authors: Zhang, Ning, Shen, Shui-Long, Zhou, Annan, Xu, Ye-Shuang
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:The performance of neural networks with quadratic cost function (MSE cost function) is analyzed in terms of the adjustment rate of weights and its performance in multi-magnitude data processing using a qualitative mathematical method based on mean squared error. However, neural networks using quadratic cost functions exhibit low-weight updating rates and variations in performances in multi-magnitude data processing. This paper investigates the performance of neural networks using a quadratic relative error cost function (REMSE cost function). Two-node-to-one-node models are built to investigate the performance of the REMSE and MSE cost functions in adjustment rate of weights and multi-magnitude data processing. A three-layer neural network is employed to compare the training and prediction performances of the REMSE cost function and MSE cost function. Three LSTM networks are used to evaluate the differences between REMSE, MSE, and Logcosh in actual applications by learning stress and strain of soil. The results indicate that the REMSE cost function can notably accelerate the adjustment rate of weights and improve the performance of the neural network in small magnitude data regression. The applications of the REMSE cost function are also discussed.
ISSN:2169-3536
2169-3536
DOI:10.1109/ACCESS.2019.2930520