Loading…

Optimization of the weights and asymmetric activation function family of neural network for time series forecasting

•We present a method for optimization of the activation functions for ANN.•The proposed optimization method uses Simulated Annealing and Tabu Search.•The proposed method is good for forecasting distinct time series. The use of neural network models for time series forecasting has been motivated by e...

Full description

Saved in:
Bibliographic Details
Published in:Expert systems with applications 2013-11, Vol.40 (16), p.6438-6446
Main Authors: Gomes, Gecynalda S. da S., Ludermir, Teresa B.
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:•We present a method for optimization of the activation functions for ANN.•The proposed optimization method uses Simulated Annealing and Tabu Search.•The proposed method is good for forecasting distinct time series. The use of neural network models for time series forecasting has been motivated by experimental results that indicate high capacity for function approximation with good accuracy. Generally, these models use activation functions with fixed parameters. However, it is known that the choice of activation function strongly influences the complexity and neural network performance and that a limited number of activation functions has been used in general. We describe the use of an asymmetric activation functions family with free parameter for neural networks. We prove that the activation functions family defined, satisfies the requirements of the universal approximation theorem We present a methodology for global optimization of the activation functions family with free parameter and the connections between the processing units of the neural network. The main idea is to optimize, simultaneously, the weights and activation function used in a Multilayer Perceptron (MLP), through an approach that combines the advantages of simulated annealing, tabu search and a local learning algorithm. We have chosen two local learning algorithms: the backpropagation with momentum (BPM) and Levenberg–Marquardt (LM). The overall purpose is to improve performance in time series forecasting.
ISSN:0957-4174
1873-6793
DOI:10.1016/j.eswa.2013.05.053