Loading…

NOA-LSTM: An efficient LSTM cell architecture for time series forecasting

The application of Machine learning and deep learning techniques for time series forecasting has gained significant attention in recent years. Numerous endeavors have been devoted to automating forecasting through the utilization of cutting-edge neural networks. Notably, the recurrent neural network...

Full description

Saved in:
Bibliographic Details
Published in:Expert systems with applications 2024-03, Vol.238, p.122333, Article 122333
Main Authors: Yadav, Hemant, Thakkar, Amit
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:The application of Machine learning and deep learning techniques for time series forecasting has gained significant attention in recent years. Numerous endeavors have been devoted to automating forecasting through the utilization of cutting-edge neural networks. Notably, the recurrent neural network (LSTM – Long Short-Term Memory) has emerged as a central concept in most research endeavors. Although LSTM was initially introduced in 1997 for sequence modeling, subsequent updates have primarily focused on language learning tasks. These updates have introduced various computational mechanisms within the LSTM cell, including the forget gate, input gate, and output gate. In this study, we investigate the impact of each computational component in isolation to analyze their effects on time series forecasting tasks. Our experiments utilize the Jena weather dataset and Appliance Energy Usage time series for evaluation. The experimental results reveal that variations of the LSTM model outperform the most popular LSRM cell format in terms of error rate and training time. Specifically, the variations identified in this paper demonstrate superior generalization capabilities and yield reduced forecasting errors.
ISSN:0957-4174
DOI:10.1016/j.eswa.2023.122333