Loading…

Traffic flow prediction using LSTM with feature enhancement

Long short-term memory (LSTM) is widely used to process and predict events with time series, but it is difficult to solve exceedingly long-term dependencies, possibly because the LSTM errors increase as the sequence length increases. Recently, researchers have noted that adding features on multiple...

Full description

Saved in:
Bibliographic Details
Published in:Neurocomputing (Amsterdam) 2019-03, Vol.332, p.320-327
Main Authors: Yang, Bailin, Sun, Shulin, Li, Jianyuan, Lin, Xianxuan, Tian, Yan
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Long short-term memory (LSTM) is widely used to process and predict events with time series, but it is difficult to solve exceedingly long-term dependencies, possibly because the LSTM errors increase as the sequence length increases. Recently, researchers have noted that adding features on multiple time scales can help improve the long-term dependency of the RNN, which is inspired by the attention mechanism, considering the need for historical data in traffic flow prediction. We propose an improved approach that connects the high-impact value of remarkably long sequence time steps to the current time step, and these high-impact traffic flow values are captured using the attention mechanism. At the same time, we smooth out some data beyond the normal range to obtain better prediction results. The experimental results show that the proposed prediction model has certain competitiveness in short-term traffic flow predictions.
ISSN:0925-2312
1872-8286
DOI:10.1016/j.neucom.2018.12.016