Loading…

The Prediction of Multistep Traffic Flow Based on AST-GCN-LSTM

Aiming at the traffic flow prediction problem of the traffic network, this paper proposes a multistep traffic flow prediction model based on attention-based spatial-temporal-graph neural network-long short-term memory neural network (AST-GCN-LSTM). The model can capture the complex spatial dependenc...

Full description

Saved in:
Bibliographic Details
Published in:Journal of advanced transportation 2021-12, Vol.2021, p.1-10
Main Authors: Hou, Fan, Zhang, Yue, Fu, Xinli, Jiao, Lele, Zheng, Wen
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Aiming at the traffic flow prediction problem of the traffic network, this paper proposes a multistep traffic flow prediction model based on attention-based spatial-temporal-graph neural network-long short-term memory neural network (AST-GCN-LSTM). The model can capture the complex spatial dependence of road nodes on the road network and use LSGC (local spectrogram convolution) to capture spatial correlation features from the K-order local neighbors of the road segment nodes in the road network. It is more accurate to extract the information of neighbor nodes by replacing the single-hop neighborhood matrix with K-order local neighborhoods to expand the receptive field of graph convolution. The high-order neighborhood of road nodes is also fully considered instead of only extracting features from first-order neighbor nodes. In addition, an external attribute enhancement unit is designed to extract external factors (weather, point of interest, time, etc.) that affect traffic flow in order to improve the accuracy of the model’s traffic flow prediction. The experimental results show that when considering the static, dynamic, and static and dynamic combination, the model has excellent performance: RMSE (4.0406, 4.0362, 4.0234), MAE (2.7184, 2.7044, 2.7030), accuracy (0.7132, 0.7190, 0.7223).
ISSN:0197-6729
2042-3195
DOI:10.1155/2021/9513170