Loading…

Spatial–temporal combination and multi-head flow-attention network for traffic flow prediction

Traffic flow prediction based on spatial–temporal data plays a vital role in traffic management. However, it still faces serious challenges due to the complex spatial–temporal correlation in nonlinear spatial–temporal data. Some previous methods have limited ability to capture spatial–temporal corre...

Full description

Saved in:
Bibliographic Details
Published in:Scientific reports 2024-04, Vol.14 (1), p.9604-9604, Article 9604
Main Authors: Yu, Lianfei, Liu, Wenbo, Wu, Dong, Xie, Dongmei, Cai, Chuang, Qu, Zhijian, Li, Panjing
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Traffic flow prediction based on spatial–temporal data plays a vital role in traffic management. However, it still faces serious challenges due to the complex spatial–temporal correlation in nonlinear spatial–temporal data. Some previous methods have limited ability to capture spatial–temporal correlation, and ignore the quadratic complexity problem in the traditional attention mechanism. To this end, we propose a novel spatial–temporal combination and multi-head flow-attention network (STCMFA) to model the spatial–temporal correlation in road networks. Firstly, we design a temporal sequence multi-head flow attention (TS-MFA), in which the unique source competition mechanism and sink allocation mechanism make the model avoid attention degradation without being affected by inductive biases. Secondly, we use GRU instead of the linear layer in traditional attention to map the input sequence, which further enhances the temporal modeling ability of the model. Finally, we combine the GCN with the TS-MFA module to capture the spatial–temporal correlation, and introduce residual mechanism and feature aggregation strategy to further improve the performance of STCMFA. Extensive experiments on four real-world traffic datasets show that our model has excellent performance and is always significantly better than other baselines.
ISSN:2045-2322
2045-2322
DOI:10.1038/s41598-024-60337-7