Loading…

Fast Trajectory Prediction Method With Attention Enhanced SRU

LSTM (Long-short Term Memory) is an effective method for trajectory prediction. However, it needs to rely on the state value of the previous unit when calculating the state value of neurons in the hidden layer, which results in too long training time and prediction time. To solve this problem, we pr...

Full description

Saved in:
Bibliographic Details
Published in:IEEE access 2020, Vol.8, p.206614-206621
Main Authors: Li, Yadong, Liu, Bailong, Zhang, Lei, Yang, Susong, Shao, Changxing, Son, Dan
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:LSTM (Long-short Term Memory) is an effective method for trajectory prediction. However, it needs to rely on the state value of the previous unit when calculating the state value of neurons in the hidden layer, which results in too long training time and prediction time. To solve this problem, we propose Fast Trajectory Prediction method with Attention enhanced SRU (FTP-AS). Firstly, we devise an SRU (Simple Recurrent Units) based trajectory prediction method. It removes the dependencies on the hidden layer state at the previous moment, and enables the model to perform better parallel calculation, speeding up model training and prediction. However, each unit of the SRU calculates the state value at each moment independently, ignoring the timing relationship between the track points and leading to accuracy decrease. Secondly, we develop the attention mechanism to enhance SRU. The influence weight for selective learning is gained by calculating the matching degree of the hidden layer state value at each moment to improve the accuracy of the prediction. Finally, experimental results on MTA bus data set and Porto taxi data set showed that FTP-AS was 3.4 times faster and about 1.7% more accurate than the traditional LSTM method.
ISSN:2169-3536
2169-3536
DOI:10.1109/ACCESS.2020.3035704