Loading…

Behavior Prediction for Unmanned Driving Based on Dual Fusions of Feature and Decision

Behavioral decision systems may suffer from poor performance due to the failure in capturing the vibrations of environmental information. To better capture such vibrations and then make more accurate predictions, a parallel deep neural network based on dual fusions including feature and decision is...

Full description

Saved in:
Bibliographic Details
Published in:IEEE transactions on intelligent transportation systems 2021-06, Vol.22 (6), p.3687-3696
Main Authors: Zhong, Shan, Wei, Meng, Gong, Shengrong, Xia, Kaijian, Fu, Yuchen, Fu, Qiming, Yin, Hongsheng
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Behavioral decision systems may suffer from poor performance due to the failure in capturing the vibrations of environmental information. To better capture such vibrations and then make more accurate predictions, a parallel deep neural network based on dual fusions including feature and decision is proposed, called DFFD-Net. DFFD-NET is composed of two parts, the feature fusion network and the driving data network. The feature fusion model adopts two different operations, deconvolution and linear weighting, to fuse local features and global features, respectively. Deconvolution is applied between the convolutional layers, while linear weighting is operated among the outputs of SPP and LSTM. To further improve the accuracy of the prediction, the decisions generated from both networks are further weighed to get the final decision. Experimentally, DFFD-NET is implemented in the benchmarks BDDV and TORCS, and the results show that the final performance is benefited from both feature fusion and decision fusion. From the comparison, DFFD-NET can get state-of-the-art results on both perplexity and precision by only using the images captured from the front-facing camera as well as a few sensing data.
ISSN:1524-9050
1558-0016
DOI:10.1109/TITS.2020.3037926