Loading…

Temporal Information Fusion Network for Driving Behavior Prediction

Since enormous hazards are caused by traffic crashes every year, ensuring safe driving is a hot topic in transportation. Technologies related to the Advanced Driver Assistance System (ADAS) are evolving rapidly. But without an adequate understanding of driving intention, ADAS usually can't help...

Full description

Saved in:
Bibliographic Details
Published in:IEEE transactions on intelligent transportation systems 2023-09, Vol.24 (9), p.1-10
Main Authors: Guo, Chenghao, Liu, Haizhuang, Chen, Jiansheng, Ma, Huimin
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Since enormous hazards are caused by traffic crashes every year, ensuring safe driving is a hot topic in transportation. Technologies related to the Advanced Driver Assistance System (ADAS) are evolving rapidly. But without an adequate understanding of driving intention, ADAS usually can't help the driver prepare for the danger in advance. This paper focuses on the fusion strategy of driver and environment information and proposes a lightweight end-to-end model, temporal information fusion network (TIFN). Driving behavior is the interactive result of the driver and the external world. To better understand the driver's intention, the state update cell (STU) is proposed to introduce the influence of environment information into the driver's state modeling, inspired by the selective attention of the human cognition process. Meanwhile, semantic segmentation features are extracted to offer clear clues affecting driver attention in place of motion optical flow images and binary value vectors. Finally, the driver's intention and environment state are combined to make a joint prediction. The experiments evaluated on Brain4cars and IESDD show that the proposed approach has superior performance than other approaches that only use camera data.
ISSN:1524-9050
1558-0016
DOI:10.1109/TITS.2023.3267150