Loading…
A Fatigue Driving Detection Algorithm Based on Facial Motion Information Entropy
Research studies on machine vision-based driver fatigue detection algorithm have improved traffic safety significantly. Generally, many algorithms asses the driving state according to limited video frames, thus resulting in some inaccuracy. We propose a real-time detection algorithm involved in info...
Saved in:
Published in: | Journal of advanced transportation 2020, Vol.2020 (2020), p.1-17 |
---|---|
Main Authors: | , , , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Research studies on machine vision-based driver fatigue detection algorithm have improved traffic safety significantly. Generally, many algorithms asses the driving state according to limited video frames, thus resulting in some inaccuracy. We propose a real-time detection algorithm involved in information entropy. Particularly, this algorithm relies on the analysis of sufficient consecutive video frames. First, we introduce an improved YOLOv3-tiny convolutional neural network to capture the facial regions under complex driving conditions, eliminating the inaccuracy and affections caused by artificial feature extraction. Second, we construct a geometric area called Face Feature Triangle (FFT) based on the application of the Dlib toolkit as well as the landmarks and the coordinates of the facial regions; then we create a Face Feature Vector (FFV), which contains all the information of the area and centroid of each FFT. We use FFV as an indicator to determine whether the driver is in fatigue state. Finally, we design a sliding window to get the facial information entropy. Comparative experiments show that our algorithm performs better than the current ones on both accuracy and real-time performance. In simulated driving applications, the proposed algorithm detects the fatigue state at a speed of over 20 fps with an accuracy of 94.32%. |
---|---|
ISSN: | 0197-6729 2042-3195 |
DOI: | 10.1155/2020/8851485 |