Loading…

Detecting distraction of drivers using Convolutional Neural Network

•A model to detect distraction of drivers.•Utilizes Convolutional Neural Networks to detect the activity being performed by the driver.•The model is able to differentiate between the types of distractions as well.•High detection accuracy of 99% achieved over a large dataset.•The model proposed, take...

Full description

Saved in:
Bibliographic Details
Published in:Pattern recognition letters 2020-11, Vol.139, p.79-85
Main Authors: Masood, Sarfaraz, Rai, Abhinav, Aggarwal, Aakash, Doja, M.N., Ahmad, Musheer
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:•A model to detect distraction of drivers.•Utilizes Convolutional Neural Networks to detect the activity being performed by the driver.•The model is able to differentiate between the types of distractions as well.•High detection accuracy of 99% achieved over a large dataset.•The model proposed, takes significantly less training time and delivers high test accuracy. With the intervention of social media and internet technology, people are getting more and more careless and distracted while driving which is having a severe detrimental effect on the safety of the driver and his fellow passengers. To provide an effective solution, this paper puts forward a Machine Learning model using Convolutional Neural Networks to not only detect the distracted driver but also identify the cause of his distraction by analyzing the images obtained using the camera module installed inside the vehicle. Convolutional Neural Networks are known to learn spatial features from images, which can be further examined by fully connected neural networks. The experimental results show a 99% average accuracy in distraction recognition and hence strongly support that our Convolutional Neural Networks model can be used to identify distraction among the drivers.
ISSN:0167-8655
1872-7344
DOI:10.1016/j.patrec.2017.12.023