Loading…
Driver Fatigue Detection Based on Convolutional Neural Networks Using EM-CNN
With a focus on fatigue driving detection research, a fully automated driver fatigue status detection algorithm using driving images is proposed. In the proposed algorithm, the multitask cascaded convolutional network (MTCNN) architecture is employed in face detection and feature point location, and...
Saved in:
Published in: | Computational intelligence and neuroscience 2020-11, Vol.2020 (2020), p.1-11 |
---|---|
Main Authors: | , , , , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
cited_by | cdi_FETCH-LOGICAL-c476t-ca26b14db8195d6f54097ea39fed61e56284fe0c7721627402932b8f41f9e9313 |
---|---|
cites | cdi_FETCH-LOGICAL-c476t-ca26b14db8195d6f54097ea39fed61e56284fe0c7721627402932b8f41f9e9313 |
container_end_page | 11 |
container_issue | 2020 |
container_start_page | 1 |
container_title | Computational intelligence and neuroscience |
container_volume | 2020 |
creator | Xu, Yi Yan, Hualin Zhang, Lan Zhou, Nana Zhao, Zuopeng Zhang, Zhongxin |
description | With a focus on fatigue driving detection research, a fully automated driver fatigue status detection algorithm using driving images is proposed. In the proposed algorithm, the multitask cascaded convolutional network (MTCNN) architecture is employed in face detection and feature point location, and the region of interest (ROI) is extracted using feature points. A convolutional neural network, named EM-CNN, is proposed to detect the states of the eyes and mouth from the ROI images. The percentage of eyelid closure over the pupil over time (PERCLOS) and mouth opening degree (POM) are two parameters used for fatigue detection. Experimental results demonstrate that the proposed EM-CNN can efficiently detect driver fatigue status using driving images. The proposed algorithm EM-CNN outperforms other CNN-based methods, i.e., AlexNet, VGG-16, GoogLeNet, and ResNet50, showing accuracy and sensitivity rates of 93.623% and 93.643%, respectively. |
doi_str_mv | 10.1155/2020/7251280 |
format | article |
fullrecord | <record><control><sourceid>gale_pubme</sourceid><recordid>TN_cdi_pubmedcentral_primary_oai_pubmedcentral_nih_gov_7688374</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><galeid>A697129308</galeid><sourcerecordid>A697129308</sourcerecordid><originalsourceid>FETCH-LOGICAL-c476t-ca26b14db8195d6f54097ea39fed61e56284fe0c7721627402932b8f41f9e9313</originalsourceid><addsrcrecordid>eNqF0U1PGzEQBmALtQJKe-u5WqmXSrDg749LJQhQkEJ6KWfL2R0H080a7N2g_nscEoHaS08e2Y9ejz0IfSb4mBAhTiim-ERRQajGO2ifSK1qQRV791pLsYc-5HyPsVAC0120xxg1zHC2j6bnKawgVZduCIsRqnMYoBlC7Kszl6GtSjGJ_Sp243rTddUMxvSyDE8x_c7VbQ79orq4qSez2Uf03rsuw6fteoBuLy9-Ta7q6c8f15PTad1wJYe6cVTOCW_nmhjRSi84NgocMx5aSUBIqrkH3ChFiaSK49IsnWvPiTdgGGEH6Psm92GcL6FtoB9KT_YhhaVLf2x0wf590oc7u4grq6TWTPES8G0bkOLjCHmwy5Ab6DrXQxyzpVxqKaQRotCv_9D7OKbyEy9KUMYpM29q4Tqwofex3NusQ-2pNIqUF2Bd1NFGNSnmnMC_tkywXQ_Trodpt8Ms_HDD70LfuqfwP_1lo6EY8O5NE6Y1luwZP2CkBA</addsrcrecordid><sourcetype>Open Access Repository</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2465234239</pqid></control><display><type>article</type><title>Driver Fatigue Detection Based on Convolutional Neural Networks Using EM-CNN</title><source>Wiley Online Library</source><source>Publicly Available Content Database</source><creator>Xu, Yi ; Yan, Hualin ; Zhang, Lan ; Zhou, Nana ; Zhao, Zuopeng ; Zhang, Zhongxin</creator><contributor>Doulamis, Anastasios D. ; Anastasios D Doulamis</contributor><creatorcontrib>Xu, Yi ; Yan, Hualin ; Zhang, Lan ; Zhou, Nana ; Zhao, Zuopeng ; Zhang, Zhongxin ; Doulamis, Anastasios D. ; Anastasios D Doulamis</creatorcontrib><description>With a focus on fatigue driving detection research, a fully automated driver fatigue status detection algorithm using driving images is proposed. In the proposed algorithm, the multitask cascaded convolutional network (MTCNN) architecture is employed in face detection and feature point location, and the region of interest (ROI) is extracted using feature points. A convolutional neural network, named EM-CNN, is proposed to detect the states of the eyes and mouth from the ROI images. The percentage of eyelid closure over the pupil over time (PERCLOS) and mouth opening degree (POM) are two parameters used for fatigue detection. Experimental results demonstrate that the proposed EM-CNN can efficiently detect driver fatigue status using driving images. The proposed algorithm EM-CNN outperforms other CNN-based methods, i.e., AlexNet, VGG-16, GoogLeNet, and ResNet50, showing accuracy and sensitivity rates of 93.623% and 93.643%, respectively.</description><identifier>ISSN: 1687-5265</identifier><identifier>EISSN: 1687-5273</identifier><identifier>DOI: 10.1155/2020/7251280</identifier><identifier>PMID: 33293943</identifier><language>eng</language><publisher>Cairo, Egypt: Hindawi Publishing Corporation</publisher><subject>Algorithms ; Artificial neural networks ; Discriminant analysis ; Driver fatigue ; Eyelid ; Face recognition ; Fatigue ; Feature extraction ; Mouth ; Neural networks ; Physiology ; Vision systems</subject><ispartof>Computational intelligence and neuroscience, 2020-11, Vol.2020 (2020), p.1-11</ispartof><rights>Copyright © 2020 Zuopeng Zhao et al.</rights><rights>COPYRIGHT 2020 John Wiley & Sons, Inc.</rights><rights>Copyright © 2020 Zuopeng Zhao et al. This is an open access article distributed under the Creative Commons Attribution License (the “License”), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License. https://creativecommons.org/licenses/by/4.0</rights><rights>Copyright © 2020 Zuopeng Zhao et al. 2020</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c476t-ca26b14db8195d6f54097ea39fed61e56284fe0c7721627402932b8f41f9e9313</citedby><cites>FETCH-LOGICAL-c476t-ca26b14db8195d6f54097ea39fed61e56284fe0c7721627402932b8f41f9e9313</cites><orcidid>0000-0001-7320-0669 ; 0000-0003-3352-1315</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktopdf>$$Uhttps://www.proquest.com/docview/2465234239/fulltextPDF?pq-origsite=primo$$EPDF$$P50$$Gproquest$$Hfree_for_read</linktopdf><linktohtml>$$Uhttps://www.proquest.com/docview/2465234239?pq-origsite=primo$$EHTML$$P50$$Gproquest$$Hfree_for_read</linktohtml><link.rule.ids>230,314,778,782,883,25740,27911,27912,36999,37000,44577,74881</link.rule.ids></links><search><contributor>Doulamis, Anastasios D.</contributor><contributor>Anastasios D Doulamis</contributor><creatorcontrib>Xu, Yi</creatorcontrib><creatorcontrib>Yan, Hualin</creatorcontrib><creatorcontrib>Zhang, Lan</creatorcontrib><creatorcontrib>Zhou, Nana</creatorcontrib><creatorcontrib>Zhao, Zuopeng</creatorcontrib><creatorcontrib>Zhang, Zhongxin</creatorcontrib><title>Driver Fatigue Detection Based on Convolutional Neural Networks Using EM-CNN</title><title>Computational intelligence and neuroscience</title><description>With a focus on fatigue driving detection research, a fully automated driver fatigue status detection algorithm using driving images is proposed. In the proposed algorithm, the multitask cascaded convolutional network (MTCNN) architecture is employed in face detection and feature point location, and the region of interest (ROI) is extracted using feature points. A convolutional neural network, named EM-CNN, is proposed to detect the states of the eyes and mouth from the ROI images. The percentage of eyelid closure over the pupil over time (PERCLOS) and mouth opening degree (POM) are two parameters used for fatigue detection. Experimental results demonstrate that the proposed EM-CNN can efficiently detect driver fatigue status using driving images. The proposed algorithm EM-CNN outperforms other CNN-based methods, i.e., AlexNet, VGG-16, GoogLeNet, and ResNet50, showing accuracy and sensitivity rates of 93.623% and 93.643%, respectively.</description><subject>Algorithms</subject><subject>Artificial neural networks</subject><subject>Discriminant analysis</subject><subject>Driver fatigue</subject><subject>Eyelid</subject><subject>Face recognition</subject><subject>Fatigue</subject><subject>Feature extraction</subject><subject>Mouth</subject><subject>Neural networks</subject><subject>Physiology</subject><subject>Vision systems</subject><issn>1687-5265</issn><issn>1687-5273</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2020</creationdate><recordtype>article</recordtype><sourceid>PIMPY</sourceid><recordid>eNqF0U1PGzEQBmALtQJKe-u5WqmXSrDg749LJQhQkEJ6KWfL2R0H080a7N2g_nscEoHaS08e2Y9ejz0IfSb4mBAhTiim-ERRQajGO2ifSK1qQRV791pLsYc-5HyPsVAC0120xxg1zHC2j6bnKawgVZduCIsRqnMYoBlC7Kszl6GtSjGJ_Sp243rTddUMxvSyDE8x_c7VbQ79orq4qSez2Uf03rsuw6fteoBuLy9-Ta7q6c8f15PTad1wJYe6cVTOCW_nmhjRSi84NgocMx5aSUBIqrkH3ChFiaSK49IsnWvPiTdgGGEH6Psm92GcL6FtoB9KT_YhhaVLf2x0wf590oc7u4grq6TWTPES8G0bkOLjCHmwy5Ab6DrXQxyzpVxqKaQRotCv_9D7OKbyEy9KUMYpM29q4Tqwofex3NusQ-2pNIqUF2Bd1NFGNSnmnMC_tkywXQ_Trodpt8Ms_HDD70LfuqfwP_1lo6EY8O5NE6Y1luwZP2CkBA</recordid><startdate>20201118</startdate><enddate>20201118</enddate><creator>Xu, Yi</creator><creator>Yan, Hualin</creator><creator>Zhang, Lan</creator><creator>Zhou, Nana</creator><creator>Zhao, Zuopeng</creator><creator>Zhang, Zhongxin</creator><general>Hindawi Publishing Corporation</general><general>Hindawi</general><general>John Wiley & Sons, Inc</general><general>Hindawi Limited</general><scope>ADJCN</scope><scope>AHFXO</scope><scope>RHU</scope><scope>RHW</scope><scope>RHX</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>3V.</scope><scope>7QF</scope><scope>7QQ</scope><scope>7SC</scope><scope>7SE</scope><scope>7SP</scope><scope>7SR</scope><scope>7TA</scope><scope>7TB</scope><scope>7TK</scope><scope>7U5</scope><scope>7X7</scope><scope>7XB</scope><scope>8AL</scope><scope>8BQ</scope><scope>8FD</scope><scope>8FE</scope><scope>8FG</scope><scope>8FH</scope><scope>8FI</scope><scope>8FJ</scope><scope>8FK</scope><scope>ABJCF</scope><scope>ABUWG</scope><scope>AFKRA</scope><scope>ARAPS</scope><scope>AZQEC</scope><scope>BBNVY</scope><scope>BENPR</scope><scope>BGLVJ</scope><scope>BHPHI</scope><scope>CCPQU</scope><scope>CWDGH</scope><scope>DWQXO</scope><scope>F28</scope><scope>FR3</scope><scope>FYUFA</scope><scope>GHDGH</scope><scope>GNUQQ</scope><scope>H8D</scope><scope>H8G</scope><scope>HCIFZ</scope><scope>JG9</scope><scope>JQ2</scope><scope>K7-</scope><scope>K9.</scope><scope>KR7</scope><scope>L6V</scope><scope>L7M</scope><scope>LK8</scope><scope>L~C</scope><scope>L~D</scope><scope>M0N</scope><scope>M0S</scope><scope>M1P</scope><scope>M7P</scope><scope>M7S</scope><scope>P5Z</scope><scope>P62</scope><scope>PIMPY</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PRINS</scope><scope>PSYQQ</scope><scope>PTHSS</scope><scope>Q9U</scope><scope>7X8</scope><scope>5PM</scope><orcidid>https://orcid.org/0000-0001-7320-0669</orcidid><orcidid>https://orcid.org/0000-0003-3352-1315</orcidid></search><sort><creationdate>20201118</creationdate><title>Driver Fatigue Detection Based on Convolutional Neural Networks Using EM-CNN</title><author>Xu, Yi ; Yan, Hualin ; Zhang, Lan ; Zhou, Nana ; Zhao, Zuopeng ; Zhang, Zhongxin</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c476t-ca26b14db8195d6f54097ea39fed61e56284fe0c7721627402932b8f41f9e9313</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2020</creationdate><topic>Algorithms</topic><topic>Artificial neural networks</topic><topic>Discriminant analysis</topic><topic>Driver fatigue</topic><topic>Eyelid</topic><topic>Face recognition</topic><topic>Fatigue</topic><topic>Feature extraction</topic><topic>Mouth</topic><topic>Neural networks</topic><topic>Physiology</topic><topic>Vision systems</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Xu, Yi</creatorcontrib><creatorcontrib>Yan, Hualin</creatorcontrib><creatorcontrib>Zhang, Lan</creatorcontrib><creatorcontrib>Zhou, Nana</creatorcontrib><creatorcontrib>Zhao, Zuopeng</creatorcontrib><creatorcontrib>Zhang, Zhongxin</creatorcontrib><collection>الدوريات العلمية والإحصائية - e-Marefa Academic and Statistical Periodicals</collection><collection>معرفة - المحتوى العربي الأكاديمي المتكامل - e-Marefa Academic Complete</collection><collection>Hindawi Publishing Complete</collection><collection>Hindawi Publishing Subscription Journals</collection><collection>Hindawi Publishing Open Access Journals</collection><collection>CrossRef</collection><collection>ProQuest Central (Corporate)</collection><collection>Aluminium Industry Abstracts</collection><collection>Ceramic Abstracts</collection><collection>Computer and Information Systems Abstracts</collection><collection>Corrosion Abstracts</collection><collection>Electronics & Communications Abstracts</collection><collection>Engineered Materials Abstracts</collection><collection>Materials Business File</collection><collection>Mechanical & Transportation Engineering Abstracts</collection><collection>Neurosciences Abstracts</collection><collection>Solid State and Superconductivity Abstracts</collection><collection>Health & Medical Collection (Proquest)</collection><collection>ProQuest Central (purchase pre-March 2016)</collection><collection>Computing Database (Alumni Edition)</collection><collection>METADEX</collection><collection>Technology Research Database</collection><collection>ProQuest SciTech Collection</collection><collection>ProQuest Technology Collection</collection><collection>ProQuest Natural Science Collection</collection><collection>Hospital Premium Collection</collection><collection>Hospital Premium Collection (Alumni Edition)</collection><collection>ProQuest Central (Alumni) (purchase pre-March 2016)</collection><collection>Materials Science & Engineering Collection</collection><collection>ProQuest Central (Alumni)</collection><collection>ProQuest Central</collection><collection>Advanced Technologies & Aerospace Collection</collection><collection>ProQuest Central Essentials</collection><collection>Biological Science Collection</collection><collection>ProQuest Central</collection><collection>Technology Collection</collection><collection>ProQuest Natural Science Collection</collection><collection>ProQuest One Community College</collection><collection>Middle East & Africa Database</collection><collection>ProQuest Central Korea</collection><collection>ANTE: Abstracts in New Technology & Engineering</collection><collection>Engineering Research Database</collection><collection>Health Research Premium Collection</collection><collection>Health Research Premium Collection (Alumni)</collection><collection>ProQuest Central Student</collection><collection>Aerospace Database</collection><collection>Copper Technical Reference Library</collection><collection>SciTech Premium Collection</collection><collection>Materials Research Database</collection><collection>ProQuest Computer Science Collection</collection><collection>Computer Science Database</collection><collection>ProQuest Health & Medical Complete (Alumni)</collection><collection>Civil Engineering Abstracts</collection><collection>ProQuest Engineering Collection</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>ProQuest Biological Science Collection</collection><collection>Computer and Information Systems Abstracts Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><collection>Computing Database</collection><collection>Health & Medical Collection (Alumni Edition)</collection><collection>Medical Database</collection><collection>ProQuest Biological Science Journals</collection><collection>Engineering Database</collection><collection>Advanced Technologies & Aerospace Database</collection><collection>ProQuest Advanced Technologies & Aerospace Collection</collection><collection>Publicly Available Content Database</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>ProQuest Central China</collection><collection>ProQuest One Psychology</collection><collection>Engineering Collection</collection><collection>ProQuest Central Basic</collection><collection>MEDLINE - Academic</collection><collection>PubMed Central (Full Participant titles)</collection><jtitle>Computational intelligence and neuroscience</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Xu, Yi</au><au>Yan, Hualin</au><au>Zhang, Lan</au><au>Zhou, Nana</au><au>Zhao, Zuopeng</au><au>Zhang, Zhongxin</au><au>Doulamis, Anastasios D.</au><au>Anastasios D Doulamis</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Driver Fatigue Detection Based on Convolutional Neural Networks Using EM-CNN</atitle><jtitle>Computational intelligence and neuroscience</jtitle><date>2020-11-18</date><risdate>2020</risdate><volume>2020</volume><issue>2020</issue><spage>1</spage><epage>11</epage><pages>1-11</pages><issn>1687-5265</issn><eissn>1687-5273</eissn><abstract>With a focus on fatigue driving detection research, a fully automated driver fatigue status detection algorithm using driving images is proposed. In the proposed algorithm, the multitask cascaded convolutional network (MTCNN) architecture is employed in face detection and feature point location, and the region of interest (ROI) is extracted using feature points. A convolutional neural network, named EM-CNN, is proposed to detect the states of the eyes and mouth from the ROI images. The percentage of eyelid closure over the pupil over time (PERCLOS) and mouth opening degree (POM) are two parameters used for fatigue detection. Experimental results demonstrate that the proposed EM-CNN can efficiently detect driver fatigue status using driving images. The proposed algorithm EM-CNN outperforms other CNN-based methods, i.e., AlexNet, VGG-16, GoogLeNet, and ResNet50, showing accuracy and sensitivity rates of 93.623% and 93.643%, respectively.</abstract><cop>Cairo, Egypt</cop><pub>Hindawi Publishing Corporation</pub><pmid>33293943</pmid><doi>10.1155/2020/7251280</doi><tpages>11</tpages><orcidid>https://orcid.org/0000-0001-7320-0669</orcidid><orcidid>https://orcid.org/0000-0003-3352-1315</orcidid><oa>free_for_read</oa></addata></record> |
fulltext | fulltext |
identifier | ISSN: 1687-5265 |
ispartof | Computational intelligence and neuroscience, 2020-11, Vol.2020 (2020), p.1-11 |
issn | 1687-5265 1687-5273 |
language | eng |
recordid | cdi_pubmedcentral_primary_oai_pubmedcentral_nih_gov_7688374 |
source | Wiley Online Library; Publicly Available Content Database |
subjects | Algorithms Artificial neural networks Discriminant analysis Driver fatigue Eyelid Face recognition Fatigue Feature extraction Mouth Neural networks Physiology Vision systems |
title | Driver Fatigue Detection Based on Convolutional Neural Networks Using EM-CNN |
url | http://sfxeu10.hosted.exlibrisgroup.com/loughborough?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-16T00%3A48%3A10IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-gale_pubme&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Driver%20Fatigue%20Detection%20Based%20on%20Convolutional%20Neural%20Networks%20Using%20EM-CNN&rft.jtitle=Computational%20intelligence%20and%20neuroscience&rft.au=Xu,%20Yi&rft.date=2020-11-18&rft.volume=2020&rft.issue=2020&rft.spage=1&rft.epage=11&rft.pages=1-11&rft.issn=1687-5265&rft.eissn=1687-5273&rft_id=info:doi/10.1155/2020/7251280&rft_dat=%3Cgale_pubme%3EA697129308%3C/gale_pubme%3E%3Cgrp_id%3Ecdi_FETCH-LOGICAL-c476t-ca26b14db8195d6f54097ea39fed61e56284fe0c7721627402932b8f41f9e9313%3C/grp_id%3E%3Coa%3E%3C/oa%3E%3Curl%3E%3C/url%3E&rft_id=info:oai/&rft_pqid=2465234239&rft_id=info:pmid/33293943&rft_galeid=A697129308&rfr_iscdi=true |