Loading…

Machine learning-based vehicle detection and tracking based on headlight extraction and GMM clustering under low illumination conditions

Advanced traffic flow management and control systems aimed at continuously monitoring vehicles are quite popular due to video camera affordability and their wide applicability in intelligent transportation domain. To date, vehicle object detection and tracking at nighttime built upon the use of RGB...

Full description

Saved in:
Bibliographic Details
Published in:Expert systems with applications 2025-04, Vol.267, p.126240, Article 126240
Main Authors: Lashkov, Igor, Yuan, Runze, Zhang, Guohui
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
cited_by
cites cdi_FETCH-LOGICAL-c960-1e4dfedebbb14a18c7ffbd559e1a8b32ba002df5d63072007c96363aad374b273
container_end_page
container_issue
container_start_page 126240
container_title Expert systems with applications
container_volume 267
creator Lashkov, Igor
Yuan, Runze
Zhang, Guohui
description Advanced traffic flow management and control systems aimed at continuously monitoring vehicles are quite popular due to video camera affordability and their wide applicability in intelligent transportation domain. To date, vehicle object detection and tracking at nighttime built upon the use of RGB color sensor-based, in particular, monocular cameras, is a challenging task for researchers and engineers. Light reflections, coming from the vehicle lights, road signs, road surface markers, and headlight’s dazzle may significantly distort the camera image and, eventually, drop the performance of the vehicle surveillance systems. To overcome these issues and build this approach, we employ convolutional neural networks to extract the headlights of each vehicle so as to detect their location at the same time. Once the position of every vehicle is known, we perform a newly made adaptive method built on top of Gaussian Mixture Models clustering based on prior knowledge and assumptions to pair up the extracted single headlights belonging to the same vehicle and track the vehicle objects sequentially frame by frame in a video. The trajectories of the vehicles could be further analyzed to generate the traffic volume and speed parameters for traffic flow control. The experiments conducted in different environmental conditions show stable and robust results for the developed approach. Our proposed method achieved the best overall performance with an average of 97.4 % IDF1, 96.8 % IDR, 94.7 % Recall, 104 MT, 398 FN, and 90.6 % MOTA indicators on a public headlight detection and tracking dataset.
doi_str_mv 10.1016/j.eswa.2024.126240
format article
fullrecord <record><control><sourceid>elsevier_cross</sourceid><recordid>TN_cdi_crossref_primary_10_1016_j_eswa_2024_126240</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><els_id>S0957417424031075</els_id><sourcerecordid>S0957417424031075</sourcerecordid><originalsourceid>FETCH-LOGICAL-c960-1e4dfedebbb14a18c7ffbd559e1a8b32ba002df5d63072007c96363aad374b273</originalsourceid><addsrcrecordid>eNp9kEtOwzAQhr0AifK4ACtfIMF2Hm4lNqiCgtSKTffW2J40Lq6D7LSFG3BsEopYsvLI83-jmY-QW85yznh9t80xHSEXTJQ5F7Uo2RmZsFkls5LL8oJcprRljEvG5IR8rcC0LiD1CDG4sMk0JLT0gK0zHqnFHk3vukAhWNpHMG9DiJ5Cw2-LYL3btD3Fj7H7F12sVtT4feoxjsA-WIzUd0fqvN_vXICfpOmCdWOVrsl5Az7hze97RdZPj-v5c7Z8XbzMH5aZmdUs41jaBi1qrXkJfGpk02hbVTPkMNWF0MCYsE1l64JJMVw4UEVdANhCllrI4oqI01gTu5QiNuo9uh3ET8WZGvWprRr1qVGfOukboPsThMNiB4dRJeMwGLQuDnaU7dx_-DcmcX74</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype></control><display><type>article</type><title>Machine learning-based vehicle detection and tracking based on headlight extraction and GMM clustering under low illumination conditions</title><source>ScienceDirect Freedom Collection</source><creator>Lashkov, Igor ; Yuan, Runze ; Zhang, Guohui</creator><creatorcontrib>Lashkov, Igor ; Yuan, Runze ; Zhang, Guohui</creatorcontrib><description>Advanced traffic flow management and control systems aimed at continuously monitoring vehicles are quite popular due to video camera affordability and their wide applicability in intelligent transportation domain. To date, vehicle object detection and tracking at nighttime built upon the use of RGB color sensor-based, in particular, monocular cameras, is a challenging task for researchers and engineers. Light reflections, coming from the vehicle lights, road signs, road surface markers, and headlight’s dazzle may significantly distort the camera image and, eventually, drop the performance of the vehicle surveillance systems. To overcome these issues and build this approach, we employ convolutional neural networks to extract the headlights of each vehicle so as to detect their location at the same time. Once the position of every vehicle is known, we perform a newly made adaptive method built on top of Gaussian Mixture Models clustering based on prior knowledge and assumptions to pair up the extracted single headlights belonging to the same vehicle and track the vehicle objects sequentially frame by frame in a video. The trajectories of the vehicles could be further analyzed to generate the traffic volume and speed parameters for traffic flow control. The experiments conducted in different environmental conditions show stable and robust results for the developed approach. Our proposed method achieved the best overall performance with an average of 97.4 % IDF1, 96.8 % IDR, 94.7 % Recall, 104 MT, 398 FN, and 90.6 % MOTA indicators on a public headlight detection and tracking dataset.</description><identifier>ISSN: 0957-4174</identifier><identifier>DOI: 10.1016/j.eswa.2024.126240</identifier><language>eng</language><publisher>Elsevier Ltd</publisher><subject>Headlight extraction ; Nighttime conditions ; Traffic flow monitoring ; Vehicle detection ; Vehicle tracking</subject><ispartof>Expert systems with applications, 2025-04, Vol.267, p.126240, Article 126240</ispartof><rights>2024 Elsevier Ltd</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><cites>FETCH-LOGICAL-c960-1e4dfedebbb14a18c7ffbd559e1a8b32ba002df5d63072007c96363aad374b273</cites><orcidid>0000-0001-7343-2857 ; 0000-0001-6418-4660 ; 0000-0001-5194-9222</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>314,776,780,27903,27904</link.rule.ids></links><search><creatorcontrib>Lashkov, Igor</creatorcontrib><creatorcontrib>Yuan, Runze</creatorcontrib><creatorcontrib>Zhang, Guohui</creatorcontrib><title>Machine learning-based vehicle detection and tracking based on headlight extraction and GMM clustering under low illumination conditions</title><title>Expert systems with applications</title><description>Advanced traffic flow management and control systems aimed at continuously monitoring vehicles are quite popular due to video camera affordability and their wide applicability in intelligent transportation domain. To date, vehicle object detection and tracking at nighttime built upon the use of RGB color sensor-based, in particular, monocular cameras, is a challenging task for researchers and engineers. Light reflections, coming from the vehicle lights, road signs, road surface markers, and headlight’s dazzle may significantly distort the camera image and, eventually, drop the performance of the vehicle surveillance systems. To overcome these issues and build this approach, we employ convolutional neural networks to extract the headlights of each vehicle so as to detect their location at the same time. Once the position of every vehicle is known, we perform a newly made adaptive method built on top of Gaussian Mixture Models clustering based on prior knowledge and assumptions to pair up the extracted single headlights belonging to the same vehicle and track the vehicle objects sequentially frame by frame in a video. The trajectories of the vehicles could be further analyzed to generate the traffic volume and speed parameters for traffic flow control. The experiments conducted in different environmental conditions show stable and robust results for the developed approach. Our proposed method achieved the best overall performance with an average of 97.4 % IDF1, 96.8 % IDR, 94.7 % Recall, 104 MT, 398 FN, and 90.6 % MOTA indicators on a public headlight detection and tracking dataset.</description><subject>Headlight extraction</subject><subject>Nighttime conditions</subject><subject>Traffic flow monitoring</subject><subject>Vehicle detection</subject><subject>Vehicle tracking</subject><issn>0957-4174</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2025</creationdate><recordtype>article</recordtype><recordid>eNp9kEtOwzAQhr0AifK4ACtfIMF2Hm4lNqiCgtSKTffW2J40Lq6D7LSFG3BsEopYsvLI83-jmY-QW85yznh9t80xHSEXTJQ5F7Uo2RmZsFkls5LL8oJcprRljEvG5IR8rcC0LiD1CDG4sMk0JLT0gK0zHqnFHk3vukAhWNpHMG9DiJ5Cw2-LYL3btD3Fj7H7F12sVtT4feoxjsA-WIzUd0fqvN_vXICfpOmCdWOVrsl5Az7hze97RdZPj-v5c7Z8XbzMH5aZmdUs41jaBi1qrXkJfGpk02hbVTPkMNWF0MCYsE1l64JJMVw4UEVdANhCllrI4oqI01gTu5QiNuo9uh3ET8WZGvWprRr1qVGfOukboPsThMNiB4dRJeMwGLQuDnaU7dx_-DcmcX74</recordid><startdate>20250401</startdate><enddate>20250401</enddate><creator>Lashkov, Igor</creator><creator>Yuan, Runze</creator><creator>Zhang, Guohui</creator><general>Elsevier Ltd</general><scope>AAYXX</scope><scope>CITATION</scope><orcidid>https://orcid.org/0000-0001-7343-2857</orcidid><orcidid>https://orcid.org/0000-0001-6418-4660</orcidid><orcidid>https://orcid.org/0000-0001-5194-9222</orcidid></search><sort><creationdate>20250401</creationdate><title>Machine learning-based vehicle detection and tracking based on headlight extraction and GMM clustering under low illumination conditions</title><author>Lashkov, Igor ; Yuan, Runze ; Zhang, Guohui</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c960-1e4dfedebbb14a18c7ffbd559e1a8b32ba002df5d63072007c96363aad374b273</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2025</creationdate><topic>Headlight extraction</topic><topic>Nighttime conditions</topic><topic>Traffic flow monitoring</topic><topic>Vehicle detection</topic><topic>Vehicle tracking</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Lashkov, Igor</creatorcontrib><creatorcontrib>Yuan, Runze</creatorcontrib><creatorcontrib>Zhang, Guohui</creatorcontrib><collection>CrossRef</collection><jtitle>Expert systems with applications</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Lashkov, Igor</au><au>Yuan, Runze</au><au>Zhang, Guohui</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Machine learning-based vehicle detection and tracking based on headlight extraction and GMM clustering under low illumination conditions</atitle><jtitle>Expert systems with applications</jtitle><date>2025-04-01</date><risdate>2025</risdate><volume>267</volume><spage>126240</spage><pages>126240-</pages><artnum>126240</artnum><issn>0957-4174</issn><abstract>Advanced traffic flow management and control systems aimed at continuously monitoring vehicles are quite popular due to video camera affordability and their wide applicability in intelligent transportation domain. To date, vehicle object detection and tracking at nighttime built upon the use of RGB color sensor-based, in particular, monocular cameras, is a challenging task for researchers and engineers. Light reflections, coming from the vehicle lights, road signs, road surface markers, and headlight’s dazzle may significantly distort the camera image and, eventually, drop the performance of the vehicle surveillance systems. To overcome these issues and build this approach, we employ convolutional neural networks to extract the headlights of each vehicle so as to detect their location at the same time. Once the position of every vehicle is known, we perform a newly made adaptive method built on top of Gaussian Mixture Models clustering based on prior knowledge and assumptions to pair up the extracted single headlights belonging to the same vehicle and track the vehicle objects sequentially frame by frame in a video. The trajectories of the vehicles could be further analyzed to generate the traffic volume and speed parameters for traffic flow control. The experiments conducted in different environmental conditions show stable and robust results for the developed approach. Our proposed method achieved the best overall performance with an average of 97.4 % IDF1, 96.8 % IDR, 94.7 % Recall, 104 MT, 398 FN, and 90.6 % MOTA indicators on a public headlight detection and tracking dataset.</abstract><pub>Elsevier Ltd</pub><doi>10.1016/j.eswa.2024.126240</doi><orcidid>https://orcid.org/0000-0001-7343-2857</orcidid><orcidid>https://orcid.org/0000-0001-6418-4660</orcidid><orcidid>https://orcid.org/0000-0001-5194-9222</orcidid></addata></record>
fulltext fulltext
identifier ISSN: 0957-4174
ispartof Expert systems with applications, 2025-04, Vol.267, p.126240, Article 126240
issn 0957-4174
language eng
recordid cdi_crossref_primary_10_1016_j_eswa_2024_126240
source ScienceDirect Freedom Collection
subjects Headlight extraction
Nighttime conditions
Traffic flow monitoring
Vehicle detection
Vehicle tracking
title Machine learning-based vehicle detection and tracking based on headlight extraction and GMM clustering under low illumination conditions
url http://sfxeu10.hosted.exlibrisgroup.com/loughborough?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-28T03%3A09%3A14IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-elsevier_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Machine%20learning-based%20vehicle%20detection%20and%20tracking%20based%20on%20headlight%20extraction%20and%20GMM%20clustering%20under%20low%20illumination%20conditions&rft.jtitle=Expert%20systems%20with%20applications&rft.au=Lashkov,%20Igor&rft.date=2025-04-01&rft.volume=267&rft.spage=126240&rft.pages=126240-&rft.artnum=126240&rft.issn=0957-4174&rft_id=info:doi/10.1016/j.eswa.2024.126240&rft_dat=%3Celsevier_cross%3ES0957417424031075%3C/elsevier_cross%3E%3Cgrp_id%3Ecdi_FETCH-LOGICAL-c960-1e4dfedebbb14a18c7ffbd559e1a8b32ba002df5d63072007c96363aad374b273%3C/grp_id%3E%3Coa%3E%3C/oa%3E%3Curl%3E%3C/url%3E&rft_id=info:oai/&rft_id=info:pmid/&rfr_iscdi=true