Loading…

Multistage Framework for Automatic Face Mask Detection Using Deep Learning

The whole world is fighting as one against a deadly virus. COVID-19 cases are upon us in waves, with subsequent waves turning out to be worse than the previous one. Scores of human lives are lost while the post-COVID-19 complications are on a rise. Monitoring the behaviour of people in public places...

Full description

Saved in:
Bibliographic Details
Published in:Computational intelligence and neuroscience 2022-08, Vol.2022, p.1500047-12
Main Authors: K. N, Sowmya, P. M, Rekha, Kumari, Trishala, Debtera, Baru
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:The whole world is fighting as one against a deadly virus. COVID-19 cases are upon us in waves, with subsequent waves turning out to be worse than the previous one. Scores of human lives are lost while the post-COVID-19 complications are on a rise. Monitoring the behaviour of people in public places and offices is necessary to mitigate the transmission of COVID-19 among humans. In this work, a low-cost, lightweight two-stage face mask detection model is proposed. In the first stage, the model checks if a face mask is worn. In the second stage, it detects if the mask is worn appropriately, by classifying and labelling them. The proposed models are trained to detect faces with and without masks for varied inputs such as images, recorded videos, and live streaming videos where it can efficiently detect multiple faces at once. The efficacy of the proposed approach is tested against conventional datasets as well as our proposed dataset, which includes no masks, surgical masks, and nonsurgical masks. In this work, multiple CNN models like MobileNetV2, ResNet50V2, and InceptionV3 have been considered for training and are evaluated based on transfer learning. We further rely on MobileNetV2 as the backbone model since it has an accuracy of 98.44%.
ISSN:1687-5265
1687-5273
1687-5273
DOI:10.1155/2022/1500047