Loading…
Hybrid features and exponential moth-flame optimization based deep belief network for face recognition
Face recognition finds its application in various areas, like biometrics, person identification through their identity cards, Closed-Circuit Television (CCTV) cameras, and so on. Among various biometrics, such as fingerprint, palm print, iris, etc., face plays an important role. Hence, a face recogn...
Saved in:
Published in: | Computer methods in biomechanics and biomedical engineering. 2020-11, Vol.8 (6), p.581-594 |
---|---|
Main Authors: | , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Face recognition finds its application in various areas, like biometrics, person identification through their identity cards, Closed-Circuit Television (CCTV) cameras, and so on. Among various biometrics, such as fingerprint, palm print, iris, etc., face plays an important role. Hence, a face recognition technology was developed in the previous papers, with contributions in the feature extraction and classification phases. In this paper, an approach is developed for face recognition using the proposed Exponential Moth-flame Optimisation (Exponential MFO) based Deep Belief Network (DBN). Initially, the images in the database undergo feature extraction, where the features, such as K-SIFT, m-Co-HOG, along with Active Appearance Models (AAM) features are extracted from the image. Then, the classification is done using the proposed EMFO-DBN. The proposed EMFO-DBN is designed by integrating the Exponential Weighted Moving Average (EWMA) into the update process of the Moth flame optimisation (MFO) algorithm. The experimentation of the proposed method has been done using the CVL database, and the proposed method outclassed other state of the art techniques with the values of 0.98, 0.0073, and 0.0083, respectively, for accuracy, FAR, and FRR, respectively. |
---|---|
ISSN: | 2168-1163 2168-1171 |
DOI: | 10.1080/21681163.2020.1761454 |