Loading…
Semi-Supervised Cross-Spectral Face Recognition with Small Datasets
While systems based on deep neural networks have pro-duced remarkable performance on many tasks such as face/object detection and recognition, they also require large amounts of labeled training data. However, there are many applications where collecting a relatively large la-beled training data may...
Saved in:
Main Authors: | , |
---|---|
Format: | Conference Proceeding |
Language: | English |
Subjects: | |
Online Access: | Request full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | While systems based on deep neural networks have pro-duced remarkable performance on many tasks such as face/object detection and recognition, they also require large amounts of labeled training data. However, there are many applications where collecting a relatively large la-beled training data may not be feasible due to time and/or financial constraints. Trying to train deep networks on these small datasets in the standard manner usually leads to serious over-fitting issues and poor generalization. In this work, we explore how a state-of-the-art deep learning pipeline for unconstrained visual face identification and verification can be adapted to domains with scarce data/label availability using semi-supervised learning. The rationale for system adaptation and experiments are set in the following context - given a pretrained network (that was trained on a large training dataset in the source domain), adapt it to generalize onto a target domain using a rela-tively small labeled (typically hundred to ten thousand times smaller) and an unlabeled training dataset. We present al-gorithms and results of extensive experiments with varying training dataset sizes and composition, and model archi-tectures using the IARPA JANUS Benchmark Multi-domain Face dataset for training and evaluation with visible and short-wave infrared domains as the source and target do-mains respectively. |
---|---|
ISSN: | 2690-621X |
DOI: | 10.1109/WACVW60836.2024.00069 |