Loading…
Robust CNN-based Camera Model Identification against Fixed L∞-norm Adversarial Examples
Various machine-learning and deep-learning models misclassify adversarial examples, which are formed by adding small perturbations to the original image. Image forensics is concerned with either the source camera identification or the identification of an image's editing history. Convolution ne...
Saved in:
Main Authors: | , , |
---|---|
Format: | Conference Proceeding |
Language: | English |
Subjects: | |
Online Access: | Request full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Various machine-learning and deep-learning models misclassify adversarial examples, which are formed by adding small perturbations to the original image. Image forensics is concerned with either the source camera identification or the identification of an image's editing history. Convolution neural network (CNN) based forensic methods are constantly being used as they can learn the traces left in an image, that relates to both the camera model and the manipulation. However, these CNNs are vulnerable to adversarial examples. One of the most effective defenses against adversarial images is to train the classifier with adversarial perturbed images. But adversarial training takes a very long time due to many gradient computations to the pixel values. Also, steganalysis, which aims to unearth an image's hidden message, is a natural choice for adversarial detection since both detect manipulations on the input. This paper proposes a robust method using a weighted steganalysis algorithm before passing it to a free adversarial trained camera- model classifier network. Five camera models have been used from the VISION dataset with DenseNet as the CNN classifier. The robust accuracy against the original images was 91.6 %, and for adversarial examples, it is 43.8 %. With the proposed filter, the robust accuracy against adversarial examples increases to 58.1%; however, the accuracy for the original images drops to 89.6 % |
---|---|
ISSN: | 2325-9418 |
DOI: | 10.1109/INDICON56171.2022.10040104 |