Loading…

PRNU based source camera attribution for image sets anonymized with patch-match algorithm

Patch-Match is an efficient algorithm used for structural image editing and available as a tool on popular commercial photo-editing software. The tool allows users to insert or remove objects from photos using information from similar scene content. Recently, a modified version of this algorithm was...

Full description

Saved in:
Bibliographic Details
Published in:Digital investigation 2019-09, Vol.30, p.43-51
Main Authors: Karaküçük, Ahmet, Dirik, A. Emir
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Patch-Match is an efficient algorithm used for structural image editing and available as a tool on popular commercial photo-editing software. The tool allows users to insert or remove objects from photos using information from similar scene content. Recently, a modified version of this algorithm was proposed as a counter-measure against Photo-Response Non-Uniformity (PRNU) based Source Camera Identification (SCI). The algorithm can provide anonymity at a great rate (97%) and impede PRNU based SCI without the need of any other information, hence leaving no-known recourse for the PRNU-based SCI. In this paper, we propose a method to identify sources of the Patch-Match-applied images by using randomized subsets of images and the traditional PRNU based SCI methods. We evaluate the proposed method on two forensics scenarios in which an adversary makes use of the Patch-Match algorithm and distorts the PRNU noise pattern in the incriminating images she took with his camera. Our results show that it is possible to link sets of Patch-Match-applied images back to their source camera even in the presence of images that come from unknown cameras. To our best knowledge, the proposed method represents the very first counter-measure against the usage of Patch-Match in the digital forensics literature.
ISSN:1742-2876
1873-202X
DOI:10.1016/j.diin.2019.06.001