Loading…
A Similarity-Based Positional Attention-Aided Deep Learning Model for Copy-Move Forgery Detection
The process of modifying digital images has been made significantly easier by the availability of several image editing software. However, in a variety of contexts, including journalism, judicial processes, and historical documentation, the authenticity of images is of utmost importance. In particul...
Saved in:
Published in: | IEEE transactions on artificial intelligence 2024-09, Vol.5 (9), p.4354-4363 |
---|---|
Main Authors: | , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | The process of modifying digital images has been made significantly easier by the availability of several image editing software. However, in a variety of contexts, including journalism, judicial processes, and historical documentation, the authenticity of images is of utmost importance. In particular, copy-move forgery is a distinct type of image manipulation, where a portion of an image is copied and pasted into another area of the same image, creating a fictitious or altered version of the original. In this research, we present a lightweight MultiResUnet architecture with the similarity-based positional attention module (SPAM) attention module for copy-move forgery detection (CMFD). By using a similarity measure across the patches of the features, this attention module identifies the patches, where a forged region is present. The lightweight network also aids in resource-efficient training and transforms the model into one that can be used in real time. We have employed four commonly used but extremely difficult CMFD datasets, namely CoMoFoD, COVERAGE, CASIA v2, and MICC-F600, to assess the effectiveness of our model. The proposed model significantly lowers false positives, thereby improving the pixel-level accuracy and dependability of CMFD tools. |
---|---|
ISSN: | 2691-4581 2691-4581 |
DOI: | 10.1109/TAI.2024.3379941 |