Loading…

Pornographic image detection utilizing deep convolutional neural networks

Many internet users are potential victims of the pornographic images and a large part of them are underage children. Thus, content-based pornographic images detection is an important task in computer vision and multimedia research. Previous solutions usually rely on hand-engineered visual features t...

Full description

Saved in:
Bibliographic Details
Published in:Neurocomputing (Amsterdam) 2016-10, Vol.210, p.283-293
Main Authors: Nian, Fudong, Li, Teng, Wang, Yan, Xu, Mingliang, Wu, Jun
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Many internet users are potential victims of the pornographic images and a large part of them are underage children. Thus, content-based pornographic images detection is an important task in computer vision and multimedia research. Previous solutions usually rely on hand-engineered visual features that are hard to analyze and select. In this paper, to detect pornographic images in any style accurately and efficiently with a single model, a novel scheme utilizing the deep convolutional neural networks (CNN) is proposed. The training data are obtained from internet followed by an improved sliding window method and some novel data augmentation approaches. Then a highly efficient training algorithm is proposed based on two strategies. The first is the pre-trained mid-level representations non-fixed fine-tuning strategy. The second is adjusting the training data at the appropriate time on the basis of the performance of the proposed network on the validation set. Furthermore, we introduce a fast image scanning method which is also based on the sliding window approach in the test. We further propose a fast forward pass method based on the “fixed-point algorithm”. So our CNN could detect all scale images so fast by one forward pass. The effectiveness of the proposed method is demonstrated in experiments on the proposed dataset and the comparative results show that our method lead to state-of-the-art detection performance.
ISSN:0925-2312
1872-8286
DOI:10.1016/j.neucom.2015.09.135