Loading…

Color-based object segmentation method using artificial neural network

This paper presents a color-based technique for object segmentation in colored digital images. Principally, we make use of some color spaces to segment pixels as either objects of interest or non-objects using artificial neural networks (ANN). This study clearly shows how a novel method for fusion o...

Full description

Saved in:
Bibliographic Details
Published in:Simulation modelling practice and theory 2016-05, Vol.64, p.3-17
Main Authors: Hassanat, Ahmad B.A., Alkasassbeh, Mouhammd, Al-awadi, Mouhammd, Alhasanat, Esra'a A.A.
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:This paper presents a color-based technique for object segmentation in colored digital images. Principally, we make use of some color spaces to segment pixels as either objects of interest or non-objects using artificial neural networks (ANN). This study clearly shows how a novel method for fusion of the existing color spaces produces better results in practice than individual color spaces. The segmented objects include lips, faces, hands, fingers and tree leaves. Using several databases to represent these problems, the ANN was trained on the color of the pixel and its surrounding 8 neighbors to be an object or non-object; in the test mode the trained set was used to segment the 9pixels in the test image into object or non-object. The feature vector was used for training and testing results from the fusion of different types of color information that came from different color models of the targeted pixel. Several experiments were conducted on different databases and objects to evaluate the proposed method; significant results were recorded, showing the power of expressiveness of color and some texture information to deal with the object segmentation problem.
ISSN:1569-190X
1878-1462
DOI:10.1016/j.simpat.2016.02.011