Loading…
A benchmark for interactive image segmentation algorithms
This paper proposes a general benchmark for interactive segmentation algorithms. The main contribution can be summarized as follows: (I) A new dataset of fifty images is released. These images are categorized into five groups: animal, artifact, human, building and plant. They cover several major cha...
Saved in:
Main Authors: | , , , , |
---|---|
Format: | Conference Proceeding |
Language: | eng ; jpn |
Subjects: | |
Online Access: | Request full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | This paper proposes a general benchmark for interactive segmentation algorithms. The main contribution can be summarized as follows: (I) A new dataset of fifty images is released. These images are categorized into five groups: animal, artifact, human, building and plant. They cover several major challenges for the interactive image segmentation task, including fuzzy boundary, complex texture, cluttered background, shading effect, sharp corner, and overlapping color. (II) We propose two types of schemes, point-process and boundary-process, to generate user scribbles automatically. The point-process simulates the human interaction process that users incrementally draw scribbles to some major components of the image. The boundary-process simulates the refining process that users place more scribbles near the segment boundaries to refine the details of result segments. (III) We then apply two precision measures to quantitatively evaluate the result segments of different algorithm. The region precision measures how many pixels are correctly classified, and the boundary precision measures how close is the segment boundary to the real boundary. This benchmark offered a tentative way to guarantee evaluation fairness of person-oriented tasks. Based on the benchmark, five state-of-the-art interactive segmentation algorithms are evaluated. All the images, synthesized user scribbles, running results are publicly available on the webpage. |
---|---|
DOI: | 10.1109/POV.2011.5712366 |