Loading…
Geodesic graph cut for interactive image segmentation
Interactive segmentation is useful for selecting objects of interest in images and continues to be a topic of much study. Methods that grow regions from foreground/background seeds, such as the recent geodesic segmentation approach, avoid the boundary-length bias of graph-cut methods but have their...
Saved in:
Main Authors: | , , |
---|---|
Format: | Conference Proceeding |
Language: | English |
Subjects: | |
Citations: | Items that cite this one |
Online Access: | Request full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Interactive segmentation is useful for selecting objects of interest in images and continues to be a topic of much study. Methods that grow regions from foreground/background seeds, such as the recent geodesic segmentation approach, avoid the boundary-length bias of graph-cut methods but have their own bias towards minimizing paths to the seeds, resulting in increased sensitivity to seed placement. The lack of edge modeling in geodesic or similar approaches limits their ability to precisely localize object boundaries, something at which graph-cut methods generally excel. This paper presents a method for combining geodesic-distance information with edge information in a graphcut optimization framework, leveraging the complementary strengths of each. Rather than a fixed combination we use the distinctiveness of the foreground/background color models to predict the effectiveness of the geodesic distance term and adjust the weighting accordingly. We also introduce a spatially varying weighting that decreases the potential for shortcutting in object interiors while transferring greater control to the edge term for better localization near object boundaries. Results show our method is less prone to shortcutting than typical graph cut methods while being less sensitive to seed placement and better at edge localization than geodesic methods. This leads to increased segmentation accuracy and reduced effort on the part of the user. |
---|---|
ISSN: | 1063-6919 |
DOI: | 10.1109/CVPR.2010.5540079 |