Loading…
Sparse Multi-Bending Snakes
Active contour models are one of the most emblematic algorithms of computer vision. Their strong theoretical foundations and high user interoperability turned them into a reference approach for object segmentation and tracking tasks. A high number of modifications have already been proposed in order...
Saved in:
Published in: | IEEE transactions on image processing 2019-08, Vol.28 (8), p.3898-3909 |
---|---|
Main Authors: | , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Active contour models are one of the most emblematic algorithms of computer vision. Their strong theoretical foundations and high user interoperability turned them into a reference approach for object segmentation and tracking tasks. A high number of modifications have already been proposed in order to overcome the known problems of traditional snakes, such as initialization dependence and poor convergence to concavities. In this paper, we address the scenario where the user wants to segment an object that has multiple dynamic regions but some of them do not correspond to the true object boundary. We propose a novel parametric active contour model, the Sparse Multi-Bending snake, which is capable of dividing the contour into a set of contiguous regions with different bending properties. We derive a new energy function that induces such behavior and presents a group optimization strategy that can be used to find the optimal bending resistance parameter for each point of the contour. We show the flexibility of our model in a set of synthetic images. In addition, we consider two real applications, lung segmentation in Computerized Tomography data and hand segmentation in depth images. We show how the proposed method is able to improve the segmentations obtained in both applications, when compared with other active contour models. |
---|---|
ISSN: | 1057-7149 1941-0042 |
DOI: | 10.1109/TIP.2019.2902832 |