Loading…
Accurate structure from motion using consistent cluster merging
The incremental Structure-from-Motion approach is widely used for scene reconstruction as it is robust to outliers. However, the method suffers from two major limitations: error accumulation and heavy time consumption. To alleviate these problems, we propose a redundant cluster merge approach which...
Saved in:
Published in: | Multimedia tools and applications 2022-07, Vol.81 (17), p.24913-24935 |
---|---|
Main Authors: | , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | The incremental Structure-from-Motion approach is widely used for scene reconstruction as it is robust to outliers. However, the method suffers from two major limitations: error accumulation and heavy time consumption. To alleviate these problems, we propose a redundant cluster merge approach which is effective and efficient. Different from the previous clustering methods, each cluster has only one overlapping adjacent cluster. each of the sub clusters divided by our approach has several adjacent cluster candidates with overlapping. In addition, these cluster candidates are verified whether they are suitable for merging. By selecting the correct estimated clusters, cluster merging achieves more accurate results. The cluster verification is implemented based on the fact that the correctly estimated clusters have consistent point cloud and extrinsic camera parameters in each image of the same scene will be formulated as two constraints. In addition, we introduce a feature matching consistency constraint to eliminate the falsely matched feature pairs. The gain in accuracy of feature matching leads to better estimated results in each cluster. Experiments were performed on three public datasets. The reconstruction results show that our method outperformed state-of-the-art SfM approaches in terms of both efficiency and accuracy. |
---|---|
ISSN: | 1380-7501 1573-7721 |
DOI: | 10.1007/s11042-022-12202-w |