Loading…

Efficient Region-Based 3-D Urban Building Reconstruction From TomoSAR Images

The tomographic synthetic aperture radar (TomoSAR) technique has been gaining attention because it can retrieve the 3-D structures of urban buildings by synthesizing apertures along the elevation direction. However, most existing TomoSAR methods in literature calculate elevations pixel by pixel and...

Full description

Saved in:
Bibliographic Details
Published in:IEEE transactions on geoscience and remote sensing 2024, Vol.62, p.1-14
Main Authors: Wang, Wei, Yu, Liankun, Dong, Qiulei, Hu, Zhanyi
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:The tomographic synthetic aperture radar (TomoSAR) technique has been gaining attention because it can retrieve the 3-D structures of urban buildings by synthesizing apertures along the elevation direction. However, most existing TomoSAR methods in literature calculate elevations pixel by pixel and overlook the correlation between elevations, leading to low accuracy and efficiency. To solve these problems, this study introduces an efficient region-based 3-D urban building reconstruction method that incorporates different geometric primitives (i.e., points, planes, and models). Specifically, the proposed method, under the constraints constructed by different geometric primitives, follows three steps to reconstruct the box-like models of urban buildings: 1) it detects double-bounce regions and reconstructs box-like models based on plane sweeping and region growing; 2) it reconstructs box-like models based on multiplane fitting and optimization for nondouble-bounce regions; and 3) it regularizes box-like models based on building layout priors (e.g., collinearity and proximity). The experimental results on two datasets show that the proposed method can efficiently produce reliable results and outperforms several existing methods both qualitatively and quantitatively.
ISSN:0196-2892
1558-0644
DOI:10.1109/TGRS.2024.3417948