Loading…
SEMANTIC URBAN MESH ENHANCEMENT UTILIZING A HYBRID MODEL
We propose a feature-based approach for semantic mesh segmentation in an urban scenario using real-world training data. There are only few works that deal with semantic interpretation of urban triangle meshes so far. Most 3D classifications operate on point clouds. However, we claim that point cloud...
Saved in:
Published in: | ISPRS annals of the photogrammetry, remote sensing and spatial information sciences remote sensing and spatial information sciences, 2019-09, Vol.IV-2/W7, p.175-182 |
---|---|
Main Authors: | , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | We propose a feature-based approach for semantic mesh segmentation in an urban scenario using real-world training data. There are only few works that deal with semantic interpretation of urban triangle meshes so far. Most 3D classifications operate on point clouds. However, we claim that point clouds are an intermediate product in the photogrammetric pipeline. For this reason, we explore the capabilities of a Convolutional Neural Network (CNN) based approach to semantically enrich textured urban triangle meshes as generated from LiDAR or Multi-View Stereo (MVS). For each face within a mesh, a feature vector is computed and fed into a multi-branch 1D CNN. Ordinarily, CNNs are an end-to-end learning approach operating on regularly structured input data. Meshes, however, are not regularly structured. By calculating feature vectors, we enable the CNN to process mesh data. By these means, we combine explicit feature calculation and feature learning (hybrid model). Our model achieves close to 80% Overall Accuracy (OA) on dedicated test meshes. Additionally, we compare our results with a default Random Forest (RF) classifier that performs slightly worse. In addition to slightly better performance, the 1D CNN trains faster and is faster at inference. |
---|---|
ISSN: | 2194-9050 2194-9042 2194-9050 |
DOI: | 10.5194/isprs-annals-IV-2-W7-175-2019 |