Loading…
Large-Scale Synthetic Urban Dataset for Aerial Scene Understanding
The geometric extraction and semantic understanding in bird's eye view plays an important role in cyber-physical-social systems (CPSS), because it can help human or intelligent agents (IAs) to perceive larger range of environment. Moreover, due to lack of comprehensive dataset from oblique pers...
Saved in:
Published in: | IEEE access 2020, Vol.8, p.42131-42140 |
---|---|
Main Authors: | , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | The geometric extraction and semantic understanding in bird's eye view plays an important role in cyber-physical-social systems (CPSS), because it can help human or intelligent agents (IAs) to perceive larger range of environment. Moreover, due to lack of comprehensive dataset from oblique perspective, fog-end deep learning algorithms for this purpose is still in blank. In this paper, we propose a novel method to generate synthetic large-scale dataset for geometric and semantic urban scene understanding from bird's eye view. There are two main steps involved, one is modeling and the other is rendering, which are processed by CityEngine and UnrealEngine4 respectively. In this way, synthetic aligned multi-model data are obtained efficiently, including spectral images, semantic labels, depth and normal maps. Specifically, terrain elevation, street graph, building style and trees distribution are all randomly generated according realistic situation, a few of handcrafted semantic labels annotated by colors spread throughout the scene, virtual cameras moved according to realistic trajectories of unmanned aerial vehicles (UAVs). For evaluation of practicability of our dataset, we manually labeled tens of aerial images downloaded from internet. And the experiment result show that, in both pure and combined mode, the dataset can improve the performance significantly. |
---|---|
ISSN: | 2169-3536 2169-3536 |
DOI: | 10.1109/ACCESS.2020.2976686 |