Loading…

Towards Enabling Deep Learning-Based Question-Answering for 3D Lidar Point Clouds

Remote sensing lidar point cloud dataset embeds inherent 3D topological, topographical and complex geometrical information which possess immense potential in applications involving machine-understandable 3D perception. The lidar point clouds are unstructured, unlike images, and hence are challenging...

Full description

Saved in:
Bibliographic Details
Main Authors: Shinde, Rajat C., Durbha, Surya S, Potnis, Abhishek V., Talreja, Pratyush, Singh, Gaganpreet
Format: Conference Proceeding
Language:English
Subjects:
Online Access:Request full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Remote sensing lidar point cloud dataset embeds inherent 3D topological, topographical and complex geometrical information which possess immense potential in applications involving machine-understandable 3D perception. The lidar point clouds are unstructured, unlike images, and hence are challenging to process. In our work, we are exploring the possibility of deep learning-based question-answering on the lidar 3D point clouds. We are proposing a deep CNN-RNN parallel architecture to learn lidar point cloud features and word embedding from the questions and fuse them to form a feature mapping for generating answers. We have restricted our experiments for the urban domain and present preliminary results of binary question-answering (yes/no) using the urban lidar point clouds based on the perplexity, edit distance, evaluation loss, and sequence accuracy as the performance metrics. Our proposed hypothesis of lidar question-answering is the first attempt, to the best of our knowledge, and we envisage that our novel work could be a foundation in using lidar point clouds for enhanced 3D perception in an urban environment. We envisage that our proposed lidar question-answering could be extended for machine comprehension-based applications such as rendering lidar scene descriptions and content-based 3D scene retrieval.
ISSN:2153-7003
DOI:10.1109/IGARSS47720.2021.9553785