Loading…
LiDeNeRF: Neural radiance field reconstruction with depth prior provided by LiDAR point cloud
Neural Radiance Fields (NeRF) is a technique for reconstructing real-world scenes from multiple views. However, existing methods mostly focus on the visual effects of scene reconstruction while neglecting geometric accuracy, which is crucial in photogrammetry and remote sensing. In this paper, we pr...
Saved in:
Published in: | ISPRS journal of photogrammetry and remote sensing 2024-02, Vol.208, p.296-307 |
---|---|
Main Authors: | , , , , , , , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Neural Radiance Fields (NeRF) is a technique for reconstructing real-world scenes from multiple views. However, existing methods mostly focus on the visual effects of scene reconstruction while neglecting geometric accuracy, which is crucial in photogrammetry and remote sensing. In this paper, we propose a method called LiDeNeRF which uses LiDAR point cloud to provide depth priors for NeRF reconstruction. The goal of LiDeNeRF is to achieve real-time rendering of NeRF scenes and 3D reconstruction results with high geometric accuracy. In this method, first, the LiDAR point cloud is projected onto images to generate a sparse depth map. Then, by triangulating the sparse depth and using a multi-view image depth propagation method, a dense depth map with high accuracy is quickly generated as the depth prior for NeRF. Finally, a new depth correction module is designed and embedded into the NeRF rendering pipeline to improve the accuracy of scene depth estimation. The experimental results evince that our methodology has attained paramount performance in both novel view synthesis and 3D reconstruction tasks. Our source code is made available at https://github.com/WPC-WHU/LiDeNeRF. |
---|---|
ISSN: | 0924-2716 1872-8235 |
DOI: | 10.1016/j.isprsjprs.2024.01.017 |