Neural Radiance Field with LiDAR Maps

Download: PDF.

“Neural Radiance Field with LiDAR Maps” by M.-F. Chang, A. Sharma, M. Kaess, and S. Lucey. In Proc. Intl. Conf. on Computer Vision, ICCV, (Paris, France), Oct. 2023.

Abstract

We address outdoor Neural Radiance Fields (NeRF) [23] with real-world camera views and LiDAR maps. Existing methods usually require densely-sampled source views and do not perform well with the open source camera-LiDAR datasets. In this paper, our design leverages 1) LiDAR sensors for strong 3D geometry priors that significantly improve the ray sampling locality, and 2) Conditional Adversarial Networks (cGANs) [15] to recover image details since aggregating embeddings from imperfect LiDAR maps causes artifacts. Our experiments show that while NeRF baselines produce either noisy or blurry results on Argoverse 2 [42], our system not only outperforms baselines in image quality metrics under both clean and noisy conditions, but also obtains closer Detectron2 [43] results to the ground truth images. Furthermore, this system can be used in data augmentation for training a pose regression network [3] and multi-season view synthesis. We hope this work to serve as a new LiDAR-based NeRF baseline that pushes this research direction forward (released here).

Download: PDF.

BibTeX entry:

@inproceedings{Chang23iccv,
   author = {M.-F. Chang and A. Sharma and M. Kaess and S. Lucey},
   title = {Neural Radiance Field with LiDAR Maps},
   booktitle = {Proc. Intl. Conf. on Computer Vision, ICCV},
   address = {Paris, France},
   month = oct,
   year = {2023}
}
Last updated: March 21, 2023