Dynamic LiDAR Re-simulation using Compositional Neural Fields
CoRR(2023)
Abstract
We introduce DyNFL, a novel neural field-based approach for high-fidelity
re-simulation of LiDAR scans in dynamic driving scenes. DyNFL processes LiDAR
measurements from dynamic environments, accompanied by bounding boxes of moving
objects, to construct an editable neural field. This field, comprising
separately reconstructed static backgrounds and dynamic objects, allows users
to modify viewpoints, adjust object positions, and seamlessly add or remove
objects in the re-simulated scene. A key innovation of our method is the neural
field composition technique, which effectively integrates reconstructed neural
assets from various scenes through a ray drop test, accounting for occlusions
and transparent surfaces. Our evaluation with both synthetic and real-world
environments demonstrates that \ShortName substantial improves dynamic scene
simulation based on LiDAR scans, offering a combination of physical fidelity
and flexible editing capabilities.
MoreTranslated text
AI Read Science
Must-Reading Tree
Example
![](https://originalfileserver.aminer.cn/sys/aminer/pubs/mrt_preview.jpeg)
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined