Dynamic LiDAR Re-simulation using Compositional Neural Fields
CoRR(2023)
摘要
We introduce DyNFL, a novel neural field-based approach for high-fidelity
re-simulation of LiDAR scans in dynamic driving scenes. DyNFL processes LiDAR
measurements from dynamic environments, accompanied by bounding boxes of moving
objects, to construct an editable neural field. This field, comprising
separately reconstructed static backgrounds and dynamic objects, allows users
to modify viewpoints, adjust object positions, and seamlessly add or remove
objects in the re-simulated scene. A key innovation of our method is the neural
field composition technique, which effectively integrates reconstructed neural
assets from various scenes through a ray drop test, accounting for occlusions
and transparent surfaces. Our evaluation with both synthetic and real-world
environments demonstrates that \ShortName substantial improves dynamic scene
simulation based on LiDAR scans, offering a combination of physical fidelity
and flexible editing capabilities.
更多查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要