谷歌Chrome浏览器插件
订阅小程序
在清言上使用

Outdoor real-time RGBD sensor fusion of stereo camera and sparse lidar

Journal of Physics: Conference Series(2022)

引用 0|浏览6
暂无评分
摘要
Abstract Autonomous driving technology relies on high-resolution, high-performance vision sensing technology. As the two most popular sensors in the industry, Lidar and stereo cameras play an important role in sensing, detection, control and planning decisions. At this stage, the Lidar can obtain more accurate depth map information, but with the increase of resolution, its cost increases sharply and cannot be applied in large scale in the market. At the same time, the depth detection scheme based on computer vision has high resolution, but the data accuracy is very low. In order to solve the practical application defects of the above two sensors, this paper proposes a new sensor fusion method for stereo camera and low-resolution Lidar, which has high resolution, high performance and low cost. This paper adopts new sensor design, multi-sensor calibration, classification and selection of Lidar pointcloud features, large-scale and efficient stereo image matching and depth map calculation, and method of filling missing information based on pointcloud segmentation. To verify the effectiveness of the method in this paper, this paper used high-resolution Lidar as ground truth for comparison. The results show that the fusion method can achieve an average improved accuracy of 30% within a close range of 30 meters, while achieving 98% resolution. In addition, this paper presents a scheme for visualizing multi-sensor image fusion, and encapsulates five modules: multi-sensor calibration, large-scale stereo depth calculation, low-resolution Lidar simulation, sensor data fusion, and fusion image and error visualization, in order to facilitate future secondary development.
更多
查看译文
关键词
stereo camera,lidar,fusion,real-time
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要