An Event-based Stereo 3D Mapping and Tracking Pipeline for Autonomous Vehicles.

Anass El Moudni,Fabio Morbidi, Sébastien Kramm,Rémi Boutteau

2023 IEEE 26th International Conference on Intelligent Transportation Systems (ITSC)(2023)

Cited 0|Views1
No score
Abstract
Event cameras are bio-inspired, motion-activated sensors which generate asynchronous events instead of intensity images at a fixed rate. These sensors have been shown to outperform traditional frame-based cameras by large margins, in case of high-speed motions and scenes with high dynamic range. Next-generation intelligent vehicles are expected to greatly benefit from these novel cameras, especially in adverse lighting conditions, and their potential is still largely untapped. In the last decade, the continuous stream of events produced by an event camera has been exploited in numerous 3D perception tasks (depth estimation, 6-DoF tracking, visual-inertial odometry, etc.). In this paper, we propose an event-based stereo pipeline for simultaneous 3D mapping and tracking. The mapping module relies on DSI (Disparity Space Image) fusion, and the tracking module makes use of time surfaces as anisotropic distance fields, to estimate the pose of the stereo camera. Numerical experiments with a publicly-available event dataset recorded by a car in different urban environments, show the effectiveness of the proposed architecture.
More
Translated text
Key words
Event camera,Stereo depth estimation,Visual odometry,Disparity space image,Intelligent vehicle
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined