Real-Time Sketching of Harshly Lit Driving Environment Perception by Neuromorphic Sensing

IEEE Transactions on Intelligent Vehicles(2024)

Cited 0|Views5
No score
Abstract
Visual sensors are indispensable for automatic vehicles, to achieve comprehensive environmental perception for navigation, but their deteriorated performance in harsh illuminations largely sets back the practical use of autonomous driving technologies. A promising solution is to use a bio-inspired event sensor that asynchronously records the intensity changes with high sensitivity, fast response, and large dynamic range, which assists situational awareness of moving vehicles in harshly lit scenarios. However, the sensing of event sensors comes with heavy noise and sparse signals, due to either severe photon starvation or limited acquisition bandwidth. In this paper, we propose an approach for real-time sketching of the harshly lit driving environment (RIDE), to outline the driving surroundings from noisy sporadic measurements. We address confronted challenges as follows: (i) map the raw event signals into a low dimensional space and cluster the features to depict the spatial-temporal correlation within raw events; (ii) design a general inference network to construct continuous motion fields of the scene from the encoded features of noisy sporadic raw measurements; (iii) construct the pseudo-ground-truth via the unsupervised motion compensation as the label of the above network learning, achieving real-time inference. Our approach is experimentally validated on real traffic data and displays high-fidelity perception capability for extremely dark scenes and scenarios with high dynamic range. Also, we investigate RIDE's effectiveness in the downstream task—detection of traffic participants. In a nutshell, the proposed RIDE provides high-fidelity sensing of harshly lit environments and lays the foundation for the all-day visual navigation of autonomous vehicles.
More
Translated text
Key words
Harshly lit driving perception,neuromorphic sensing,autonomous driving,computer vision,robot vision
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined