Robust Target Recognition and Tracking of Self-Driving Cars With Radar and Camera Information Fusion Under Severe Weather Conditions

Ze Liu, Yingfeng Cai, Hai Wang, Long Chen, Hongbo Gao, Yunyi Jia, Yicheng Li

IEEE Transactions on Intelligent Transportation Systems(2022)

Cited 227|Views206
No score
Abstract
Radar and camera information fusion sensing methods are used to solve the inherent shortcomings of the single sensor in severe weather. Our fusion scheme uses radar as the main hardware and camera as the auxiliary hardware framework. At the same time, the Mahalanobis distance is used to match the observed values of the target sequence. Data fusion based on the joint probability function method. Moreover, the algorithm was tested using actual sensor data collected from a vehicle, performing real-time environment perception. The test results show that radar and camera fusion algorithms perform better than single sensor environmental perception in severe weather, which can effectively reduce the missed detection rate of autonomous vehicle environment perception in severe weather. The fusion algorithm improves the robustness of the environment perception system and provides accurate environment perception information for the decision-making system and control system of autonomous vehicles.
More
Translated text
Key words
Multi-sensor fusion,radar camera fusion,severe weather conditions,self-driving cars
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined