Chrome Extension
WeChat Mini Program
Use on ChatGLM

Tightly-coupled Vision-Gyro- Wheel Odometry for Ground Vehicle with Online Extrinsic Calibration

2020 3rd International Conference on Intelligent Autonomous Systems (ICoIAS)(2020)

Cited 3|Views41
No score
Abstract
In the absence of GPS or other sensors that provide absolute pose, it is relatively difficult to estimate the accurate pose of the ground vehicles continuously. Estimation method combining visual measurement and inertial data (VINS) can achieve high accuracy, and gradually attracts the attention of researchers. However, It has been proved that VINS has additional unobservable direction, scale, under constant speed or acceleration. For ground vehicle, these motions can often occur, which has a negative impact on the localization performance. Futhermore, accelerometer measurements are greatly affected by noise and the good bias estimation of accelerometer is needed for an accurate integration. In this paper, we present a method fusing the measurements from a stereo camera, gyro and wheel odometer. Compared to accelerometer, wheel odometer can provide more reliable translation information. Furthermore, replacing accelerometer with wheel odometer overcomes the additional unobservability of scale. And we also prove that, in online optimization, the z-axis of the translation of extrinsic parameters between camera and wheel odometry is unobservable under planar motion. Hence, the additional constraint is needed for this. By the real-world experiments in industrial indoor environments, we validate the better performance of the proposed system compared to other state-of-the-art approach.
More
Translated text
Key words
Simultaneous localization and mapping,sensor fusion,wheel encoder,observability
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined