Precise localization of the mobile wheeled robot using sensor fusion of odometry, visual artificial landmarks and inertial sensors.

Robotics and Autonomous Systems(2019)

Cited 23|Views5
No score
Abstract
This article proposes a method for sensor fusion between odometers, gyroscope, accelerometer, magnetometer and visual landmark localization system. The method is designed for estimation of all 6 degrees of freedom (both translation and attitude) of a wheeled robot moving in uneven terrain. The fusion method is based on continuous estimation of the mean square error of each estimated value and allows different sampling rates of each sensor. Due to the simple implementation, it is suitable for real-time processing in the low-cost hardware. In order to evaluate the precision of the estimated position, stochastical models of sensors (with parameters matching real hardware sensors) were used and random trajectories were simulated. The virtual experiments showed that the method is resistant to the failure of any sensor except the odometers; however, each sensor provides improvement in the resultant precision.
More
Translated text
Key words
Localization,Sensor fusion,Odometry,Landmarks,Inertial sensors,Mean square error
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined