Chained fusion of discrete and continuous epipolar geometry with odometry for long-term localization of mobile robots

Control Applications(2011)

引用 7|浏览11
暂无评分
摘要
This paper presents a sensor fusion implementation to improve the accuracy of robot localization by combining multiple visual odometry approaches with wheel and IMU odometry. Discrete and continuous Homography Matrices are used to recover robot pose and velocity from image sequences of tracked feature points. The camera's limited field of view is addressed by chaining vision-based motion estimates. As feature points leave the field of view, new features are acquired and tracked. When a new set of points is needed, the motion estimate is reinitialized and chained to the previous state estimate. An extended Kalman filter fuses measurements from the robot's wheel encoders with those from visual and inertial measurement systems. Time varying matrices in the extended Kalman filter compensate for known changes in sensor accuracy, including periods when visual features cannot be reliably tracked. Experiments are performed to validate the approach.
更多
查看译文
关键词
Kalman filters,distance measurement,geometry,image sequences,mobile robots,sensor fusion,continuous epipolar geometry,discrete epipolar geometry,extended Kalman filter,image sequences,long-term localization,mobile robots,odometry,robot localization,sensor fusion implementation,vision-based motion estimates
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要