Real-Time Dense Surface Reconstruction For Aerial Manipulation
2016 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)(2016)
摘要
With robotic systems reaching considerable maturity in basic self-localization and environment mapping, new research avenues open up pushing for interaction of a robot with its surroundings for added autonomy. However, the transition from traditionally sparse feature-based maps to dense and accurate scene-estimation imperative for realistic manipulation is not straightforward. Moreover, achieving this level of scene perception in real-time from a computationally constrained and highly shaky and agile platform, such as a small an Unmanned Aerial Vehicle (UAV) is perhaps the most challenging scenario for perception for manipulation. Drawing inspiration from otherwise computationally constraining Computer Vision techniques, we present a system combining visual, inertial and depth information to achieve dense, local scene reconstruction of high precision in real-time. Our evaluation testbed is formed using ground-truth not only in the pose of the sensor-suite, but also the scene reconstruction using a highly accurate laser scanner, offering unprecedented comparisons of scene estimation to ground-truth using real sensor data. Given the lack of any real, ground-truth datasets for environment reconstruction, our V4RL Dense Surface Reconstruction dataset is publicly available(1).
更多查看译文
关键词
V4RL dense surface reconstruction dataset,ground-truth datasets,scene estimation,local scene reconstruction,computer vision techniques,UAV,unmanned aerial vehicle,agile platform,scene perception,realistic manipulation,sparse feature-based maps,research avenues,environment mapping,basic self-localization,robotic systems,aerial manipulation,real-time dense surface reconstruction
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络