A Resource-aware Vision-aided Inertial Navigation System for Wearable and Portable Computers

semanticscholar(2014)

引用 0|浏览0
暂无评分
摘要
In this paper, we address the problem of deploying Vision-aided Inertial Navigation Systems (VINS) on resourceconstrained platforms such as cell phones and wearable computers. In particular, we consider the case of a sliding-window extended Kalman filter (EKF)-based estimator and focus on optimizing its use of the available processing resources. This is achieved by first classifying visual observations based on their feature-track length and then assigning different portions of the CPU budget for processing subsets of the observations belonging to each class. Moreover, we introduce a processing strategy where “spare” CPU cycles are used for (re)-processing all or a subset of the observations corresponding to the same feature, across multiple, overlapping, sliding windows. This way, feature observations are used by the estimator more than once for improving the state estimates, while consistency is ensured by marginalizing each feature only once (i.e., when it moves outside the camera’s field of view). The ability of the proposed feature classification and processing approach to adjust to the availability of processing resources is demonstrated experimentally on a Samsung S4 cell phone and on the Google Glass where VINS operates in real-time while occupying only half of the CPU cycles of one of the ARM processor’s cores.
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要