Efficient Continuous-Time Ego-Motion Estimation for Asynchronous Event-based Data Associations
CoRR(2024)
摘要
Event cameras are bio-inspired vision sensors that asynchronously measure
per-pixel brightness changes. The high temporal resolution and asynchronicity
of event cameras offer great potential for estimating the robot motion state.
Recent works have adopted the continuous-time ego-motion estimation methods to
exploit the inherent nature of event cameras. However, most of the adopted
methods have poor real-time performance. To alleviate it, a lightweight
Gaussian Process (GP)-based estimation framework is proposed to efficiently
estimate motion trajectory from asynchronous event-driven data associations.
Concretely, an asynchronous front-end pipeline is designed to adapt
event-driven feature trackers and generate feature trajectories from event
streams; a parallel dynamic sliding-window back-end is presented within the
framework of sparse GP regression on SE(3). Notably, a specially designed state
marginalization strategy is employed to ensure the consistency and sparsity of
this GP regression. Experiments conducted on synthetic and real-world datasets
demonstrate that the proposed method achieves competitive precision and
superior robustness compared to the state-of-the-art. Furthermore, the
evaluations on three 60 s trajectories show that the proposal outperforms the
ISAM2-based method in terms of computational efficiency by 2.64, 4.22, and
11.70 times, respectively.
更多查看译文
AI 理解论文
溯源树
样例
![](https://originalfileserver.aminer.cn/sys/aminer/pubs/mrt_preview.jpeg)
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要