Anipose: a toolkit for robust markerless 3D pose estimation

bioRxiv(2020)

引用 84|浏览10
暂无评分
摘要
Quantifying movement is critical for understanding animal behavior. Advances in computer vision now enable markerless tracking from 2D video, but most animals live and move in 3D. Here, we introduce Anipose, a Python toolkit for robust markerless 3D pose estimation. Anipose consists of four components: (1) a 3D calibration module, (2) filters to resolve 2D tracking errors, (3) a triangulation module that integrates temporal and spatial constraints, and (4) a pipeline for processing large numbers of videos. We evaluate Anipose on four datasets: a moving calibration board, fruit flies walking on a treadmill, mice reaching for a pellet, and humans performing various actions. Because Anipose is built on popular 2D tracking methods (e.g., DeepLabCut), users can expand their existing experimental setups to incorporate robust 3D tracking. We hope this open-source software and accompanying tutorials (www.anipose.org) will facilitate the analysis of 3D animal behavior and the biology that underlies it.
更多
查看译文
关键词
pose estimation,robust tracking,markerless tracking,behavior,3D,deep learning
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要