Analysis of accuracy of pedestrian inertial data obtained from camera’s images

2018 IEEE Biennial Congress of Argentina (ARGENCON)(2018)

引用 0|浏览5
暂无评分
摘要
Wearable devices have inertial sensors that provide useful information to estimate and predict pedestrian motion and intention, which is of fundamental importance in ITS applications. These devices are usually placed in the limbs, such as wrist, ankles and feet and they provide rotation rate and acceleration information. This information is essential for the successful development of systems capable of inferencing pedestrian intentions. Unfortunately these devices do not have the capabilities to broadcast information to all vehicles in proximity and require all pedestrian to be retrofitted with such capability. This is the fundamental reason why all existing approaches are based on sensing installed directly in the vehicles. Intelligent vehicles have different types of sensors to perceive the environment in proximity, the most common being cameras. This work demonstrates that vision from cameras is capable of obtaining pedestrian dynamics with similar accuracy of wearables devices. It compares rotation ratios and acceleration obtained with wearables installed in pedestrian wrists with similar information obtained by vision. The vision dynamic information is obtained using robust methods that combine skeleton representation with semantic information. The experimental results presented demonstrate the strong correlation between the wearable measured and vision observed rates and acceleration information. The outcomes of this work will enable the solution of one of the fundamental issues in pedestrian safety that is inference of intent.
更多
查看译文
关键词
inertial data,pedestrian,accuracy,cameras
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要