Chrome Extension
WeChat Mini Program
Use on ChatGLM

A Comprehensive Multisensor Dataset Employing RGBD Camera, Inertial Sensor and Web Camera

2019 20th Asia-Pacific Network Operations and Management Symposium (APNOMS)(2019)

Cited 5|Views33
No score
Abstract
Over the decades, fitness activities and extreme endurance events are expanding throughout the world. The number of available public skeletal repositories and recognition/evaluation benchmarks has grown rapidly since Microsoft manufactured a motion sensing device called Kinect. Kinect RGBD data has become a very useful representation of an indoor scene for solving activity/fitness recognition problems. The other alternative sensor which has been utilized widely in this area is the wearable inertial measurement unit (IMU) sensor. With numerous advance sensors with mass adoption, this technology represents a possible approach to surpass current activity recognition and evaluation research solutions. Nevertheless, there is a limited number of publicly available datasets where depth camera, inertial sensor, and RGB image data are captured at the same time. In this paper, we introduce NCTU-MFD (National Chiao Tung University Multisensor Fitness Dataset), a comprehensive, diverse multisensor dataset collected using Kinect RGBD sensor, wearable inertial sensors, and web cameras. The dataset contains 47131 RGB images, 47131 depth images, and 100 csv files including 47131 skeletal data (from 25 joints) collected from Kinect sensor. In addition, our dataset also contains acceleration and gyroscope data from IMU sensors, and 94262 RGB images (47131 images from each web camera). To demonstrate the possible use of our dataset, we conduct an experiment on evaluation of depth maps.
More
Translated text
Key words
Wearable sensors,Kinect,web camera,fitness,dataset
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined