Chrome Extension
WeChat Mini Program
Use on ChatGLM

Evaluation of Different Radar Placements for Food Intake Monitoring Using Deep Learning

2023 IEEE RADAR CONFERENCE, RADARCONF23(2023)

Cited 0|Views8
No score
Abstract
Automated food intake monitoring has drawn significant attention due to its potential applications in the healthcare domain. Plenty of research, including wrist-worn imu-based and camera-based approaches, have emerged to detect food intake activity passively and objectively. Recently, researchers explored radar for food intake monitoring because of its contactless and privacy-preserving characteristics. In this study, we deploy the Frequency Modulated Continuous Wave (FMCW) radar in three different positions to investigate the performance of each position in automated eating gesture detection. The three positions are front, side, and overhead. Fifteen participants are recruited to have three meals (45 meals, 641 min in total), while the radar is deployed in different positions in each meal. A 3D Temporal Convolutional Network (3D-TCN) is used to process the range-doppler cube (RD Cube) of each dataset. The Leave-One-Subject-Out (LOSO) validation method shows that putting radar in the front position obtains the best performance with a segmental F1-score of 0.786 and 0.825 for eating and drinking gestures, respectively.
More
Translated text
Key words
Eating gesture detection, Food intake monitoring, FMCW radar, Human activity recognition, deep learning
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined