Multimodal Human Activity Recognition for Smart Healthcare Applications

2022 IEEE International Conference on Systems, Man, and Cybernetics (SMC)(2022)

Cited 3|Views27
No score
Abstract
Human Activity Recognition (HAR) has emerged as a potential research topic for smart healthcare owing to the fast growth of wearable and smart devices in recent years. The significant applications of HAR in ambient assisted living environments include monitoring the daily activities of elderly and cognitively impaired individuals to assist them by observing their health status. In this research, we present a deep learning-based fusion approach for multimodal HAR that fuses the different modalities of data to obtain robust outcomes. Here, Convolutional Neural Networks (CNNs) retrieve the high-level attributes from the image data, and the Convolutional Long Short Term Memory (ConvLSTM) is utilized to capture significant patterns from the multi-sensory data. Finally, the extracted features from the modalities are fused through self-attention mechanisms that enhance the relevant activity data and inhibit the superfluous and possibly confusing information by measuring their compatibility. Lastly, extensive tests have been performed to measure the efficiency and robustness of the developed fusion approach using the UP-Fall detection dataset. It is evident from the experimental findings that the proposed fusion technique outperforms the existing state-of-the-art and achieves relatively better performance.
More
Translated text
Key words
Human Activity Recognition,Convolutional Neural Network,Convolutional Long Short Term Memory,Self-Attention,Smart Healthcare.
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined