Skip to main content Skip to main navigation

Publication

Unimodal and Multimodal Sensor Fusion for Wearable Activity Recognition

Hymalai Bello
In: Ph.D Forum on Pervasive Computing and Communications. IEEE International Conference on Pervasive Computing and Communications (PerCom-2024), Pervasive Computing and Communications, located at The 22nd International Conference on Pervasive Computing and Communications (PerCom 2024), March 11-15, Biarritz, France, IEEE PerCom, 3/2024.

Abstract

Combining different sensing modalities with multiple positions helps form a unified perception and understanding of complex situations such as human behavior. Hence, human activity recognition (HAR) benefits from combining redundant and complementary information (Unimodal/Multimodal). Even so, it is not an easy task. It requires a multidisciplinary approach, including expertise in sensor technologies, signal processing, data fusion algorithms, and domain-specific knowledge. This Ph.D. work employs sensing modalities such as inertial, pressure (audio and atmospheric pressure), and textile capacitive sensing for HAR. The scenarios explored are gesture and hand position tracking, facial and head pattern recognition, and body posture and gesture recognition. The selected wearable devices and sensing modalities are fully integrated with machine learning-based algorithms, some of which are implemented in the embedded device, on the edge, and tested in real-time.

Projects