Skip to main content Skip to main navigation

Projekt | VidGenSense

Laufzeit:

Methods for Generating Synthetic Wearable And Ubiquitous Sensor Training Data

As wearable sensors have become more and more ubiquitous one would expect human activity recognition (HAR) systems to increasingly move out of the lab into long term, real life scenarios. While the availability of sensor data as such is rapidly increasing with the spread of various sensor enabled personal devices, labelling such data remains a challenge. Recently the idea of using labeled videos to generate “synthetic” sensor data has been proposed as a solution for the HAR labeled data problem. Thus, when developing a new HAR application it would not be necessary anymore for the difficult process of recording large amounts of real sensor data from users actually wearing sensors and annotating their own sensor data. Instead, one could first collect videos related to the activities that need to be recognised from appropriate online sources. If not already assigned to the respective activities (which is often the case with online videos) the videos would then be labeled (which as described above is much easier then labeling sensor data) and converted to synthetic sensor data that can be used to train the system.

In this project we aim to make a systematic and focused effort to ease access to labeled training data for wearable and ubiquitous HAR systems by: -Improving the initial methods for the generation of IMU data from online videos; -Extending the methods to -leverage multimodal data (video, sound, textual description), -incorporate physical models of the sensor and the relevant activity, -explore the use of additional semantic information either extracted from the videos or given by the background knowledge; -Investigating the transferability of the resulting methods to other HAR sensing modalities such as our Head- Sense systems, textile pressure sensor mats or auditory scene analysis.

Publikationen zum Projekt

  1. Unimodal and Multimodal Sensor Fusion for Wearable Activity Recognition

    Hymalai Bello

    In: Ph.D Forum on Pervasive Computing and Communications. IEEE International Conference on Pervasive Computing and Communications (PerCom-2024), Pervasive Computing and Communications, located at The 22nd International Conference on Pervasive Computing and Communications (PerCom 2024), March 11-15, Biarritz, France, IEEE PerCom, 3/2024.
  2. iMove: Exploring Bio-Impedance Sensing for Fitness Activity Recognition

    M. Liu; V. Rey; Y. Zhang; L. Swarup Ray; B. Zhou; P. Lukowicz

    In: 2024 IEEE International Conference on Pervasive Computing and Communications (PerCom). IEEE International Conference on Pervasive Computing and Communications (PerCom-2024), Los Alamitos, CA, USA, Pages 194-205, IEEE Computer Society, 3/2024.
  3. iMove: Exploring Bio-Impedance Sensing for Fitness Activity Recognition

    M. Liu; V. Rey; Y. Zhang; L. Swarup Ray; B. Zhou; P. Lukowicz

    In: 2024 IEEE International Conference on Pervasive Computing and Communications (PerCom). IEEE International Conference on Pervasive Computing and Communications (PerCom-2024), Los Alamitos, CA, USA, Pages 194-205, IEEE Computer Society, 3/2024.

Fördergeber

BMBF - Bundesministerium für Bildung und Forschung

01IW21003

BMBF - Bundesministerium für Bildung und Forschung