Skip to main content Skip to main navigation

Publication

Learning from the Best: Contrastive Representations Learning Across Sensor Locations for Wearable Activity Recognition

Vitor Fortes Rey; Sungho Suh; Paul Lukowicz
In: ACM International Symposium on Wearable Computers. ACM International Symposium on Wearable Computers (ISWC-2022), ACM, 9/2022.

Abstract

We address the well-known wearable activity recognition problem of having to work with sensors that are non-optimal in terms of information they provide but have to be used due to wearability/usability concerns (e.g. the need to work with wrist-worn IMUs because they are embedded in most smart watches). To mitigate this problem we propose a method that facilitates the use of information from sensors that are only present during the training process and are unavailable during the later use of the system. The method transfers information from the source sensors to the latent representation of the target sensor data through contrastive loss that is combined with the classification loss during joint training (fig. 1). We evaluate the method on the well-known PAMAP2 and Opportunity benchmarks for different combinations of source and target sensors showing average (over all activities) F1 score improvements of between 5% and 13% with the improvement on individual activities, particularly well suited to benefit from the additional information going up to between 20% and 40%.

Projects

More links