Publication
Wearable Eye Tracking for Multisensor Physical Activity Recognition
Peter Hevesi; Jamie Ward; Orkhan Amiraslanov; Gerald Pirkl; Paul Lukowicz
In: International Journal On Advances in Life Sciences, Vol. 10, No. 1&2, Pages 103-116, IARIA XPS Press, 2018.
Abstract
This paper explores the use of wearable eye-tracking to detect physical activities and location information during assembly and construction tasks involving small groups of up to four people. Large physical activities, like carrying heavy items and walking, are analysed alongside more precise, hand- tool activities, like using a drill, or a screwdriver. In a first analysis, gaze-invariant features from the eye-tracker are classi- fied (using Naive Bayes) alongside features obtained from wrist- worn accelerometers and microphones. An evaluation is presented using data from an 8-person dataset containing over 600 physical activity events, performed under real-world (noisy) conditions. Despite the challenges of working with complex, and sometimes unreliable, data, we show that event-based precision and recall of 0.66 and 0.81 respectively can be achieved by combining all three sensing modalities (using experiment-independent training, and temporal smoothing). In a further analysis, we apply state- of-the-art computer vision methods like object recognition, scene recognition, and face detection, to generate features from the eye-trackers’ egocentric videos. Activity recognition trained on the output of an object recognition model (e.g., VGG16 trained on ImageNet) could predict Precise activities with an (overall average) f-measure of 0.45. Location of participants was similarly obtained using visual scene recognition, with average precision and recall of 0.58 and 0.56.