Skip to main content Skip to main navigation

Publication

SensCogAR: Cognitive Load Estimation Via Movement Data in Assembly Tasks

Javier Melo; Leyla Akinci; Ko Watanabe; Nicolas Großmann; Shoya Ishimaru; Andreas Dengel
In: International Journal of Activity and Behavior Computing, Vol. 2026, No. 1, Pages 1-20, International Journal of Activity and Behavior Computing (IJABC), 2026.

Abstract

Understanding cognitive load in Augmented Reality (AR) applications has become increasingly relevant, especially within assembly instruction systems. To effectively address this challenge, we designed an experiment employing a task that shares key cognitive and motor characteristics with industrial assembly processes and can be manipulated to induce low and high levels of cognitive load. Participants completed tangram puzzles across two sessions representing these load levels. We collected physiological and movement data from wearable devices, including the Microsoft HoloLens 2 and Empatica E4, alongside NASA Task Load Index (NASA–TLX) scores and task completion times. Machine learning models trained on movement data (head, hand, and eye tracking from the HoloLens) achieved the highest classification F1 score of 0.886, outperforming combined sensor data and physiological data alone. NASA–TLX scores and task completion times validated the experimental manipulation of cognitive load. Our findings provide evidence that movement data captured via AR headsets can effectively detect cognitive load, with implications for adaptive AR assembly systems.

Projects

More links