Skip to main content Skip to main navigation

Publication

Concentration Estimation in Online Video Lecture Using Multimodal Sensors

Noriyuki Tanaka; Ko Watanabe; Shoya Ishimaru; Andreas Dengel; Shingo Ata; Manato Fujimoto
In: Companion of the 2024 ACM International Joint Conference on Pervasive and Ubiquitous Computing Pervasive and Ubiquitous Computing (UbiComp Companion '24). International Joint Conference on Pervasive and Ubiquitous Computing (UbiComp-2024), October 5-9, Melbourne, Australia, ISBN 979-8-4007-1058-2/24/10, Association for Computing Machinery, 10/2024.

Abstract

Online lecture is one of the technology-wise challenges in the education field. It provides the advantage of encouraging anyone to join from worldwide. However, understanding students’ concentration in remote is one of the difficulties. In this paper, we evaluate multimodal sensors for estimating students’ concentration levels during online video lectures. We collect multimodal sensor data such as accelerometers, gyroscopes, heart rates, facial orientations, and eye gazes. We conducted experiments with 13 university students in Japan. The results of our study, with an average accuracy rate of 74.4% for user-dependent cross-validation and 66.3% for user-independent cross-validation, have significant implications for understanding and improving student engagement in online learning environments. Most interestingly, we found that facial orientations are significant for user-dependent and eye gazes for user-independent classification.

Projects