Skip to main content Skip to main navigation

Publication

Analyzing Learners’ Emotion from an HRI Experiment Using Facial Expression Recognition Systems

Hae Seon Yun; Heiko Hübert; Johann Chevalère; Niels Pinkwart; Verena V Hafner; Rebecca Lazarides
In: Learning and Collaboration Technologies - Proceedings of the 25th HCII. International Conference on Human-Computer Interaction (HCII), July 23-28, Copenhagen, Denmark, Pages 396-407, Vol. 14041, Springer Nature Switzerland AG, 6/2023.

Abstract

In this study, we analyzed the facial expressions of the participants from a study where 63 students were learning about Climate Change using Betty’s Brain, half with robotic agents and the other half with on-screen based agents using a Facial Expression Recognition (FER) system. Among FER systems, upon a review of existing offline open source solutions, we chose HyperExtended LightFace (more known as deepface) to extract emotions from the participants’ facial expressions. Based on the extracted emotions from the FER system, we compared the resulting emotions between two groups to investigate if there are differences in displayed emotions due to human robot interaction. The result of this first analysis shows that learners who interacted with robotic agents express fear, happy and neutral emotions more than the students who interacted with on-screen based agents. In addition, neutral emotion was detected more in the robotic condition with a large effect, fear with moderate effect and happy with small effect size. Fear and happiness was 60% correlated, which indicates that the FER system may not distinguish well between these two emotions. The future work entails filtering and validation of face detection results in case of false detection and utilizing different pre-processing methods to detect the face and using different FER algorithms.