Publikation
Eye Movement in a Controlled Dialogue Setting
David Dembinsky; Ko Watanabe; Andreas Dengel; Shoya Ishimaru
In: Proceedings of the 2024 Symposium on Eye Tracking Research & Applications. Symposium on Eye Tracking Research & Applications (ETRA-2024), June 4-7, Glasgow, United Kingdom, Association for Computing Machinery, 6/2024.
Zusammenfassung
Designing realistic eye movements for animated avatars poses a challenge, as gaze behavior is predominantly unconscious. Accurately modulating those movements is crucial to avoid the Uncanny Valley. The human gaze exhibits different characteristics in conversations, depending on speaking or listening. Albeit these distinctions are known, data for synthesizing eye movement models suitable for avatars is scarce. This research introduces a novel dataset involving human gaze behavior during remote screen conversations. The data are collected from 19 participants, offering 4 hours of gaze data labeled as Speaking and Listening. Our data analysis substantiates prior knowledge of gaze behavior while providing new insights through higher precision. Furthermore, we demonstrate the dataset's suitability for machine learning algorithms by training a classifier, achieving 88.1% binary classification accuracy.