Skip to main content Skip to main navigation

Publication

SynthoGestures: A Multi-Camera Framework for Generating Synthetic Dynamic Hand Gestures for Enhanced Vehicle Interaction

Amr Gomaa; Robin Zitt; Guillermo Reyes; Antonio Krüger
In: 2024 IEEE Intelligent Vehicles Symposium (IV). IEEE Intelligent Vehicles Symposium (IV-2024), June 2-5, Jeju Island, Korea, Republic of, Pages 3297-3303, IEEE, 6/2024.

Abstract

Dynamic hand gesture recognition is crucial for human-machine interfaces in the automotive domain. However, creating a diverse and comprehensive dataset of hand gestures can be challenging and time-consuming, especially in dynamic dual-task situations like driving. To address these challenges, we propose using synthetic gesture datasets generated by virtual 3D models as an alternative. Our framework synthesizes realistic hand gestures using a combination of 3D models and animation software, particularly utilizing Unreal Engine. This approach enables the creation of diverse and customizable gesture datasets, reducing the risk of overfitting and improving the model's generalizability. Specifically, our framework generates natural-looking dynamic hand gestures with multiple variants, including gesture speed, performance, and hand shape. Moreover, we simulate various camera locations, such as above the driver and behind the wheel, and different camera types, such as RGB, infrared, and depth cameras, without incurring additional time and cost to obtain these cameras. Our experiments demonstrate that our proposed framework, SynthoGestures (available at https://github.com/amrgomaaelhady/SynthoGestures), can augment or replace existing real-hand datasets with additional enhancement in gesture recognition accuracy. Our tool for generating synthetic static and dynamic hand gestures saves time and effort in creating large datasets, facilitating the faster development of gesture recognition systems for automotive applications.

Projekte

Weitere Links