Skip to main content Skip to main navigation

Publication

Human-Robot Interaction Through Egocentric Hand Gesture Recognition

Snehal Walunj; Nazanin Mashhaditafreshi; Parsha Pahlevannejad; Achim Wagner; Martin Ruskowski
In: Kosmas Alexopoulos; Sotiris Makris; Panagiotis Stavropoulos (Hrsg.). Advances in Artificial Intelligence in Manufacturing II. European Symposium on Artificial Intelligence in Manufacturing (ESAIM-2024), Cham, Pages 125-133, ISBN 978-3-031-86489-6, Springer Nature Switzerland, 2025.

Abstract

Recognition of human hand gestures in industrial environments is gaining popularity, especially in the context of assistance systems, thanks to advancements in deep learning-based vision methods. Also, head-worn devices with cameras are becoming more popular especially for smart assistance using Extended Reality (XR) technology, even for industrial use cases. Employing sensors from head-worn devices such as HoloLens enhance the communication between human and robot hereby providing interaction using ego-centric vision. This study delves into human-robot interaction by investigating ego-centered hand gesture recognition for commanding robots. A pipeline is developed for collecting these HoloLens video frames and to detect hand landmark labels on them using MediaPipe library by Google. Then, a Long Short-Term Memory Network (LSTM) model for hand-gesture recognition was developed that classifies the hand-gesture from the given hand landmarks in near real-time, which can then be translated into robot commands. We also present results for our network's performance and implementation pipeline.

Projects

More links