Publikation
Detection and recognition of human manipulation building blocks
Lisa Gutzeit
PhD-Thesis, Universität Bremen, 3/2023.
Zusammenfassung
New research challenges in robotics, which arise from attempts to get robots out of isolated environments into human spaces with a close human-robot interaction, demand new methods to acquire a deeper understanding of the human behavior and intentions. Especially in Learning from Demonstration, which provides an intuitive way to teach robotic systems new behavior from human movement examples, approaches are needed to recognize relevant movement segments in human motion data. In this way, building blocks of robotic motions can be learned which can be combined to generate a wide range of behaviors.
This thesis introduces algorithms to detect and annotate human building block movements in recordings of manipulation movements. The velocity-based Multiple Change-point Inference algorithm is presented, which identifies building blocks with a bell-shaped velocity profile using Bayesian Inference. To annotate the detected building blocks, a 1-Nearest Neighbor-based movement classification is proposed, which annotates the movements fast and reliably in the presence of only a small amount of labeled movement examples for training. To detect basic movements as well as their concatenations into more complex, labeled actions, the velocity-based Hierarchical Movement Segmentation algorithm which hierarchically analyses human movements is presented. The developed methods are evaluated on different movement recordings acquired from several subjects and applied in a framework to learn new robotic behavior from human demonstration as well as on teleoperated movements recorded with an exoskeleton.
With the presented algorithms, the imitation of human behavior for robotic systems can be made more intuitive, automated, and more generally applicable. Furthermore, the hierarchical movement segmentation approach opens the door to construct hierarchical learning approaches which are based on human demonstration to learn complex robotic behavior more effectively.
Projekte
- BesMan - BesMan - Behaviours for Mobile Manipulation
- TransFit - Flexible Interaktion für Infrastrukturaufbau mittels Teleoperation und direkte Kollaboration und Transfer in Industrie 4.0
- KiMMI SF - Adaptives Softwareframework für Kontextabhängige intuitive Mensch-Maschine-Interaktion