Publication
Towards Adaptive User-Centered Neuro-Symbolic Learning for Multimodal Interaction with Autonomous Systems
Amr Gomaa; Michael Feld
In: Proceedings of the 25th International Conference on Multimodal Interaction. International Conference on Multimodal Interfaces (ICMI-2023), Paris, France, ICMI '23, ISBN 9798400700552, Association for Computing Machinery, 10/2023.
Abstract
Recent advances in deep learning and data-driven approaches have facilitated the perception of objects and their environments in a perceptual subsymbolic manner. Thus, these autonomous systems can now perform object detection, sensor data fusion, and language understanding tasks. However, there is an increasing demand to further enhance these systems to attain a more conceptual and symbolic understanding of objects to acquire the underlying reasoning behind the learned tasks. Achieving this level of powerful artificial intelligence necessitates considering both explicit teachings provided by humans (e.g., explaining how to act) and implicit teaching obtained through observing human behavior (e.g., through system sensors). Hence, it is imperative to incorporate symbolic and subsymbolic learning approaches to support implicit and explicit interaction models. This integration enables the system to achieve multimodal input and output capabilities. In this Blue Sky paper, we argue for considering these input types, along with human-in-the-loop and incremental learning techniques, to advance the field of artificial intelligence and enable autonomous systems to emulate human learning. We propose several hypotheses and design guidelines aimed at achieving this objective.
Projekte
CAMELOT - Continuous Adaptive Machine-Learning of Transfer of Control Situations,
SC_Gomaa - TeachTAM: Machine Teaching with Hybrid Neurosymbolic Reinforcement Learning; The Apprenticeship Model
SC_Gomaa - TeachTAM: Machine Teaching with Hybrid Neurosymbolic Reinforcement Learning; The Apprenticeship Model