The aim of the project is to create new technical and scientific foundations for social and physical AI of the real world, especially for improved social interaction with digital characters. For this purpose, an immersive-shared performance environment will be developed for a virtual space or in a mixed reality reconstruction.
To realise this, dance was chosen as a vehicle, because this deeply human activity leads to strong positive effects on physiological and psychological well-being by combining thinking, feeling, sensing with physical movement. Thus, Carousel+ aims to develop AI-controlled characters that interact autonomously with individuals or groups in meaningful ways. Dancing combines physical activity with heightened sensory perception, cognitive abilities, creativity, interpersonal contact, and emotional expression. Therefore, the first use cases of the project will include modern free-form dance styles in groups and couples, folk dances, and partner dances such as tango. The special challenge is offered by the complex interdependence between movement, music, tactile contact, and the dancers' feelings.
Carousel+ belongs to the new branch of "Real-World Social and Physical AI", with many new application areas such as social, entertainment, health, education, security, peace-making, emergency handling and autonomous driving.
The DFKI research area Agents and Simulated Reality (ASR), led by Prof. Dr. Philipp Slusallek, is developing the technology for intelligent simulation and control of physically plausible virtual characters. To create high-quality animated content, signals from multiple people are interpreted in real time to achieve a seamless and engaging visual experience during the dance. A character controller is generated from the current scene and user analysis to animate the virtual character. Special machine learning models can be used to develop different types of movements for different dances. To increase realism, the characters should work intrinsically and learn from their experiences. For the special case of teaching scenarios, visual and haptic feedbacks are integrated with an agent controller.
For more information visit the project website.
Partners
Grassroots Arts and Research, Edinburgh Napier University, Aalto University, VIVITnet