Skip to main content Skip to main navigation

Publication

Multisensor-Pipeline: A Lightweight, Flexible, and Extensible Framework for Building Multimodal-Multisensor Interfaces

Michael Barz; Omair Shahzad Bhatti; Bengt Lüers; Alexander Prange; Daniel Sonntag
In: Companion Publication of the 2021 International Conference on Multimodal Interaction. ACM International Conference on Multimodal Interaction (ICMI-2021), October 18-22, Montréal, QC, Canada, Pages 13-18, ISBN 9781450384711, Association for Computing Machinery, New York, NY, USA, 2021.

Abstract

We present the multisensor-pipeline (MSP), a lightweight, flexible, and extensible framework for prototyping multimodal-multisensor interfaces based on real-time sensor input. Our open-source framework (available on GitHub) enables researchers and developers to easily integrate multiple sensors or other data streams via source modules, to add stream and event processing capabilities via processor modules, and to connect user interfaces or databases via sink modules in a graph-based processing pipeline. Our framework is implemented in Python with a low number of dependencies, which enables a quick setup process, execution across multiple operating systems, and direct access to cutting-edge machine learning libraries and models. We showcase the functionality and capabilities of MSP through a sample application that connects a mobile eye tracker to classify image patches surrounding the user’s fixation points and visualizes the classification results in real-time.

Projects

More links