Publikation
Tactile MNIST: Benchmarking Active Tactile Perception
Tim Schneider; Guillaume Duret; Cristiana de Farias; Roberto Calandra; Liming Chen; Jan Peters
In: Computing Research Repository eprint Journal (CoRR), Vol. abs/2506.06361, Pages 1-39, arXiv, 2025.
Zusammenfassung
Tactile perception has the potential to significantly enhance dexterous robotic
manipulation by providing rich local information that can complement or substitute
for other sensory modalities such as vision. However, because tactile sensing is
inherently local, it is not well-suited for tasks that require broad spatial awareness
or global scene understanding on its own. A human-inspired strategy to address
this issue is to consider active perception techniques instead. That is, to actively
guide sensors toward regions with more informative or significant features and
integrate such information over time in order to understand a scene or complete
a task. Both active perception and different methods for tactile sensing have
received significant attention recently. Yet, despite advancements, both fields lack
standardized benchmarks. To bridge this gap, we introduce the Tactile MNIST
Benchmark Suite, an open-source, Gymnasium-compatible benchmark specifically
designed for active tactile perception tasks, including localization, classification,
and volume estimation. Our benchmark suite offers diverse simulation scenarios,
from simple toy environments all the way to complex tactile perception tasks using
vision-based tactile sensors. Furthermore, we also offer a comprehensive dataset
comprising 13,500 synthetic 3D MNIST digit models and 153,600 real-world
tactile samples collected from 600 3D printed digits. Using this dataset, we train a
CycleGAN for realistic tactile simulation rendering. By providing standardized
protocols and reproducible evaluation frameworks, our benchmark suite facilitates
systematic progress in the fields of tactile sensing and active perception.
Project page: https://sites.google.com/robot-learning.de/tactile-mnist
