Publication
SonarCloud: A Simulated Forward Looking Sonar Dataset for Underwater Perception Tasks
Nael Jaber; Qais Al-Ramahi; Leif Christensen; Bilal Wehbe
In: Jessica Daignault; Katie Skinner; Ed Verhamme (Hrsg.). SonarCloud: A Simulated Forward Looking Sonar Dataset for Underwater Perception Tasks. OCEANS MTS/IEEE Conference (OCEANS-2025), September 29 - October 2, Chicago, Illinois, Illinois, USA, IEEE, 2025.
Abstract
With over 80% of the world's oceans remaining unmapped and unexplored, the advancement of robust underwater perception technologies is becoming more important than ever before. This vast frontier cannot be reliably observed with optical cameras mounted on Autonomous Underwater Vehicles (AUVs), which struggle in turbid, low-light conditions common to marine environments. Acoustic sensors like the Forward-Looking Sonar (FLS) are essential alternatives, yet progress in the field is significantly hampered by a profound scarcity of large-scale, publicly available sonar datasets. To address this critical gap, we introduce the SonarCloud Dataset, a comprehensive synthetic dataset generated to accelerate research in underwater perception. Our dataset consists of FLS and depth imagery of 19 distinct objects, totaling approximately 500,000 images, along with a 3D point cloud for each object generated from the corresponding depth maps in various orientations. As technical validation, we selected object detection and 3D reconstruction to evaluate the effectiveness of our dataset. We demonstrate that state-of-theart models trained solely on simulated data from our dataset can successfully detect objects in real-world sonar images and reconstruct their 3D shapes. The SonarCloud Dataset is presented as a valuable tool for the research community, and it can be found publicly in: https://doi.org/10.5281/zenodo.16645568.
