Robohub.org
 

Underwater 3D mapping


by
13 May 2011



share this:

We saw the need for good underwater robots during the Deepwater spill last summer. In such scenarios, a remote operator controls a robot equipped with a camera and means to build a 2D map of the environment. However, if you want your robot to inspect non-trivial structures such as oil- and gas- production and transport equipment, or if you want it to be more autonomous in challenging environments, 3D mapping is essential.

As seen in previous posts, to make a 3D map for a ground robot you might use a laser-range finder. However, similar sensors are not available in underwater environments and the researchers are left coping with low-resolution and noisy measurement systems. To solve this problem, Bülow et al. propose a new method to combine sensory information from noisy 3D sonar scans that partially overlap. The general idea is that the robot scans the environment, moves a little, and then scans the environment again such that the scans overlap. By comparing them, the researchers are able to figure out how the robot moved and can use that to infer where each scan was taken from. This means that there is no need to add expensive motion sensors typically required by other state-of-the-art strategies (Inertial Navigation Systems, and Doppler Velocity Logs).

The approach was first tested in simulation on virtual images with controllable levels of noise. Results show that the method is not computationally expensive, can deal with large spatial distances between scans, and that it is very robust to noise. The authors then plunged a Tritech Eclipse sonar in a river in Germany to generate 18 scans of the Lesumer Sperrwerk, a river flood gate. Results from that experiment shown in the video below compared well to other approaches described in the literature.



In the future, Bülow et al. hope to combine this approach with SLAM to avoid the accumulation of relative localization errors.



tags:


Sabine Hauert is President of Robohub and Associate Professor at the Bristol Robotics Laboratory
Sabine Hauert is President of Robohub and Associate Professor at the Bristol Robotics Laboratory





Related posts :



Robot Talk Episode 135 – Robot anatomy and design, with Chapa Sirithunge

  28 Nov 2025
In the latest episode of the Robot Talk podcast, Claire chatted to Chapa Sirithunge from University of Cambridge about what robots can teach us about human anatomy, and vice versa.

Learning robust controllers that work across many partially observable environments

  27 Nov 2025
Exploring designing controllers that perform reliably even when the environment may not be precisely known.

Human-robot interaction design retreat

  25 Nov 2025
Find out more about an event exploring design for human-robot interaction.

Robot Talk Episode 134 – Robotics as a hobby, with Kevin McAleer

  21 Nov 2025
In the latest episode of the Robot Talk podcast, Claire chatted to Kevin McAleer from kevsrobots about how to get started building robots at home.

ACM SIGAI Autonomous Agents Award 2026 open for nominations

  19 Nov 2025
Nominations are solicited for the 2026 ACM SIGAI Autonomous Agents Research Award.

Robot Talk Episode 133 – Creating sociable robot collaborators, with Heather Knight

  14 Nov 2025
In the latest episode of the Robot Talk podcast, Claire chatted to Heather Knight from Oregon State University about applying methods from the performing arts to robotics.

CoRL2025 – RobustDexGrasp: dexterous robot hand grasping of nearly any object

  11 Nov 2025
A new reinforcement learning framework enables dexterous robot hands to grasp diverse objects with human-like robustness and adaptability—using only a single camera.



 

Robohub is supported by:




Would you like to learn how to tell impactful stories about your robot or AI system?


scicomm
training the next generation of science communicators in robotics & AI


 












©2025.05 - Association for the Understanding of Artificial Intelligence