Robohub.org
 

Underwater 3D mapping


by
13 May 2011



share this:

We saw the need for good underwater robots during the Deepwater spill last summer. In such scenarios, a remote operator controls a robot equipped with a camera and means to build a 2D map of the environment. However, if you want your robot to inspect non-trivial structures such as oil- and gas- production and transport equipment, or if you want it to be more autonomous in challenging environments, 3D mapping is essential.

As seen in previous posts, to make a 3D map for a ground robot you might use a laser-range finder. However, similar sensors are not available in underwater environments and the researchers are left coping with low-resolution and noisy measurement systems. To solve this problem, Bülow et al. propose a new method to combine sensory information from noisy 3D sonar scans that partially overlap. The general idea is that the robot scans the environment, moves a little, and then scans the environment again such that the scans overlap. By comparing them, the researchers are able to figure out how the robot moved and can use that to infer where each scan was taken from. This means that there is no need to add expensive motion sensors typically required by other state-of-the-art strategies (Inertial Navigation Systems, and Doppler Velocity Logs).

The approach was first tested in simulation on virtual images with controllable levels of noise. Results show that the method is not computationally expensive, can deal with large spatial distances between scans, and that it is very robust to noise. The authors then plunged a Tritech Eclipse sonar in a river in Germany to generate 18 scans of the Lesumer Sperrwerk, a river flood gate. Results from that experiment shown in the video below compared well to other approaches described in the literature.



In the future, Bülow et al. hope to combine this approach with SLAM to avoid the accumulation of relative localization errors.



tags:


Sabine Hauert is President of Robohub and Associate Professor at the Bristol Robotics Laboratory
Sabine Hauert is President of Robohub and Associate Professor at the Bristol Robotics Laboratory





Related posts :

How can robots acquire skills through interactions with the physical world? An interview with Jiaheng Hu

and   12 Feb 2026
Find out more about work published at the Conference on Robot Learning (CoRL).

Sven Koenig wins the 2026 ACM/SIGAI Autonomous Agents Research Award

  10 Feb 2026
Sven honoured for his work on AI planning and search.

Robot Talk Episode 143 – Robots for children, with Elmira Yadollahi

  06 Feb 2026
In the latest episode of the Robot Talk podcast, Claire chatted to Elmira Yadollahi from Lancaster University about how children interact with and relate to robots.

New frontiers in robotics at CES 2026

  03 Feb 2026
Henry Hickson reports on the exciting developments in robotics at Consumer Electronics Show 2026.

Robot Talk Episode 142 – Collaborative robot arms, with Mark Gray

  30 Jan 2026
In the latest episode of the Robot Talk podcast, Claire chatted to Mark Gray from Universal Robots about their lightweight robotic arms that work alongside humans.

Robot Talk Episode 141 – Our relationship with robot swarms, with Razanne Abu-Aisheh

  23 Jan 2026
In the latest episode of the Robot Talk podcast, Claire chatted to Razanne Abu-Aisheh from the University of Bristol about how people feel about interacting with robot swarms.

Vine-inspired robotic gripper gently lifts heavy and fragile objects

  23 Jan 2026
The new design could be adapted to assist the elderly, sort warehouse products, or unload heavy cargo.

Robot Talk Episode 140 – Robot balance and agility, with Amir Patel

  16 Jan 2026
In the latest episode of the Robot Talk podcast, Claire chatted to Amir Patel from University College London about designing robots with the agility and manoeuvrability of a cheetah.


Robohub is supported by:





 













©2026.01 - Association for the Understanding of Artificial Intelligence