Robohub.org
 

Augmented realty interface for telepresence robots


by
22 February 2013



share this:

Giraff2013 Telepresence robots are remote controlled cameras on wheels that are connected via Wi-Fi to a user’s computer, mobile phone or tablet. The user can communicate via a “human-scale” robot with other people and move around a workplace or home.

Researcher Giovanni Mosiello from the Department of Technology at Örebro University, has investigated how to enhance depth perception when using a telepresence robot. In his experiments he found that users often do not have any knowledge of how to drive a robot, and in most cases are not even familiar with the typical movement commands in the field of 3D gaming. Users do not feel like they are part of the remote environment where the robot is driving. This kind of perception however is critical because it allows the users to effectively estimate the distance between objects and avoid collisions. The main goal of the project was to provide a user interface that improves the user’s depth perception through 2D visual feedback and, as a secondary goal, to allow non-expert users to become familiar with the robot control interface. In his thesis “Telepresence robot: an effective drive interface using the augmented reality” Mosiello describes how a more user friendly interface can reduce the effort needed to drive the robot properly, especially for non-expert users.



tags: ,


Wolfgang Heller

            AUAI is supported by:



Subscribe to Robohub newsletter on substack



Related posts :

Developing active and flexible microrobots

  13 May 2026
This class of robots opens up possibilities for biomedical applications.

How to teach the same skill to different robots

  11 May 2026
A new framework to teach a skill to robots with different mechanical designs, allowing them to carry out the same task without rewriting code for each.

Robot Talk Episode 155 – Making aerial robots smarter, with Melissa Greeff

  08 May 2026
In the latest episode of the Robot Talk podcast, Claire chatted to Melissa Greeff from Queen's University about autonomous navigation and learning for drones.

New understanding of insect flight points way to stable flapping-wing robots

  07 May 2026
The way bugs and birds flap their wings may look effortless, but the dynamics that keep them aloft are dizzyingly complex and difficult to quantify.

Robotically assembled building blocks could make construction more efficient and sustainable

  05 May 2026
Research suggests constructing a simple building from interlocking subunits should be mechanically feasible and have a much smaller carbon footprint.

Robot Talk Episode 154 – Visual navigation in insects and robots, with Andrew Philippides

  01 May 2026
In the latest episode of the Robot Talk podcast, Claire chatted to Andrew Philippides from the University of Sussex about what we can learn from ants and bees to improve robot navigation.

Ultralightweight sonar plus AI lets tiny drones navigate like bats

  29 Apr 2026
Researchers develop ultrasound-based perception system inspired by bat echolocation.

Gradient-based planning for world models at longer horizons

  28 Apr 2026
What were the problems that motivated this project and what was the approach to address them?



AUAI is supported by:







Subscribe to Robohub newsletter on substack




 















©2026.02 - Association for the Understanding of Artificial Intelligence