Robohub.org
 

Augmented realty interface for telepresence robots


by
22 February 2013



share this:

Giraff2013 Telepresence robots are remote controlled cameras on wheels that are connected via Wi-Fi to a user’s computer, mobile phone or tablet. The user can communicate via a “human-scale” robot with other people and move around a workplace or home.

Researcher Giovanni Mosiello from the Department of Technology at Örebro University, has investigated how to enhance depth perception when using a telepresence robot. In his experiments he found that users often do not have any knowledge of how to drive a robot, and in most cases are not even familiar with the typical movement commands in the field of 3D gaming. Users do not feel like they are part of the remote environment where the robot is driving. This kind of perception however is critical because it allows the users to effectively estimate the distance between objects and avoid collisions. The main goal of the project was to provide a user interface that improves the user’s depth perception through 2D visual feedback and, as a secondary goal, to allow non-expert users to become familiar with the robot control interface. In his thesis “Telepresence robot: an effective drive interface using the augmented reality” Mosiello describes how a more user friendly interface can reduce the effort needed to drive the robot properly, especially for non-expert users.



tags: ,


Wolfgang Heller


Subscribe to Robohub newsletter on substack



Related posts :

I developed an app that uses drone footage to track plastic litter on beaches

  26 Feb 2026
Plastic pollution is one of those problems everyone can see, yet few know how to tackle it effectively.

Translating music into light and motion with robots

  25 Feb 2026
Robots the size of a soccer ball create new visual art by trailing light that represents the “emotional essence” of music

Robot Talk Episode 145 – Robotics and automation in manufacturing, with Agata Suwala

  20 Feb 2026
In the latest episode of the Robot Talk podcast, Claire chatted to Agata Suwala from the Manufacturing Technology Centre about leveraging robotics to make manufacturing systems more sustainable.

Reversible, detachable robotic hand redefines dexterity

  19 Feb 2026
A robotic hand developed at EPFL has dual-thumbed, reversible-palm design that can detach from its robotic ‘arm’ to reach and grasp multiple objects.

“Robot, make me a chair”

  17 Feb 2026
An AI-driven system lets users design and build simple, multicomponent objects by describing them with words.

Robot Talk Episode 144 – Robot trust in humans, with Samuele Vinanzi

  13 Feb 2026
In the latest episode of the Robot Talk podcast, Claire chatted to Samuele Vinanzi from Sheffield Hallam University about how robots can tell whether to trust or distrust people.

How can robots acquire skills through interactions with the physical world? An interview with Jiaheng Hu

and   12 Feb 2026
Find out more about work published at the Conference on Robot Learning (CoRL).

Sven Koenig wins the 2026 ACM/SIGAI Autonomous Agents Research Award

  10 Feb 2026
Sven honoured for his work on AI planning and search.



Robohub is supported by:


Subscribe to Robohub newsletter on substack




 















©2026.02 - Association for the Understanding of Artificial Intelligence