Robohub.org
 

Telepresence has come a long way, where do we go from here?


by
03 March 2016



share this:
telepresence-office

A modular telepresence system. Source: wikipedia

In our final post for this series, we’ll talk about the future of telepresence.  For most robots telepresence is just the beginning, a means to an end. Many of the robots in this series are meant for office, medical, or home. But it should not be forgotten this same technology can easily be implemented for dull, dirty, and dangerous work, typical of robots. Allowing human operators to teach robots through telepresence creates a synergy along with higher levels of autonomy.

New robots help navigate the world

Take Henry Evans with Robots for Humanity. In 2002, an unfortunate brain stem stroke left Henry a quadraplegic. Telepresence provided the abilities to help him live his life. Whether using a Beam for a TED talk (see below), a Parrot drone to fly around his garden at home, or a PR2 to fold laundry, Henry has been able to take back control of his life through robotics with a condition that would have otherwise left him bedridden.

Consider your audience, and whether that’s intended

Knowing the operator of the robot is one thing. When Willow Garage launched the Heaphy Project back in 2012, they failed to consider who was on the other side operating the robot handling dish washing duties, which turned out to be rather creepy for the crew. While the concept of using a human anywhere in the world to control a PR2 was technologically advanced, there was the human element (even ones used to working with robots), that needed to be considered. The PR2 robot outsourced their dish washing duties to the world via Amazon’s Mechanical Turk. And though it had promise, it only lasted a month.

Human-like telecommunication

If anonymously operated robots are unnerving, maybe Nayang Technical University’s EDGAR is more comforting. Operators can project their face onto EDGAR’s, instead of a flat monitor, mimicking the gestures of the operator. It is fully controlled by a Microsoft Kinect. The university also developed ‘Nadine’ a social robot that operates like Siri or Cortana. Implementing the technology of EDGAR and Nadine would allow you to nearly duplicate yourself, at least from the waist up.

Better vision display

One obstacle in telepresence systems is lack of peripheral vision. But companies are working to solve this issue. Other technologies being developed include DORA, created by a team of students at the University of Pennsylvania.  By using the Oculus Rift all the head movements of the operator are mimicked by DORA. DORA solves the lack of peripheral vision issue by giving users the ability to look around, giving them a better sense of the environment. It’s the difference between watching something on TV versus actually being there. While this may not necessarily be important at weekly staff meetings, that increased visual awareness in an emergency response situation is vital.

Rise of the personal assistant

designingjibo

Designing Jibo.

And finally there is the rise of the personal assistant. While there are home service robots like Buddy, don’t disregard products like Jibo or Amazon’s Alexa.  While Jibo and Alexa are stationary right now, my hunch is that they won’t be for long.  I would also be on the lookout for an entrant from Google, Apple, or Microsoft to make their way into this scene, by means other than a smartphone.

While we are still a long way from seeing ‘Rosie’ enter our homes as a fully capable service robot, telepresence will lead the way for most.  The technology has come a long way since Pebbles was first introduced, but we’re about to see an even bigger change in the coming years.



tags: ,


Michael Savoie Michael is the founder and Chief Robot Wizard at Frostbyte Technologies, a start-up aimed at developing autonomous outdoor mobile robots.
Michael Savoie Michael is the founder and Chief Robot Wizard at Frostbyte Technologies, a start-up aimed at developing autonomous outdoor mobile robots.

            AUAI is supported by:



Subscribe to Robohub newsletter on substack



Related posts :

Developing active and flexible microrobots

  13 May 2026
This class of robots opens up possibilities for biomedical applications.

How to teach the same skill to different robots

  11 May 2026
A new framework to teach a skill to robots with different mechanical designs, allowing them to carry out the same task without rewriting code for each.

Robot Talk Episode 155 – Making aerial robots smarter, with Melissa Greeff

  08 May 2026
In the latest episode of the Robot Talk podcast, Claire chatted to Melissa Greeff from Queen's University about autonomous navigation and learning for drones.

New understanding of insect flight points way to stable flapping-wing robots

  07 May 2026
The way bugs and birds flap their wings may look effortless, but the dynamics that keep them aloft are dizzyingly complex and difficult to quantify.

Robotically assembled building blocks could make construction more efficient and sustainable

  05 May 2026
Research suggests constructing a simple building from interlocking subunits should be mechanically feasible and have a much smaller carbon footprint.

Robot Talk Episode 154 – Visual navigation in insects and robots, with Andrew Philippides

  01 May 2026
In the latest episode of the Robot Talk podcast, Claire chatted to Andrew Philippides from the University of Sussex about what we can learn from ants and bees to improve robot navigation.

Ultralightweight sonar plus AI lets tiny drones navigate like bats

  29 Apr 2026
Researchers develop ultrasound-based perception system inspired by bat echolocation.

Gradient-based planning for world models at longer horizons

  28 Apr 2026
What were the problems that motivated this project and what was the approach to address them?



AUAI is supported by:







Subscribe to Robohub newsletter on substack




 















©2026.02 - Association for the Understanding of Artificial Intelligence