In our final post for this series, we’ll talk about the future of telepresence. For most robots telepresence is just the beginning, a means to an end. Many of the robots in this series are meant for office, medical, or home. But it should not be forgotten this same technology can easily be implemented for dull, dirty, and dangerous work, typical of robots. Allowing human operators to teach robots through telepresence creates a synergy along with higher levels of autonomy.
New robots help navigate the world
Take Henry Evans with Robots for Humanity. In 2002, an unfortunate brain stem stroke left Henry a quadraplegic. Telepresence provided the abilities to help him live his life. Whether using a Beam for a TED talk (see below), a Parrot drone to fly around his garden at home, or a PR2 to fold laundry, Henry has been able to take back control of his life through robotics with a condition that would have otherwise left him bedridden.
Consider your audience, and whether that’s intended
Knowing the operator of the robot is one thing. When Willow Garage launched the Heaphy Project back in 2012, they failed to consider who was on the other side operating the robot handling dish washing duties, which turned out to be rather creepy for the crew. While the concept of using a human anywhere in the world to control a PR2 was technologically advanced, there was the human element (even ones used to working with robots), that needed to be considered. The PR2 robot outsourced their dish washing duties to the world via Amazon’s Mechanical Turk. And though it had promise, it only lasted a month.
Human-like telecommunication
If anonymously operated robots are unnerving, maybe Nayang Technical University’s EDGAR is more comforting. Operators can project their face onto EDGAR’s, instead of a flat monitor, mimicking the gestures of the operator. It is fully controlled by a Microsoft Kinect. The university also developed ‘Nadine’ a social robot that operates like Siri or Cortana. Implementing the technology of EDGAR and Nadine would allow you to nearly duplicate yourself, at least from the waist up.
Better vision display
One obstacle in telepresence systems is lack of peripheral vision. But companies are working to solve this issue. Other technologies being developed include DORA, created by a team of students at the University of Pennsylvania. By using the Oculus Rift all the head movements of the operator are mimicked by DORA. DORA solves the lack of peripheral vision issue by giving users the ability to look around, giving them a better sense of the environment. It’s the difference between watching something on TV versus actually being there. While this may not necessarily be important at weekly staff meetings, that increased visual awareness in an emergency response situation is vital.
Rise of the personal assistant
And finally there is the rise of the personal assistant. While there are home service robots like Buddy, don’t disregard products like Jibo or Amazon’s Alexa. While Jibo and Alexa are stationary right now, my hunch is that they won’t be for long. I would also be on the lookout for an entrant from Google, Apple, or Microsoft to make their way into this scene, by means other than a smartphone.
While we are still a long way from seeing ‘Rosie’ enter our homes as a fully capable service robot, telepresence will lead the way for most. The technology has come a long way since Pebbles was first introduced, but we’re about to see an even bigger change in the coming years.