Robohub.org
 

What do teachers mean when they say ‘do it like me’?

by
17 February 2014



share this:

This post is part of our ongoing efforts to make the latest papers in robotics accessible to a general audience.

Teaching robots to do tasks is useful, and teaching them in an easy and non time-intensive way is even more useful. The algorithm TRIC presented in the latest paper in Autonomous Robots allows robots to observe a few motions from a human teacher, understand the essence of what the demonstration is, and then repeat it and adapt it to new situations.

Robots should learn to move and do useful tasks in order to be helpful to humans. However, tasks that are easy for a human, like grasping a glass, are not so obvious for a machine. Programming a robot requires time and work. Instead, what if the robot could watch the human and learn why the human did what he did, and in what way?

This is a task that we people do all the time. Imagine you are playing tennis and the teacher says ‘do the forehand like me’ and then shows an example. How should the student understand this? Should he move his fingers, or his elbow? Should he watch the ball, the racket, the ground, or the net? All these possible reference points can be described with numbers. The algorithm presented in this paper, called Task Space Retrieval Using Inverse Feedback Control (TRIC), can help a robot learn the important aspects of a demonstrated motion. Afterwards, the robot should be able to reproduce the moves like an expert, even if the task changes slightly.

The algorithm was successfully tested in simulation on various grasping and manipulation tasks. This figure shows one of these tasks in which a robot hand must approach a box and open the cover. The robot was shown 10 sets of trajectories from a simulated teacher. After training, it was then asked to open a series of boxes where the box is moved, rotated, or of a different size. Overall, TRIC was very good on these scenarios with 24 successes out of 25 tries.

For more information, you can read the paper Discovering relevant task spaces using inverse feedback control (N. Jetchev and M. Toussaint, Autonomous Robots – Springer US, Feb 2014) or ask questions below!



tags: ,


Autonomous Robots Blog Latest publications in the journal Autonomous Robots (Springer).
Autonomous Robots Blog Latest publications in the journal Autonomous Robots (Springer).





Related posts :



ep.

352

podcast

Robotics Grasping and Manipulation Competition Spotlight, with Yu Sun

Yu Sun, previous chair of the Robotics Grasping and Manipulation Competition, speaks on the value that this competition brought to the robotics community.
21 May 2022, by
ep.

351

podcast

Early Days of ICRA Competitions, with Bill Smart

Bill Smart, one fo the early ICRA Competition Chairs, dives into the high-level decisions involved with creating a meaningful competition.
21 May 2022, by

New imaging method makes tiny robots visible in the body

Microrobots have the potential to revolutionize medicine. Researchers at the Max Planck ETH Centre for Learning Systems have now developed an imaging technique that for the first time recognises cell-​sized microrobots individually and at high resolution in a living organism.
20 May 2022, by

A draft open standard for an Ethical Black Box

Within the RoboTIPS project, we have developed and tested several model of Ethical Black Boxes, including one for an e-puck robot, and another for the MIRO robot.
19 May 2022, by

Unable to attend #ICRA2022 for accessibility issues? Or just curious to see robots?

There are many things that can make it difficult to attend an in person conference in the United States and so the ICRA Organizing Committee, the IEEE Robotics and Automation Society and OhmniLabs would like to help you attend ICRA virtually.
17 May 2022, by
ep.

350

podcast

Duckietown Competition Spotlight, with Dr Liam Paull

Dr. Liam Paull, cofounder of the Duckietown competition talks about the only robotics competition where Rubber Duckies are the passengers on an autonomous driving track.
17 May 2022, by





©2021 - ROBOTS Association


 












©2021 - ROBOTS Association