Robohub.org
 

Applying direct transcription methods to robot motion planning


by
08 March 2016



share this:
Source: ADRLabETH/youtube

Hardware experiments on motion planning for the the ballbot Rezero using direct transcription. Source: ADRLab ETH/youtube

When you walk across a room or down a path, your brain is making thousands of decisions on how best to move. For example, how best to use your weight, scanning for any obstacles or uneven surfaces, and how rigid (or soft) your limbs and joints should be. Teaching a robot to conduct the same decision-making process is ongoing in robotics, and a team from ADRL, ETH Zurich and NCCR Robotics is studying existing direct transcription methods for trajectory optimization applied to robot motion planning.

Rezero, the dancing ballbot

Rezero, the dancing ballbot. Source: ETH Zurich.

Using a method of control called direct transcription (where complex mathematical problems are broken down into smaller problems and each solved individually), the team uses direct transcription to enable an unstable ball-balancing robot to perform a series of tasks with increasing complexity. The common issue with direct optimisation methods, which are used to allow the robots to obtain more natural movements, is that they require computers to continuously run multiple algorithms at once, meaning that planning a path in real time, like the human brain does, has not yet been achieved. Simply put, the computers working online with a robot are nowhere near as fast, efficient and robust as your brain, and that’s before considering how heavy such a computer might need to be, or how much bandwidth this communication requires.

First, by using computer models, the team tested the unstable ball balancing robot (see video below) with three variations of a simple task where the robot had to move from one location to another while avoiding fixed obstacles. By allowing the robot to use the best solution it found for previous tasks, coupled with a feedback controller to stabilise the system, the simulated robot was able to find a path through two obstacles in under a second. When using the real robot, the same paths and trajectories were followed, with the robot reaching the planned destination safely and in the same period of time as the virtual robot, thus validating the hypothesis.

The speed with which the robot is able to assess its scenario and follow a path that it has decided for itself without falling is a positive step forward that can be transported onto more complex robots (such as quadrupedal robots) in more uneven environments.

If a quadrupedal robot, such as HyQ or StarlETH, are able to understand obstacles in its path and successfully avoid or modify a movement to accommodate, such as softening joints when walking over rocks, then robots have made one step further towards regularly being sent to disaster zones to locate victims and save more lives.

Reference: 
D. Pardo, L. Möller, M. Neunert, A. W. Winkler and J. Buchli, “Evaluating direct transcription and nonlinear optimization methods for robot motion planning”, IEEE RA-L, 2016.


tags: , , ,


NCCR Robotics





Related posts :



Robot Talk Episode 115 – Robot dogs working in industry, with Benjamin Mottis

  28 Mar 2025
In the latest episode of the Robot Talk podcast, Claire chatted to Benjamin Mottis from ANYbotics about deploying their four-legged ANYmal robot in a variety of industries.

Robot Talk Episode 114 – Reducing waste with robotics, with Josie Gotz

  21 Mar 2025
In the latest episode of the Robot Talk podcast, Claire chatted to Josie Gotz from the Manufacturing Technology Centre about robotics for material recovery, reuse and recycling.

Robot Talk Episode 113 – Soft robotic hands, with Kaspar Althoefer

  14 Mar 2025
In the latest episode of the Robot Talk podcast, Claire chatted to Kaspar Althoefer from Queen Mary University of London about soft robotic manipulators for healthcare and manufacturing.

Robot Talk Episode 112 – Getting creative with robotics, with Vali Lalioti

  07 Mar 2025
In the latest episode of the Robot Talk podcast, Claire chatted to Vali Lalioti from the University of the Arts London about how art, culture and robotics interact.

Robot Talk Episode 111 – Robots for climate action, with Patrick Meier

  28 Feb 2025
In the latest episode of the Robot Talk podcast, Claire chatted to Patrick Meier from the Climate Robotics Network about how robots can help scale action on climate change.

Robot Talk Episode 110 – Designing ethical robots, with Catherine Menon

  21 Feb 2025
In the latest episode of the Robot Talk podcast, Claire chatted to Catherine Menon from the University of Hertfordshire about designing home assistance robots with ethics in mind.

Robot Talk Episode 109 – Building robots at home, with Dan Nicholson

  14 Feb 2025
In the latest episode of the Robot Talk podcast, Claire chatted to Dan Nicholson from MakerForge.tech about creating open source robotics projects you can do at home.

Robot Talk Episode 108 – Giving robots the sense of touch, with Anuradha Ranasinghe

  07 Feb 2025
In the latest episode of the Robot Talk podcast, Claire chatted to Anuradha Ranasinghe from Liverpool Hope University about haptic sensors for wearable tech and robotics.





Robohub is supported by:




Would you like to learn how to tell impactful stories about your robot or AI system?


scicomm
training the next generation of science communicators in robotics & AI


©2024 - Association for the Understanding of Artificial Intelligence


 












©2021 - ROBOTS Association