Robohub.org
 

Adaptive bipedal walking on slopes


by
06 September 2010



share this:

Imagine walking on a flat surface with your eyes blinded. If the slope below your feet changes, you’ll most likely change your posture to keep moving. To explain this, an idea from the 1950s says that we can predict the sensation that will be produced by a motor command sent by our central nervous system. We can therefore tell apart sensations that are due to our own motion and sensations due to external stimuli. When the expected sensation doesn’t match the sensory input, we change our behavior to compensate.

In work by Schröder-Schetelig et al., a robotic walker uses this idea to stay on its two feet. More precisely, the robot uses a neural network (which is a type of controller) to send commands to hip-joint and knee-joint motors such that the robot is able to walk on flat terrain. These motor commands are then copied (efference copy) and fed to a second neural network that captures the internal model of the robot. This model predicts the acceleration the robot should feel given its motor command and current state. If the acceleration is larger than expected, the robot is probably going downhill and should lean back to slow down. Likewise, if the acceleration is lower, the robot is going uphill and should lean forward. Leaning backward and forward is performed by moving a mass that represents the upper body of the robot and is controlled by a third neural network that takes as an input the robot’s predicted acceleration and the measured acceleration given by an accelerometer.

Experiments shown in the video below were conducted on Runbot, a 23cm bipedal robot that is physically constrained to a circular path of 1m radius and can not perform sideway movements. Results show the robot successfully climbing a changing slope.

In the future, Schröder-Schetelig et al. hope to refine the internal model of Runbot, make it climb even steeper slopes and adapt to new and unforeseen environments.




Sabine Hauert is President of Robohub and Associate Professor at the Bristol Robotics Laboratory
Sabine Hauert is President of Robohub and Associate Professor at the Bristol Robotics Laboratory





Related posts :



Why companies don’t share AV crash data – and how they could

  01 Dec 2025
Researchers have created a roadmap outlining the barriers and opportunities to encourage AV companies to share the data to make AVs safer.

Robot Talk Episode 135 – Robot anatomy and design, with Chapa Sirithunge

  28 Nov 2025
In the latest episode of the Robot Talk podcast, Claire chatted to Chapa Sirithunge from University of Cambridge about what robots can teach us about human anatomy, and vice versa.

Learning robust controllers that work across many partially observable environments

  27 Nov 2025
Exploring designing controllers that perform reliably even when the environment may not be precisely known.

Human-robot interaction design retreat

  25 Nov 2025
Find out more about an event exploring design for human-robot interaction.

Robot Talk Episode 134 – Robotics as a hobby, with Kevin McAleer

  21 Nov 2025
In the latest episode of the Robot Talk podcast, Claire chatted to Kevin McAleer from kevsrobots about how to get started building robots at home.

ACM SIGAI Autonomous Agents Award 2026 open for nominations

  19 Nov 2025
Nominations are solicited for the 2026 ACM SIGAI Autonomous Agents Research Award.

Robot Talk Episode 133 – Creating sociable robot collaborators, with Heather Knight

  14 Nov 2025
In the latest episode of the Robot Talk podcast, Claire chatted to Heather Knight from Oregon State University about applying methods from the performing arts to robotics.



 

Robohub is supported by:




Would you like to learn how to tell impactful stories about your robot or AI system?


scicomm
training the next generation of science communicators in robotics & AI


 












©2025.05 - Association for the Understanding of Artificial Intelligence