Robohub.org
 

Adaptive bipedal walking on slopes


by
06 September 2010



share this:

Imagine walking on a flat surface with your eyes blinded. If the slope below your feet changes, you’ll most likely change your posture to keep moving. To explain this, an idea from the 1950s says that we can predict the sensation that will be produced by a motor command sent by our central nervous system. We can therefore tell apart sensations that are due to our own motion and sensations due to external stimuli. When the expected sensation doesn’t match the sensory input, we change our behavior to compensate.

In work by Schröder-Schetelig et al., a robotic walker uses this idea to stay on its two feet. More precisely, the robot uses a neural network (which is a type of controller) to send commands to hip-joint and knee-joint motors such that the robot is able to walk on flat terrain. These motor commands are then copied (efference copy) and fed to a second neural network that captures the internal model of the robot. This model predicts the acceleration the robot should feel given its motor command and current state. If the acceleration is larger than expected, the robot is probably going downhill and should lean back to slow down. Likewise, if the acceleration is lower, the robot is going uphill and should lean forward. Leaning backward and forward is performed by moving a mass that represents the upper body of the robot and is controlled by a third neural network that takes as an input the robot’s predicted acceleration and the measured acceleration given by an accelerometer.

Experiments shown in the video below were conducted on Runbot, a 23cm bipedal robot that is physically constrained to a circular path of 1m radius and can not perform sideway movements. Results show the robot successfully climbing a changing slope.

In the future, Schröder-Schetelig et al. hope to refine the internal model of Runbot, make it climb even steeper slopes and adapt to new and unforeseen environments.




Sabine Hauert is President of Robohub and Associate Professor at the Bristol Robotics Laboratory
Sabine Hauert is President of Robohub and Associate Professor at the Bristol Robotics Laboratory





Related posts :

“Robot, make me a chair”

  17 Feb 2026
An AI-driven system lets users design and build simple, multicomponent objects by describing them with words.

Robot Talk Episode 144 – Robot trust in humans, with Samuele Vinanzi

  13 Feb 2026
In the latest episode of the Robot Talk podcast, Claire chatted to Samuele Vinanzi from Sheffield Hallam University about how robots can tell whether to trust or distrust people.

How can robots acquire skills through interactions with the physical world? An interview with Jiaheng Hu

and   12 Feb 2026
Find out more about work published at the Conference on Robot Learning (CoRL).

Sven Koenig wins the 2026 ACM/SIGAI Autonomous Agents Research Award

  10 Feb 2026
Sven honoured for his work on AI planning and search.

Robot Talk Episode 143 – Robots for children, with Elmira Yadollahi

  06 Feb 2026
In the latest episode of the Robot Talk podcast, Claire chatted to Elmira Yadollahi from Lancaster University about how children interact with and relate to robots.

New frontiers in robotics at CES 2026

  03 Feb 2026
Henry Hickson reports on the exciting developments in robotics at Consumer Electronics Show 2026.

Robot Talk Episode 142 – Collaborative robot arms, with Mark Gray

  30 Jan 2026
In the latest episode of the Robot Talk podcast, Claire chatted to Mark Gray from Universal Robots about their lightweight robotic arms that work alongside humans.

Robot Talk Episode 141 – Our relationship with robot swarms, with Razanne Abu-Aisheh

  23 Jan 2026
In the latest episode of the Robot Talk podcast, Claire chatted to Razanne Abu-Aisheh from the University of Bristol about how people feel about interacting with robot swarms.


Robohub is supported by:





 













©2026.01 - Association for the Understanding of Artificial Intelligence