Robohub.org
 

Adaptive bipedal walking on slopes


by
06 September 2010



share this:

Imagine walking on a flat surface with your eyes blinded. If the slope below your feet changes, you’ll most likely change your posture to keep moving. To explain this, an idea from the 1950s says that we can predict the sensation that will be produced by a motor command sent by our central nervous system. We can therefore tell apart sensations that are due to our own motion and sensations due to external stimuli. When the expected sensation doesn’t match the sensory input, we change our behavior to compensate.

In work by Schröder-Schetelig et al., a robotic walker uses this idea to stay on its two feet. More precisely, the robot uses a neural network (which is a type of controller) to send commands to hip-joint and knee-joint motors such that the robot is able to walk on flat terrain. These motor commands are then copied (efference copy) and fed to a second neural network that captures the internal model of the robot. This model predicts the acceleration the robot should feel given its motor command and current state. If the acceleration is larger than expected, the robot is probably going downhill and should lean back to slow down. Likewise, if the acceleration is lower, the robot is going uphill and should lean forward. Leaning backward and forward is performed by moving a mass that represents the upper body of the robot and is controlled by a third neural network that takes as an input the robot’s predicted acceleration and the measured acceleration given by an accelerometer.

Experiments shown in the video below were conducted on Runbot, a 23cm bipedal robot that is physically constrained to a circular path of 1m radius and can not perform sideway movements. Results show the robot successfully climbing a changing slope.

In the future, Schröder-Schetelig et al. hope to refine the internal model of Runbot, make it climb even steeper slopes and adapt to new and unforeseen environments.




Sabine Hauert is President of Robohub and Associate Professor at the Bristol Robotics Laboratory
Sabine Hauert is President of Robohub and Associate Professor at the Bristol Robotics Laboratory





Related posts :



ACM SIGAI Autonomous Agents Award 2026 open for nominations

  19 Nov 2025
Nominations are solicited for the 2026 ACM SIGAI Autonomous Agents Research Award.

Robot Talk Episode 133 – Creating sociable robot collaborators, with Heather Knight

  14 Nov 2025
In the latest episode of the Robot Talk podcast, Claire chatted to Heather Knight from Oregon State University about applying methods from the performing arts to robotics.

CoRL2025 – RobustDexGrasp: dexterous robot hand grasping of nearly any object

  11 Nov 2025
A new reinforcement learning framework enables dexterous robot hands to grasp diverse objects with human-like robustness and adaptability—using only a single camera.

Robot Talk Episode 132 – Collaborating with industrial robots, with Anthony Jules

  07 Nov 2025
In the latest episode of the Robot Talk podcast, Claire chatted to Anthony Jules from Robust.AI about their autonomous warehouse robots that work alongside humans.

Teaching robots to map large environments

  05 Nov 2025
A new approach could help a search-and-rescue robot navigate an unpredictable environment by rapidly generating an accurate map of its surroundings.

Robot Talk Episode 131 – Empowering game-changing robotics research, with Edith-Clare Hall

  31 Oct 2025
In the latest episode of the Robot Talk podcast, Claire chatted to Edith-Clare Hall from the Advanced Research and Invention Agency about accelerating scientific and technological breakthroughs.



 

Robohub is supported by:




Would you like to learn how to tell impactful stories about your robot or AI system?


scicomm
training the next generation of science communicators in robotics & AI


 












©2025.05 - Association for the Understanding of Artificial Intelligence