Robohub.org
 

Bipedal robot uses high-speed vision to run


by
15 October 2014



share this:

We have developed a visually controlled bipedal running robot named ACHIRES: Actively Coordinated High-speed Image-processing Running Experiment System. This robot has a leg length of 14cm and 6 degrees of freedom, and can run in the sagittal plane at 4.2 km/h . Its key technologies are high-speed vision for recognizing the posture of the robot at 600 fps, and high-speed actuation for realizing high speed motion. The combination of these technologies plays an important role in the robot’s ability to run stably at high speeds.

In our laboratory we develop various types of high-speed vision hardware and algorithms that can implement high-speed image processing with a sampling time from 10ms up to 1ms. High-speed vision can provide control data at the same sampling rate as that of the servo controller used for the robot actuators. This means that vision can control actuators just like other sensors e.g. an encoder. Although at present the camera is located off board the robot, it will be attached to the body in future iterations.

In addition, we developed a light-weight, high-power actuator for high-speed motion. Its torque per weight ratio is 3.5 times higher than that of previous products of same actuators.

Those technologies are used in various demonstrations of our robots such as:

The running algorithm used in the ACHIRES robot is different from those typically used in other running robots. While most running robots use a method based on ZMP-criteria for maintaining stable and balanced posture, we introduced a very simple algorithm using high-speed performance of a sensory-motor system without ZMP criteria. The aerial posture is recovered to compensate for the deviation from the stable trajectory using high-speed visual feedback.

BiPedal

It took four years to develop ACHIRES, in part because analyzing robot dynamics that are faster than video capture rates requires high speed video analysis. You can see how the abilities of the robot have evolved since the project was first started in 2009:

Although ACHIRES is a research platform with no direct application at the present moment, the combination of high-speed vision and actuation could be applied to various types of high-speed intelligent systems, including high-speed robots, manufacturing systems, aircraft, microscope image control for bio/medical applications, and human-machine interfaces. We believe it will open new era of visual feed back systems.

More info:
Project Website
YouTube channel

Reference: T. Tamada, W. Ikarashi, D. Yoneyama, K. Tanaka, Y. Yamakawa, T. Senoo, M. Ishikawa: High Speed Bipedal Robot Running Using High Speed Visual Feedback, The Robotics Society of Japan The 32nd Annual Conference (RSJ2014) (Fukuoka, 2014)/1B2-03.

 



tags: , , , , ,


Masatoshi Ishikawa is a professor at the University of Tokyo.
Masatoshi Ishikawa is a professor at the University of Tokyo.





Related posts :



Why companies don’t share AV crash data – and how they could

  01 Dec 2025
Researchers have created a roadmap outlining the barriers and opportunities to encourage AV companies to share the data to make AVs safer.

Robot Talk Episode 135 – Robot anatomy and design, with Chapa Sirithunge

  28 Nov 2025
In the latest episode of the Robot Talk podcast, Claire chatted to Chapa Sirithunge from University of Cambridge about what robots can teach us about human anatomy, and vice versa.

Learning robust controllers that work across many partially observable environments

  27 Nov 2025
Exploring designing controllers that perform reliably even when the environment may not be precisely known.

Human-robot interaction design retreat

  25 Nov 2025
Find out more about an event exploring design for human-robot interaction.

Robot Talk Episode 134 – Robotics as a hobby, with Kevin McAleer

  21 Nov 2025
In the latest episode of the Robot Talk podcast, Claire chatted to Kevin McAleer from kevsrobots about how to get started building robots at home.

ACM SIGAI Autonomous Agents Award 2026 open for nominations

  19 Nov 2025
Nominations are solicited for the 2026 ACM SIGAI Autonomous Agents Research Award.

Robot Talk Episode 133 – Creating sociable robot collaborators, with Heather Knight

  14 Nov 2025
In the latest episode of the Robot Talk podcast, Claire chatted to Heather Knight from Oregon State University about applying methods from the performing arts to robotics.



 

Robohub is supported by:




Would you like to learn how to tell impactful stories about your robot or AI system?


scicomm
training the next generation of science communicators in robotics & AI


 












©2025.05 - Association for the Understanding of Artificial Intelligence