Robohub.org
 

Bipedal robot uses high-speed vision to run


by
15 October 2014



share this:

We have developed a visually controlled bipedal running robot named ACHIRES: Actively Coordinated High-speed Image-processing Running Experiment System. This robot has a leg length of 14cm and 6 degrees of freedom, and can run in the sagittal plane at 4.2 km/h . Its key technologies are high-speed vision for recognizing the posture of the robot at 600 fps, and high-speed actuation for realizing high speed motion. The combination of these technologies plays an important role in the robot’s ability to run stably at high speeds.

In our laboratory we develop various types of high-speed vision hardware and algorithms that can implement high-speed image processing with a sampling time from 10ms up to 1ms. High-speed vision can provide control data at the same sampling rate as that of the servo controller used for the robot actuators. This means that vision can control actuators just like other sensors e.g. an encoder. Although at present the camera is located off board the robot, it will be attached to the body in future iterations.

In addition, we developed a light-weight, high-power actuator for high-speed motion. Its torque per weight ratio is 3.5 times higher than that of previous products of same actuators.

Those technologies are used in various demonstrations of our robots such as:

The running algorithm used in the ACHIRES robot is different from those typically used in other running robots. While most running robots use a method based on ZMP-criteria for maintaining stable and balanced posture, we introduced a very simple algorithm using high-speed performance of a sensory-motor system without ZMP criteria. The aerial posture is recovered to compensate for the deviation from the stable trajectory using high-speed visual feedback.

BiPedal

It took four years to develop ACHIRES, in part because analyzing robot dynamics that are faster than video capture rates requires high speed video analysis. You can see how the abilities of the robot have evolved since the project was first started in 2009:

Although ACHIRES is a research platform with no direct application at the present moment, the combination of high-speed vision and actuation could be applied to various types of high-speed intelligent systems, including high-speed robots, manufacturing systems, aircraft, microscope image control for bio/medical applications, and human-machine interfaces. We believe it will open new era of visual feed back systems.

More info:
Project Website
YouTube channel

Reference: T. Tamada, W. Ikarashi, D. Yoneyama, K. Tanaka, Y. Yamakawa, T. Senoo, M. Ishikawa: High Speed Bipedal Robot Running Using High Speed Visual Feedback, The Robotics Society of Japan The 32nd Annual Conference (RSJ2014) (Fukuoka, 2014)/1B2-03.

 



tags: , , , , ,


Masatoshi Ishikawa is a professor at the University of Tokyo.
Masatoshi Ishikawa is a professor at the University of Tokyo.





Related posts :



Robot Talk Episode 110 – Designing ethical robots, with Catherine Menon

  21 Feb 2025
In the latest episode of the Robot Talk podcast, Claire chatted to Catherine Menon from the University of Hertfordshire about designing home assistance robots with ethics in mind.

Robot Talk Episode 109 – Building robots at home, with Dan Nicholson

  14 Feb 2025
In the latest episode of the Robot Talk podcast, Claire chatted to Dan Nicholson from MakerForge.tech about creating open source robotics projects you can do at home.

Robot Talk Episode 108 – Giving robots the sense of touch, with Anuradha Ranasinghe

  07 Feb 2025
In the latest episode of the Robot Talk podcast, Claire chatted to Anuradha Ranasinghe from Liverpool Hope University about haptic sensors for wearable tech and robotics.

Robot Talk Episode 107 – Animal-inspired robot movement, with Robert Siddall

  31 Jan 2025
In the latest episode of the Robot Talk podcast, Claire chatted to Robert Siddall from the University of Surrey about novel robot designs inspired by the way real animals move.

Robot Talk Episode 106 – The future of intelligent systems, with Didem Gurdur Broo

  24 Jan 2025
In the latest episode of the Robot Talk podcast, Claire chatted to Didem Gurdur Broo from Uppsala University about how to shape the future of robotics, autonomous vehicles, and industrial automation.

Robot Talk Episode 105 – Working with robots in industry, with Gianmarco Pisanelli 

  17 Jan 2025
In the latest episode of the Robot Talk podcast, Claire chatted to Gianmarco Pisanelli from the Advanced Manufacturing Research Centre about how to promote the safe and intuitive use of robots in manufacturing.

Robot Talk Episode 104 – Robot swarms inspired by nature, with Kirstin Petersen

  10 Jan 2025
In the latest episode of the Robot Talk podcast, Claire chatted to Kirstin Petersen from Cornell University about how robots can work together to achieve complex behaviours.

Robot Talk Episode 103 – Delivering medicine by drone, with Keenan Wyrobek

  20 Dec 2024
In the latest episode of the Robot Talk podcast, Claire chatted to Keenan Wyrobek from Zipline about drones for delivering life-saving medicine to remote locations.





Robohub is supported by:




Would you like to learn how to tell impactful stories about your robot or AI system?


scicomm
training the next generation of science communicators in robotics & AI


©2024 - Association for the Understanding of Artificial Intelligence


 












©2021 - ROBOTS Association