Robohub.org
 

Robotic wheelchair from Chiba Tech turns wheels into legs and climbs over steps


by
15 October 2012



share this:
12-0174-r

At the Chiba Institute of Technology, a group led by Shuro Nakajima has developed a robot wheelchair that can climb over steps.

This robot can make a variety of movements, using its four-wheel drive and five axes. Normally, it rolls along on its wheels, but if there’s a step or ditch, it can get over the obstacle by using its wheels as legs. All the user needs to do is tell it which direction to go, using a joystick. The robot automatically assesses the surrounding terrain and moves appropriately.

Also, when moving on uneven ground, the robot controls the seat to make sure it remains level.

“The robot has sensors on its feet, to see if there’s anything nearby. It can also see how far it is from a step. It actually has various sensors, and it uses them in combination, to assess how big a step is. Even if the sensors are in error, and the wheels touch an obstacle, the wheel torque can vary, so the robot can use that as back-up, too. In this way, the robot can detect the road surface reliably.”

“If a sensor detects a step, the robot calculates whether it can lift that leg. It can’t raise its wheels right away, so the steering system at the rear makes preparatory motions to gain stability. When the wheels can be raised stably, the robot lifts its legs.”

In addition, this robot can line its wheels up, and extend stabilizers to the left and right, enabling it to turn a circle. This makes it easy to reverse, even in a narrow space.

“We were particular about using wheels, because this kind of vehicle will mostly move on ordinary paved surfaces. The most efficient way of getting around on paved surfaces is to use wheels, like a car. So, this robot mainly uses wheels, but the wheels can become legs.”

“For now, we’re presenting this system and form as a concept, and the motion has mostly been worked out. So, we’re at the stage where we can show this robot to the world. In the next phase, we’ll get a variety of people to try it, so we can fine-tune the user experience.”




DigInfo TV is a Tokyo-based online video news platform dedicated to producing original coverage of cutting edge technology, research and products from Japan.
DigInfo TV is a Tokyo-based online video news platform dedicated to producing original coverage of cutting edge technology, research and products from Japan.

            AUAI is supported by:



Subscribe to Robohub newsletter on substack



Related posts :

How to teach the same skill to different robots

  11 May 2026
A new framework to teach a skill to robots with different mechanical designs, allowing them to carry out the same task without rewriting code for each.

Robot Talk Episode 155 – Making aerial robots smarter, with Melissa Greeff

  08 May 2026
In the latest episode of the Robot Talk podcast, Claire chatted to Melissa Greeff from Queen's University about autonomous navigation and learning for drones.

New understanding of insect flight points way to stable flapping-wing robots

  07 May 2026
The way bugs and birds flap their wings may look effortless, but the dynamics that keep them aloft are dizzyingly complex and difficult to quantify.

Robotically assembled building blocks could make construction more efficient and sustainable

  05 May 2026
Research suggests constructing a simple building from interlocking subunits should be mechanically feasible and have a much smaller carbon footprint.

Robot Talk Episode 154 – Visual navigation in insects and robots, with Andrew Philippides

  01 May 2026
In the latest episode of the Robot Talk podcast, Claire chatted to Andrew Philippides from the University of Sussex about what we can learn from ants and bees to improve robot navigation.

Ultralightweight sonar plus AI lets tiny drones navigate like bats

  29 Apr 2026
Researchers develop ultrasound-based perception system inspired by bat echolocation.

Gradient-based planning for world models at longer horizons

  28 Apr 2026
What were the problems that motivated this project and what was the approach to address them?

Robot Talk Episode 153 – Origami-inspired robots, with Chenying Liu

  24 Apr 2026
In the latest episode of the Robot Talk podcast, Claire chatted to Chenying Liu from University of Oxford about how a robot's physical form can actively contribute to sensing, processing, decision-making, and movement.



AUAI is supported by:







Subscribe to Robohub newsletter on substack




 















©2026.02 - Association for the Understanding of Artificial Intelligence