Robohub.org
 

Toyota Partner Robot provides everyday assistance for people with disabilities


by
04 October 2012



share this:

12-0175-rToyota has developed the Partner Robot, to provide everyday assistance for people with disabilities. This robot has a compact, cylindrical body, so it can turn round in small spaces, as well as folding arms, which can do tasks such as fetching objects and opening curtains. The robot is controlled easily, by using the touch interface on a smartphone or speech recognition. It can also be controlled remotely by a caregiver, while communicating with the user.

“For robots to operate in ordinary living spaces, the most important factor is their size. So, in developing this one, we’ve prioritized making it compact.”

“The picture from the robot’s camera is shown on the user’s tablet. We’ve achieved a capability where, if there’s a dropped object in the picture, the user can tap it, and the robot automatically picks the object up. As for fetching things, it’s currently very difficult for robots to find and bring back objects in ordinary environments. So, for now, we’ve achieved a system where the user puts their favorite things in specific boxes, and registers the boxes, so the robot can automatically fetch things from there.”

The robot’s height can vary between 83cm and 1.3m, so it can reach things in high places. When the robot picks something up, it can also use a suction mechanism, so it can handle thin objects like paper as well.

“Regarding capabilities, we actually surveyed people with disabilities, together with the Japan Service Dog Association, to find out what capabilities users want a robot to have. The results showed that there’s a strong need for robots that can pick up dropped objects, fetch wanted objects, and communicate remotely, to report an emergency, for example. So the first thing we’ve done in developing this robot is, we’ve given it those three capabilities.”

“We’ve done a trial with Yokohama Rehabilitation Center, and we’ve already found some issues that need thinking about. So, the first thing we’ll do now is, we’ll implement capabilities to handle those. For example, people would like the robot to operate switches and open doors. We’ll be testing capabilities like that in our trials.”



tags: ,


DigInfo TV is a Tokyo-based online video news platform dedicated to producing original coverage of cutting edge technology, research and products from Japan.
DigInfo TV is a Tokyo-based online video news platform dedicated to producing original coverage of cutting edge technology, research and products from Japan.

            AUAI is supported by:



Subscribe to Robohub newsletter on substack



Related posts :

Developing active and flexible microrobots

  13 May 2026
This class of robots opens up possibilities for biomedical applications.

How to teach the same skill to different robots

  11 May 2026
A new framework to teach a skill to robots with different mechanical designs, allowing them to carry out the same task without rewriting code for each.

Robot Talk Episode 155 – Making aerial robots smarter, with Melissa Greeff

  08 May 2026
In the latest episode of the Robot Talk podcast, Claire chatted to Melissa Greeff from Queen's University about autonomous navigation and learning for drones.

New understanding of insect flight points way to stable flapping-wing robots

  07 May 2026
The way bugs and birds flap their wings may look effortless, but the dynamics that keep them aloft are dizzyingly complex and difficult to quantify.

Robotically assembled building blocks could make construction more efficient and sustainable

  05 May 2026
Research suggests constructing a simple building from interlocking subunits should be mechanically feasible and have a much smaller carbon footprint.

Robot Talk Episode 154 – Visual navigation in insects and robots, with Andrew Philippides

  01 May 2026
In the latest episode of the Robot Talk podcast, Claire chatted to Andrew Philippides from the University of Sussex about what we can learn from ants and bees to improve robot navigation.

Ultralightweight sonar plus AI lets tiny drones navigate like bats

  29 Apr 2026
Researchers develop ultrasound-based perception system inspired by bat echolocation.

Gradient-based planning for world models at longer horizons

  28 Apr 2026
What were the problems that motivated this project and what was the approach to address them?



AUAI is supported by:







Subscribe to Robohub newsletter on substack




 















©2026.02 - Association for the Understanding of Artificial Intelligence