Robohub.org
 

AR app from docomo translates menus and signs in real time


by
10 October 2012



share this:
12-0178-r

On October 11, NTT Docomo will start the Utsushite Honyaku service, which instantly translates foreign-language restaurant menus when you point a smartphone’s camera at them.

Utsushite Honyaku is a commercial version of a service that’s been available as a trial version. As well as menus, the new service can now handle signs. It works between Japanese and four languages: English, Korean and both simplified and traditional Chinese.

“For example, suppose you visit Korea, and you can’t read signs in Korean at all. You can start up this app, select the Korean dictionary, and use it just by pointing your smartphone’s camera at the writing you can’t read. The translation is shown over the Korean text, so when you use this app, it feels as if you’re looking at a sign in Japanese.”

“This service doesn’t use cloud translation, so instead, the app itself and the dictionary are downloaded to the phone. This means it can be used for free, without having to access the mobile network.”

The user can choose two translation modes. In one mode, the whole translation is superposed on the actual picture, and in the other, it’s shown line by line in a separate window.

This service can also translate from Japanese into the four languages. So, it’s helpful not only for Japanese Docomo users, but also for foreign Android smartphone users visiting Japan.




DigInfo TV is a Tokyo-based online video news platform dedicated to producing original coverage of cutting edge technology, research and products from Japan.
DigInfo TV is a Tokyo-based online video news platform dedicated to producing original coverage of cutting edge technology, research and products from Japan.





Related posts :



Meet the AI-powered robotic dog ready to help with emergency response

  07 Jan 2026
Built by Texas A&M engineering students, this four-legged robot could be a powerful ally in search-and-rescue missions.

MIT engineers design an aerial microrobot that can fly as fast as a bumblebee

  31 Dec 2025
With insect-like speed and agility, the tiny robot could someday aid in search-and-rescue missions.

Robohub highlights 2025

  29 Dec 2025
We take a look back at some of the interesting blog posts, interviews and podcasts that we've published over the course of the year.

The science of human touch – and why it’s so hard to replicate in robots

  24 Dec 2025
Trying to give robots a sense of touch forces us to confront just how astonishingly sophisticated human touch really is.

Bio-hybrid robots turn food waste into functional machines

  22 Dec 2025
EPFL scientists have integrated discarded crustacean shells into robotic devices, leveraging the strength and flexibility of natural materials for robotic applications.

Robot Talk Episode 138 – Robots in the environment, with Stefano Mintchev

  19 Dec 2025
In the latest episode of the Robot Talk podcast, Claire chatted to Stefano Mintchev from ETH Zürich about robots to explore and monitor the natural environment.

Artificial tendons give muscle-powered robots a boost

  18 Dec 2025
The new design from MIT engineers could pump up many biohybrid builds.



 

Robohub is supported by:




Would you like to learn how to tell impactful stories about your robot or AI system?


scicomm
training the next generation of science communicators in robotics & AI


 












©2025.05 - Association for the Understanding of Artificial Intelligence