Robohub.org
 

Feel the edges and contours of virtual objects with this haptic system from NHK


by
03 June 2013



share this:
13-0044-r

With the aim of implementing a television service which allows viewers to touch virtual objects, NHK is developing a tactile system which applies stimuli to five points on one finger, making objects feel more real than when using previous systems.

“This device assists people with a visual disability, through what’s called the tactile or kinesthetic sense. It communicates a 3D shape or 2D graph of an object shown on a TV, such as a work of art, as a sensation felt by the hand.”

“The system detects spatial position data for the fingertip in 1/1000 of a second. The finger moves freely if there isn’t anything present, but when it approaches something, it becomes unable to go into the object. By tracing with your finger like this, you can feel that there’s a continuous rough surface there. Also, the jagged feeling of the tooth makes the sensation similar to that of touching an actual object.”

In previous systems, a force was applied to the entire fingertip, and when angular parts of an object were presented, the fingertip tended to deviate from the virtual object, making it hard to understand the object’s shape. But this device makes the corners of an object, in particular, feel natural, by generating a force at five points.

“Now, Domo-kun’s body is one object, while the tooth is another object. The texture can be changed for each object. Here, the tooth alone has been made slightly hard. In that case, if we create a different object using CG, we can do this kind of thing.”

“This is slightly different from a texture that’s actually felt by the skin. For instance, if you hold a pen, you can perceive textures through the pen. This system reproduces that kind of sensation.”

“From now on, we’d like to simplify this device. We want to extend the range of movement of the fingers so that curved surfaces are also recognizable, while keeping the advantages of the current version.”



tags: , ,


DigInfo TV is a Tokyo-based online video news platform dedicated to producing original coverage of cutting edge technology, research and products from Japan.
DigInfo TV is a Tokyo-based online video news platform dedicated to producing original coverage of cutting edge technology, research and products from Japan.





Related posts :



Robot Talk Episode 123 – Standardising robot programming, with Nick Thompson

  30 May 2025
In the latest episode of the Robot Talk podcast, Claire chatted to Nick Thompson from BOW about software that makes robots easier to program.

Congratulations to the #AAMAS2025 best paper, best demo, and distinguished dissertation award winners

  29 May 2025
Find out who won the awards presented at the International Conference on Autonomous Agents and Multiagent Systems last week.

Congratulations to the #ICRA2025 best paper award winners

  27 May 2025
The winners and finalists in the different categories have been announced.

#ICRA2025 social media round-up

  23 May 2025
Find out what the participants got up to at the International Conference on Robotics & Automation.

Robot Talk Episode 122 – Bio-inspired flying robots, with Jane Pauline Ramos Ramirez

  23 May 2025
In the latest episode of the Robot Talk podcast, Claire chatted to Jane Pauline Ramos Ramirez from Delft University of Technology about drones that can move on land and in the air.

Robot Talk Episode 121 – Adaptable robots for the home, with Lerrel Pinto

  16 May 2025
In the latest episode of the Robot Talk podcast, Claire chatted to Lerrel Pinto from New York University about using machine learning to train robots to adapt to new environments.

What’s coming up at #ICRA2025?

  16 May 2025
Find out what's in store at the IEEE International Conference on Robotics & Automation, which will take place from 19-23 May.

Robot see, robot do: System learns after watching how-tos

  14 May 2025
Researchers have developed a new robotic framework that allows robots to learn tasks by watching a how-to video



 

Robohub is supported by:




Would you like to learn how to tell impactful stories about your robot or AI system?


scicomm
training the next generation of science communicators in robotics & AI


©2025.05 - Association for the Understanding of Artificial Intelligence


 












©2025.05 - Association for the Understanding of Artificial Intelligence