Robohub.org
 

Feel the edges and contours of virtual objects with this haptic system from NHK

by
03 June 2013



share this:
13-0044-r

With the aim of implementing a television service which allows viewers to touch virtual objects, NHK is developing a tactile system which applies stimuli to five points on one finger, making objects feel more real than when using previous systems.

“This device assists people with a visual disability, through what’s called the tactile or kinesthetic sense. It communicates a 3D shape or 2D graph of an object shown on a TV, such as a work of art, as a sensation felt by the hand.”

“The system detects spatial position data for the fingertip in 1/1000 of a second. The finger moves freely if there isn’t anything present, but when it approaches something, it becomes unable to go into the object. By tracing with your finger like this, you can feel that there’s a continuous rough surface there. Also, the jagged feeling of the tooth makes the sensation similar to that of touching an actual object.”

In previous systems, a force was applied to the entire fingertip, and when angular parts of an object were presented, the fingertip tended to deviate from the virtual object, making it hard to understand the object’s shape. But this device makes the corners of an object, in particular, feel natural, by generating a force at five points.

“Now, Domo-kun’s body is one object, while the tooth is another object. The texture can be changed for each object. Here, the tooth alone has been made slightly hard. In that case, if we create a different object using CG, we can do this kind of thing.”

“This is slightly different from a texture that’s actually felt by the skin. For instance, if you hold a pen, you can perceive textures through the pen. This system reproduces that kind of sensation.”

“From now on, we’d like to simplify this device. We want to extend the range of movement of the fingers so that curved surfaces are also recognizable, while keeping the advantages of the current version.”



tags: , ,


DigInfo TV is a Tokyo-based online video news platform dedicated to producing original coverage of cutting edge technology, research and products from Japan.
DigInfo TV is a Tokyo-based online video news platform dedicated to producing original coverage of cutting edge technology, research and products from Japan.





Related posts :



Hot Robotics Symposium celebrates UK success

An internationally leading robotics initiative that enables academia and industry to find innovative solutions to real world challenges, celebrated its success with a Hot Robotics Symposium hosted across three UK regions last week.
25 June 2022, by

Researchers release open-source photorealistic simulator for autonomous driving

MIT scientists unveil the first open-source simulation engine capable of constructing realistic environments for deployable training and testing of autonomous vehicles.
22 June 2022, by

In this episode, Audrow Nash speaks to Maria Telleria, who is a co-founder and the CTO of Canvas. Canvas makes a drywall finishing robot and is based in the Bay Area. In this interview, Maria talks ab...
21 June 2022, by and

Coffee with a Researcher (#ICRA2022)

As part of her role as one of the IEEE ICRA 2022 Science Communication Awardees, Avie Ravendran sat down virtually with a few researchers from academia and industry attending the conference.

Seeing the robots at #ICRA2022 through the eyes of a robot

Accessbility@ICRA2022 and OhmniLabs provided three OhmniBots for the conference, allowing students, faculty and interested industry members to attend the expo and poster sessions.
17 June 2022, by

Communicating innovation: What can we do better?

The question on what role communications play in forming the perception of innovative technology was discussed in this workshop. Experts explained how the innovation uptake should be supported by effective communication of innovations: explaining the benefits, tackling risks and fears of the audiences, and taking innovation closer to the general public.
15 June 2022, by





©2021 - ROBOTS Association


 












©2021 - ROBOTS Association