Robohub.org
 

Feel the edges and contours of virtual objects with this haptic system from NHK

by
03 June 2013



share this:
13-0044-r

With the aim of implementing a television service which allows viewers to touch virtual objects, NHK is developing a tactile system which applies stimuli to five points on one finger, making objects feel more real than when using previous systems.

“This device assists people with a visual disability, through what’s called the tactile or kinesthetic sense. It communicates a 3D shape or 2D graph of an object shown on a TV, such as a work of art, as a sensation felt by the hand.”

“The system detects spatial position data for the fingertip in 1/1000 of a second. The finger moves freely if there isn’t anything present, but when it approaches something, it becomes unable to go into the object. By tracing with your finger like this, you can feel that there’s a continuous rough surface there. Also, the jagged feeling of the tooth makes the sensation similar to that of touching an actual object.”

In previous systems, a force was applied to the entire fingertip, and when angular parts of an object were presented, the fingertip tended to deviate from the virtual object, making it hard to understand the object’s shape. But this device makes the corners of an object, in particular, feel natural, by generating a force at five points.

“Now, Domo-kun’s body is one object, while the tooth is another object. The texture can be changed for each object. Here, the tooth alone has been made slightly hard. In that case, if we create a different object using CG, we can do this kind of thing.”

“This is slightly different from a texture that’s actually felt by the skin. For instance, if you hold a pen, you can perceive textures through the pen. This system reproduces that kind of sensation.”

“From now on, we’d like to simplify this device. We want to extend the range of movement of the fingers so that curved surfaces are also recognizable, while keeping the advantages of the current version.”



tags: , ,


DigInfo TV is a Tokyo-based online video news platform dedicated to producing original coverage of cutting edge technology, research and products from Japan.
DigInfo TV is a Tokyo-based online video news platform dedicated to producing original coverage of cutting edge technology, research and products from Japan.





Related posts :



Open Robotics Launches the Open Source Robotics Alliance

The Open Source Robotics Foundation (OSRF) is pleased to announce the creation of the Open Source Robotics Alliance (OSRA), a new initiative to strengthen the governance of our open-source robotics so...

Robot Talk Episode 77 – Patricia Shaw

In the latest episode of the Robot Talk podcast, Claire chatted to Patricia Shaw from Aberystwyth University all about home assistance robots, and robot learning and development.
18 March 2024, by

Robot Talk Episode 64 – Rav Chunilal

In the latest episode of the Robot Talk podcast, Claire chatted to Rav Chunilal from Sellafield all about robotics and AI for nuclear decommissioning.
31 December 2023, by

AI holidays 2023

Thanks to those that sent and suggested AI and robotics-themed holiday videos, images, and stories. Here’s a sample to get you into the spirit this season....
31 December 2023, by and

Faced with dwindling bee colonies, scientists are arming queens with robots and smart hives

By Farshad Arvin, Martin Stefanec, and Tomas Krajnik Be it the news or the dwindling number of creatures hitting your windscreens, it will not have evaded you that the insect world in bad shape. ...
31 December 2023, by

Robot Talk Episode 63 – Ayse Kucukyilmaz

In the latest episode of the Robot Talk podcast, Claire chatted to Ayse Kucukyilmaz from the University of Nottingham about collaboration, conflict and failure in human-robot interactions.
31 December 2023, by





Robohub is supported by:




Would you like to learn how to tell impactful stories about your robot or AI system?


scicomm
training the next generation of science communicators in robotics & AI


©2024 - Association for the Understanding of Artificial Intelligence


 












©2021 - ROBOTS Association