Robohub.org
 

See and feel virtual water with this immersive crossmodal perception system from Solidray

by
07 December 2012



share this:
12-0220-r

Solidray, which is involved in virtual reality production, has released an immersive crossmodal system incorporating visual and tactile feedback, enabling the user to see and feel flowing water in a virtual space.

“When you put on the 3D glasses, the scene appears to be coming towards you. You’re looking at a virtual world created in the computer. The most important thing is, things appear life-sized, so the female character appears life-sized before the user’s eyes. So, it looks as if she is really in front of you. Also, water is flowing out of the 3D scene. When the user takes a cup, and places it against the water, vibration is transmitted to the cup, making it feel as if water is pouring into the cup.”

The glasses have a magnetic sensor, which precisely measures the user’s line of sight in 3D. This enables the system to dynamically change the viewpoint in 3D, in line with the viewing position, so the user can look into the scene from all directions.

The tactile element uses the TECHTILE toolkit, a haptic recording and playback tool developed by a research group at Keio University. The sensation of water being poured is recorded using a microphone in advance, and when the position of the cup overlaps the parabolic line of the water, the sensation is reproduced. The position of the cup is measured using an infrared camera.

“Here, we’ve added tactile as well as visual sensations. Taking things that far makes other sensations arise in the brain. You can really feel that you’ve gone into a virtual space. All we’re doing is making the cup vibrate, but some users even say it feels cold or heavy.”

“We’re researching how to make users feel sensations that aren’t being delivered. We’d like to use that in promotions. For example, this system uses a cute character. Cute characters are said to be two-dimensional, but they can become three-dimensional. We think it’s more fun to look at a life-sized character than a little figure. So, we think business utilizing that may emerge.”



tags:


DigInfo TV is a Tokyo-based online video news platform dedicated to producing original coverage of cutting edge technology, research and products from Japan.
DigInfo TV is a Tokyo-based online video news platform dedicated to producing original coverage of cutting edge technology, research and products from Japan.





Related posts :



Robot Talk Episode 98 – Gabriella Pizzuto

In the latest episode of the Robot Talk podcast, Claire chatted to Gabriella Pizzuto from the University of Liverpool about intelligent robotic manipulators for laboratory automation.
15 November 2024, by

Online hands-on science communication training – sign up here!

Find out how to communicate about your work with experts from Robohub, AIhub, and IEEE Spectrum.
13 November 2024, by

Robot Talk Episode 97 – Pratap Tokekar

In the latest episode of the Robot Talk podcast, Claire chatted to Pratap Tokekar from the University of Maryland about how teams of robots with different capabilities can work together.
08 November 2024, by

Robot Talk Episode 96 – Maria Elena Giannaccini

In the latest episode of the Robot Talk podcast, Claire chatted to Maria Elena Giannaccini from the University of Aberdeen about soft and bioinspired robotics for healthcare and beyond.
01 November 2024, by

Robot Talk Episode 95 – Jonathan Walker

In the latest episode of the Robot Talk podcast, Claire chatted to Jonathan Walker from Innovate UK about translating robotics research into the commercial sector.
25 October 2024, by

Robot Talk Episode 94 – Esyin Chew

In the latest episode of the Robot Talk podcast, Claire chatted to Esyin Chew from Cardiff Metropolitan University about service and social humanoid robots in healthcare and education.
18 October 2024, by





Robohub is supported by:




Would you like to learn how to tell impactful stories about your robot or AI system?


scicomm
training the next generation of science communicators in robotics & AI


©2024 - Association for the Understanding of Artificial Intelligence


 












©2021 - ROBOTS Association