
Capturing and processing camera and sensor data and recognizing various shapes to determine a set of robotic actions is conceptually easy. Yet Amazon challenged the industry to do a selecting and picking task robotically and 28 teams from around the world rose to it.
With the aim of implementing a television service which allows viewers to touch virtual objects, NHK is developing a tactile system which applies stimuli to five points on one finger, making objects feel more real than when using previous systems.
“This device assists people with a visual disability, through what’s called the tactile or kinesthetic sense. It communicates a 3D shape or 2D graph of an object shown on a TV, such as a work of art, as a sensation felt by the hand.”
Professor Takashi Kawai’s lab at Waseda University’s School of Fundamental Science and Engineering is conducting research on a cross-modal perception technology employing multi-sensory integration in which participants perceive tactile sensation from visual stimulation. In a visual-evoked “minute tactile sensation” presentation system prototype, a very simple mechanism is used to enable the phenomenon of tactile sensation perception, despite the lack of any physical contact, simply by viewing a video image.
November 26, 2019
Need help spreading the word?
Join the Robohub crowdfunding page and increase the visibility of your campaign