news    views    podcast    learn    |    about    contribute     republish    
Share this article: facebook . twitter . linkedin . reddit

Video: Giving robots and prostheses a sense of touch

April 7, 2015
Haptic exploration of fingertip sized geometric features using a multimodal tactile sensor   YouTubeThe UCLA Biomechatronics Lab develops a language of touch that can be “felt” by computers and humans alike.

Research engineers and students in the University of California, Los Angeles (UCLA) Biomechatronics Lab are designing artificial limbs to be more sensational, with the emphasis on sensation.

With support from the National Science Foundation (NSF), the team, led by mechanical engineer Veronica J. Santos, is constructing a language of touch that both a computer and a human can understand. The researchers are quantifying this with mechanical touch sensors that interact with objects of various shapes, sizes and textures. Using an array of instrumentation, Santos’ team is able to translate that interaction into data a computer can understand.

The data is used to create a formula or algorithm that gives the computer the ability to identify patterns among the items it has in its library of experiences and something it has never felt before. This research will help the team develop artificial haptic intelligence, which is, essentially, giving robots, as well as prostheses, the “human touch.”

the National Science Foundation (NSF)
guest author
the National Science Foundation (NSF) is an independent federal US agency created to promote the progress of science.

comments powered by Disqus


inVia Robotics: Product-Picking Robots for the Warehouse
October 7, 2019

Are you planning to crowdfund your robot startup?

Need help spreading the word?

Join the Robohub crowdfunding page and increase the visibility of your campaign