Robohub.org
ep.

036

podcast
 

Active touch with Tony Prescott and Elio Tuci

by
09 October 2009



share this:

In today’s show we’ll be dabbing at the subject of active touch. Our first guest, Tony Prescott from the University of Sheffield in the UK has been looking at how rats actively use their whiskers to sense their environment and how this can be used in robotics or to help understand the brain. Our second guest, Elio Tuci, evolved a robot arm to touch an object and then figure out what the object is as a first step towards understanding language in humans.

Tony Prescott

Tony Prescott is Professor of Cognitive Neuroscience at the University of Sheffield, co-director of the University’s Adaptive Behaviour Research Group and Director of the Active Touch Laboratory. In the scope of several large European projects, such as BIOTACT and ICEA, he’s been frisking the whiskers of rats to study how they can be used to actively interact with their environments and how the signals from these sensors tap into the brain. To test models he’s inferred from high-speed images of real rats, Prescott has been working with a rat-like robot called SCRATCHbot developed in collaboration with the Bristol Robotics Lab. SCRATCHbot is equipped with an active 18-whisker array and a non-actuated micro-vibrissae array located on the “nose”. Its head is connected to the body by a 3 degrees of freedom neck, and the body is driven by 3 independently-steerable motor drive units.

More generally, whiskers have a real potential in robotics applications for their ability to detect and categorize objects and surface textures while only lightly touching the objects they interact with. Touch is still a widely untapped sensor modality that could be strapped to robot arms, cleaning robots and maybe your LEGO robot. For this purpose, Prescott is looking at creating an off-the-shelf version of the rat’s whisker system.

Elio Tuci

Elio Tuci is a researcher at the Institute of Cognitive Sciences and Technologies of the Italian National Research Council, member of the Laboratory of Autonomous Robotics and Artificial Life. Tuci is currently working on the ITALK Project which is studying the various aspects of language and how humans learn to speak. He tells us about how active perception is an integral part of how we learn to categorize objects, a necessary prerequisite to developing language. He speaks in particular about his recent work on a robot arm that evolved to discriminate between different objects such as ellipses and circles using active touch.

Links:


Latest News:

For videos of Japan’s new Gigantor robot statue, Nissan’s EPORO car robots and Panasonic’s new Power Loader exoskeleton visit the Robots forum!

View and post comments on this episode in the forum



tags: ,


Podcast team The ROBOTS Podcast brings you the latest news and views in robotics through its bi-weekly interviews with leaders in the field.
Podcast team The ROBOTS Podcast brings you the latest news and views in robotics through its bi-weekly interviews with leaders in the field.





Related posts :



Tesla’s Optimus robot isn’t very impressive – but it may be a sign of better things to come

Musk has now unveiled a prototype of the robot, called Optimus, which he hopes to mass-produce and sell for less than US$20,000 (A$31,000).
04 October 2022, by

Bipedal robot achieves Guinness World Record in 100 metres

Cassie the robot, developed at Oregon State University, records the fastest 100 metres by a bipedal robot.
03 October 2022, by and

Breaking through the mucus barrier

A capsule that tunnels through mucus in the GI tract could be used to orally administer large protein drugs such as insulin.
02 October 2022, by

Women in Tech leadership resources from IMTS 2022

There’ve been quite a few events recently focusing on Women in Robotics, Women in Manufacturing, Women in 3D Printing, in Engineering, and in Tech Leadership. One of the largest tradeshows in the US is IMTS 2022. Here I bring you some resources shared in the curated technical content and leadership sessions.
29 September 2022, by and

MIT engineers build a battery-free, wireless underwater camera

The device could help scientists explore unknown regions of the ocean, track pollution, or monitor the effects of climate change.
27 September 2022, by

How do we control robots on the moon?

In the future, we imagine that teams of robots will explore and develop the surface of nearby planets, moons and asteroids - taking samples, building structures, deploying instruments.
25 September 2022, by , and





©2021 - ROBOTS Association


 












©2021 - ROBOTS Association