Robohub.org
ep.

036

podcast
 

Active touch with Tony Prescott and Elio Tuci


by
09 October 2009



share this:

In today’s show we’ll be dabbing at the subject of active touch. Our first guest, Tony Prescott from the University of Sheffield in the UK has been looking at how rats actively use their whiskers to sense their environment and how this can be used in robotics or to help understand the brain. Our second guest, Elio Tuci, evolved a robot arm to touch an object and then figure out what the object is as a first step towards understanding language in humans.

Tony Prescott

Tony Prescott is Professor of Cognitive Neuroscience at the University of Sheffield, co-director of the University’s Adaptive Behaviour Research Group and Director of the Active Touch Laboratory. In the scope of several large European projects, such as BIOTACT and ICEA, he’s been frisking the whiskers of rats to study how they can be used to actively interact with their environments and how the signals from these sensors tap into the brain. To test models he’s inferred from high-speed images of real rats, Prescott has been working with a rat-like robot called SCRATCHbot developed in collaboration with the Bristol Robotics Lab. SCRATCHbot is equipped with an active 18-whisker array and a non-actuated micro-vibrissae array located on the “nose”. Its head is connected to the body by a 3 degrees of freedom neck, and the body is driven by 3 independently-steerable motor drive units.

More generally, whiskers have a real potential in robotics applications for their ability to detect and categorize objects and surface textures while only lightly touching the objects they interact with. Touch is still a widely untapped sensor modality that could be strapped to robot arms, cleaning robots and maybe your LEGO robot. For this purpose, Prescott is looking at creating an off-the-shelf version of the rat’s whisker system.

Elio Tuci

Elio Tuci is a researcher at the Institute of Cognitive Sciences and Technologies of the Italian National Research Council, member of the Laboratory of Autonomous Robotics and Artificial Life. Tuci is currently working on the ITALK Project which is studying the various aspects of language and how humans learn to speak. He tells us about how active perception is an integral part of how we learn to categorize objects, a necessary prerequisite to developing language. He speaks in particular about his recent work on a robot arm that evolved to discriminate between different objects such as ellipses and circles using active touch.

Links:


Latest News:

For videos of Japan’s new Gigantor robot statue, Nissan’s EPORO car robots and Panasonic’s new Power Loader exoskeleton visit the Robots forum!

View and post comments on this episode in the forum



tags: ,


Podcast team The ROBOTS Podcast brings you the latest news and views in robotics through its bi-weekly interviews with leaders in the field.
Podcast team The ROBOTS Podcast brings you the latest news and views in robotics through its bi-weekly interviews with leaders in the field.

            AUAI is supported by:



Subscribe to Robohub newsletter on substack



Related posts :

How to teach the same skill to different robots

  11 May 2026
A new framework to teach a skill to robots with different mechanical designs, allowing them to carry out the same task without rewriting code for each.

Robot Talk Episode 155 – Making aerial robots smarter, with Melissa Greeff

  08 May 2026
In the latest episode of the Robot Talk podcast, Claire chatted to Melissa Greeff from Queen's University about autonomous navigation and learning for drones.

New understanding of insect flight points way to stable flapping-wing robots

  07 May 2026
The way bugs and birds flap their wings may look effortless, but the dynamics that keep them aloft are dizzyingly complex and difficult to quantify.

Robotically assembled building blocks could make construction more efficient and sustainable

  05 May 2026
Research suggests constructing a simple building from interlocking subunits should be mechanically feasible and have a much smaller carbon footprint.

Robot Talk Episode 154 – Visual navigation in insects and robots, with Andrew Philippides

  01 May 2026
In the latest episode of the Robot Talk podcast, Claire chatted to Andrew Philippides from the University of Sussex about what we can learn from ants and bees to improve robot navigation.

Ultralightweight sonar plus AI lets tiny drones navigate like bats

  29 Apr 2026
Researchers develop ultrasound-based perception system inspired by bat echolocation.

Gradient-based planning for world models at longer horizons

  28 Apr 2026
What were the problems that motivated this project and what was the approach to address them?

Robot Talk Episode 153 – Origami-inspired robots, with Chenying Liu

  24 Apr 2026
In the latest episode of the Robot Talk podcast, Claire chatted to Chenying Liu from University of Oxford about how a robot's physical form can actively contribute to sensing, processing, decision-making, and movement.



AUAI is supported by:







Subscribe to Robohub newsletter on substack




 















©2026.02 - Association for the Understanding of Artificial Intelligence