Robohub.org
 

Nature inspires new generation of robot brains


by
11 May 2016



share this:
Systems based on animal neurons could help robots understand the world. Image credit: CC0

Systems based on animal neurons could help robots understand the world. Image credit: CC0

by Rex Merrifield

Animals have evolved sophisticated ways of processing sensory data to make sense of their surroundings. Now, robotics researchers are drawing inspiration from biological processes to improve the way machines handle information, perceive their surroundings, and react to stimuli.

While the human brain is often seen as the ultimate model for robotic intelligence, scientists are also learning plenty from the neurobiological structures and processes of more humble creatures, from fruit flies to rodents.

Take the fruit fly – or rather, the maggot that grows up to be a fruit fly. Drosophila fruit fly larvae have fewer than 10 000 neurons – compared to about 100 billion in the human brain. But they display a range of complex orientation and learning behaviours that computational theory does not adequately explain at present.

By studying how the larvae change their response to stimuli such as smells when these are associated with reward or punishment, the EU-funded MINIMAL project aims to unpick the exact mechanism underlying learning processes.

‘We think it is quite important in any biological system to look at how the way the animal is acting and what it does has an effect on what it senses and what it learns, and how that influences its behaviour the next time,’ said Professor Barbara Webb, professor of biorobotics at the University of Edinburgh, UK, and coordinator of MINIMAL.

‘Animals have to learn while they survive – they can’t separate those two things. For many robot applications it may be equally important that they can learn rapidly while on the job,’ she added.

Using genetic methods that allow specific Drosophila neurons to be tagged, the researchers are able to watch and manipulate how these processes work in real time, identifying the critical neural circuits and providing insight into a quite fundamental way of learning, which they are then applying to robotics.

Being able to process sensory data in real time and react quickly is crucial to animals’ survival, and the REAL-TIME ASoC project is drawing on biological processes for algorithms that make robots more efficient in perception and in search functions.

Project researcher Dr Francisco Barranco, of the University of Granada in Spain, is working with electronic sensors that only transmit information from pixels that detect changes, a far more efficient process than transmitting the full picture continuously.

Human eye

The robot vision processor provides a kind of ‘bottom-up’ filter along the lines of the human eye – where information from 100 million sensors in the retina needs to be conveyed to the brain through the optic nerve, which has only about 1 million ‘cables’ to carry the information. Efficient filtering at the retina is common even in animals such as insects.

The system could have many different applications, such as in interactive television, but there is a long way to go to get the technology from the robotics laboratory to a marketable product.

‘We have developed a low-level vision engine, for biologically inspired vision sensors.’

Prof. Eduardo Ros, University of Granada, Spain

‘We have developed a low-level vision engine, for biologically inspired vision sensors,’ said Professor Eduardo Ros, also at the University of Granada, who supervised the EU-funded research. He said the research is also looking at more sophisticated stages of the vision process typically found in mammals, such as motion processing and depth estimation, and attention.

‘We want a robot to be able to select the important information,’ he said. ‘It needs to be able to discard all the other irrelevant sensations early on, otherwise it becomes overwhelmed.’

Prof. Ros says that visual perception in mammals is as much about a part of the brain called the cortex as it is about information coming through the eye, a feature they are trying to replicate in robots.

Searching for objects

When searching for an object, such as a moving red car, the robot needs to have a clear idea of what it must find, to limit the possibilities.

So REAL-TIME ASoC gives the robot prior information, such as the shape, size, colour or texture that it can work from. It also has to be able to identify how an object may be moving around. By combining these two modes of processing – bottom-up and top-down – the robot is able to find the target objects.

Crucial to successful physical interaction between robots and objects is accurate spatial perception, and the EU-funded GRIDMAP project is drawing on studies of brain anatomy, animal behaviour, and mathematical modelling to engineer a robotic system that can perceive, memorise, and react to changes in its environment in a similar way to rats.

The researchers are studying rats to make computer models of grid cells and place cells – the neurons in specific regions of the mammal brain that help us to perceive and retain an idea of the space in which we live and move. In 2014, GRIDMAP’s coordinator, Professor Edvard I. Moser, based at the Norwegian University of Science and Technology in Trondheim, was awarded the Nobel Prize in Physiology or Medicine jointly with two other neuroscientists for work they had done previously in identifying this ‘inner GPS’ system in rats.

The GRIDMAP researchers now want to understand how these cells form and build up spatial representation, starting from a single cell to regular, observable patterns of activity, said Professor Jörg Conradt, professor of neuroengineering at the Technical University in Munich, Germany, who helps in coordinating the project.

While the group in Trondheim is observing how the rat brain’s spatial representation changes, readjusts and adapts as its immediate environment is made more and more complicated, Prof. Conradt and his colleagues in Munich have built robots that learn to navigate like rats.

‘In the robots experiment, they explore the arena, they observe and they build a representation, so the software running on our computers builds spatial representations just as we believe biology builds up spatial representations over time,’ Prof. Conradt said.

By using cameras and computer controls, the robots learn to find their way through changing mazes as these are made more and more complicated.

‘So it is a rat’s world, adapted for the robot’s senses,’ Prof. Conradt said.

Understanding the fundamental principles in grid cells and place cells could help to construct new systems that are able to achieve self-creation of software – or machines that can programme themselves.



tags: ,


Horizon Magazine brings you the latest news and features about thought-provoking science and innovative research projects funded by the EU.
Horizon Magazine brings you the latest news and features about thought-provoking science and innovative research projects funded by the EU.





Related posts :



Robot Talk Episode 126 – Why are we building humanoid robots?

  20 Jun 2025
In this special live recording at Imperial College London, Claire chatted to Ben Russell, Maryam Banitalebi Dehkordi, and Petar Kormushev about humanoid robotics.

Gearing up for RoboCupJunior: Interview with Ana Patrícia Magalhães

and   18 Jun 2025
We hear from the organiser of RoboCupJunior 2025 and find out how the preparations are going for the event.

Robot Talk Episode 125 – Chatting with robots, with Gabriel Skantze

  13 Jun 2025
In the latest episode of the Robot Talk podcast, Claire chatted to Gabriel Skantze from KTH Royal Institute of Technology about having natural face-to-face conversations with robots.

Preparing for kick-off at RoboCup2025: an interview with General Chair Marco Simões

and   12 Jun 2025
We caught up with Marco to find out what exciting events are in store at this year's RoboCup.

Interview with Amar Halilovic: Explainable AI for robotics

  10 Jun 2025
Find out about Amar's research investigating the generation of explanations for robot actions.

Robot Talk Episode 124 – Robots in the performing arts, with Amy LaViers

  06 Jun 2025
In the latest episode of the Robot Talk podcast, Claire chatted to Amy LaViers from the Robotics, Automation, and Dance Lab about the creative relationship between humans and machines.

Robot Talk Episode 123 – Standardising robot programming, with Nick Thompson

  30 May 2025
In the latest episode of the Robot Talk podcast, Claire chatted to Nick Thompson from BOW about software that makes robots easier to program.

Congratulations to the #AAMAS2025 best paper, best demo, and distinguished dissertation award winners

  29 May 2025
Find out who won the awards presented at the International Conference on Autonomous Agents and Multiagent Systems last week.



 

Robohub is supported by:




Would you like to learn how to tell impactful stories about your robot or AI system?


scicomm
training the next generation of science communicators in robotics & AI


©2025.05 - Association for the Understanding of Artificial Intelligence


 












©2025.05 - Association for the Understanding of Artificial Intelligence