Robohub.org
 

Nature inspires new generation of robot brains


by
11 May 2016



share this:
Systems based on animal neurons could help robots understand the world. Image credit: CC0

Systems based on animal neurons could help robots understand the world. Image credit: CC0

by Rex Merrifield

Animals have evolved sophisticated ways of processing sensory data to make sense of their surroundings. Now, robotics researchers are drawing inspiration from biological processes to improve the way machines handle information, perceive their surroundings, and react to stimuli.

While the human brain is often seen as the ultimate model for robotic intelligence, scientists are also learning plenty from the neurobiological structures and processes of more humble creatures, from fruit flies to rodents.

Take the fruit fly – or rather, the maggot that grows up to be a fruit fly. Drosophila fruit fly larvae have fewer than 10 000 neurons – compared to about 100 billion in the human brain. But they display a range of complex orientation and learning behaviours that computational theory does not adequately explain at present.

By studying how the larvae change their response to stimuli such as smells when these are associated with reward or punishment, the EU-funded MINIMAL project aims to unpick the exact mechanism underlying learning processes.

‘We think it is quite important in any biological system to look at how the way the animal is acting and what it does has an effect on what it senses and what it learns, and how that influences its behaviour the next time,’ said Professor Barbara Webb, professor of biorobotics at the University of Edinburgh, UK, and coordinator of MINIMAL.

‘Animals have to learn while they survive – they can’t separate those two things. For many robot applications it may be equally important that they can learn rapidly while on the job,’ she added.

Using genetic methods that allow specific Drosophila neurons to be tagged, the researchers are able to watch and manipulate how these processes work in real time, identifying the critical neural circuits and providing insight into a quite fundamental way of learning, which they are then applying to robotics.

Being able to process sensory data in real time and react quickly is crucial to animals’ survival, and the REAL-TIME ASoC project is drawing on biological processes for algorithms that make robots more efficient in perception and in search functions.

Project researcher Dr Francisco Barranco, of the University of Granada in Spain, is working with electronic sensors that only transmit information from pixels that detect changes, a far more efficient process than transmitting the full picture continuously.

Human eye

The robot vision processor provides a kind of ‘bottom-up’ filter along the lines of the human eye – where information from 100 million sensors in the retina needs to be conveyed to the brain through the optic nerve, which has only about 1 million ‘cables’ to carry the information. Efficient filtering at the retina is common even in animals such as insects.

The system could have many different applications, such as in interactive television, but there is a long way to go to get the technology from the robotics laboratory to a marketable product.

‘We have developed a low-level vision engine, for biologically inspired vision sensors.’

Prof. Eduardo Ros, University of Granada, Spain

‘We have developed a low-level vision engine, for biologically inspired vision sensors,’ said Professor Eduardo Ros, also at the University of Granada, who supervised the EU-funded research. He said the research is also looking at more sophisticated stages of the vision process typically found in mammals, such as motion processing and depth estimation, and attention.

‘We want a robot to be able to select the important information,’ he said. ‘It needs to be able to discard all the other irrelevant sensations early on, otherwise it becomes overwhelmed.’

Prof. Ros says that visual perception in mammals is as much about a part of the brain called the cortex as it is about information coming through the eye, a feature they are trying to replicate in robots.

Searching for objects

When searching for an object, such as a moving red car, the robot needs to have a clear idea of what it must find, to limit the possibilities.

So REAL-TIME ASoC gives the robot prior information, such as the shape, size, colour or texture that it can work from. It also has to be able to identify how an object may be moving around. By combining these two modes of processing – bottom-up and top-down – the robot is able to find the target objects.

Crucial to successful physical interaction between robots and objects is accurate spatial perception, and the EU-funded GRIDMAP project is drawing on studies of brain anatomy, animal behaviour, and mathematical modelling to engineer a robotic system that can perceive, memorise, and react to changes in its environment in a similar way to rats.

The researchers are studying rats to make computer models of grid cells and place cells – the neurons in specific regions of the mammal brain that help us to perceive and retain an idea of the space in which we live and move. In 2014, GRIDMAP’s coordinator, Professor Edvard I. Moser, based at the Norwegian University of Science and Technology in Trondheim, was awarded the Nobel Prize in Physiology or Medicine jointly with two other neuroscientists for work they had done previously in identifying this ‘inner GPS’ system in rats.

The GRIDMAP researchers now want to understand how these cells form and build up spatial representation, starting from a single cell to regular, observable patterns of activity, said Professor Jörg Conradt, professor of neuroengineering at the Technical University in Munich, Germany, who helps in coordinating the project.

While the group in Trondheim is observing how the rat brain’s spatial representation changes, readjusts and adapts as its immediate environment is made more and more complicated, Prof. Conradt and his colleagues in Munich have built robots that learn to navigate like rats.

‘In the robots experiment, they explore the arena, they observe and they build a representation, so the software running on our computers builds spatial representations just as we believe biology builds up spatial representations over time,’ Prof. Conradt said.

By using cameras and computer controls, the robots learn to find their way through changing mazes as these are made more and more complicated.

‘So it is a rat’s world, adapted for the robot’s senses,’ Prof. Conradt said.

Understanding the fundamental principles in grid cells and place cells could help to construct new systems that are able to achieve self-creation of software – or machines that can programme themselves.



tags: ,


Horizon Magazine brings you the latest news and features about thought-provoking science and innovative research projects funded by the EU.
Horizon Magazine brings you the latest news and features about thought-provoking science and innovative research projects funded by the EU.





Related posts :



Robot Talk Episode 103 – Keenan Wyrobek

  20 Dec 2024
In the latest episode of the Robot Talk podcast, Claire chatted to Keenan Wyrobek from Zipline about drones for delivering life-saving medicine to remote locations.

Robot Talk Episode 102 – Isabella Fiorello

  13 Dec 2024
In the latest episode of the Robot Talk podcast, Claire chatted to Isabella Fiorello from the University of Freiburg about bioinspired living materials for soft robotics.

Robot Talk Episode 101 – Christos Bergeles

  06 Dec 2024
In the latest episode of the Robot Talk podcast, Claire chatted to Christos Bergeles from King's College London about micro-surgical robots to deliver therapies deep inside the body.

Robot Talk Episode 100 – Mini Rai

  29 Nov 2024
In the latest episode of the Robot Talk podcast, Claire chatted to Mini Rai from Orbit Rise about orbital and planetary robots.

Robot Talk Episode 99 – Joe Wolfel

  22 Nov 2024
In the latest episode of the Robot Talk podcast, Claire chatted to Joe Wolfel from Terradepth about autonomous submersible robots for collecting ocean data.

Robot Talk Episode 98 – Gabriella Pizzuto

  15 Nov 2024
In the latest episode of the Robot Talk podcast, Claire chatted to Gabriella Pizzuto from the University of Liverpool about intelligent robotic manipulators for laboratory automation.

Online hands-on science communication training – sign up here!

  13 Nov 2024
Find out how to communicate about your work with experts from Robohub, AIhub, and IEEE Spectrum.

Robot Talk Episode 97 – Pratap Tokekar

  08 Nov 2024
In the latest episode of the Robot Talk podcast, Claire chatted to Pratap Tokekar from the University of Maryland about how teams of robots with different capabilities can work together.





Robohub is supported by:




Would you like to learn how to tell impactful stories about your robot or AI system?


scicomm
training the next generation of science communicators in robotics & AI


©2024 - Association for the Understanding of Artificial Intelligence


 












©2021 - ROBOTS Association