Robohub.org
 

Autonomous robot evolution: from cradle to grave

by
31 July 2018



share this:

A few weeks ago we had the kick-off meeting, in York, of our new 4 year EPSRC funded project Autonomous Robot Evolution (ARE): cradle to grave. We – Andy Tyrrell and Jon Timmis (York), Emma Hart (Edinburgh Napier), Gusti Eiben (Free University of Amsterdam) and myself – are all super excited. We’ve been trying to win support for this project for five years or so, and only now succeeded. This is a project that we’ve been thinking, and writing about, for a long time – so to have the opportunity to try out our ideas for real is wonderful.

In ARE we aim to investigate the artificial evolution of robots for unknown or extreme environments. In a radical new approach we will co-evolve robot bodies and brains in real-time and real-space. Using techniques from 3D printing new robot designs will literally be printed, before being trained in a nursery, then fitness tested in a target environment (a mock nuclear plant). The genomes of the fittest robots will then be combined to create the next generation of ‘child’ robots, so that – over successive generations – we will breed new robot designs in a process that mirrors the way farmers have artificially selected new varieties of plants and animals for thousands of years. Because evolving real robots is slow and resource hungry we will run a parallel process of simulated evolution in a virtual environment, in which the real world environment is used to calibrate the virtual world, and reduce the reality gap*. A hybrid real-virtual process under the control of an ecosystem manager will allow real and virtual robots to mate, and the child robots to be printed and tested in either the virtual or real environments.

The project will be divided into five work packages, each led by a different partner: WP1 Evolution (York), WP2 Physical Environment (UWE), WP3 Virtual Environment (York), WP4 Ecosystem Manager (Napier) and WP5 Integration and Demonstration (UWE).

Here in the Bristol Robotics Lab we will focus on work packages 2 and 5. The goal of WP 2 is the development of a purpose designed 3D printing system – which we call a birth clinic – capable of printing small mobile robots, according to a specification determined by a genome designed in WP1. The birth clinic will need to pick and place a number of pre designed and fabricated electronics, sensing and actuation modules (the robot’s ‘organs’) into the printing work area which will be over printed with hot plastic to form the complete robot. The goal of WP5 will be to integrate all components, including the real world birth clinic, nursery, and mock nuclear environment with the virtual environment (WP3) and the ecosystem manager (WP4) into a working demonstrator and undertake evaluation and analysis.

You can see an impression of what the birth clinic might look like above.

One of the most interesting aspects of the project is that we have no idea what the robots we breed will look like. The evolutionary process could come up with almost any body shape and structure (morphology). The same process will also determine which and how many organs (sensors, actuators, etc) are selected, and their positions and orientation within the body. Our evolved robot bodies could be very surprising indeed.

And who knows – maybe we can take a step towards Walterian Creatures?


*Anyone who uses simulation as a tool to develop robots is well aware that robots which appear to work perfectly well in a simulated virtual world often don’t work very well at all when the same design is tested in the real robot. This problem is especially acute when we are artificially evolving those robots. The reason for these problems is that the model of the real world and the robot(s) in it inside our simulation is an approximation. The Reality Gap refers to the less-than-perfect fidelity of the simulation; a better (higher fidelity) simulator would reduce the reality gap.

Related materials:
Article in de Volkskrant (in Dutch) De robotevolutie kan beginnen. Hoe? Moeder Natuur vervangen door virtuele kraamkamer (The robot evolution can begin. How? Replacing Mother Nature with virtual nursery), May 2018.
Eiben and Smith (2015) From evolutionary computing to the evolution of things, Nature.
Winfield and Timmis (2015) Evolvable Robot Hardware, in Evolvable Hardware, Springer.
Eiben et al. (2013) The Triangle of Life, European Conference on Artificial Life (ECAL 2013).




Alan Winfield is Professor in robotics at UWE Bristol. He communicates about science on his personal blog.
Alan Winfield is Professor in robotics at UWE Bristol. He communicates about science on his personal blog.





Related posts :



ep.

340

podcast

NVIDIA and ROS Teaming Up To Accelerate Robotics Development, with Amit Goel

Amit Goel, Director of Product Management for Autonomous Machines at NVIDIA, discusses the new collaboration between Open Robotics and NVIDIA. The collaboration will dramatically improve the way ROS and NVIDIA's line of products such as Isaac SIM and the Jetson line of embedded boards operate together.
23 October 2021, by

One giant leap for the mini cheetah

A new control system, demonstrated using MIT’s robotic mini cheetah, enables four-legged robots to jump across uneven terrain in real-time.
23 October 2021, by

Robotics Today latest talks – Raia Hadsell (DeepMind), Koushil Sreenath (UC Berkeley) and Antonio Bicchi (Istituto Italiano di Tecnologia)

Robotics Today held three more online talks since we published the one from Amanda Prorok (Learning to Communicate in Multi-Agent Systems). In this post we bring you the last talks that Robotics Today...
21 October 2021, by and

Sense Think Act Pocast: Erik Schluntz

In this episode, Audrow Nash interviews Erik Schluntz, co-founder and CTO of Cobalt Robotics, which makes a security guard robot. Erik speaks about how their robot handles elevators, how they have hum...
19 October 2021, by and

A robot that finds lost items

Researchers at MIT have created RFusion, a robotic arm with a camera and radio frequency (RF) antenna attached to its gripper, that fuses signals from the antenna with visual input from the camera to locate and retrieve an item, even if the item is buried under a pile and completely out of view.
18 October 2021, by

Robohub gets a fresh look

If you visited Robohub this week, you may have spotted a big change: how this blog looks now! On Tuesday (coinciding with Ada Lovelace Day and our ‘50 women in robotics that you need to know about‘ by chance), Robohub got a massive modernisation on its look by our technical director Ioannis K. Erripis and his team.
17 October 2021, by





©2021 - ROBOTS Association


 












©2021 - ROBOTS Association