Robohub.org
 

Autonomous robot evolution: from cradle to grave


by
31 July 2018



share this:

A few weeks ago we had the kick-off meeting, in York, of our new 4 year EPSRC funded project Autonomous Robot Evolution (ARE): cradle to grave. We – Andy Tyrrell and Jon Timmis (York), Emma Hart (Edinburgh Napier), Gusti Eiben (Free University of Amsterdam) and myself – are all super excited. We’ve been trying to win support for this project for five years or so, and only now succeeded. This is a project that we’ve been thinking, and writing about, for a long time – so to have the opportunity to try out our ideas for real is wonderful.

In ARE we aim to investigate the artificial evolution of robots for unknown or extreme environments. In a radical new approach we will co-evolve robot bodies and brains in real-time and real-space. Using techniques from 3D printing new robot designs will literally be printed, before being trained in a nursery, then fitness tested in a target environment (a mock nuclear plant). The genomes of the fittest robots will then be combined to create the next generation of ‘child’ robots, so that – over successive generations – we will breed new robot designs in a process that mirrors the way farmers have artificially selected new varieties of plants and animals for thousands of years. Because evolving real robots is slow and resource hungry we will run a parallel process of simulated evolution in a virtual environment, in which the real world environment is used to calibrate the virtual world, and reduce the reality gap*. A hybrid real-virtual process under the control of an ecosystem manager will allow real and virtual robots to mate, and the child robots to be printed and tested in either the virtual or real environments.

The project will be divided into five work packages, each led by a different partner: WP1 Evolution (York), WP2 Physical Environment (UWE), WP3 Virtual Environment (York), WP4 Ecosystem Manager (Napier) and WP5 Integration and Demonstration (UWE).

Here in the Bristol Robotics Lab we will focus on work packages 2 and 5. The goal of WP 2 is the development of a purpose designed 3D printing system – which we call a birth clinic – capable of printing small mobile robots, according to a specification determined by a genome designed in WP1. The birth clinic will need to pick and place a number of pre designed and fabricated electronics, sensing and actuation modules (the robot’s ‘organs’) into the printing work area which will be over printed with hot plastic to form the complete robot. The goal of WP5 will be to integrate all components, including the real world birth clinic, nursery, and mock nuclear environment with the virtual environment (WP3) and the ecosystem manager (WP4) into a working demonstrator and undertake evaluation and analysis.

You can see an impression of what the birth clinic might look like above.

One of the most interesting aspects of the project is that we have no idea what the robots we breed will look like. The evolutionary process could come up with almost any body shape and structure (morphology). The same process will also determine which and how many organs (sensors, actuators, etc) are selected, and their positions and orientation within the body. Our evolved robot bodies could be very surprising indeed.

And who knows – maybe we can take a step towards Walterian Creatures?


*Anyone who uses simulation as a tool to develop robots is well aware that robots which appear to work perfectly well in a simulated virtual world often don’t work very well at all when the same design is tested in the real robot. This problem is especially acute when we are artificially evolving those robots. The reason for these problems is that the model of the real world and the robot(s) in it inside our simulation is an approximation. The Reality Gap refers to the less-than-perfect fidelity of the simulation; a better (higher fidelity) simulator would reduce the reality gap.

Related materials:
Article in de Volkskrant (in Dutch) De robotevolutie kan beginnen. Hoe? Moeder Natuur vervangen door virtuele kraamkamer (The robot evolution can begin. How? Replacing Mother Nature with virtual nursery), May 2018.
Eiben and Smith (2015) From evolutionary computing to the evolution of things, Nature.
Winfield and Timmis (2015) Evolvable Robot Hardware, in Evolvable Hardware, Springer.
Eiben et al. (2013) The Triangle of Life, European Conference on Artificial Life (ECAL 2013).




Alan Winfield is Professor in robotics at UWE Bristol. He communicates about science on his personal blog.
Alan Winfield is Professor in robotics at UWE Bristol. He communicates about science on his personal blog.

            AUAI is supported by:



Subscribe to Robohub newsletter on substack



Related posts :

How to teach the same skill to different robots

  11 May 2026
A new framework to teach a skill to robots with different mechanical designs, allowing them to carry out the same task without rewriting code for each.

Robot Talk Episode 155 – Making aerial robots smarter, with Melissa Greeff

  08 May 2026
In the latest episode of the Robot Talk podcast, Claire chatted to Melissa Greeff from Queen's University about autonomous navigation and learning for drones.

New understanding of insect flight points way to stable flapping-wing robots

  07 May 2026
The way bugs and birds flap their wings may look effortless, but the dynamics that keep them aloft are dizzyingly complex and difficult to quantify.

Robotically assembled building blocks could make construction more efficient and sustainable

  05 May 2026
Research suggests constructing a simple building from interlocking subunits should be mechanically feasible and have a much smaller carbon footprint.

Robot Talk Episode 154 – Visual navigation in insects and robots, with Andrew Philippides

  01 May 2026
In the latest episode of the Robot Talk podcast, Claire chatted to Andrew Philippides from the University of Sussex about what we can learn from ants and bees to improve robot navigation.

Ultralightweight sonar plus AI lets tiny drones navigate like bats

  29 Apr 2026
Researchers develop ultrasound-based perception system inspired by bat echolocation.

Gradient-based planning for world models at longer horizons

  28 Apr 2026
What were the problems that motivated this project and what was the approach to address them?

Robot Talk Episode 153 – Origami-inspired robots, with Chenying Liu

  24 Apr 2026
In the latest episode of the Robot Talk podcast, Claire chatted to Chenying Liu from University of Oxford about how a robot's physical form can actively contribute to sensing, processing, decision-making, and movement.



AUAI is supported by:







Subscribe to Robohub newsletter on substack




 















©2026.02 - Association for the Understanding of Artificial Intelligence