Robohub.org
 

The Year of CoCoRo Video #30/52: Combined scenario number one


by
29 July 2015



share this:

CoCoRo30The EU-funded Collective Cognitive Robotics (CoCoRo) project has built a swarm of 41 autonomous underwater vehicles (AVs) that show collective cognition. Throughout 2015 – The Year of CoCoRo – we’ll be uploading a new weekly video detailing the latest stage in its development. This week there are two videos. The first shows a computer animation of  our “combined scenario #1.” The second shows how this scenario was performed by real robots in the water.

In CoCoRo, the search for a sunken metallic object on the waterbed was dealt with in several scenarios. The first video is a computer animation illustrating the phases of the most complex, “scenario #1:”

  1. The base station at the water surface moves into the habitat, with docked Jeff robots and a swarm of Lily robots confined to it.
  2. The Jeff robots are released. They sink to the ground and search the habitat.
  3. As soon as the first Jeff robot has found the target, it recruits other Jeff robots to the location via blue-light LED signals, leading to a positive feedback loop that attracts more and more Jeff robots to the area.
  4. Meanwhile, the Lily robots build a chain that connects the surface station to the aggregated Jeff robots on the ground.
  5. Through this “relay chain,” information (blink signals or other data) can be sent from the ground swarm to the surface station (and human operators) and vice versa.

Scenario #1 combines many of the individual algorithms and functionalities shown in previous videos in “The Year of CoCoRo.”

The second video shows how this scenario was performed by real robots in the water. It is noteworthy that all the phases in the scenario were executed in one single run and not split into different experiments. It was important for us to demonstrate that, using the modular algorithmic design of CoCoRo, it is possible to build up complex behavioral programs to solve composite tasks.

To date, in bio-inspired and swarm robotics, there have either been simple algorithms to enable robots to perform basic tasks or complex scenarios achieved by very complicated and specifically written software. To our knowledge, this is the first time in swarm robotics that several very simple signal exchange patterns and behavioral responses, triggered by simple signals within the environment, are combined with an elementary piece of code that allows the robot swarm to perform a very sophisticated behavioral program. There is no recourse to global knowledge, mapping, ego-positioning, image tracking or computer vision within this autonomous swarm. There are only: a few blue-light blinks, photoreceptors to receive these, random-walk programs, communication-free shoaling-behavior for the relay chain, radio-frequency pulsing and a simple internal-robot compass. This is how swarm robotics should be: simple, robust, flexible, scalable and – most importantly – it works!



tags: , , , , , , ,


Thomas Schmickl is an Associate Professor at Karl-Franzens University, Graz, Austria, and a lecturer at the University for Applied Sciences in St. Pölten, Austria.
Thomas Schmickl is an Associate Professor at Karl-Franzens University, Graz, Austria, and a lecturer at the University for Applied Sciences in St. Pölten, Austria.





Related posts :



Robot Talk Episode 110 – Designing ethical robots, with Catherine Menon

  21 Feb 2025
In the latest episode of the Robot Talk podcast, Claire chatted to Catherine Menon from the University of Hertfordshire about designing home assistance robots with ethics in mind.

Robot Talk Episode 109 – Building robots at home, with Dan Nicholson

  14 Feb 2025
In the latest episode of the Robot Talk podcast, Claire chatted to Dan Nicholson from MakerForge.tech about creating open source robotics projects you can do at home.

Robot Talk Episode 108 – Giving robots the sense of touch, with Anuradha Ranasinghe

  07 Feb 2025
In the latest episode of the Robot Talk podcast, Claire chatted to Anuradha Ranasinghe from Liverpool Hope University about haptic sensors for wearable tech and robotics.

Robot Talk Episode 107 – Animal-inspired robot movement, with Robert Siddall

  31 Jan 2025
In the latest episode of the Robot Talk podcast, Claire chatted to Robert Siddall from the University of Surrey about novel robot designs inspired by the way real animals move.

Robot Talk Episode 106 – The future of intelligent systems, with Didem Gurdur Broo

  24 Jan 2025
In the latest episode of the Robot Talk podcast, Claire chatted to Didem Gurdur Broo from Uppsala University about how to shape the future of robotics, autonomous vehicles, and industrial automation.

Robot Talk Episode 105 – Working with robots in industry, with Gianmarco Pisanelli 

  17 Jan 2025
In the latest episode of the Robot Talk podcast, Claire chatted to Gianmarco Pisanelli from the Advanced Manufacturing Research Centre about how to promote the safe and intuitive use of robots in manufacturing.

Robot Talk Episode 104 – Robot swarms inspired by nature, with Kirstin Petersen

  10 Jan 2025
In the latest episode of the Robot Talk podcast, Claire chatted to Kirstin Petersen from Cornell University about how robots can work together to achieve complex behaviours.

Robot Talk Episode 103 – Delivering medicine by drone, with Keenan Wyrobek

  20 Dec 2024
In the latest episode of the Robot Talk podcast, Claire chatted to Keenan Wyrobek from Zipline about drones for delivering life-saving medicine to remote locations.





Robohub is supported by:




Would you like to learn how to tell impactful stories about your robot or AI system?


scicomm
training the next generation of science communicators in robotics & AI


©2024 - Association for the Understanding of Artificial Intelligence


 












©2021 - ROBOTS Association