Robohub.org
 

EU’s future cyber-farms to utilise drones, robots and sensors


by
25 August 2017



share this:

Farmers could protect the environment and cut down on fertiliser use with swarms of drones. Image credit – ‘Aerial View – Landschaft Markgräflerland’ by Taxiarchos228 is licenced under CC 3.0 unported

by Anthony King

Bee-based maths is helping teach swarms of drones to find weeds, while robotic mowers keep hedgerows in shape.

‘We observe the behaviour of bees. We gain knowledge of how the bees solve problems and with this we obtain rules of interaction that can be adapted to tell us how the robot swarms should work together,’ said Vito Trianni at the Institute of Cognitive Sciences and Technologies of the Italian National Research Council.

Honeybees, for example, run on an algorithm to allow them to choose the best nest site, even though no bee knows the full picture.

Trianni runs an EU-funded research project known as SAGA, which is using the power of robotic groupthink to keep crops weed free.

‘We can use low-cost robots and low-cost cameras. They can even be prone to error, but thanks to the cooperation they will be able to generate precise maps at centimetre scales,’ said Trianni.

‘They will initially spread over the field to inspect it at low resolution, but will then decide on areas that require more focus,’ said Trianni. ‘They can gather together in small groups closer to the ground.’

Importantly the drones make these decisions themselves, as a group.

Next spring, a swarm of the quadcopters will be released over a sugar beet field. They will stay in radio contact with each other and use algorithms learnt from the bees to cooperate and put together a map of weeds. This will then allow for targeted spraying of weeds or their mechanical removal on organic farms.

Today the most common way to control weeds is to spray entire fields with herbicide chemicals. Smarter spraying will save farmers money, but it will also lower the risk of resistance developing to the agrichemicals. And there will be an environmental benefit from spraying less herbicides.

Co-ops

Swarms of drones for mapping crop fields offer a service to farmers, while farm co-ops could even buy swarms themselves.

‘There is no need to fly them every day over your field, so it is possible to share the technology between multiple farmers,’ said Trianni. A co-op might buy 20 to 30 drones, but adjust the size of the swarm to the farm.

The drones are 1.5 kilos in weight and fly for around 20-30 minutes. For large fields, the drone swarms could operate in relay teams, with drones landing and being replaced by others.

It’s the kind of technology that is ideally suited to today’s large-scale farms, as is another remote technology that combines on-the-ground sensor information with satellite data to tell farmers how much nitrogen or water their fields need.

Wheat harvested from a field in Boigneville, 100 km south of Paris, France, in August this year will have been grown with the benefit of this data, as part of pilot being run by an EU-funded project known as IOF2020, which involves over 70 partners and around 200 researchers.

‘Sensors are costing less and less, so at the end of the project we hope to have something farmers or farm cooperatives can deploy in their fields,’ explained Florence Leprince, a plant scientist at Arvalis – Institut du végétal, the French arable farming institute which is running the wheat experiment.

‘This will allow farmers be more precise and not overuse nitrogen or water.’ Florence Leprince, Arvalis – Institut du végétal, France

Adding too much nitrogen to a crop field costs farmers money, but it also has a negative environmental impact. Surplus nitrogen leaches from soils and into rivers and lakes, causing pollution.

It’s needed because satellite pictures can indicate how much nitrogen is in a crop, but not in soil. The sensors will help add detail, though in a way that farmers will find easy to use.

It’s a similar story for the robotic hedge trimmer being developed by a separate group of researchers. All the farmer or groundskeeper needs to do is mark which hedge needs trimming.

‘The user will sketch the garden, though not too accurately,’ said Bob Fisher, computer vision scientist at Edinburgh University, UK, and coordinator of the EU-funded TrimBot2020 project. ‘The robot will go into the garden and come back with a tidied-up sketch map. At that point, the user can say go trim that hedge, or mark what’s needed on the map.’

This autumn will see the arm and the robot base assembled together, while the self-driving bot will be set off around the garden next spring.

More info:
SAGA (part of ECHORD Plus Plus)
IOF2020
TrimBot2020



tags:


Horizon Magazine brings you the latest news and features about thought-provoking science and innovative research projects funded by the EU.
Horizon Magazine brings you the latest news and features about thought-provoking science and innovative research projects funded by the EU.





Related posts :



CoRL2025 – RobustDexGrasp: dexterous robot hand grasping of nearly any object

  11 Nov 2025
A new reinforcement learning framework enables dexterous robot hands to grasp diverse objects with human-like robustness and adaptability—using only a single camera.

Robot Talk Episode 132 – Collaborating with industrial robots, with Anthony Jules

  07 Nov 2025
In the latest episode of the Robot Talk podcast, Claire chatted to Anthony Jules from Robust.AI about their autonomous warehouse robots that work alongside humans.

Teaching robots to map large environments

  05 Nov 2025
A new approach could help a search-and-rescue robot navigate an unpredictable environment by rapidly generating an accurate map of its surroundings.

Robot Talk Episode 131 – Empowering game-changing robotics research, with Edith-Clare Hall

  31 Oct 2025
In the latest episode of the Robot Talk podcast, Claire chatted to Edith-Clare Hall from the Advanced Research and Invention Agency about accelerating scientific and technological breakthroughs.

A flexible lens controlled by light-activated artificial muscles promises to let soft machines see

  30 Oct 2025
Researchers have designed an adaptive lens made of soft, light-responsive, tissue-like materials.

Social media round-up from #IROS2025

  27 Oct 2025
Take a look at what participants got up to at the IEEE/RSJ International Conference on Intelligent Robots and Systems.

Using generative AI to diversify virtual training grounds for robots

  24 Oct 2025
New tool from MIT CSAIL creates realistic virtual kitchens and living rooms where simulated robots can interact with models of real-world objects, scaling up training data for robot foundation models.



 

Robohub is supported by:




Would you like to learn how to tell impactful stories about your robot or AI system?


scicomm
training the next generation of science communicators in robotics & AI


 












©2025.05 - Association for the Understanding of Artificial Intelligence