Robohub.org
 

Biological Agriculture for Roboticists, Part 6


by
30 May 2016



share this:

In a previous installment, I said that identifying weeds based on what’s left standing after a patch of ground has been grazed won’t control low-growing plants, using goatheads as an example.

To begin with, what some type of herbivore (cattle) finds distasteful another (goats) may find delectable, so not everything left standing by a single species is useless, and it’s a good idea to run cattle, which strongly prefer grass, together with or immediately followed by another herbivore that is less picky, like goats.

Secondly, being unpalatable doesn’t automatically make a plant a weed. Weeds are plants that move aggressively into disturbed ground, smother or chemically inhibit other plant life, and/or put most of their energy into producing above-ground growth and seeds rather than roots. They are typically annuals or biennials (producing seed in their second year). If a plant does none of these things and is not toxic to livestock or wildlife, it’s probably not accurate to describe it as a weed. Even so, if livestock won’t eat it and it’s not a candidate for protection for being rare and endangered or threatened, and not vital to some rare and endangered animal, you probably don’t want it taking up ground that could be producing something more useful in your pasture. So what’s left standing after grazing isn’t such a bad indication, but, as already mentioned, this test won’t catch low-growing plants.

So, how to deal with those low-growing plants? Good question, and a good subject for further research. First you have to be able to identify their presence, and distinguish between them and the grass stubble left behind by grazing. Then there’s the matter of locating the main stem and the location where it and the root system connect. If a plant is laying on the ground, supported by it and not swaying in the breeze, the modeling of its branching structure from video of its motion I referenced earlier won’t work. One way to accomplish this might be to use a vacuum that pulls in a sufficiently large volume of air to pick up the vining tendrils and suck them in, and if you have a serious infestation of this sort of weed then using such equipment might be a reasonable choice. Another way might be a pincer-like manipulator, with cylindrical counter-rotating rotary rasps for fingers, pinching the vine at any point, determining which direction to rotate by trial and error, then using the resulting tension to guide the manipulator to the main stem so it can be uprooted.

Such a manipulator might be generally better at uprooting than a simple grasping manipulator, since the rotation of the fingers would replace retracting the robotic arm, potentially making the overall operation more efficient. A variation on the theme which might prove more generally useful would have low points on each finger matched by shallow indentations on the other finger, at the end furthest from the motors driving finger rotation, progressing to protruding hooks matched by deep indentations at the end nearest the motors. This would allow the same attachment to be used both for ordinary uprooting and for gathering up a something like goatheads, simply by adjusting where along the length of the rotating fingers it grasped the plant.


I also promised to get back to the use of sound, in the context of fauna management and pest control. This by itself could easily be the subject of a lengthy book. Information about the environment can be gleaned from ambient sounds as well as from active sonar, and a robot might also emit sounds for the effects they can produce.

Sonar is already widely used in robotics as a way of detecting and determining the distance to obstacles. While thus far more sophisticated technologies, such as synthetic aperture sonar, have primarily been developed for underwater use, a large market for autonomous robots operating at modest ground speeds in uncontrolled environments might prove incentive enough to justify developing versions for use in air.

Meanwhile, there is a wealth of information available from simple microphones. From tiny arthropods to passing ungulates, many animals produce characteristic sounds, with familiar examples including crickets, frogs, and all types of birds and mammals. These sounds can help identify not only what species are present but where they are and what they are doing.

Sound can also be used to affect the behavior of animals, for example discouraging deer from spending too much time browsing on your vegetable garden or keeping chickens from venturing too far afield. Through sound, a robot might signal the presence of a predator, or food, or a potential mate.

But it’s not just animals; even plants produce sounds. A tree that has sustained wind damage, introducing cracks into its trunk, will sound different from one which has not. A plant with wilted leaves sounds different from one that is fully turgid, and one from which the leaves have fallen sounds different yet.

So far as I’m aware, all such potential uses of sound represent largely unexplored areas of research, so it’s hard to know what all a machine might be able to learn about its biological environment just by listening and processing the data produced, and in what manner it might use sound to exert some control over that environment.


I’ve concentrated on tying up loose ends here because I’m eager to get on to the series on Robotics for Gardeners and Farmers. That’s not to say that this will be the last installment in this series; after all I’ve yet to address planting, pruning, pest control, harvest, or dealing with the plant matter left behind after harvest, as well as animal husbandry. Whether I eventually get to all of these remains to be seen. Touching on all such topics probably isn’t as important as conveying the nature of the opportunities presented by the application of robotics to methods founded in horticulture rather than in conventional agriculture, with an eye to then making them scalable.

Previous installments:




John Payne





Related posts :



Robot Talk Episode 132 – Collaborating with industrial robots, with Anthony Jules

  07 Nov 2025
In the latest episode of the Robot Talk podcast, Claire chatted to Anthony Jules from Robust.AI about their autonomous warehouse robots that work alongside humans.

Teaching robots to map large environments

  05 Nov 2025
A new approach could help a search-and-rescue robot navigate an unpredictable environment by rapidly generating an accurate map of its surroundings.

Robot Talk Episode 131 – Empowering game-changing robotics research, with Edith-Clare Hall

  31 Oct 2025
In the latest episode of the Robot Talk podcast, Claire chatted to Edith-Clare Hall from the Advanced Research and Invention Agency about accelerating scientific and technological breakthroughs.

A flexible lens controlled by light-activated artificial muscles promises to let soft machines see

  30 Oct 2025
Researchers have designed an adaptive lens made of soft, light-responsive, tissue-like materials.

Social media round-up from #IROS2025

  27 Oct 2025
Take a look at what participants got up to at the IEEE/RSJ International Conference on Intelligent Robots and Systems.

Using generative AI to diversify virtual training grounds for robots

  24 Oct 2025
New tool from MIT CSAIL creates realistic virtual kitchens and living rooms where simulated robots can interact with models of real-world objects, scaling up training data for robot foundation models.

Robot Talk Episode 130 – Robots learning from humans, with Chad Jenkins

  24 Oct 2025
In the latest episode of the Robot Talk podcast, Claire chatted to Chad Jenkins from University of Michigan about how robots can learn from people and assist us in our daily lives.

Robot Talk at the Smart City Robotics Competition

  22 Oct 2025
In a special bonus episode of the podcast, Claire chatted to competitors, exhibitors, and attendees at the Smart City Robotics Competition in Milton Keynes.



 

Robohub is supported by:




Would you like to learn how to tell impactful stories about your robot or AI system?


scicomm
training the next generation of science communicators in robotics & AI


 












©2025.05 - Association for the Understanding of Artificial Intelligence