Robohub.org
 

Biological Agriculture for Roboticists, Part 6

by
30 May 2016



share this:

In a previous installment, I said that identifying weeds based on what’s left standing after a patch of ground has been grazed won’t control low-growing plants, using goatheads as an example.

To begin with, what some type of herbivore (cattle) finds distasteful another (goats) may find delectable, so not everything left standing by a single species is useless, and it’s a good idea to run cattle, which strongly prefer grass, together with or immediately followed by another herbivore that is less picky, like goats.

Secondly, being unpalatable doesn’t automatically make a plant a weed. Weeds are plants that move aggressively into disturbed ground, smother or chemically inhibit other plant life, and/or put most of their energy into producing above-ground growth and seeds rather than roots. They are typically annuals or biennials (producing seed in their second year). If a plant does none of these things and is not toxic to livestock or wildlife, it’s probably not accurate to describe it as a weed. Even so, if livestock won’t eat it and it’s not a candidate for protection for being rare and endangered or threatened, and not vital to some rare and endangered animal, you probably don’t want it taking up ground that could be producing something more useful in your pasture. So what’s left standing after grazing isn’t such a bad indication, but, as already mentioned, this test won’t catch low-growing plants.

So, how to deal with those low-growing plants? Good question, and a good subject for further research. First you have to be able to identify their presence, and distinguish between them and the grass stubble left behind by grazing. Then there’s the matter of locating the main stem and the location where it and the root system connect. If a plant is laying on the ground, supported by it and not swaying in the breeze, the modeling of its branching structure from video of its motion I referenced earlier won’t work. One way to accomplish this might be to use a vacuum that pulls in a sufficiently large volume of air to pick up the vining tendrils and suck them in, and if you have a serious infestation of this sort of weed then using such equipment might be a reasonable choice. Another way might be a pincer-like manipulator, with cylindrical counter-rotating rotary rasps for fingers, pinching the vine at any point, determining which direction to rotate by trial and error, then using the resulting tension to guide the manipulator to the main stem so it can be uprooted.

Such a manipulator might be generally better at uprooting than a simple grasping manipulator, since the rotation of the fingers would replace retracting the robotic arm, potentially making the overall operation more efficient. A variation on the theme which might prove more generally useful would have low points on each finger matched by shallow indentations on the other finger, at the end furthest from the motors driving finger rotation, progressing to protruding hooks matched by deep indentations at the end nearest the motors. This would allow the same attachment to be used both for ordinary uprooting and for gathering up a something like goatheads, simply by adjusting where along the length of the rotating fingers it grasped the plant.


I also promised to get back to the use of sound, in the context of fauna management and pest control. This by itself could easily be the subject of a lengthy book. Information about the environment can be gleaned from ambient sounds as well as from active sonar, and a robot might also emit sounds for the effects they can produce.

Sonar is already widely used in robotics as a way of detecting and determining the distance to obstacles. While thus far more sophisticated technologies, such as synthetic aperture sonar, have primarily been developed for underwater use, a large market for autonomous robots operating at modest ground speeds in uncontrolled environments might prove incentive enough to justify developing versions for use in air.

Meanwhile, there is a wealth of information available from simple microphones. From tiny arthropods to passing ungulates, many animals produce characteristic sounds, with familiar examples including crickets, frogs, and all types of birds and mammals. These sounds can help identify not only what species are present but where they are and what they are doing.

Sound can also be used to affect the behavior of animals, for example discouraging deer from spending too much time browsing on your vegetable garden or keeping chickens from venturing too far afield. Through sound, a robot might signal the presence of a predator, or food, or a potential mate.

But it’s not just animals; even plants produce sounds. A tree that has sustained wind damage, introducing cracks into its trunk, will sound different from one which has not. A plant with wilted leaves sounds different from one that is fully turgid, and one from which the leaves have fallen sounds different yet.

So far as I’m aware, all such potential uses of sound represent largely unexplored areas of research, so it’s hard to know what all a machine might be able to learn about its biological environment just by listening and processing the data produced, and in what manner it might use sound to exert some control over that environment.


I’ve concentrated on tying up loose ends here because I’m eager to get on to the series on Robotics for Gardeners and Farmers. That’s not to say that this will be the last installment in this series; after all I’ve yet to address planting, pruning, pest control, harvest, or dealing with the plant matter left behind after harvest, as well as animal husbandry. Whether I eventually get to all of these remains to be seen. Touching on all such topics probably isn’t as important as conveying the nature of the opportunities presented by the application of robotics to methods founded in horticulture rather than in conventional agriculture, with an eye to then making them scalable.

Previous installments:




John Payne





Related posts :



Robot Talk Episode 101 – Christos Bergeles

In the latest episode of the Robot Talk podcast, Claire chatted to Christos Bergeles from King's College London about micro-surgical robots to deliver therapies deep inside the body.
06 December 2024, by

Robot Talk Episode 100 – Mini Rai

In the latest episode of the Robot Talk podcast, Claire chatted to Mini Rai from Orbit Rise about orbital and planetary robots.
29 November 2024, by

Robot Talk Episode 99 – Joe Wolfel

In the latest episode of the Robot Talk podcast, Claire chatted to Joe Wolfel from Terradepth about autonomous submersible robots for collecting ocean data.
22 November 2024, by

Robot Talk Episode 98 – Gabriella Pizzuto

In the latest episode of the Robot Talk podcast, Claire chatted to Gabriella Pizzuto from the University of Liverpool about intelligent robotic manipulators for laboratory automation.
15 November 2024, by

Online hands-on science communication training – sign up here!

Find out how to communicate about your work with experts from Robohub, AIhub, and IEEE Spectrum.
13 November 2024, by

Robot Talk Episode 97 – Pratap Tokekar

In the latest episode of the Robot Talk podcast, Claire chatted to Pratap Tokekar from the University of Maryland about how teams of robots with different capabilities can work together.
08 November 2024, by





Robohub is supported by:




Would you like to learn how to tell impactful stories about your robot or AI system?


scicomm
training the next generation of science communicators in robotics & AI


©2024 - Association for the Understanding of Artificial Intelligence


 












©2021 - ROBOTS Association