Biological Agriculture for Roboticists, Part 6

30 May 2016

share this:

In a previous installment, I said that identifying weeds based on what’s left standing after a patch of ground has been grazed won’t control low-growing plants, using goatheads as an example.

To begin with, what some type of herbivore (cattle) finds distasteful another (goats) may find delectable, so not everything left standing by a single species is useless, and it’s a good idea to run cattle, which strongly prefer grass, together with or immediately followed by another herbivore that is less picky, like goats.

Secondly, being unpalatable doesn’t automatically make a plant a weed. Weeds are plants that move aggressively into disturbed ground, smother or chemically inhibit other plant life, and/or put most of their energy into producing above-ground growth and seeds rather than roots. They are typically annuals or biennials (producing seed in their second year). If a plant does none of these things and is not toxic to livestock or wildlife, it’s probably not accurate to describe it as a weed. Even so, if livestock won’t eat it and it’s not a candidate for protection for being rare and endangered or threatened, and not vital to some rare and endangered animal, you probably don’t want it taking up ground that could be producing something more useful in your pasture. So what’s left standing after grazing isn’t such a bad indication, but, as already mentioned, this test won’t catch low-growing plants.

So, how to deal with those low-growing plants? Good question, and a good subject for further research. First you have to be able to identify their presence, and distinguish between them and the grass stubble left behind by grazing. Then there’s the matter of locating the main stem and the location where it and the root system connect. If a plant is laying on the ground, supported by it and not swaying in the breeze, the modeling of its branching structure from video of its motion I referenced earlier won’t work. One way to accomplish this might be to use a vacuum that pulls in a sufficiently large volume of air to pick up the vining tendrils and suck them in, and if you have a serious infestation of this sort of weed then using such equipment might be a reasonable choice. Another way might be a pincer-like manipulator, with cylindrical counter-rotating rotary rasps for fingers, pinching the vine at any point, determining which direction to rotate by trial and error, then using the resulting tension to guide the manipulator to the main stem so it can be uprooted.

Such a manipulator might be generally better at uprooting than a simple grasping manipulator, since the rotation of the fingers would replace retracting the robotic arm, potentially making the overall operation more efficient. A variation on the theme which might prove more generally useful would have low points on each finger matched by shallow indentations on the other finger, at the end furthest from the motors driving finger rotation, progressing to protruding hooks matched by deep indentations at the end nearest the motors. This would allow the same attachment to be used both for ordinary uprooting and for gathering up a something like goatheads, simply by adjusting where along the length of the rotating fingers it grasped the plant.

I also promised to get back to the use of sound, in the context of fauna management and pest control. This by itself could easily be the subject of a lengthy book. Information about the environment can be gleaned from ambient sounds as well as from active sonar, and a robot might also emit sounds for the effects they can produce.

Sonar is already widely used in robotics as a way of detecting and determining the distance to obstacles. While thus far more sophisticated technologies, such as synthetic aperture sonar, have primarily been developed for underwater use, a large market for autonomous robots operating at modest ground speeds in uncontrolled environments might prove incentive enough to justify developing versions for use in air.

Meanwhile, there is a wealth of information available from simple microphones. From tiny arthropods to passing ungulates, many animals produce characteristic sounds, with familiar examples including crickets, frogs, and all types of birds and mammals. These sounds can help identify not only what species are present but where they are and what they are doing.

Sound can also be used to affect the behavior of animals, for example discouraging deer from spending too much time browsing on your vegetable garden or keeping chickens from venturing too far afield. Through sound, a robot might signal the presence of a predator, or food, or a potential mate.

But it’s not just animals; even plants produce sounds. A tree that has sustained wind damage, introducing cracks into its trunk, will sound different from one which has not. A plant with wilted leaves sounds different from one that is fully turgid, and one from which the leaves have fallen sounds different yet.

So far as I’m aware, all such potential uses of sound represent largely unexplored areas of research, so it’s hard to know what all a machine might be able to learn about its biological environment just by listening and processing the data produced, and in what manner it might use sound to exert some control over that environment.

I’ve concentrated on tying up loose ends here because I’m eager to get on to the series on Robotics for Gardeners and Farmers. That’s not to say that this will be the last installment in this series; after all I’ve yet to address planting, pruning, pest control, harvest, or dealing with the plant matter left behind after harvest, as well as animal husbandry. Whether I eventually get to all of these remains to be seen. Touching on all such topics probably isn’t as important as conveying the nature of the opportunities presented by the application of robotics to methods founded in horticulture rather than in conventional agriculture, with an eye to then making them scalable.

Previous installments:

John Payne

Related posts :




Robotics Grasping and Manipulation Competition Spotlight, with Yu Sun

Yu Sun, previous chair of the Robotics Grasping and Manipulation Competition, speaks on the value that this competition brought to the robotics community.
21 May 2022, by



Early Days of ICRA Competitions, with Bill Smart

Bill Smart, one fo the early ICRA Competition Chairs, dives into the high-level decisions involved with creating a meaningful competition.
21 May 2022, by

New imaging method makes tiny robots visible in the body

Microrobots have the potential to revolutionize medicine. Researchers at the Max Planck ETH Centre for Learning Systems have now developed an imaging technique that for the first time recognises cell-​sized microrobots individually and at high resolution in a living organism.
20 May 2022, by

A draft open standard for an Ethical Black Box

Within the RoboTIPS project, we have developed and tested several model of Ethical Black Boxes, including one for an e-puck robot, and another for the MIRO robot.
19 May 2022, by

Unable to attend #ICRA2022 for accessibility issues? Or just curious to see robots?

There are many things that can make it difficult to attend an in person conference in the United States and so the ICRA Organizing Committee, the IEEE Robotics and Automation Society and OhmniLabs would like to help you attend ICRA virtually.
17 May 2022, by



Duckietown Competition Spotlight, with Dr Liam Paull

Dr. Liam Paull, cofounder of the Duckietown competition talks about the only robotics competition where Rubber Duckies are the passengers on an autonomous driving track.
17 May 2022, by

©2021 - ROBOTS Association


©2021 - ROBOTS Association