How AR technology can help farmers stay relevant

19 January 2017

share this:
Image: Wheat Genetics and Germplasm Improvement

Image: Wheat Genetics and Germplasm Improvement

I’ve long believed that Augmented Reality (AR) and robotics are closely related. Both model their environments to some degree. Robotics uses that model to guide the behavior of a machine, whereas AR uses it to provide an enhanced sensory experience to a human.

The exact nature of that enhanced experience is bounded only by available sensory, computational, and display (audio, haptic, …) hardware, and by how the data gathered can be usefully transformed into overlays that augment the natural perception of the human user. What is useful is a function of both the content of those overlays and the latency, how much lag time is introduced by the computations involved in generating the overlays. Faster computational hardware can produce more detailed overlays with the same latency or the same overlays with lower latency than slower hardware.

One important application for AR is making it safer and easier for a human to work in collaboration with robotic hardware. For example, a robot might provide the path it intends to follow and the 3D space through which it intends to pass, and that information might be converted in an AR display into highlighting of anything occupying that space. Or perhaps a machine wants to direct the attention of its human counterpart to some particular element of the environment, say one specific plant. That too could be highlighted in the display.

While these examples only scratch the surface of what is possible, they do serve to illustrate that the content of the AR overlays need not be generated entirely from data gathered by sensors attached to the display itself, but can be provided by other sources, including but not limited to other nearby devices. Those sources might include aerial or satellite imagery and information from databases. In the farming context, they might include 3D soil maps produced from core samples.

Examples of overlays that might be useful for a farmer include thermal imagery, current soil moisture content, soil surface porosity and water absorption capacity, exaggerated vertical relief and what to expect in the way of runoff and resulting erosion for various precipitation scenarios, highlighting all plants of a particular species, all plants exhibiting nutrient deficiencies or other trauma, highlighting bare soil (no mulch or plant cover), the presence, activity, and impact of various types of animals. This list could go on and on.

Machines may be better at doing particular manipulations of data, finding correlations, and even at answering well-specified questions, but they’re not so good at asking meaningful questions, much less at thinking outside the box. For this reason, the combination of human and machine is more powerful than either alone.

It’s still very early days in AR, and there’s a great deal of room for improvement. One development that is likely to occur sooner rather than later is voice operation, enabling hands-free control of the AR experience, including which overlays are active and how they are combined. With voice control, a farmer should be able to walk through a field, say what he wants to see, and make modifications to the plan controlling the robotic machinery that actually operates the farm, or issuing commands for execution by the first available machine. For most, this will be a more intimate and far richer connection to their land than what they currently experience.

If you enjoyed this article, you may also want to read:

See all the latest robotics news on Robohub, or sign up for our weekly newsletter.

tags: , , , , , , , ,

John Payne

Related posts :

Robo-Insight #5

In this fifth edition, we are excited to feature robot progress in human-robot interaction, agile movement, enhanced training methods, soft robotics, brain surgery, medical navigation, and ecological research. 
25 September 2023, by

Soft robotic tool provides new ‘eyes’ in endovascular surgery

The magnetic device can help visualise and navigate complex and narrow spaces.

‘Brainless’ robot can navigate complex obstacles

Researchers who created a soft robot that could navigate simple mazes without human or computer direction have now built on that work, creating a “brainless” soft robot that can navigate more complex and dynamic environments.
21 September 2023, by

Battery-free origami microfliers from UW researchers offer a new bio-inspired future of flying machines

Researchers at the University of Washington present battery-free microfliers that can change shape in mid-air to vary their dispersal distance.

Virtual-reality tech is fast becoming more real

Touch sensations are improving to help sectors like healthcare and manufacturing, while other advances are being driven by the gaming industry.
16 September 2023, by

High-tech microscope with ML software for detecting malaria in returning travellers

Method not as accurate as human experts, but shows promise.
14 September 2023, by and

©2021 - ROBOTS Association


©2021 - ROBOTS Association