Robohub.org
 

How AR technology can help farmers stay relevant

by
19 January 2017



share this:
Image: Wheat Genetics and Germplasm Improvement

Image: Wheat Genetics and Germplasm Improvement

I’ve long believed that Augmented Reality (AR) and robotics are closely related. Both model their environments to some degree. Robotics uses that model to guide the behavior of a machine, whereas AR uses it to provide an enhanced sensory experience to a human.

The exact nature of that enhanced experience is bounded only by available sensory, computational, and display (audio, haptic, …) hardware, and by how the data gathered can be usefully transformed into overlays that augment the natural perception of the human user. What is useful is a function of both the content of those overlays and the latency, how much lag time is introduced by the computations involved in generating the overlays. Faster computational hardware can produce more detailed overlays with the same latency or the same overlays with lower latency than slower hardware.

One important application for AR is making it safer and easier for a human to work in collaboration with robotic hardware. For example, a robot might provide the path it intends to follow and the 3D space through which it intends to pass, and that information might be converted in an AR display into highlighting of anything occupying that space. Or perhaps a machine wants to direct the attention of its human counterpart to some particular element of the environment, say one specific plant. That too could be highlighted in the display.

While these examples only scratch the surface of what is possible, they do serve to illustrate that the content of the AR overlays need not be generated entirely from data gathered by sensors attached to the display itself, but can be provided by other sources, including but not limited to other nearby devices. Those sources might include aerial or satellite imagery and information from databases. In the farming context, they might include 3D soil maps produced from core samples.

Examples of overlays that might be useful for a farmer include thermal imagery, current soil moisture content, soil surface porosity and water absorption capacity, exaggerated vertical relief and what to expect in the way of runoff and resulting erosion for various precipitation scenarios, highlighting all plants of a particular species, all plants exhibiting nutrient deficiencies or other trauma, highlighting bare soil (no mulch or plant cover), the presence, activity, and impact of various types of animals. This list could go on and on.

Machines may be better at doing particular manipulations of data, finding correlations, and even at answering well-specified questions, but they’re not so good at asking meaningful questions, much less at thinking outside the box. For this reason, the combination of human and machine is more powerful than either alone.

It’s still very early days in AR, and there’s a great deal of room for improvement. One development that is likely to occur sooner rather than later is voice operation, enabling hands-free control of the AR experience, including which overlays are active and how they are combined. With voice control, a farmer should be able to walk through a field, say what he wants to see, and make modifications to the plan controlling the robotic machinery that actually operates the farm, or issuing commands for execution by the first available machine. For most, this will be a more intimate and far richer connection to their land than what they currently experience.


If you enjoyed this article, you may also want to read:

See all the latest robotics news on Robohub, or sign up for our weekly newsletter.



tags: , , , , , , , ,


John Payne





Related posts :



Sensing with purpose

Fadel Adib uses wireless technologies to sense the world in new ways, taking aim at sweeping problems such as food insecurity, climate change, and access to health care.
29 January 2023, by

Robot Talk Episode 34 – Interview with Sabine Hauert

In this week's episode of the Robot Talk podcast, host Claire Asher chatted to Dr Sabine Hauert from the University of Bristol all about swarm robotics, nanorobots, and environmental monitoring.
28 January 2023, by

Special drone collects environmental DNA from trees

Researchers at ETH Zurich and the Swiss Federal research institute WSL have developed a flying device that can land on tree branches to take samples. This opens up a new dimension for scientists previously reserved for biodiversity researchers.
27 January 2023, by

The robots of CES 2023

Robots were on the main expo floor at CES this year, and these weren’t just cool robots for marketing purposes. I’ve been tracking robots at CES for more than 10 years, watching the transition from robot toys to real robots.
25 January 2023, by

Robot Talk Episode 33 – Interview with Dan Stoyanov

In this week's episode of the Robot Talk podcast, host Claire Asher chatted to Professor Dan Stoyanov from University College London all about robotic vision, surgical robotics, and artificial intelligence.
20 January 2023, by





©2021 - ROBOTS Association


 












©2021 - ROBOTS Association