Robohub.org
 

How AR technology can help farmers stay relevant

by
19 January 2017



share this:
Image: Wheat Genetics and Germplasm Improvement

Image: Wheat Genetics and Germplasm Improvement

I’ve long believed that Augmented Reality (AR) and robotics are closely related. Both model their environments to some degree. Robotics uses that model to guide the behavior of a machine, whereas AR uses it to provide an enhanced sensory experience to a human.

The exact nature of that enhanced experience is bounded only by available sensory, computational, and display (audio, haptic, …) hardware, and by how the data gathered can be usefully transformed into overlays that augment the natural perception of the human user. What is useful is a function of both the content of those overlays and the latency, how much lag time is introduced by the computations involved in generating the overlays. Faster computational hardware can produce more detailed overlays with the same latency or the same overlays with lower latency than slower hardware.

One important application for AR is making it safer and easier for a human to work in collaboration with robotic hardware. For example, a robot might provide the path it intends to follow and the 3D space through which it intends to pass, and that information might be converted in an AR display into highlighting of anything occupying that space. Or perhaps a machine wants to direct the attention of its human counterpart to some particular element of the environment, say one specific plant. That too could be highlighted in the display.

While these examples only scratch the surface of what is possible, they do serve to illustrate that the content of the AR overlays need not be generated entirely from data gathered by sensors attached to the display itself, but can be provided by other sources, including but not limited to other nearby devices. Those sources might include aerial or satellite imagery and information from databases. In the farming context, they might include 3D soil maps produced from core samples.

Examples of overlays that might be useful for a farmer include thermal imagery, current soil moisture content, soil surface porosity and water absorption capacity, exaggerated vertical relief and what to expect in the way of runoff and resulting erosion for various precipitation scenarios, highlighting all plants of a particular species, all plants exhibiting nutrient deficiencies or other trauma, highlighting bare soil (no mulch or plant cover), the presence, activity, and impact of various types of animals. This list could go on and on.

Machines may be better at doing particular manipulations of data, finding correlations, and even at answering well-specified questions, but they’re not so good at asking meaningful questions, much less at thinking outside the box. For this reason, the combination of human and machine is more powerful than either alone.

It’s still very early days in AR, and there’s a great deal of room for improvement. One development that is likely to occur sooner rather than later is voice operation, enabling hands-free control of the AR experience, including which overlays are active and how they are combined. With voice control, a farmer should be able to walk through a field, say what he wants to see, and make modifications to the plan controlling the robotic machinery that actually operates the farm, or issuing commands for execution by the first available machine. For most, this will be a more intimate and far richer connection to their land than what they currently experience.


If you enjoyed this article, you may also want to read:

See all the latest robotics news on Robohub, or sign up for our weekly newsletter.



tags: , , , , , , , ,


John Payne





Related posts :



Robot Talk Episode 101 – Christos Bergeles

In the latest episode of the Robot Talk podcast, Claire chatted to Christos Bergeles from King's College London about micro-surgical robots to deliver therapies deep inside the body.
06 December 2024, by

Robot Talk Episode 100 – Mini Rai

In the latest episode of the Robot Talk podcast, Claire chatted to Mini Rai from Orbit Rise about orbital and planetary robots.
29 November 2024, by

Robot Talk Episode 99 – Joe Wolfel

In the latest episode of the Robot Talk podcast, Claire chatted to Joe Wolfel from Terradepth about autonomous submersible robots for collecting ocean data.
22 November 2024, by

Robot Talk Episode 98 – Gabriella Pizzuto

In the latest episode of the Robot Talk podcast, Claire chatted to Gabriella Pizzuto from the University of Liverpool about intelligent robotic manipulators for laboratory automation.
15 November 2024, by

Online hands-on science communication training – sign up here!

Find out how to communicate about your work with experts from Robohub, AIhub, and IEEE Spectrum.
13 November 2024, by

Robot Talk Episode 97 – Pratap Tokekar

In the latest episode of the Robot Talk podcast, Claire chatted to Pratap Tokekar from the University of Maryland about how teams of robots with different capabilities can work together.
08 November 2024, by





Robohub is supported by:




Would you like to learn how to tell impactful stories about your robot or AI system?


scicomm
training the next generation of science communicators in robotics & AI


©2024 - Association for the Understanding of Artificial Intelligence


 












©2021 - ROBOTS Association