Robohub.org
 

How AR technology can help farmers stay relevant


by
19 January 2017



share this:
Image: Wheat Genetics and Germplasm Improvement

Image: Wheat Genetics and Germplasm Improvement

I’ve long believed that Augmented Reality (AR) and robotics are closely related. Both model their environments to some degree. Robotics uses that model to guide the behavior of a machine, whereas AR uses it to provide an enhanced sensory experience to a human.

The exact nature of that enhanced experience is bounded only by available sensory, computational, and display (audio, haptic, …) hardware, and by how the data gathered can be usefully transformed into overlays that augment the natural perception of the human user. What is useful is a function of both the content of those overlays and the latency, how much lag time is introduced by the computations involved in generating the overlays. Faster computational hardware can produce more detailed overlays with the same latency or the same overlays with lower latency than slower hardware.

One important application for AR is making it safer and easier for a human to work in collaboration with robotic hardware. For example, a robot might provide the path it intends to follow and the 3D space through which it intends to pass, and that information might be converted in an AR display into highlighting of anything occupying that space. Or perhaps a machine wants to direct the attention of its human counterpart to some particular element of the environment, say one specific plant. That too could be highlighted in the display.

While these examples only scratch the surface of what is possible, they do serve to illustrate that the content of the AR overlays need not be generated entirely from data gathered by sensors attached to the display itself, but can be provided by other sources, including but not limited to other nearby devices. Those sources might include aerial or satellite imagery and information from databases. In the farming context, they might include 3D soil maps produced from core samples.

Examples of overlays that might be useful for a farmer include thermal imagery, current soil moisture content, soil surface porosity and water absorption capacity, exaggerated vertical relief and what to expect in the way of runoff and resulting erosion for various precipitation scenarios, highlighting all plants of a particular species, all plants exhibiting nutrient deficiencies or other trauma, highlighting bare soil (no mulch or plant cover), the presence, activity, and impact of various types of animals. This list could go on and on.

Machines may be better at doing particular manipulations of data, finding correlations, and even at answering well-specified questions, but they’re not so good at asking meaningful questions, much less at thinking outside the box. For this reason, the combination of human and machine is more powerful than either alone.

It’s still very early days in AR, and there’s a great deal of room for improvement. One development that is likely to occur sooner rather than later is voice operation, enabling hands-free control of the AR experience, including which overlays are active and how they are combined. With voice control, a farmer should be able to walk through a field, say what he wants to see, and make modifications to the plan controlling the robotic machinery that actually operates the farm, or issuing commands for execution by the first available machine. For most, this will be a more intimate and far richer connection to their land than what they currently experience.


If you enjoyed this article, you may also want to read:

See all the latest robotics news on Robohub, or sign up for our weekly newsletter.



tags: , , , , , , , ,


John Payne





Related posts :



Congratulations to the #AAMAS2025 best paper, best demo, and distinguished dissertation award winners

  29 May 2025
Find out who won the awards presented at the International Conference on Autonomous Agents and Multiagent Systems last week.

Congratulations to the #ICRA2025 best paper award winners

  27 May 2025
The winners and finalists in the different categories have been announced.

#ICRA2025 social media round-up

  23 May 2025
Find out what the participants got up to at the International Conference on Robotics & Automation.

Robot Talk Episode 122 – Bio-inspired flying robots, with Jane Pauline Ramos Ramirez

  23 May 2025
In the latest episode of the Robot Talk podcast, Claire chatted to Jane Pauline Ramos Ramirez from Delft University of Technology about drones that can move on land and in the air.

Robot Talk Episode 121 – Adaptable robots for the home, with Lerrel Pinto

  16 May 2025
In the latest episode of the Robot Talk podcast, Claire chatted to Lerrel Pinto from New York University about using machine learning to train robots to adapt to new environments.

What’s coming up at #ICRA2025?

  16 May 2025
Find out what's in store at the IEEE International Conference on Robotics & Automation, which will take place from 19-23 May.

Robot see, robot do: System learns after watching how-tos

  14 May 2025
Researchers have developed a new robotic framework that allows robots to learn tasks by watching a how-to video

AI-powered robots help tackle Europe’s growing e-waste problem

  12 May 2025
EU-funded researchers have developed adaptable robots that could transform the way we recycle electronic waste, benefiting both the environment and the economy.



 

Robohub is supported by:




Would you like to learn how to tell impactful stories about your robot or AI system?


scicomm
training the next generation of science communicators in robotics & AI


©2025.05 - Association for the Understanding of Artificial Intelligence


 












©2025.05 - Association for the Understanding of Artificial Intelligence