Robohub.org
 

Drones, volcanoes and the ‘computerisation’ of the Earth


by
29 December 2017



share this:

The Mount Agung volcano spews smoke, as seen from Karangasem, Bali. EPA-EFE/MADE NAGI


By Adam Fish

The eruption of the Agung volcano in Bali, Indonesia has been devastating, particularly for the 55,000 local people who have had to leave their homes and move into shelters. It has also played havoc with the flights in and out of the island, leaving people stranded while the experts try to work out what the volcano will do next.

But this has been a fascinating time for scholars like me who investigate the use of drones in social justice, environmental activism and crisis preparedness. The use of drones in this context is just the latest example of the “computerisation of nature” and raises questions about how reality is increasingly being constructed by software.

Amazon drone delivery is developing in the UK, drone blood delivery is happening in Rwanda, while in Indonesia people are using drones to monitor orangutan populations, map the growth and expansion of palm oil plantations and gather information that might help us predict when volcanoes such as Agung might again erupt with devastating impact.

In Bali, I have the pleasure of working with a remarkable group of drone professionals, inventors and hackers who work for Aeroterrascan, a drone company from Bandung, on the Indonesian island of Java. As part of their corporate social responsibility, they have donated their time and technologies to the Balinese emergency and crisis response teams. It’s been fascinating to participate in a project that flies remote sensing systems high in the air in order to better understand dangerous forces deep in the Earth.

I’ve been involved in two different drone volcano missions. A third mission will begin in a few days. In the first, we used drones to create an extremely accurate 3D map of the size of the volcano – down to 20cm of accuracy. With this information, we could see if the volcano was actually growing in size – key evidence that it is about to blow up.

The second mission involved flying a carbon dioxide and sulphur dioxide smelling sensor through the plume. An increase in these gases can tell us if an eruption looms. There was a high degree of carbon dioxide and that informed the government to raise the threat warning to the highest level.

In the forthcoming third mission, we will use drones to see if anyone is still in the exclusion zone so they can be found and rescued.

What is interesting to me as an anthropologist is how scientists and engineers use technologies to better understand distant processes in the atmosphere and below the Earth. It has been a difficult task, flying a drone 3,000 meters to the summit of an erupting volcano. Several different groups have tried and a few expensive drones have been lost – sacrifices to what the Balinese Hindus consider a sacred mountain.

More philosophically, I am interested in better understanding the implications of having sensor systems such as drones flying about in the air, under the seas, or on volcanic craters – basically everywhere. These tools may help us to evacuate people before a crisis but it also entails transforming organic signals into computer code. We’ve long interpreted nature through technologies that augment our senses, particularly sight. Microscopes, telescopes and binoculars have been great assets for chemistry, astronomy and biology.

The internet of nature

But the sensorification of the elements is something different. This has been called the computationalisation of Earth. We’ve heard a lot about the internet of things but this is the internet of nature. This is the surveillance state turned onto biology. The present proliferation of drones is the latest step in wiring everything on the planet. In this case, the air itself, to better understand the guts of a volcano.

These flying sensors, it is hoped, will give volcanologists what anthropologist Stephen Helmreich called abduction – or a predictive and prophetic “argument from the future”.

But the drones, sensors and software we use provide a particular and partial worldview. Looking back at today from the future, what will be the impact of increasing datafication of nature: better crop yield, emergency preparation, endangered species monitoring? Or will this quantification of the elements result in a reduction of nature to computer logic?

There is something not fully comprehended – or more ominously not comprehensible – about how flying robots and self-driving cars equipped with remote sensing systems filter the world through big data crunching algorithms capable of generating and responding to their own artificial intelligence.

These non-human others react to the world not as ecological, social, or geological processes but as functions and feature sets in databases. I am concerned by what this software view of nature will exclude, and as they remake the world in their database image, what the implications of those exclusions might be for planetary sustainability and human autonomy.

The ConversationIn this future world, there may be less of a difference between engineering towards nature and the engineering of nature.

Adam Fish, Senior Lecturer in Sociology and Media Studies, Lancaster University

This article was originally published on The Conversation. Read the original article.




The Conversation is an independent source of news and views, sourced from the academic and research community and delivered direct to the public.
The Conversation is an independent source of news and views, sourced from the academic and research community and delivered direct to the public.





Related posts :



Robot Talk Episode 132 – Collaborating with industrial robots, with Anthony Jules

  07 Nov 2025
In the latest episode of the Robot Talk podcast, Claire chatted to Anthony Jules from Robust.AI about their autonomous warehouse robots that work alongside humans.

Teaching robots to map large environments

  05 Nov 2025
A new approach could help a search-and-rescue robot navigate an unpredictable environment by rapidly generating an accurate map of its surroundings.

Robot Talk Episode 131 – Empowering game-changing robotics research, with Edith-Clare Hall

  31 Oct 2025
In the latest episode of the Robot Talk podcast, Claire chatted to Edith-Clare Hall from the Advanced Research and Invention Agency about accelerating scientific and technological breakthroughs.

A flexible lens controlled by light-activated artificial muscles promises to let soft machines see

  30 Oct 2025
Researchers have designed an adaptive lens made of soft, light-responsive, tissue-like materials.

Social media round-up from #IROS2025

  27 Oct 2025
Take a look at what participants got up to at the IEEE/RSJ International Conference on Intelligent Robots and Systems.

Using generative AI to diversify virtual training grounds for robots

  24 Oct 2025
New tool from MIT CSAIL creates realistic virtual kitchens and living rooms where simulated robots can interact with models of real-world objects, scaling up training data for robot foundation models.

Robot Talk Episode 130 – Robots learning from humans, with Chad Jenkins

  24 Oct 2025
In the latest episode of the Robot Talk podcast, Claire chatted to Chad Jenkins from University of Michigan about how robots can learn from people and assist us in our daily lives.

Robot Talk at the Smart City Robotics Competition

  22 Oct 2025
In a special bonus episode of the podcast, Claire chatted to competitors, exhibitors, and attendees at the Smart City Robotics Competition in Milton Keynes.



 

Robohub is supported by:




Would you like to learn how to tell impactful stories about your robot or AI system?


scicomm
training the next generation of science communicators in robotics & AI


 












©2025.05 - Association for the Understanding of Artificial Intelligence