Robohub.org
 

Relax, we’re not living in a computer simulation

by
13 July 2016



share this:
space-robotics-exploration

Ever since Elon Musk’s recent admission that he’s a simulationist, several people have asked me what I think of the proposition that we are living inside a simulation. My view is very firmly that the Universe we are right now experiencing is real. Here are my reasons.

Firstly, Occam’s razor; the principle of explanatory parsimony. The problem with the simulation argument is that it is a fantastically complicated explanation for the universe we experience. It’s about as implausible as the idea that some omnipotent being created the universe. No. The simplest and most elegant explanation is that the universe we see and touch, both first hand and through our telescopes, LIGOs and Large Hadron Colliders, is the real universe and not an artifact of some massive computer simulation.

Second, is the problem of the Reality Gap. Anyone who uses simulation as a tool to develop robots is well aware that robots which appear to work perfectly well in a simulated virtual world often don’t work very well at all when the same design is tested in the real robot. This problem is especially acute when we are artificially evolving those robots. The reason for these problems is that the model of the real world and the robot(s) in it inside our simulation is an approximation. The Reality Gap refers to the less-than-perfect fidelity of the simulation; a better (higher fidelity) simulator would reduce the reality gap.

Anyone who has actually coded a simulator is painfully aware of the cost, not just computational but coding costs, of improving the fidelity of the simulation – even a little bit is very high indeed. My long experience of both coding and using computer simulations teaches me that there is a law of diminishing returns, i.e. that the cost of each additional 1% of simulator fidelity costs far more than 1%. I rather suspect that the computational and coding cost of a simulator with 100% fidelity is infinite. Rather as in HiFi audio, the amount of money you would need to spend to perfectly reproduce the sound of a Stradivarius ends up higher than the cost of hiring a real Strad and a world-class violinist to play it for you.

At this point the simulationists might argue that the simulation we are living in doesn’t need to be perfect, just good enough. Good enough to do what, exactly? To fool us that we’re living in a simulation, or good enough to run on a finite computer (i.e. one that has finite computational power and runs at a finite speed). The problem with this argument is that every time we look deeper into the universe we see more: more galaxies, more sub-atomic particles, etc. In short, we see more detail. The Voyager 1 spacecraft has left the Solar System without crashing, like Truman, into the edge of the simulation. There are no glitches like deja vu in The Matrix.

My third argument is about the computational effort, and therefore energy cost of simulation. I conjecture that to non-trivially simulate a complex system x (i.e. human), requires more energy than the real x consumes. An equation to express this inequality looks like this; how much greater depends on how high the fidelity of the simulation.

Let me explain. The average human burns around 2000 Calories a day, or about 9000 KJoules of energy. How much energy would a computer simulation of a human require, capable of doing all the same stuff (even in a virtual world) that you can in your day? Well, that’s impossible to estimate because we can’t simulate complete human brains (let alone the rest of a human). But here’s one illustration. Lee Sedol played AlphaGo a few months ago. In a single 2 hour match, he burned about 170 Calories – the amount of energy you’d get from an egg sandwich. In the same 2 hours the AlphaGo machine consumed around 50,000 times more energy.

What can we simulate? The most complex organism that we have been able to simulate so far is the Nematode worm c-elegans. I previously estimated that the energy cost of simulating the nervous system of a c-elegans is (optimistically) about 9 J/hour, which is about 2000 times greater than the real nematode (0.004 J/hr).

I think there are lots of good reasons that simulating complex systems on a computer costs more energy than the same system consumes in the real world, so I’ll ask you to take my word for it (I’ll write about it another time). And what’s more the relationship between energy cost and mass is logarithmic, following Kleiber’s Law, and I strongly suspect the same law applies to scaling up computational effort as I wrote here. Thus, if the complexity of an organism o is C, then following Kleiber’s Law the energy cost of simulating that organism, e will be

Furthermore, the exponent X (which in Kleiber’s law is reckoned to be between 0.66 and 0.75 for animals and 1 for plants), will itself be a function of the fidelity of the simulation, hence X(F), where F is a measure of fidelity.

By using the number of synapses as a proxy for complexity and making some guesses about the values of X and F we could probably estimate the energy cost of simulating all humans on the planet (much harder would be estimating the energy cost of simulating every living thing on the planet). It would be a very big number indeed, but that’s not really the point I’m making here.

The fundamental issue is this: if my conjecture that to simulate complex system x requires more energy than the real x consumes is correct, then to simulate the base level universe would require more energy than that universe contains – which is clearly impossible. Thus we – even in principle – could not simulate the whole of our own observable universe to a level of fidelity sufficient for our conscious experience. And, for the same reason, neither could our super advanced descendants create a simulation of a duplicate ancestor universe for us to (virtually) live in. Hence, we are not living in such a simulation.



tags: , , , ,


Alan Winfield is Professor in robotics at UWE Bristol. He communicates about science on his personal blog.
Alan Winfield is Professor in robotics at UWE Bristol. He communicates about science on his personal blog.





Related posts :



Open Robotics Launches the Open Source Robotics Alliance

The Open Source Robotics Foundation (OSRF) is pleased to announce the creation of the Open Source Robotics Alliance (OSRA), a new initiative to strengthen the governance of our open-source robotics so...

Robot Talk Episode 77 – Patricia Shaw

In the latest episode of the Robot Talk podcast, Claire chatted to Patricia Shaw from Aberystwyth University all about home assistance robots, and robot learning and development.
18 March 2024, by

Robot Talk Episode 64 – Rav Chunilal

In the latest episode of the Robot Talk podcast, Claire chatted to Rav Chunilal from Sellafield all about robotics and AI for nuclear decommissioning.
31 December 2023, by

AI holidays 2023

Thanks to those that sent and suggested AI and robotics-themed holiday videos, images, and stories. Here’s a sample to get you into the spirit this season....
31 December 2023, by and

Faced with dwindling bee colonies, scientists are arming queens with robots and smart hives

By Farshad Arvin, Martin Stefanec, and Tomas Krajnik Be it the news or the dwindling number of creatures hitting your windscreens, it will not have evaded you that the insect world in bad shape. ...
31 December 2023, by

Robot Talk Episode 63 – Ayse Kucukyilmaz

In the latest episode of the Robot Talk podcast, Claire chatted to Ayse Kucukyilmaz from the University of Nottingham about collaboration, conflict and failure in human-robot interactions.
31 December 2023, by





Robohub is supported by:




Would you like to learn how to tell impactful stories about your robot or AI system?


scicomm
training the next generation of science communicators in robotics & AI


©2024 - Association for the Understanding of Artificial Intelligence


 












©2021 - ROBOTS Association