Robohub.org
 

Relax, we’re not living in a computer simulation

by
13 July 2016



share this:
space-robotics-exploration

Ever since Elon Musk’s recent admission that he’s a simulationist, several people have asked me what I think of the proposition that we are living inside a simulation. My view is very firmly that the Universe we are right now experiencing is real. Here are my reasons.

Firstly, Occam’s razor; the principle of explanatory parsimony. The problem with the simulation argument is that it is a fantastically complicated explanation for the universe we experience. It’s about as implausible as the idea that some omnipotent being created the universe. No. The simplest and most elegant explanation is that the universe we see and touch, both first hand and through our telescopes, LIGOs and Large Hadron Colliders, is the real universe and not an artifact of some massive computer simulation.

Second, is the problem of the Reality Gap. Anyone who uses simulation as a tool to develop robots is well aware that robots which appear to work perfectly well in a simulated virtual world often don’t work very well at all when the same design is tested in the real robot. This problem is especially acute when we are artificially evolving those robots. The reason for these problems is that the model of the real world and the robot(s) in it inside our simulation is an approximation. The Reality Gap refers to the less-than-perfect fidelity of the simulation; a better (higher fidelity) simulator would reduce the reality gap.

Anyone who has actually coded a simulator is painfully aware of the cost, not just computational but coding costs, of improving the fidelity of the simulation – even a little bit is very high indeed. My long experience of both coding and using computer simulations teaches me that there is a law of diminishing returns, i.e. that the cost of each additional 1% of simulator fidelity costs far more than 1%. I rather suspect that the computational and coding cost of a simulator with 100% fidelity is infinite. Rather as in HiFi audio, the amount of money you would need to spend to perfectly reproduce the sound of a Stradivarius ends up higher than the cost of hiring a real Strad and a world-class violinist to play it for you.

At this point the simulationists might argue that the simulation we are living in doesn’t need to be perfect, just good enough. Good enough to do what, exactly? To fool us that we’re living in a simulation, or good enough to run on a finite computer (i.e. one that has finite computational power and runs at a finite speed). The problem with this argument is that every time we look deeper into the universe we see more: more galaxies, more sub-atomic particles, etc. In short, we see more detail. The Voyager 1 spacecraft has left the Solar System without crashing, like Truman, into the edge of the simulation. There are no glitches like deja vu in The Matrix.

My third argument is about the computational effort, and therefore energy cost of simulation. I conjecture that to non-trivially simulate a complex system x (i.e. human), requires more energy than the real x consumes. An equation to express this inequality looks like this; how much greater depends on how high the fidelity of the simulation.

Let me explain. The average human burns around 2000 Calories a day, or about 9000 KJoules of energy. How much energy would a computer simulation of a human require, capable of doing all the same stuff (even in a virtual world) that you can in your day? Well, that’s impossible to estimate because we can’t simulate complete human brains (let alone the rest of a human). But here’s one illustration. Lee Sedol played AlphaGo a few months ago. In a single 2 hour match, he burned about 170 Calories – the amount of energy you’d get from an egg sandwich. In the same 2 hours the AlphaGo machine consumed around 50,000 times more energy.

What can we simulate? The most complex organism that we have been able to simulate so far is the Nematode worm c-elegans. I previously estimated that the energy cost of simulating the nervous system of a c-elegans is (optimistically) about 9 J/hour, which is about 2000 times greater than the real nematode (0.004 J/hr).

I think there are lots of good reasons that simulating complex systems on a computer costs more energy than the same system consumes in the real world, so I’ll ask you to take my word for it (I’ll write about it another time). And what’s more the relationship between energy cost and mass is logarithmic, following Kleiber’s Law, and I strongly suspect the same law applies to scaling up computational effort as I wrote here. Thus, if the complexity of an organism o is C, then following Kleiber’s Law the energy cost of simulating that organism, e will be

Furthermore, the exponent X (which in Kleiber’s law is reckoned to be between 0.66 and 0.75 for animals and 1 for plants), will itself be a function of the fidelity of the simulation, hence X(F), where F is a measure of fidelity.

By using the number of synapses as a proxy for complexity and making some guesses about the values of X and F we could probably estimate the energy cost of simulating all humans on the planet (much harder would be estimating the energy cost of simulating every living thing on the planet). It would be a very big number indeed, but that’s not really the point I’m making here.

The fundamental issue is this: if my conjecture that to simulate complex system x requires more energy than the real x consumes is correct, then to simulate the base level universe would require more energy than that universe contains – which is clearly impossible. Thus we – even in principle – could not simulate the whole of our own observable universe to a level of fidelity sufficient for our conscious experience. And, for the same reason, neither could our super advanced descendants create a simulation of a duplicate ancestor universe for us to (virtually) live in. Hence, we are not living in such a simulation.



tags: ,


Alan Winfield is Professor in robotics at UWE Bristol. He communicates about science on his personal blog.
Alan Winfield is Professor in robotics at UWE Bristol. He communicates about science on his personal blog.





Related posts :



Robot Talk Episode 98 – Gabriella Pizzuto

In the latest episode of the Robot Talk podcast, Claire chatted to Gabriella Pizzuto from the University of Liverpool about intelligent robotic manipulators for laboratory automation.
15 November 2024, by

Online hands-on science communication training – sign up here!

Find out how to communicate about your work with experts from Robohub, AIhub, and IEEE Spectrum.
13 November 2024, by

Robot Talk Episode 97 – Pratap Tokekar

In the latest episode of the Robot Talk podcast, Claire chatted to Pratap Tokekar from the University of Maryland about how teams of robots with different capabilities can work together.
08 November 2024, by

Robot Talk Episode 96 – Maria Elena Giannaccini

In the latest episode of the Robot Talk podcast, Claire chatted to Maria Elena Giannaccini from the University of Aberdeen about soft and bioinspired robotics for healthcare and beyond.
01 November 2024, by

Robot Talk Episode 95 – Jonathan Walker

In the latest episode of the Robot Talk podcast, Claire chatted to Jonathan Walker from Innovate UK about translating robotics research into the commercial sector.
25 October 2024, by

Robot Talk Episode 94 – Esyin Chew

In the latest episode of the Robot Talk podcast, Claire chatted to Esyin Chew from Cardiff Metropolitan University about service and social humanoid robots in healthcare and education.
18 October 2024, by





Robohub is supported by:




Would you like to learn how to tell impactful stories about your robot or AI system?


scicomm
training the next generation of science communicators in robotics & AI


©2024 - Association for the Understanding of Artificial Intelligence


 












©2021 - ROBOTS Association