Could we experience the workings of our own brains?

08 May 2013

share this:

One of the oft quoted paradoxes of consciousness is that we are unable to observe or experience our own conscious minds at work; that we cannot be conscious of the workings of consciousness. I’ve always been puzzled about why this is a puzzle. After all, we don’t think it odd that word processors have no insight into their inner workings (although that’s a bad example because we might conceivably code a future self-aware WP and arrange for it to access its inner machinery).

Perhaps a better example is this: The act of picking up a cup of hot coffee and bringing it to your lips appears, on the face of it, to be perfectly observable. No mystery at all. We can see the joints and muscles at work, ‘feel’ the tactile sensing of the coffee cup, and its weight as we begin to lift it. We can even build mathematical models of the kinetics and dynamics, and (with somewhat more difficulty) make robot arms to pick up cups of coffee. But – I contend – we are kidding ourselves if we think we know what’s going on in the complex sensory and neurological processes that appear so effortless to perform. The fact we can observe and even feel ourselves lifting a coffee cup gives very little real insight. And the mathematical models – and robots – are not really models of the human neurological and physiological processes at all, they are models of idealised abstractions of limbs, joints and hand.

We are kidding ourselves if we think we know what’s going on in the complex sensory and neurological processes that appear so effortless to perform.

I would argue that we have no greater insight into the workings of this (apparently straightforward) physical act, than we do of thinking itself. But again this is not surprising. The additional cognitive machinery to be able to access or experience the inner workings of any process, whether mental or physical, would be huge and (biologically) expensive. And with no apparent survival value (except perhaps for philosophers of mind), it’s not surprising that such mechanisms have not evolved. They would of course require not just extra grey matter, but sensing too. It’s interesting that there are no pain receptors within our brains – that’s why it’s perfectly possible to have brain surgery while wide awake.

But this got me thinking. Imagine that at some future time we have nanoscale sensors capable of positioning themselves throughout our brains in order to provide a very large sensor network. If each sensor is monitoring the activity of key neurons, or axons, and able to transmit its readings in real-time to an external device, then we would have the data to provide ourselves with a real-time activity image of our own brains. It could be presented visually, or perhaps sonically (or via multi-media). It might be fun for awhile, but this personal brain imaging technology (let’s call it iBrain) probably wouldn’t provide us with much more insight or experience of our own thought processes.

brain imaging
Diffusion spectrum imaging of the human brain, developed by neuroscientist Van Wedeen at Massachusetts General Hospital. Learn more: MIT Technology Review

But let’s assume that by the time we have the nanotechnology for harmlessly inserting millions of brain nanosensors we will have also figured out the major architectural structures of the brain – crucially linking the neural scale to the macro scale. Actually, if we believe that the recently announced European and US human brain Grand Challenges will achieve what they are promising in terms of modelling and mapping human brain activity, then such an understanding should only be a few decades away. So now build those maps and structures into the personal iBrain, and we will be presented not with a vast and bewildering cloud of colours, as in the beautiful image above, but a simpler image with major highways and structures highlighted. Still complex of course, but then so are street maps of cities or countries. So the iBrain would allow you to zoom into certain regions and really see what’s going on while you (say) listen to Bach (the very thing I’m doing right now).

Then we really would be able to observe our own brains at work and, just perhaps, experience the connection between brain and thought.

This post originally appeared on Alan Winfield’s Web Log on Feb. 20, 2013.

tags: , ,

Alan Winfield is Professor in robotics at UWE Bristol. He communicates about science on his personal blog.
Alan Winfield is Professor in robotics at UWE Bristol. He communicates about science on his personal blog.

Related posts :

How do we control robots on the moon?

In the future, we imagine that teams of robots will explore and develop the surface of nearby planets, moons and asteroids - taking samples, building structures, deploying instruments.
25 September 2022, by , and

Have a say on these robotics solutions before they enter the market!

We have gathered robots which are being developed right now or have just entered the market. We have set these up in a survey style consultation.
24 September 2022, by

Shelf-stocking robots with independent movement

A robot that helps store employees by moving independently through the supermarket and shelving products. According to cognitive robotics researcher Carlos Hernández Corbato, this may be possible in the future. If we engineer the unexpected.
23 September 2022, by

RoboCup humanoid league: Interview with Jasper Güldenstein

We talked to Jasper Güldenstein about how teams transferred developments from the virtual humanoid league to the real-world league.
20 September 2022, by and

Integrated Task and Motion Planning (TAMP) in robotics

In this post we will explore a few things that differentiate TAMP from “plain” task planning, and dive into some detailed examples with the pyrobosim and PDDLStream software tools.
16 September 2022, by



Building Communities Around AI in Africa, with Benjamin Rosman

Deep Learning Indaba is an organization that empowers and builds communities around Artificial Intelligence and Machine Learning across Africa. Benjamin Rosman dives into how Deep Learning Indaba is impacting these communities.
14 September 2022, by

©2021 - ROBOTS Association


©2021 - ROBOTS Association