Robohub.org
 

The new Westworld: Humanizing the un-human, or dehumanizing humankind?


by
18 October 2016



share this:
the_vitruvian_man

HBO’s latest offering (on SkyAtlantic here in the UK) is an update of Michael Crichton‘s 1973 film Westworld; this time brought to us as a ten-part television series by sci-fi re-booter extraordinaire J.J. Abrams and screenwriter Jonathan Nolan (yes, brother and frequent collaborator of Christopher). The new Westworld comes with much hope (for HBO as a potential ‘new Game of Thrones’) and hype, understandably with the talent behind it having given us so much terrific science-fiction of late, including Star Wars: The Force Awakens and Interstellar.

As with Channel 4’s series Humans, broadcast last June (news on forthcoming series 2 here), Dreaming Robot’s offered a live Twitter commentary while the show was being broadcast in the UK, and I’ll take some time afterwards to write some reflective pieces on what we see in the show. (The Twitter ‘Superguide’ from @ShefRobotics can be seen here; my own @DreamingRobots Superguide can be seen here.)

Unsurprisingly, many of the themes in the Westworld reboot could also be seen in Humans. It seems, for example, that both shows express a certain anxiety, that as our machines become more human we humans seem to become less human, or humane. But this isn’t a new idea original to either show – this anxiety as been around as long as robots themselves, from the very invention of the term robot in the 1920s. And if we trace the history of robots in science fiction, we see a history of monsters that reflect this same fear, time and again, in slightly different contexts. Because the robot – which, remember, was invented in the popular imagination long before they were built in labs – is above all else exactly that, a perfect way of expressing this fear.

westworld-image1So, what are we looking at in this new, improved Westworld? (because frankly, the original film lacked the depth of even the first hour of this series, being just a very traditional Frankenstein narrative and rough draft for Crichton’s Jurassic Park, made 20 years later). First – as this nifty graphic on the right illustrates – we do see the robots in Westworld becoming more human. The programme starts with a voice asking the humanoid (very humanoid) robot, Dolores (Evan Rachel Wood), ‘Have you ever questioned the nature of your reality?’ The questioner is exploring whether Dolores has become sentient, that is, aware of her own existence. The echo here with Descartes is clear. We all know about cogito, ergo sum – I think, therefore I am. But Descartes proposition isn’t just about thinking, it is about doubt.  He begins with the proposition that the act of doubting itself means that we cannot doubt, at least, our existence . So we would better understand Descartes’s proposition as dubito, ergo cogito, ergo sum: I doubt, therefore I think, therefore I am. If Dolores is found to be questioning the nature of her reality then that would be evidence for self-awareness and being, according to the Cartesian model.

The robots in Westworld are depicted as falling in love, maintaining strong family bonds, appreciating beauty, and consider the world in philosophical, reflective contexts. How much of this is merely programming and how much exceeds the limits imposed upon them by their human masters is the key question that the show teases its audience with, I suspect, though certainly, it occupies much of our attention in the first few hours. But if these moments – described as ‘reveries’ – are in any way genuine moments of creativity, then this is another category, beyond the notion of the cogito, that we might say the robots are becoming ‘alive’.

For many thinkers in the second-half of the twentieth century (for example, the post-Freudian psychoanalyst D. W. Winnicott), it is only in such moments of creativity, or enjoying genuine moments of spontaneous being, that we truly discover the self and come alive. (The Freudian and post-Freudian influences on the narrative become even more apparent in subsequent episodes.) As response to late industrial capitalism (and the shock of fascism to the human self-conception as a rational animal), this idea emerged of human beings coming ‘alive’ only when we are not acting compliantly, that is, when we are acting spontaneously, or creatively, and not according to the laws or dictates (or ‘programming’) of another individual, organisation or group consciousness. We see this trend not only in a post-Freudian psychotherapy (e.g. Eric Fromm and the Frankfurt SchoolR. D. Laing) and other philosophical writings but also in the popular post-war subculture media, including advertising – the sort satirised in the ad for Westworld that opens the original film.

Other perspectives give us a glimpse into the robot’s moments of becoming human. Peter Abernathy, looking at a picture that offers a peek at the world outside, says: “I have a question, a question you’re not supposed to ask.” This is an allusion to Adam and Eve and the fruit of forbidden knowledge when humankind became self-aware of the difference between right and wrong. (Peter, after this, is consumed with rage at how he and his daughter have been treated.) And like Walter, the robot that goes on a psychotic killing spree, pouring milk over his victims, Peter is determined to ‘go off script’ and reclaim for himself a degree of agency and self-determination, acting according to his own, new-found consciousness instead of according to what others have programmed for him.

westworld2MEANWHILE, the human beings (‘newcomers’) in Westworld seem less ‘humane’ than their robot counterparts. The newcomers are shown to be sadistic, misogynist, psychopathic in the indulgence of their fantasies. One could argue that this behaviour is morally justifiable in the unreal world designed solely for the benefit of paying customers – that a ‘rape’, for example, in Westworld isn’t really ‘rape’ if it is done to a robot (who, by definition, cannot give nor deny consent), but this clearly is not how the audience is being invited to see these actions.

That human beings are becoming more like machines is an anxiety for which there is a long history of evidence, and even pre-dates the cultural invention of robots in the 1920s. We can see this anxiety in the Romantic unease with the consequences of Enlightenment that gave birth to the new, rational man, and of the industrial revolution, that was turning humans into nothing more than cogs in the steam-powered machines that so transformed the economy. This has been addressed in the Gothic tale of Frankenstein, still the basis for so many narratives involving robots, including the original Westworld film and, more recently, even our most contemporary stories such as Ex_Machinaand, to a lesser extent, in this manifestation of Westworld (which will be subject of a future post). (I have written and spoken on this theme myself many times, for example here and here).

So in Westworld, we meet Dr Ford – the mad scientist who creates the machines that will, inevitably, be freed upon the world. Dr Ford immediately reminds us of another Ford, the man whose name is synonymous with the assembly lines and a mode of production in the late industrial revolution that has done so much to dehumanise modern workforces. We see these methods of production and workers in the iconic film, which was contemporary with Henry Ford’s factories, Metropolis. (Though, as we shall see, this Ford is rather more complex…).

This fear reflects, too, that as post-Enlightenment humans become more rational, they become more like machines, acting in predictable, programmed ways, having lost the spontaneity and creativity of an earlier age. The humans of Westworld are exaggerations of the humans of our ‘Western’ world of rationalism, science and alienation. We don’t have to agree with this Romantic notion, that rationalism and science are negative forces in our world to accept that there is a great deal of anxiety about how rationalism and science are transforming individual human beings and our societies.

Rational dehumanisation is something personified in the actions of the corporation, which has replaced the mad-scientist as the frequent villain of the sci-fi Frankenstein-robot-twist (more to come on this in a future post) and we see hints in Episode 1 of what is to follow in Westworld, along the lines of film’s such as 2013’s The Machine, where the slightly misguided and naive actions of a scientist are only made monstrous when appropriated by a thoroughly evil, inhumane military-industrial complex.

This theme to address so succinctly in Ridley Scott’s Blade Runner, a major influence on the new Westworld, where the Tyrell Corporation boasts that their replicants are More Human Than Human. And in Blade Runner, too, we see humanoid robots behaving more humanely that the people that ruthlessly, rationally hunt down the machines. It is unclear, however, from the Tyrell slogan whether the robots are more human than the human because the technology has become so sophisticated, or because humans have fallen so low.

On Westworld as a whole, it is too early to tell if it will maintain its initial promise and be as monumentally successful as Game of Thrones, or as iconic as Blade Runner. But already this first episode has given us more to think about than the original, and undoubtedly the success and failure of the programme will be instructive.


If you liked this post, you may also want to read:

See all the latest robotics news on Robohub, or sign up for our weekly newsletter.



tags: , , , , , , , ,


Michael Szollosy Michael Szollosy is a research fellow at Sheffield Robotics, at the University of Sheffield, and is attached to the Department of Psychology.
Michael Szollosy Michael Szollosy is a research fellow at Sheffield Robotics, at the University of Sheffield, and is attached to the Department of Psychology.





Related posts :



Robot Talk Episode 103 – Keenan Wyrobek

  20 Dec 2024
In the latest episode of the Robot Talk podcast, Claire chatted to Keenan Wyrobek from Zipline about drones for delivering life-saving medicine to remote locations.

Robot Talk Episode 102 – Isabella Fiorello

  13 Dec 2024
In the latest episode of the Robot Talk podcast, Claire chatted to Isabella Fiorello from the University of Freiburg about bioinspired living materials for soft robotics.

Robot Talk Episode 101 – Christos Bergeles

  06 Dec 2024
In the latest episode of the Robot Talk podcast, Claire chatted to Christos Bergeles from King's College London about micro-surgical robots to deliver therapies deep inside the body.

Robot Talk Episode 100 – Mini Rai

  29 Nov 2024
In the latest episode of the Robot Talk podcast, Claire chatted to Mini Rai from Orbit Rise about orbital and planetary robots.

Robot Talk Episode 99 – Joe Wolfel

  22 Nov 2024
In the latest episode of the Robot Talk podcast, Claire chatted to Joe Wolfel from Terradepth about autonomous submersible robots for collecting ocean data.

Robot Talk Episode 98 – Gabriella Pizzuto

  15 Nov 2024
In the latest episode of the Robot Talk podcast, Claire chatted to Gabriella Pizzuto from the University of Liverpool about intelligent robotic manipulators for laboratory automation.

Online hands-on science communication training – sign up here!

  13 Nov 2024
Find out how to communicate about your work with experts from Robohub, AIhub, and IEEE Spectrum.

Robot Talk Episode 97 – Pratap Tokekar

  08 Nov 2024
In the latest episode of the Robot Talk podcast, Claire chatted to Pratap Tokekar from the University of Maryland about how teams of robots with different capabilities can work together.





Robohub is supported by:




Would you like to learn how to tell impactful stories about your robot or AI system?


scicomm
training the next generation of science communicators in robotics & AI


©2024 - Association for the Understanding of Artificial Intelligence


 












©2021 - ROBOTS Association