Robohub.org
 

Robotics has a new kind of Cartesian Dualism, and it’s just as unhelpful


by
22 July 2013



share this:

I believe robotics has re-invented mind-body dualism.

At the excellent European Robotics Forum earlier this year, I attended a workshop called AI meets Robotics. The thinking behind the workshop was:

The fields of Artificial Intelligence (AI) and Robotics were strongly connected in the early days of AI, but became mostly disconnected later on. While there are several attempts at tackling them together, these attempts remain isolated points in a landscape whose overall structure and extent is not clear. Recently, it was suggested that even the otherwise successful EC program “Cognitive systems and robotics” was not entirely effective in putting together the two sides of cognitive systems and of robotics.

I couldn’t agree more. Actually I would go further and suggest that robotics has a much bigger problem than we think. It’s a new kind of dualism which parallels Cartesian brain-mind dualism, except in robotics, it’s hardware-software dualism. And like Cartesian dualism it could prove just as unhelpful, both conceptually, and practically – in our quest to build intelligent robots.

While sitting in the workshop last week I realised rather sheepishly that I’m guilty of the same kind of dualistic thinking. In my Introduction to Robotics one of the (three) ways I define a robot is: an embodied Artificial Intelligence. And I go on to explain:

…a robot is an Artificial Intelligence (AI) with a physical body. The AI is the thing that provides the robot with its purposefulness of action, its cognition; without the AI the robot would just be a useless mechanical shell. A robot’s body is made of mechanical and electronic parts, including a microcomputer, and the AI made by the software running in the microcomputer. The robot analogue of mind/body is software/hardware. A robot’s software – its programming – is the thing that determines how intelligently it behaves, or whether it behaves at all.

But, as I said in the workshop, we must stop thinking of cognitive robots as either “a robot body with added AI”, or “an AI with added motors and sensors”. Instead we need a new kind of holistic approach that explicitly seeks to avoid this lazy “with added” thinking.

[This post originally appeared on Alan Winfield’s blog on March 24, 2013.]



tags: , , ,


Alan Winfield is Professor in robotics at UWE Bristol. He communicates about science on his personal blog.
Alan Winfield is Professor in robotics at UWE Bristol. He communicates about science on his personal blog.


Subscribe to Robohub newsletter on substack



Related posts :

Developing an optical tactile sensor for tracking head motion during radiotherapy: an interview with Bhoomika Gandhi

  05 Mar 2026
Bhoomika Gandhi discusses her work on an optical sensor for medical robotics applications.

Humanoid home robots are on the market – but do we really want them?

  03 Mar 2026
Last year, Norwegian-US tech company 1X announced “the world’s first consumer-ready humanoid robot designed to transform life at home”.

Robot Talk Episode 146 – Embodied AI on the ISS, with Jamie Palmer

  27 Feb 2026
In the latest episode of the Robot Talk podcast, Claire chatted to Jamie Palmer from Icarus Robotics about building a robotic labour force to perform routine and risky tasks in orbit.

I developed an app that uses drone footage to track plastic litter on beaches

  26 Feb 2026
Plastic pollution is one of those problems everyone can see, yet few know how to tackle it effectively.

Translating music into light and motion with robots

  25 Feb 2026
Robots the size of a soccer ball create new visual art by trailing light that represents the “emotional essence” of music

Robot Talk Episode 145 – Robotics and automation in manufacturing, with Agata Suwala

  20 Feb 2026
In the latest episode of the Robot Talk podcast, Claire chatted to Agata Suwala from the Manufacturing Technology Centre about leveraging robotics to make manufacturing systems more sustainable.

Reversible, detachable robotic hand redefines dexterity

  19 Feb 2026
A robotic hand developed at EPFL has dual-thumbed, reversible-palm design that can detach from its robotic ‘arm’ to reach and grasp multiple objects.

“Robot, make me a chair”

  17 Feb 2026
An AI-driven system lets users design and build simple, multicomponent objects by describing them with words.



Robohub is supported by:


Subscribe to Robohub newsletter on substack




 















©2026.02 - Association for the Understanding of Artificial Intelligence