Robohub.org
 

Why intelligence requires both body and brain


by and
18 February 2014



share this:

digital_mind_body_intelligenceWestern philosophy has traditionally separated mind from matter and brain from body. In recent years, however, cognitive scientists have turned the assumption on its head that we can study the mind based on the brain alone.Many now view the brain as part of a larger “mind” – a cognitive system embodied in the organism’s physical structure and embedded in its surrounding environment. These twin concepts of embodied and embedded cognition are challenging the way we understand human intelligence, and in the process transforming neuroscience, robotics, philosophy, and a host of other fields concerned with the mind.

When we think of intelligence, we often focus on skills specific to humans, such as mathematical reasoning and critical thinking. But the ability of animals (including humans) to navigate their environment and process sensory information is an equally impressive feat – one that has in many ways proven harder to understand and replicate than more intellectual endeavors such as playing chess.

Tackling these more basic abilities has required cognitive scientists to rethink their assumptions about how organisms engage in movement and behavior. Walking, for example, is not a matter of the brain computing all the positions a leg must pass through and then giving the command to take the next step forward. It requires exquisite timing across the entire organism, as nerves and muscles coordinate to exploit the physical properties of legs that function as pendulums.

(a)  Cognitive scientists have long debated how the brain is able to instantly interpret an influx of incoming visual information with myriad properties (color, shape, size, etc.) as a cohesive scene, particularly when the same sensory input can often correspond to more than one object. There is further debate over how we then use this information to infer more complex properties, such as where an object is located and how we can position ourselves to pick it up.

The brains of more advanced organisms have evolved over time to exert precise control over such motions, but this ability is still built on a foundation of information and feedback from across the body.1 The brain exploits large quantities of information continuously streaming through the senses.(a) It took millions of years of evolution to get to the point where a bipedal organism could coordinate its vision, its sense of balance, and its sense of bodily position so that it only rarely fell over when chasing dinner.

Exploitation of the physical properties of bodies and environments is an important way in which animals reduce the computational load on the nervous system. An animal’s nervous system does not have to compute or represent exact positions and speeds for its legs before initiating an action, but can rely on the fact that the physics of the swinging leg can provide the sensory feedback needed to make small motor corrections on the fly.

In a similar way, crickets use information from unique bodily structures to facilitate hearing. Biorobotics professor Barbara Webb and her colleagues have shown how the physical transmission of sound through tracheal tubes in a cricket’s body simplifies the task of orienting toward a sound source.2 Sound waves entering through different portals in the tracheal tubes will be amplified or cancelled depending on the location of the sound source. Without this physical system, the cricket’s neurons would have to determine direction using only the small timing differences of sound waves reaching the eardrums on different sides of the head.

(b)  One outgrowth of the insight that physically embodied experiences influence our understanding of language is a theory of conceptual metaphor, according to which expressions such as “he hit a stumbling block” or “she bit off more than she could chew” would be incomprehensible without certain bodily experiences.

The embodied nature of cognition applies not only to basic animal activities like walking and hearing, but also to higher-level capacities of human intelligence. Findings about how the physical environment can affect thought offer a fascinating look at this phenomenon. Recent studies3 have shown the impact of wearing a white coat on decision making and the effect of holding a heavier clipboard on rating the “weight” of certain decisions.(b)

Intriguing as such studies may be, they focus on only one direction of the mind-body loop: how physical conditions (i.e. what you wear or what you are holding) have an effect on human thought and behavior. More significant for understanding the physically embodied and embedded nature of cognition are the various ways in which cognitive problems are offloaded onto physical processes, making two-way bodily interactions with the environment essential to cognitive success.

An early example of this kind of research was the finding that experienced Tetris players don’t just imagine the shapes rotating and fitting into the available space, but they actually rotate the images on the screen as part of the process of figuring out where they should go, and they solve the problems more quickly and efficiently by doing so.4 Cognition is also facilitated by directing attention around one’s spatial environment. Studies of eye movements show that cognitive success is partly due to learning complex gaze patterns governing when and where to focus visual attention, such as where to look on a counter full of ingredients when cooking, so as to obtain information “just in time.”5

Similarly, scientists are slowly beginning to understand how teachers and learners rely on gestures to scaffold their understanding of arithmetic and algebra. Children begin to learn math by counting and manipulating objects and observing how quantities change when objects are added or taken away. More experienced mathematical reasoners exploit the physical characteristics of symbolic notation systems to simplify problems in ways not explicitly taught. For example, when adding or subtracting a value on both sides of an algebra equation so that it disappears on one side and appears on the other, we may imagine physically moving the number across the equal sign.

(c)  Plato’s famous cave allegory depicts thinkers trapped in a cave, trying to decipher the outside world by interpreting its shadows on the wall as representations of reality. One shortcoming of this analogy is that the mind is not a “cave” sealed off from the external environment.

These findings show the limitations of the classical separation between the inner mental world and outer physical world. In this traditional view, the inner representations accessible to the human mind are only weakly connected to the physical world through fallible senses.(c) The latest developments in cognitive science are demonstrating that the mind is not so decoupled from the world it seeks to understand and that brain, physical body, and environment work in tandem to produce thought.

In my next article , I will explore how this growing understanding of the embodied nature of cognition is influencing the quest to develop artificial intelligence.

Colin Allen is Provost Professor of Cognitive Science and of History & Philosophy of Science in the College of Arts and Sciences at Indiana University, Bloomington, where he has been a faculty member since 2004. He also holds an adjunct appointment in the Department of Philosophy and is a faculty member of IU’s Center for the Integrative Study of Animal Behavior and Program for Neuroscience. He became director of IU’s Cognitive Science program in July 2011. Allen’s main area of research is on the philosophical foundations of cognitive science, particularly with respect to nonhuman animals.
This article was originally published on Footnote, a website that shares academic research in a format that mainstream readers can understand and engage with. It is part of Footnote’s multidisciplinary series on robots and their impact on society. Please contact Footnote for permission before republishing this article. 

ENDNOTES

  1. Esther Thelen and Linda B. Smith (1996) A Dynamic Systems Approach to the Development of Cognition and Action, Cambridge, Mass.: MIT Press.
  2. R. Reeve, A. van Schaik, C. Jin, T. Hamilton, B. Torben-Neilsen, and B. Webb (2006) “Directional Hearing in a Silicon Cricket,” Biosystems, 87: 317-313
  3. Hajo Adam and Adam Galinsky, (2012) “Enclothed Cognition,” Journal of Experimental Social Psychology, 48(4): 918-925. Nils B. Jostmann, Daniël Lakens, and Thomas W. Schubert (2009) “Weight as an Embodiment of Importance,” Psychological Science, 20(9): 1169-1174.
  4. David Kirsh and Paul Maglio (1994) “On Distinguishing Epistemic from Pragmatic Action,” Cognitive Science, 18(4): 513-549.
  5. For the role of gesture in mathematical learning, see Susan Wagner Cook, Terina Kuangyi Yip, and Susan Goldin-Meadow (2012) “Gestures, but not Meaningless Movements, Lighten Working Memory Load when Explaining Math,” Language and Cognitive Processes, 27(4): 594-610. For embodied symbolic reasoning, see David H. Landy, Colin Allen, and Carlos Zednik (in press) “A Perceptual Account of Symbolic Reasoning.” For embodied visual attention, see Mary Hayhoe and Dana Ballard (2005) “Eye Movements in Natural Behavior,” Trends in Cognitive Sciences, 9(4): 188-194.

If you liked this article, you may also be interested in:

See all the latest robotics news on Robohub, or sign up for our weekly newsletter.



tags: , , ,


Colin Allen is Provost Professor of Cognitive Science and of History & Philosophy of Science in the College of Arts and Sciences at Indiana University.
Colin Allen is Provost Professor of Cognitive Science and of History & Philosophy of Science in the College of Arts and Sciences at Indiana University.

Footnote is an online media company that unlocks the power of academic knowledge by making it accessible to a broader audience.
Footnote is an online media company that unlocks the power of academic knowledge by making it accessible to a broader audience.





Related posts :



Robot Talk Episode 103 – Keenan Wyrobek

  20 Dec 2024
In the latest episode of the Robot Talk podcast, Claire chatted to Keenan Wyrobek from Zipline about drones for delivering life-saving medicine to remote locations.

Robot Talk Episode 102 – Isabella Fiorello

  13 Dec 2024
In the latest episode of the Robot Talk podcast, Claire chatted to Isabella Fiorello from the University of Freiburg about bioinspired living materials for soft robotics.

Robot Talk Episode 101 – Christos Bergeles

  06 Dec 2024
In the latest episode of the Robot Talk podcast, Claire chatted to Christos Bergeles from King's College London about micro-surgical robots to deliver therapies deep inside the body.

Robot Talk Episode 100 – Mini Rai

  29 Nov 2024
In the latest episode of the Robot Talk podcast, Claire chatted to Mini Rai from Orbit Rise about orbital and planetary robots.

Robot Talk Episode 99 – Joe Wolfel

  22 Nov 2024
In the latest episode of the Robot Talk podcast, Claire chatted to Joe Wolfel from Terradepth about autonomous submersible robots for collecting ocean data.

Robot Talk Episode 98 – Gabriella Pizzuto

  15 Nov 2024
In the latest episode of the Robot Talk podcast, Claire chatted to Gabriella Pizzuto from the University of Liverpool about intelligent robotic manipulators for laboratory automation.

Online hands-on science communication training – sign up here!

  13 Nov 2024
Find out how to communicate about your work with experts from Robohub, AIhub, and IEEE Spectrum.

Robot Talk Episode 97 – Pratap Tokekar

  08 Nov 2024
In the latest episode of the Robot Talk podcast, Claire chatted to Pratap Tokekar from the University of Maryland about how teams of robots with different capabilities can work together.





Robohub is supported by:




Would you like to learn how to tell impactful stories about your robot or AI system?


scicomm
training the next generation of science communicators in robotics & AI


©2024 - Association for the Understanding of Artificial Intelligence


 












©2021 - ROBOTS Association