Robohub.org
 

ShanghAI Lectures 2012: Lecture 8 “Where is human memory?”


by
16 April 2013



share this:
ShanghAIGlobeColorSmall

In this 8th part of the ShanghAI Lecture series, Rolf Pfeifer looks into differences between human and computer memory and shows several types of “memories”. In the first guest lecture, Vera Zabotkina (Russian State University for the Humanities) talks about cognitive modeling in linguistics; in the second guest lecture, José del R. Millán (EPFL) demonstrates a brain-computer interface.

The ShanghAI Lectures are a videoconference-based lecture series on Embodied Intelligence run by Rolf Pfeifer and organized by me and partners around the world.

 

Vera Zabotkina: Cognitive modeling in linguistics: conceptual metaphors

The concepts that govern our thought are not just matters of the intellect. They also govern our everyday functioning, down to the most mundane details. Our concepts structure what we perceive, how we get around in the world, and how we relate to other people. Our conceptual system thus plays a central role in defining our everyday realities. If we are right in suggesting that our conceptual system is largely metaphorical, then the way we think, what we experience, and what we do every day is very much a matter of metaphor… (Lakoff & Johnson, 1980)

In this lecture, Vera addresses the integration challenge facing cognitive science as an interdisciplinary endeavor. She highlights the interconnection between AI and Linguistics and discusses conceptual metaphors.

 

José del Millán: Brain-Computer Interfacing
In this lecture, José del R. Millán (EPFL) demonstrates the use of human brain signals to control devices, such as wheelchairs, and interact with our environment.

http://www.youtube.com/watch?v=8Z6HRD9KAnY

Related links:



tags: , , , , , , , , , , , , ,


Nathan Labhart Co-organizing the ShanghAI Lectures since 2009.
Nathan Labhart Co-organizing the ShanghAI Lectures since 2009.





Related posts :



Robot Talk Episode 133 – Creating sociable robot collaborators, with Heather Knight

  14 Nov 2025
In the latest episode of the Robot Talk podcast, Claire chatted to Heather Knight from Oregon State University about applying methods from the performing arts to robotics.

CoRL2025 – RobustDexGrasp: dexterous robot hand grasping of nearly any object

  11 Nov 2025
A new reinforcement learning framework enables dexterous robot hands to grasp diverse objects with human-like robustness and adaptability—using only a single camera.

Robot Talk Episode 132 – Collaborating with industrial robots, with Anthony Jules

  07 Nov 2025
In the latest episode of the Robot Talk podcast, Claire chatted to Anthony Jules from Robust.AI about their autonomous warehouse robots that work alongside humans.

Teaching robots to map large environments

  05 Nov 2025
A new approach could help a search-and-rescue robot navigate an unpredictable environment by rapidly generating an accurate map of its surroundings.

Robot Talk Episode 131 – Empowering game-changing robotics research, with Edith-Clare Hall

  31 Oct 2025
In the latest episode of the Robot Talk podcast, Claire chatted to Edith-Clare Hall from the Advanced Research and Invention Agency about accelerating scientific and technological breakthroughs.

A flexible lens controlled by light-activated artificial muscles promises to let soft machines see

  30 Oct 2025
Researchers have designed an adaptive lens made of soft, light-responsive, tissue-like materials.

Social media round-up from #IROS2025

  27 Oct 2025
Take a look at what participants got up to at the IEEE/RSJ International Conference on Intelligent Robots and Systems.



 

Robohub is supported by:




Would you like to learn how to tell impactful stories about your robot or AI system?


scicomm
training the next generation of science communicators in robotics & AI


 












©2025.05 - Association for the Understanding of Artificial Intelligence