Robohub.org
 

ShanghAI Lectures: Lutz Jäncke “On the neuropsychology of avatars”


by
17 October 2013



share this:

ShanghAI Lectures logoThe ShanghAI lectures have brought us a treasure trove of guest lectures by experts in robotics. You can find the whole series from 2012 here. Now, we’re bringing you the guest lectures you haven’t yet seen from previous years, starting with the first lectures from 2009 and releasing a new guest lecture every Thursday until all the series are complete. Enjoy!

LutzJaenckeGuest talk in the ShanghAI Lectures, 2009-10-22

In my talk I will demonstrate why virtual reality (VR) is interesting for neuroscientists or cognitive neuroscientists. A major question in cognitive neuroscience is whether VR environments cause the same experience as real environments. If this is the case the brain states evoked by VR and real environments should be the same. In our experiments we have used several virtual scenarios to examine the influence of VR environments on brain activation and subjective experience. First, we have used virtual roller coaster experiments as stimuli in the context of fMRI experiments. These experiments have shown that typical brain areas are involved in processing this kind of VR scenario. Most interesting was the fact that the dorsolateral prefrontal cortex (DLPFC) is strongly involved in controlling the subjective presence experience. While this area is involved in controlling the presence experience in adults it is not involved in kids. In kids, however, we uncovered stronger brain activations in emotional brain areas. Obviously kid cannot control their emotional reactions via the DLPFC as adults. This might have consequences how kids behave in VR environments. In a further set of experiments we are interested to learn how humans react in response to virtual persons (avatars) or how they interact with avatars suffering from pain (virtual experiment). Altogether these experiments demonstrate that human brain reacts to VR stimuli mostly similar as to real stimuli.

The ShanghAI Lectures are a videoconference-based lecture series on Embodied Intelligence run by Rolf Pfeifer and organized by me and partners around the world.

https://www.youtube.com/watch?v=tHzb446u_00

Lutz Jäncke is professor for Neuropsychology at the University Zurich since 2002. His main research interests are focussed around the question how the human brain is shaped by experience. To understand the effects of leaning and experience on the human brain he uses modern brain imaging techniques like fMRI and EEG. However, he is also strongly grounded in cognitive psychology. One of his specialties is to use professional musicians as model for neuroplasticity. He has also a special interest in virtual reality because he is interested whether VR environments stimulate the same experiences in our brain as real environments. In addition, he is also interested to explore the possibilities of VR for neurological and neuropsychological rehabilitation.



tags: , , ,


Nathan Labhart Co-organizing the ShanghAI Lectures since 2009.
Nathan Labhart Co-organizing the ShanghAI Lectures since 2009.

            AUAI is supported by:



Subscribe to Robohub newsletter on substack



Related posts :

Developing active and flexible microrobots

  13 May 2026
This class of robots opens up possibilities for biomedical applications.

How to teach the same skill to different robots

  11 May 2026
A new framework to teach a skill to robots with different mechanical designs, allowing them to carry out the same task without rewriting code for each.

Robot Talk Episode 155 – Making aerial robots smarter, with Melissa Greeff

  08 May 2026
In the latest episode of the Robot Talk podcast, Claire chatted to Melissa Greeff from Queen's University about autonomous navigation and learning for drones.

New understanding of insect flight points way to stable flapping-wing robots

  07 May 2026
The way bugs and birds flap their wings may look effortless, but the dynamics that keep them aloft are dizzyingly complex and difficult to quantify.

Robotically assembled building blocks could make construction more efficient and sustainable

  05 May 2026
Research suggests constructing a simple building from interlocking subunits should be mechanically feasible and have a much smaller carbon footprint.

Robot Talk Episode 154 – Visual navigation in insects and robots, with Andrew Philippides

  01 May 2026
In the latest episode of the Robot Talk podcast, Claire chatted to Andrew Philippides from the University of Sussex about what we can learn from ants and bees to improve robot navigation.

Ultralightweight sonar plus AI lets tiny drones navigate like bats

  29 Apr 2026
Researchers develop ultrasound-based perception system inspired by bat echolocation.

Gradient-based planning for world models at longer horizons

  28 Apr 2026
What were the problems that motivated this project and what was the approach to address them?



AUAI is supported by:







Subscribe to Robohub newsletter on substack




 















©2026.02 - Association for the Understanding of Artificial Intelligence