Robohub.org
 

ShanghAI Lectures 2012: Lecture 10 “How the body shapes the way we think”


by
06 May 2013



share this:
ShanghAIGlobeColorSmall

This concludes the ShanghAI Lecture series of 2012. After a wrap-up of the class, we announce the winners of the EmbedIT and NAO competitions and end with an outlook of the future of the ShanghAI Lectures.

Then there are three guest lectures: Tamás Haidegger (Budapest University of Technology and Economics) on surgical robots, Aude Billard (EPFL) on how the body shapes the way we move (and how humans can shape the way robots move), and Jamie Paik (EPFL) on soft robotics.

The ShanghAI Lectures are a videoconference-based lecture series on Embodied Intelligence run by Rolf Pfeifer and organized by me and partners around the world.

 

Tamás Haidegger: Human Skills for Robots: Transferring Human Knowledge and Capabilities to Robotic Task Execution in Surgery

Almost 90 years ago, the idea of telesurgery was born, along with the initial concept of robots. From the early 1970s, researchers were focusing on robotic telepresence, to empower surgeons to treat patients at a distance. The first systems appeared over 20 years ago, and robotic surgery has quickly become a standard-of-care for certain procedures—at least in the USA. Over the decades, the control concept remained the same; a human surgeon guiding the robotic tools based on real-time sensory feedback. However, from the beginning of the development, the more exciting (and sometimes frightening) questions have been linked to machine learning, AI and automated surgery. In the true sense of automation, there have only been unclear reports of one single robotically planned and executed surgery so far, despite the fact that many research groups are working on the problem. This talk introduces the major efforts currently undertaken in centers of excellence around the globe to transfer the incredibly diverse and versatile human cognition into the domain of surgical robotics.

References

  • P. Kazanzides, G. Fichtinger, G. D. Hager, A. M. Okamura, L. L. Whitcomb, and R. H. Taylor, “Surgical and Interventional Robotics: part I,” IEEE Robotics and Automation Magazine (RAM), vol. 15, no. 2, pp. 122–130, 2008.
  • G. Fichtinger , P. Kazanzides, G. D. Hager, A. M. Okamura, L. L. Whitcomb, and R. H. Taylor, “Surgical and Interventional Robotics: part II,” IEEE Robotics and Automation Magazine (RAM), vol. 15, no. 3, pp. 94–102, 2008.
  • G. Hager, A. Okamura, P. Kazanzides, L. Whitcomb, G. Fichtinger, and R. Taylor, “Surgical and Interventional Robotics: part III,” IEEE Robotics and Automation Magazine (RAM), vol. 15, no. 4, pp. 84–93, 2008.
  • C. E. Reiley, H. C. Lin, D. D. Yuh, G. D. Hager. “A Review of Methods for Objective Surgical Skill Evaluation,” Surgical Endoscopy, vol. 25, no. 2, pp. 356–366, 2011.

 

Aude Billard: How the body shapes the way we move and how humans can shape the way robots move

In this lecture Aude Billard advocates that it is advantageous to have robots move with a dynamics that resembles the dynamics of motion of natural bodies, even if the robots do not resemble humans in their physical appearance (e.g. industrial robots). This will make their motion more predictable for humans and hence make the interaction safer. She then briefly presents current approaches to modeling the dynamics of human motion in robots.

A survey of issues on robot learning from human demonstration can be found at:
http://www.scholarpedia.org/article/Robot_learning_by_demonstration

 

Jamie Paik: SOFT Robot Challenge and 
Robogamis



tags: , , , , , , , , , , , , , , , , ,


Nathan Labhart Co-organizing the ShanghAI Lectures since 2009.
Nathan Labhart Co-organizing the ShanghAI Lectures since 2009.





Related posts :

“Robot, make me a chair”

  17 Feb 2026
An AI-driven system lets users design and build simple, multicomponent objects by describing them with words.

Robot Talk Episode 144 – Robot trust in humans, with Samuele Vinanzi

  13 Feb 2026
In the latest episode of the Robot Talk podcast, Claire chatted to Samuele Vinanzi from Sheffield Hallam University about how robots can tell whether to trust or distrust people.

How can robots acquire skills through interactions with the physical world? An interview with Jiaheng Hu

and   12 Feb 2026
Find out more about work published at the Conference on Robot Learning (CoRL).

Sven Koenig wins the 2026 ACM/SIGAI Autonomous Agents Research Award

  10 Feb 2026
Sven honoured for his work on AI planning and search.

Robot Talk Episode 143 – Robots for children, with Elmira Yadollahi

  06 Feb 2026
In the latest episode of the Robot Talk podcast, Claire chatted to Elmira Yadollahi from Lancaster University about how children interact with and relate to robots.

New frontiers in robotics at CES 2026

  03 Feb 2026
Henry Hickson reports on the exciting developments in robotics at Consumer Electronics Show 2026.

Robot Talk Episode 142 – Collaborative robot arms, with Mark Gray

  30 Jan 2026
In the latest episode of the Robot Talk podcast, Claire chatted to Mark Gray from Universal Robots about their lightweight robotic arms that work alongside humans.

Robot Talk Episode 141 – Our relationship with robot swarms, with Razanne Abu-Aisheh

  23 Jan 2026
In the latest episode of the Robot Talk podcast, Claire chatted to Razanne Abu-Aisheh from the University of Bristol about how people feel about interacting with robot swarms.


Robohub is supported by:





 













©2026.01 - Association for the Understanding of Artificial Intelligence