Robohub.org
 

ShanghAI Lectures 2012: Lecture 10 “How the body shapes the way we think”


by
06 May 2013



share this:
ShanghAIGlobeColorSmall

This concludes the ShanghAI Lecture series of 2012. After a wrap-up of the class, we announce the winners of the EmbedIT and NAO competitions and end with an outlook of the future of the ShanghAI Lectures.

Then there are three guest lectures: Tamás Haidegger (Budapest University of Technology and Economics) on surgical robots, Aude Billard (EPFL) on how the body shapes the way we move (and how humans can shape the way robots move), and Jamie Paik (EPFL) on soft robotics.

The ShanghAI Lectures are a videoconference-based lecture series on Embodied Intelligence run by Rolf Pfeifer and organized by me and partners around the world.

 

Tamás Haidegger: Human Skills for Robots: Transferring Human Knowledge and Capabilities to Robotic Task Execution in Surgery

Almost 90 years ago, the idea of telesurgery was born, along with the initial concept of robots. From the early 1970s, researchers were focusing on robotic telepresence, to empower surgeons to treat patients at a distance. The first systems appeared over 20 years ago, and robotic surgery has quickly become a standard-of-care for certain procedures—at least in the USA. Over the decades, the control concept remained the same; a human surgeon guiding the robotic tools based on real-time sensory feedback. However, from the beginning of the development, the more exciting (and sometimes frightening) questions have been linked to machine learning, AI and automated surgery. In the true sense of automation, there have only been unclear reports of one single robotically planned and executed surgery so far, despite the fact that many research groups are working on the problem. This talk introduces the major efforts currently undertaken in centers of excellence around the globe to transfer the incredibly diverse and versatile human cognition into the domain of surgical robotics.

References

  • P. Kazanzides, G. Fichtinger, G. D. Hager, A. M. Okamura, L. L. Whitcomb, and R. H. Taylor, “Surgical and Interventional Robotics: part I,” IEEE Robotics and Automation Magazine (RAM), vol. 15, no. 2, pp. 122–130, 2008.
  • G. Fichtinger , P. Kazanzides, G. D. Hager, A. M. Okamura, L. L. Whitcomb, and R. H. Taylor, “Surgical and Interventional Robotics: part II,” IEEE Robotics and Automation Magazine (RAM), vol. 15, no. 3, pp. 94–102, 2008.
  • G. Hager, A. Okamura, P. Kazanzides, L. Whitcomb, G. Fichtinger, and R. Taylor, “Surgical and Interventional Robotics: part III,” IEEE Robotics and Automation Magazine (RAM), vol. 15, no. 4, pp. 84–93, 2008.
  • C. E. Reiley, H. C. Lin, D. D. Yuh, G. D. Hager. “A Review of Methods for Objective Surgical Skill Evaluation,” Surgical Endoscopy, vol. 25, no. 2, pp. 356–366, 2011.

 

Aude Billard: How the body shapes the way we move and how humans can shape the way robots move

In this lecture Aude Billard advocates that it is advantageous to have robots move with a dynamics that resembles the dynamics of motion of natural bodies, even if the robots do not resemble humans in their physical appearance (e.g. industrial robots). This will make their motion more predictable for humans and hence make the interaction safer. She then briefly presents current approaches to modeling the dynamics of human motion in robots.

A survey of issues on robot learning from human demonstration can be found at:
http://www.scholarpedia.org/article/Robot_learning_by_demonstration

 

Jamie Paik: SOFT Robot Challenge and 
Robogamis



tags: , , , , , , , , , , , , , , , , ,


Nathan Labhart Co-organizing the ShanghAI Lectures since 2009.
Nathan Labhart Co-organizing the ShanghAI Lectures since 2009.





Related posts :



Robot Talk Episode 121 – Adaptable robots for the home, with Lerrel Pinto

  16 May 2025
In the latest episode of the Robot Talk podcast, Claire chatted to Lerrel Pinto from New York University about using machine learning to train robots to adapt to new environments.

What’s coming up at #ICRA2025?

  16 May 2025
Find out what's in store at the IEEE International Conference on Robotics & Automation, which will take place from 19-23 May.

Robot see, robot do: System learns after watching how-tos

  14 May 2025
Researchers have developed a new robotic framework that allows robots to learn tasks by watching a how-to video

AI-powered robots help tackle Europe’s growing e-waste problem

  12 May 2025
EU-funded researchers have developed adaptable robots that could transform the way we recycle electronic waste, benefiting both the environment and the economy.

Robot Talk Episode 120 – Evolving robots to explore other planets, with Emma Hart

  09 May 2025
In the latest episode of the Robot Talk podcast, Claire chatted to Emma Hart from Edinburgh Napier University about algorithms that 'evolve' better robot designs and control systems.

Robot Talk Episode 119 – Robotics for small manufacturers, with Will Kinghorn

  02 May 2025
In the latest episode of the Robot Talk podcast, Claire chatted to Will Kinghorn from Made Smarter about how to increase adoption of new tech by small manufacturers.

Multi-agent path finding in continuous environments

  01 May 2025
How can a group of agents minimise their journey length whilst avoiding collisions?

Interview with Yuki Mitsufuji: Improving AI image generation

  29 Apr 2025
Find out about two pieces of research tackling different aspects of image generation.



 

Robohub is supported by:




Would you like to learn how to tell impactful stories about your robot or AI system?


scicomm
training the next generation of science communicators in robotics & AI


©2025.05 - Association for the Understanding of Artificial Intelligence


 












©2025.05 - Association for the Understanding of Artificial Intelligence