Robohub.org
 

ShanghAI Lectures: Tamim Asfour “Robots think with their hands”


by
13 March 2014



share this:

Tamim_AsfourGuest talk in the ShanghAI Lectures, 2010-12-16

The design of cognitive situated robots able to learn to operate in the real world and to interact and communicate with humans, must model and reflectively reason about their perceptions and actions in order to learn, act, predict and react appropriately. Such capabilities can only be attained by embodied agents through physical interaction with and exploration of the real world and requires the simultaneous consideration of perception and action. Representations built from such interactions are much better adapted to guiding behaviour than human crafted rules and allow embodied agents to gradually extend their cognitive horizon.

ShanghAI Lectures logoIn the first part of the talk, I present the concept of Object-Action Complexes (OAC, pronounced “oak”) which has been introduced by the European project PACO-PLUS (www.paco-plus.org) to emphasize the notion that objects and actions are inseparably intertwined and that categories are therefore determined (and also limited) by the action an agent can perform and by the attributes of the world it can perceive. Entities (things) in the world of a robot (or human) will only become semantically useful objects through the action that the agent can/will perform on them. The second part of the talk presents current results toward the implementation of integrated 24/7 humanoid robots able to 1) perform complex grasping and manipulation tasks in a kitchen environment 2) autonomously acquire object knowledge through visual and haptic exploration and 3) learn actions from human observation. The developed capabilities are demonstrated on the humanoid robots ARMAR-IIIa and ARMAR-IIIb.

https://www.youtube.com/watch?v=tJLzV418pF4

Slides

Tamim Asfour is senior research scientist and leader of the Humanoid Research Group at Humanoids and Intelligence Systems Lab, Institute for Anthropomatics, Karlsruhe Institute of Technology (KIT).

His major research interest is humanoid robotics. In particular, his research topics include action learning from human observation, goal-directed imitation learning, dexterous grasping and manipulation, active vision and active touch, whole-body motion planning, cognitive control architectures, system integration, robot software and hardware control architecture, motor control and mechatronics.

He is leading the system integration tasks and the development team of the humanoid robot series ARMAR in the German Humanoid Robotics Project (SFB 588) funded by the German Research Foundation (DFG). He is currently involved in the following projects funded by the European Commission: PACO-PLUS, GRASP and Xperience.

Tamim Asfour is member of the Editorial Board of IEEE Transactions on Robotics and European Chair of the IEEE-RAS Technical Committee on Humanoid Robots. He is member the Executive Board of the German Association of Robotics (DGR: Deutsche Gesellschaft für Robotik). He serves as member on several program committees and review panels.

He received his diploma degree in Electrical Engineering (Dipl.-Ing.) in 1994 and his PhD in Computer Science (Dr.-Ing.) in 2003 from the University of Karlsruhe. In 2003 he was awarded with the Research Center for Information Technology (FZI) price for his outstanding Ph.D. thesis on sensorimotor control in humanoid robotics and the development of the humanoid robot ARMAR. Since September 2010 he holds an Adjunct Professor position at the Georgia Institute of Technology (Georgia Tech), College of Computing, Interactive Computing.

The ShanghAI Lectures are a videoconference-based lecture series on Embodied Intelligence, run and organized by Rolf Pfeifer (from 2009 till 2012), Fabio Bonsignorio (since 2013), and me with partners around the world. 

The ShanghAI Lectures have brought us a treasure trove of guest lectures by experts in robotics. You can find the whole series from 2012 here. Now, we’re bringing you the guest lectures you haven’t yet seen from previous years, starting with the first lectures from 2009 and releasing a new guest lecture every Thursday until all the series are complete. Enjoy!



tags: , ,


Nathan Labhart Co-organizing the ShanghAI Lectures since 2009.
Nathan Labhart Co-organizing the ShanghAI Lectures since 2009.


Subscribe to Robohub newsletter on substack



Related posts :

Resource-sharing boosts robotic resilience

  31 Mar 2026
When a modular robot shares power, sensing, and communication resources among its individual units, it is significantly more resistant to failure than traditional robotic systems.

Robot Talk Episode 150 – House building robots, with Vikas Enti

  27 Mar 2026
In the latest episode of the Robot Talk podcast, Claire chatted to Vikas Enti from Reframe Systems about using robotics and automation to build climate-resilient, high-performance homes.

A history of RoboCup with Manuela Veloso

and   24 Mar 2026
Find out how RoboCup got started and how the competition has evolved, from one of the co-founders.

Robot Talk Episode 149 – Robot safety and security, with Krystal Mattich

  20 Mar 2026
In the latest episode of the Robot Talk podcast, Claire chatted to Krystal Mattich from Brain Corp about trustworthy autonomous robots in public spaces.

A multi-armed robot for assisting with agricultural tasks

  18 Mar 2026
How can a robot safely manipulate branches to reveal hidden flowers while remaining aware of interaction forces and minimizing damage?

Graphene-based sensor to improve robot touch

  16 Mar 2026
Multiscale-structured miniaturized 3D force sensors for improved robot touch.

Robot Talk Episode 148 – Ethical robot behaviour, with Alan Winfield

  13 Mar 2026
In the latest episode of the Robot Talk podcast, Claire chatted to Alan Winfield from the University of the West of England about developing new standards for ethics and transparency in robotics.

Coding for underwater robotics

  12 Mar 2026
Lincoln Laboratory intern Ivy Mahncke developed and tested algorithms to help human divers and robots navigate underwater.



Robohub is supported by:


Subscribe to Robohub newsletter on substack




 















©2026.02 - Association for the Understanding of Artificial Intelligence