Robohub.org
 

ShanghAI Lectures: Tamim Asfour “Robots think with their hands”

by
13 March 2014



share this:

Tamim_AsfourGuest talk in the ShanghAI Lectures, 2010-12-16

The design of cognitive situated robots able to learn to operate in the real world and to interact and communicate with humans, must model and reflectively reason about their perceptions and actions in order to learn, act, predict and react appropriately. Such capabilities can only be attained by embodied agents through physical interaction with and exploration of the real world and requires the simultaneous consideration of perception and action. Representations built from such interactions are much better adapted to guiding behaviour than human crafted rules and allow embodied agents to gradually extend their cognitive horizon.

ShanghAI Lectures logoIn the first part of the talk, I present the concept of Object-Action Complexes (OAC, pronounced “oak”) which has been introduced by the European project PACO-PLUS (www.paco-plus.org) to emphasize the notion that objects and actions are inseparably intertwined and that categories are therefore determined (and also limited) by the action an agent can perform and by the attributes of the world it can perceive. Entities (things) in the world of a robot (or human) will only become semantically useful objects through the action that the agent can/will perform on them. The second part of the talk presents current results toward the implementation of integrated 24/7 humanoid robots able to 1) perform complex grasping and manipulation tasks in a kitchen environment 2) autonomously acquire object knowledge through visual and haptic exploration and 3) learn actions from human observation. The developed capabilities are demonstrated on the humanoid robots ARMAR-IIIa and ARMAR-IIIb.

Slides

Tamim Asfour is senior research scientist and leader of the Humanoid Research Group at Humanoids and Intelligence Systems Lab, Institute for Anthropomatics, Karlsruhe Institute of Technology (KIT).

His major research interest is humanoid robotics. In particular, his research topics include action learning from human observation, goal-directed imitation learning, dexterous grasping and manipulation, active vision and active touch, whole-body motion planning, cognitive control architectures, system integration, robot software and hardware control architecture, motor control and mechatronics.

He is leading the system integration tasks and the development team of the humanoid robot series ARMAR in the German Humanoid Robotics Project (SFB 588) funded by the German Research Foundation (DFG). He is currently involved in the following projects funded by the European Commission: PACO-PLUS, GRASP and Xperience.

Tamim Asfour is member of the Editorial Board of IEEE Transactions on Robotics and European Chair of the IEEE-RAS Technical Committee on Humanoid Robots. He is member the Executive Board of the German Association of Robotics (DGR: Deutsche Gesellschaft für Robotik). He serves as member on several program committees and review panels.

He received his diploma degree in Electrical Engineering (Dipl.-Ing.) in 1994 and his PhD in Computer Science (Dr.-Ing.) in 2003 from the University of Karlsruhe. In 2003 he was awarded with the Research Center for Information Technology (FZI) price for his outstanding Ph.D. thesis on sensorimotor control in humanoid robotics and the development of the humanoid robot ARMAR. Since September 2010 he holds an Adjunct Professor position at the Georgia Institute of Technology (Georgia Tech), College of Computing, Interactive Computing.

The ShanghAI Lectures are a videoconference-based lecture series on Embodied Intelligence, run and organized by Rolf Pfeifer (from 2009 till 2012), Fabio Bonsignorio (since 2013), and me with partners around the world. 

The ShanghAI Lectures have brought us a treasure trove of guest lectures by experts in robotics. You can find the whole series from 2012 here. Now, we’re bringing you the guest lectures you haven’t yet seen from previous years, starting with the first lectures from 2009 and releasing a new guest lecture every Thursday until all the series are complete. Enjoy!



tags: , , , , , ,


Nathan Labhart Co-organizing the ShanghAI Lectures since 2009.
Nathan Labhart Co-organizing the ShanghAI Lectures since 2009.





Related posts :



Octopus inspires new suction mechanism for robots

Suction cup grasping a stone - Image credit: Tianqi Yue The team, based at Bristol Robotics Laboratory, studied the structures of octopus biological suckers,  which have superb adaptive s...
18 April 2024, by

Open Robotics Launches the Open Source Robotics Alliance

The Open Source Robotics Foundation (OSRF) is pleased to announce the creation of the Open Source Robotics Alliance (OSRA), a new initiative to strengthen the governance of our open-source robotics so...

Robot Talk Episode 77 – Patricia Shaw

In the latest episode of the Robot Talk podcast, Claire chatted to Patricia Shaw from Aberystwyth University all about home assistance robots, and robot learning and development.
18 March 2024, by

Robot Talk Episode 64 – Rav Chunilal

In the latest episode of the Robot Talk podcast, Claire chatted to Rav Chunilal from Sellafield all about robotics and AI for nuclear decommissioning.
31 December 2023, by

AI holidays 2023

Thanks to those that sent and suggested AI and robotics-themed holiday videos, images, and stories. Here’s a sample to get you into the spirit this season....
31 December 2023, by and

Faced with dwindling bee colonies, scientists are arming queens with robots and smart hives

By Farshad Arvin, Martin Stefanec, and Tomas Krajnik Be it the news or the dwindling number of creatures hitting your windscreens, it will not have evaded you that the insect world in bad shape. ...
31 December 2023, by





Robohub is supported by:




Would you like to learn how to tell impactful stories about your robot or AI system?


scicomm
training the next generation of science communicators in robotics & AI


©2024 - Association for the Understanding of Artificial Intelligence


 












©2021 - ROBOTS Association