Robohub.org
 

Grasping with robots – which object is in reach?


by
17 August 2014



share this:

This post is part of our ongoing efforts to make the latest papers in robotics accessible to a general audience.

Imagine a robot reaching for a mug on the table, only to realize that it is too far, or that it would need to bend its arm joint backwards to get there. Understanding which objects are within reach and how to grasp them is an essential requirement if robots are to operate in our everyday environments. To solve this problem, a recent Autonomous Robots paper by Vahrenkamp et al. proposes a new approach to build a comprehensive representation of the capabilities of a robot related to reaching and grasping.

The “manipulability” representation shown below allows the robot to know where it can reach in 6D with its right arm. That means it knows which x,y,z positions it can reach, as well as the orientation of the robot hand that is best for manipulation. The representation takes into account constraints due to joints in the arm. The manipulability is encoded by color (blue: low, red: high).

armar4_rightarm6

A cut through one of these vector clouds looks like this.

manip_armar2

In addition to single handed grasping, the authors discuss how the approach can be extended to grasping with two arms. Experiments were run in simulation on the humanoid robots ARMAR-III and ARMAR-IV.

And in case you want to try this at home, there is an open source version of this work here.

For more information, you can read the paper Representing the robot’s workspace through constrained manipulability analysis (Nikolaus Vahrenkamp and Tamim Asfour, Autonomous Robots – Springer US, July 2014) or ask questions below!



tags:


Autonomous Robots Blog Latest publications in the journal Autonomous Robots (Springer).
Autonomous Robots Blog Latest publications in the journal Autonomous Robots (Springer).





Related posts :

Robot Talk Episode 144 – Robot trust in humans, with Samuele Vinanzi

  13 Feb 2026
In the latest episode of the Robot Talk podcast, Claire chatted to Samuele Vinanzi from Sheffield Hallam University about how robots can tell whether to trust or distrust people.

How can robots acquire skills through interactions with the physical world? An interview with Jiaheng Hu

and   12 Feb 2026
Find out more about work published at the Conference on Robot Learning (CoRL).

Sven Koenig wins the 2026 ACM/SIGAI Autonomous Agents Research Award

  10 Feb 2026
Sven honoured for his work on AI planning and search.

Robot Talk Episode 143 – Robots for children, with Elmira Yadollahi

  06 Feb 2026
In the latest episode of the Robot Talk podcast, Claire chatted to Elmira Yadollahi from Lancaster University about how children interact with and relate to robots.

New frontiers in robotics at CES 2026

  03 Feb 2026
Henry Hickson reports on the exciting developments in robotics at Consumer Electronics Show 2026.

Robot Talk Episode 142 – Collaborative robot arms, with Mark Gray

  30 Jan 2026
In the latest episode of the Robot Talk podcast, Claire chatted to Mark Gray from Universal Robots about their lightweight robotic arms that work alongside humans.

Robot Talk Episode 141 – Our relationship with robot swarms, with Razanne Abu-Aisheh

  23 Jan 2026
In the latest episode of the Robot Talk podcast, Claire chatted to Razanne Abu-Aisheh from the University of Bristol about how people feel about interacting with robot swarms.

Vine-inspired robotic gripper gently lifts heavy and fragile objects

  23 Jan 2026
The new design could be adapted to assist the elderly, sort warehouse products, or unload heavy cargo.


Robohub is supported by:





 













©2026.01 - Association for the Understanding of Artificial Intelligence