Robohub.org
 

Internal models of robot bodies


by
06 July 2011



share this:

Robots that have an internal model of their body could potentially use it to predict how a motor action will affect their position and what sequence of actions will bring them to a desired configuration (inverse kinematics). Knowing in what state a robot’s body is can also be useful for merging sensor readings, for example to determine the position of an arm using a head mounted camera and joint angle sensors.

Ideally the model should be able to simulate all movements that are physically possible for a given robot body. For this purpose, Malte Schilling uses a special type of recurrent neural network called a “Mean of Multiple Computation” (MMC) network. The model can be used for the tasks described earlier (predictions, inverse kinematics and sensor fusion) simply by changing the values that are fed as input to the network. However, work so far using MMC networks has been limited to 2D or simple 3D scenarios. For more general 3D models, Schilling introduces dual quaternions as a suitable representation of the kinematics of a body.

The robot's task is to reach all the target points in the 3D environment.

Experiments were done in simulation using a three-segment arm. The task was to reach for targets in 3D space, beginning at a predefined starting position. Results shown in the figure below depict the successful robot motion using this model. Unlike other models in the literature, the MMC network does not require the precomputation of the complete movement, it is able to deal with extra degrees of freedom and it can accomodate external constraints.

Movement of a robot arm reaching for target 6 (previous figure) controlled by the MMC network.

In the future, authors hope to build a network that can represent a complete body, for example, the body of a hexapod walker with 18 joints and to use this body model for planning ahead.




Sabine Hauert is President of Robohub and Associate Professor at the Bristol Robotics Laboratory
Sabine Hauert is President of Robohub and Associate Professor at the Bristol Robotics Laboratory





Related posts :



CoRL2025 – RobustDexGrasp: dexterous robot hand grasping of nearly any object

  11 Nov 2025
A new reinforcement learning framework enables dexterous robot hands to grasp diverse objects with human-like robustness and adaptability—using only a single camera.

Robot Talk Episode 132 – Collaborating with industrial robots, with Anthony Jules

  07 Nov 2025
In the latest episode of the Robot Talk podcast, Claire chatted to Anthony Jules from Robust.AI about their autonomous warehouse robots that work alongside humans.

Teaching robots to map large environments

  05 Nov 2025
A new approach could help a search-and-rescue robot navigate an unpredictable environment by rapidly generating an accurate map of its surroundings.

Robot Talk Episode 131 – Empowering game-changing robotics research, with Edith-Clare Hall

  31 Oct 2025
In the latest episode of the Robot Talk podcast, Claire chatted to Edith-Clare Hall from the Advanced Research and Invention Agency about accelerating scientific and technological breakthroughs.

A flexible lens controlled by light-activated artificial muscles promises to let soft machines see

  30 Oct 2025
Researchers have designed an adaptive lens made of soft, light-responsive, tissue-like materials.

Social media round-up from #IROS2025

  27 Oct 2025
Take a look at what participants got up to at the IEEE/RSJ International Conference on Intelligent Robots and Systems.

Using generative AI to diversify virtual training grounds for robots

  24 Oct 2025
New tool from MIT CSAIL creates realistic virtual kitchens and living rooms where simulated robots can interact with models of real-world objects, scaling up training data for robot foundation models.



 

Robohub is supported by:




Would you like to learn how to tell impactful stories about your robot or AI system?


scicomm
training the next generation of science communicators in robotics & AI


 












©2025.05 - Association for the Understanding of Artificial Intelligence