Robohub.org
 

RFID-based global positioning


by
14 August 2010



share this:

Having a robot figure out its global position is required in many real world applications, it’s also one of the biggest challenges in robotics.

The easiest approach is to have a robot blindly keep track of its movements (odometry) from a known starting position. Odometry alone however quickly results in an add-up of errors that make the localization unusable.

To help the robot along the way, Boccadoro et al. propose to place passive Radio-Frequency IDentification (RFID) tags in known positions in the environment. These smart tags are interesting because they are typically low cost and require no energy to function. Robots equipped with RFID readers can detect a tag within a 1m range, although with a lot of noise. Algorithms are then needed to combine the robot’s sensors, in this case odometry, with the noisy RFID readings to precisely estimate its global position.

For this purpose, two types of Kalman Filters are implemented and compared to a Particle Filter method that typically has much larger computational cost. Experiments were conducted using a Pioneer P3-DX driving around a corridor equipped with 6 RFID tags.

Results show that the first method is fast but imprecise when tags are sparse (figure left). The second approach has higher computation requirements than the first but is able to obtain estimates as good as the Particle Filter method with few tags (figure right).

The path reconstructed through the various methods proposed: a red line is used to represent the estimation of the second loop of the robot path, the green line is used for the last loop; the line in blue is ground truth.

In the future, authors hope to investigate the optimal placement of RFID tags to achieve even better position estimates.




Sabine Hauert is President of Robohub and Associate Professor at the Bristol Robotics Laboratory
Sabine Hauert is President of Robohub and Associate Professor at the Bristol Robotics Laboratory


Subscribe to Robohub newsletter on substack



Related posts :

Reversible, detachable robotic hand redefines dexterity

  19 Feb 2026
A robotic hand developed at EPFL has dual-thumbed, reversible-palm design that can detach from its robotic ‘arm’ to reach and grasp multiple objects.

“Robot, make me a chair”

  17 Feb 2026
An AI-driven system lets users design and build simple, multicomponent objects by describing them with words.

Robot Talk Episode 144 – Robot trust in humans, with Samuele Vinanzi

  13 Feb 2026
In the latest episode of the Robot Talk podcast, Claire chatted to Samuele Vinanzi from Sheffield Hallam University about how robots can tell whether to trust or distrust people.

How can robots acquire skills through interactions with the physical world? An interview with Jiaheng Hu

and   12 Feb 2026
Find out more about work published at the Conference on Robot Learning (CoRL).

Sven Koenig wins the 2026 ACM/SIGAI Autonomous Agents Research Award

  10 Feb 2026
Sven honoured for his work on AI planning and search.

Robot Talk Episode 143 – Robots for children, with Elmira Yadollahi

  06 Feb 2026
In the latest episode of the Robot Talk podcast, Claire chatted to Elmira Yadollahi from Lancaster University about how children interact with and relate to robots.

New frontiers in robotics at CES 2026

  03 Feb 2026
Henry Hickson reports on the exciting developments in robotics at Consumer Electronics Show 2026.

Robot Talk Episode 142 – Collaborative robot arms, with Mark Gray

  30 Jan 2026
In the latest episode of the Robot Talk podcast, Claire chatted to Mark Gray from Universal Robots about their lightweight robotic arms that work alongside humans.



Robohub is supported by:


Subscribe to Robohub newsletter on substack




 















©2026.02 - Association for the Understanding of Artificial Intelligence