Robohub.org
 

Learning Bayesian filters without precise ground truth


by
21 January 2011



share this:

As seen in a previous post, Bayes filters such as Kalman filters can be used to estimate the state of a robot. Usually, this requires having a model of how the robot’s sensor measurements relate to the state you want to observe and a model to predict how the robot’s control inputs impact its state. When little is know about these models, machine learning techniques can help find models automatically.

In the scenario imagined by Ko et al. the goal is to infer a car’s position (state) on a race track based on remote control commands and measurements from an Inertial Measurement Unit (IMU, see red box in picture below) that provides turn rates in roll, pitch, and yaw and accelerations in 3 dimensions. The car is only controlled in speed since a “rail” in the track makes it turn. As a start, Ko et al. collect data by driving the car around the track while recording its remote control commands, IMU measurements, and position (ground truth) estimated using an overhead camera. The data is then used to train probabilistic models (Gaussian Process models) that are finally integrated into a Bayes filter.

However, the need for ground truth requires extra effort and additional hardware such as the overhead camera. To overcome this problem, Ko et al. extend their method to deal with situations where no, little, or noisy ground truth is used to train the models.

Results show the successful tracking of the car with better performance than state-of-the-art approaches, even when no ground truth is given. The authors also show how the developed method can be used to allow the car or a robot arm to replay trajectories based on expert demonstrations.




Sabine Hauert is President of Robohub and Associate Professor at the Bristol Robotics Laboratory
Sabine Hauert is President of Robohub and Associate Professor at the Bristol Robotics Laboratory





Related posts :



What’s coming up at #IROS2025?

  15 Oct 2025
Find out what the International Conference on Intelligent Robots and Systems has in store.

From sea to space, this robot is on a roll

  13 Oct 2025
Graduate students in the aptly named "RAD Lab" are working to improve RoboBall, the robot in an airbag.

Robot Talk Episode 128 – Making microrobots move, with Ali K. Hoshiar

  10 Oct 2025
In the latest episode of the Robot Talk podcast, Claire chatted to Ali K. Hoshiar from University of Essex about how microrobots move and work together.

Interview with Zahra Ghorrati: developing frameworks for human activity recognition using wearable sensors

and   08 Oct 2025
Zahra tells us more about her research on wearable technology.

Women in robotics you need to know about 2025

  06 Oct 2025
This global list celebrates women's impact across the robotics ecosystem and globe.

Robot Talk Episode 127 – Robots exploring other planets, with Frances Zhu

  03 Oct 2025
In the latest episode of the Robot Talk podcast, Claire chatted to Frances Zhu from the Colorado School of Mines about intelligent robotic systems for space exploration.

Rethinking how robots move: Light and AI drive precise motion in soft robotic arm

  01 Oct 2025
Researchers at Rice University have developed a soft robotic arm capable of performing complex tasks.



 

Robohub is supported by:




Would you like to learn how to tell impactful stories about your robot or AI system?


scicomm
training the next generation of science communicators in robotics & AI


 












©2025.05 - Association for the Understanding of Artificial Intelligence