Robohub.org
 

Learning Bayesian filters without precise ground truth

by
21 January 2011



share this:

As seen in a previous post, Bayes filters such as Kalman filters can be used to estimate the state of a robot. Usually, this requires having a model of how the robot’s sensor measurements relate to the state you want to observe and a model to predict how the robot’s control inputs impact its state. When little is know about these models, machine learning techniques can help find models automatically.

In the scenario imagined by Ko et al. the goal is to infer a car’s position (state) on a race track based on remote control commands and measurements from an Inertial Measurement Unit (IMU, see red box in picture below) that provides turn rates in roll, pitch, and yaw and accelerations in 3 dimensions. The car is only controlled in speed since a “rail” in the track makes it turn. As a start, Ko et al. collect data by driving the car around the track while recording its remote control commands, IMU measurements, and position (ground truth) estimated using an overhead camera. The data is then used to train probabilistic models (Gaussian Process models) that are finally integrated into a Bayes filter.

However, the need for ground truth requires extra effort and additional hardware such as the overhead camera. To overcome this problem, Ko et al. extend their method to deal with situations where no, little, or noisy ground truth is used to train the models.

Results show the successful tracking of the car with better performance than state-of-the-art approaches, even when no ground truth is given. The authors also show how the developed method can be used to allow the car or a robot arm to replay trajectories based on expert demonstrations.




Sabine Hauert is President of Robohub and Associate Professor at the Bristol Robotics Laboratory
Sabine Hauert is President of Robohub and Associate Professor at the Bristol Robotics Laboratory





Related posts :



#RoboCup2024 – daily digest: 21 July

In the last of our digests, we report on the closing day of competitions in Eindhoven.
21 July 2024, by and

#RoboCup2024 – daily digest: 20 July

In the second of our daily round-ups, we bring you a taste of the action from Eindhoven.
20 July 2024, by and

#RoboCup2024 – daily digest: 19 July

Welcome to the first of our daily round-ups from RoboCup2024 in Eindhoven.
19 July 2024, by and

Robot Talk Episode 90 – Robotically Augmented People

In this special live recording at the Victoria and Albert Museum, Claire chatted to Milia Helena Hasbani, Benjamin Metcalfe, and Dani Clode about robotic prosthetics and human augmentation.
21 June 2024, by

Robot Talk Episode 89 – Simone Schuerle

In the latest episode of the Robot Talk podcast, Claire chatted to Simone Schuerle from ETH Zürich all about microrobots, medicine and science.
14 June 2024, by

Robot Talk Episode 88 – Lord Ara Darzi

In the latest episode of the Robot Talk podcast, Claire chatted to Lord Ara Darzi from Imperial College London all about robotic surgery - past, present and future.
07 June 2024, by





Robohub is supported by:




Would you like to learn how to tell impactful stories about your robot or AI system?


scicomm
training the next generation of science communicators in robotics & AI


©2024 - Association for the Understanding of Artificial Intelligence


 












©2021 - ROBOTS Association