Learning Bayesian filters without precise ground truth

21 January 2011

share this:

As seen in a previous post, Bayes filters such as Kalman filters can be used to estimate the state of a robot. Usually, this requires having a model of how the robot’s sensor measurements relate to the state you want to observe and a model to predict how the robot’s control inputs impact its state. When little is know about these models, machine learning techniques can help find models automatically.

In the scenario imagined by Ko et al. the goal is to infer a car’s position (state) on a race track based on remote control commands and measurements from an Inertial Measurement Unit (IMU, see red box in picture below) that provides turn rates in roll, pitch, and yaw and accelerations in 3 dimensions. The car is only controlled in speed since a “rail” in the track makes it turn. As a start, Ko et al. collect data by driving the car around the track while recording its remote control commands, IMU measurements, and position (ground truth) estimated using an overhead camera. The data is then used to train probabilistic models (Gaussian Process models) that are finally integrated into a Bayes filter.

However, the need for ground truth requires extra effort and additional hardware such as the overhead camera. To overcome this problem, Ko et al. extend their method to deal with situations where no, little, or noisy ground truth is used to train the models.

Results show the successful tracking of the car with better performance than state-of-the-art approaches, even when no ground truth is given. The authors also show how the developed method can be used to allow the car or a robot arm to replay trajectories based on expert demonstrations.

Sabine Hauert is President of Robohub and Associate Professor at the Bristol Robotics Laboratory
Sabine Hauert is President of Robohub and Associate Professor at the Bristol Robotics Laboratory

Related posts :

Call for robot holiday videos 2022

That’s right! You better not run, you better not hide, you better watch out for brand new robot holiday videos on Robohub!
02 December 2022, by

The Utah Bionic Leg: A motorized prosthetic for lower-limb amputees

Lenzi’s Utah Bionic Leg uses motors, processors, and advanced artificial intelligence that all work together to give amputees more power to walk, stand-up, sit-down, and ascend and descend stairs and ramps.

Touch sensing: An important tool for mobile robot navigation

Proximal sensing often is a blind spot for most long range sensors such as cameras and lidars for which touch sensors could serve as a complementary modality.
29 November 2022, by

Study: Automation drives income inequality

New data suggest most of the growth in the wage gap since 1980 comes from automation displacing less-educated workers.
27 November 2022, by

Flocks of assembler robots show potential for making larger structures

Researchers make progress toward groups of robots that could build almost anything, including buildings, vehicles, and even bigger robots.
25 November 2022, by

Holiday robot wishlist for/from Women in Robotics

Are you looking for a gift for the women in robotics in your life? Or the up and coming women in robotics in your family? Perhaps these suggestions from our not-for-profit Women in Robotics organization will inspire!
24 November 2022, by and

©2021 - ROBOTS Association


©2021 - ROBOTS Association