Robohub.org
 

Learning Bayesian filters without precise ground truth

by
21 January 2011



share this:

As seen in a previous post, Bayes filters such as Kalman filters can be used to estimate the state of a robot. Usually, this requires having a model of how the robot’s sensor measurements relate to the state you want to observe and a model to predict how the robot’s control inputs impact its state. When little is know about these models, machine learning techniques can help find models automatically.

In the scenario imagined by Ko et al. the goal is to infer a car’s position (state) on a race track based on remote control commands and measurements from an Inertial Measurement Unit (IMU, see red box in picture below) that provides turn rates in roll, pitch, and yaw and accelerations in 3 dimensions. The car is only controlled in speed since a “rail” in the track makes it turn. As a start, Ko et al. collect data by driving the car around the track while recording its remote control commands, IMU measurements, and position (ground truth) estimated using an overhead camera. The data is then used to train probabilistic models (Gaussian Process models) that are finally integrated into a Bayes filter.

However, the need for ground truth requires extra effort and additional hardware such as the overhead camera. To overcome this problem, Ko et al. extend their method to deal with situations where no, little, or noisy ground truth is used to train the models.

Results show the successful tracking of the car with better performance than state-of-the-art approaches, even when no ground truth is given. The authors also show how the developed method can be used to allow the car or a robot arm to replay trajectories based on expert demonstrations.




Sabine Hauert is President of Robohub and Associate Professor at the Bristol Robotics Laboratory
Sabine Hauert is President of Robohub and Associate Professor at the Bristol Robotics Laboratory





Related posts :



A short guide to Multidisciplinary Research

How and Why would I consider colliding two opposite disciplines in my research.
27 September 2023, by

Robo-Insight #5

In this fifth edition, we are excited to feature robot progress in human-robot interaction, agile movement, enhanced training methods, soft robotics, brain surgery, medical navigation, and ecological research. 
25 September 2023, by

Soft robotic tool provides new ‘eyes’ in endovascular surgery

The magnetic device can help visualise and navigate complex and narrow spaces.

‘Brainless’ robot can navigate complex obstacles

Researchers who created a soft robot that could navigate simple mazes without human or computer direction have now built on that work, creating a “brainless” soft robot that can navigate more complex and dynamic environments.
21 September 2023, by

Battery-free origami microfliers from UW researchers offer a new bio-inspired future of flying machines

Researchers at the University of Washington present battery-free microfliers that can change shape in mid-air to vary their dispersal distance.

Virtual-reality tech is fast becoming more real

Touch sensations are improving to help sectors like healthcare and manufacturing, while other advances are being driven by the gaming industry.
16 September 2023, by





©2021 - ROBOTS Association


 












©2021 - ROBOTS Association