Robohub.org
 

Dr Rustam Stolkin and robots that learn: Nuclear robotics meets machine learning

by and
27 April 2017



share this:

Heavy Water Components Test Reactor decommissioning.

How can we create robots that can carry out important tasks in dangerous environments? Machine learning is supporting advances in the field of robotics. To find out more, we talked to Dr Rustam Stolkin, Royal Society Industry Fellow for Nuclear Robotics, Professor of Robotics at the University of Birmingham, and Director at A.R.M Robotics Ltd, about his work combining machine learning and robotics to create practical solutions to nuclear problems.


What drew you to engineering and robotics?

There are many definitions of engineering, but the one I like is “the creation of artefacts for the benefit of mankind”. Engineering is a way of using science to be creative and to create novel technologies which can bring major societal and economic benefit.

Dr Rustam Stolkin. Source: University of Birmingham/YouTube

Robotics, in particular, attracted me because robots are inherently inter-disciplinary: they use mechanics and electronics, as well as computational elements, such as machine learning and artificial intelligence. This has allowed me to become an “expert generalist”; I specialise in being extremely cross-disciplinary. At the University of Birmingham, I have research collaborations spanning eight different schools across three colleges on the campus.


Tell us about your research?

My work focuses on developing advanced robotics technologies for nuclear decommissioning, demolishing legacy nuclear sites, and safely packaging, storing and monitoring any radiological or contaminated materials. The UK contains 4.9 million tonnes of legacy nuclear waste, mainly by-products of weapons production during the Cold War, alongside a smaller contribution from nuclear power use for civilian electricity generation. Cleaning up the UK’s legacy waste is the biggest environmental remediation project across Europe; it could take over 100 years and cost up to £220billion.

Factories are highly structured environments, where robots can be pre-programmed to make repetitive motions. This has supported the revolution in robotics that we’ve seen since the 1980s. In contrast, there has been surprisingly little use of robots in the nuclear industry, aside from a very small number of robotic vehicles deployed after the Fukushima disaster, and a very few cases of deploying bespoke robotic manipulators, controlled in rudimentary ways by joystick teleoperation.

This is in part because legacy nuclear sites, some of them over 60 years old, are highly complex and unknown environments, such as contaminated buildings that have been sealed off for decades. They have an enormous diversity of scenes, materials, and objects that nuclear decommissioning robots must interact with in complex ways, such as by grasping and cutting objects.

We have only recently discovered how to use state-of-the-art machine learning methods to help robots learn adaptive and transferable skills which they can apply in new environments. This allows them to perceive and handle new kinds of objects which they have not seen before. For example, using machine learning, a robot can be trained how to grasp and pick up one object, and then use this knowledge to adapt the grasp for a new object with a different shape.

Recent “deep learning” approaches might offer part of the solution in navigating these complex environments. However such learning approaches can rely on large amounts of “training data” – labelled examples of images of objects for example – which can be used to train the robot so that it can recognise similar objects in future. It is still very unclear how to generate sufficient training data in order to enable robots to learn how to behave in useful ways in complex real environments.


Why do we need robots to carry out nuclear decommissioning?

Contaminated materials are currently decommissioned by human workers in air-fed plastic suits, which is not only unsustainable but can be dangerous. Using robots means humans do not need to enter highly hazardous work environments, and means that zones that humans cannot enter (even in protective suits) due to high gamma radiation can still be decommissioned. Using robots also reduces “secondary nuclear waste”; for every one container filled with actual primary nuclear waste, more than ten containers become filled with contaminated plastic suits, respirators, rubber gloves, and other “secondary waste” from human entries.


How have you used machine learning to develop robots capable of nuclear decommissioning?

My lab has used robots to perform fully autonomous grasping and manipulation of nuclear waste-like objects and grasping of moving objects.

We also use state-of-the-art deep-learning neural networks to do real-time 3D reconstruction of a room, and simultaneously recognise and label materials such as concrete, metal, wood and fabric. This helps us develop ways for robots to characterise nuclear environments and nuclear waste objects in real-time. We are extending these methods to also combine visual camera information with other kinds of data from radiation, thermal and other sensors.


Where do you see machine learning going within your field of research?

Robotics engineers often argue that machine learning has only been used on relatively simple problems so far. Achieving machine learning of real robotic actions, with real objects in real scenes, is a significantly more complicated challenge that is only just starting to be attempted.

Innovative new approaches will emerge in the next few years for overcoming these challenges in enabling useful robotic behaviours in real environments. The next few years will see an interesting fusion of conventional approaches, such as hand-crafted robot control algorithms, with new approaches, like machine learning of robot behaviours.

Human-robot interaction, and human-AI collaborative working will also become increasingly important. What we are working towards now is “shared control” in which artificial intelligence and humans collaborate together to control a remote robot. For example, recent research from the French institute CNRS shows human operators controlling the motion of a robot arm and gripper in relation to an object, while the AI automatically orients the gripper with the object. Simultaneously, the AI automatically controls a second robot arm to move a camera so that it always offers the best viewpoint. Thus, human and AI collaborate to control a pair of arms for a remote handling task. In the future, this kind of human-machine collaboration could transform how we interact with nuclear and hazardous environments.


The Royal Society is currently carrying out a major policy project on machine learning, and its impact for the UK economy and society. To find out more please visit our project page.

This post was originally published by the Royal Society. Click here to view the original post.



tags: , , , , , ,


The Royal Society The Royal Society is a Fellowship of many of the world's most eminent scientists and is the oldest scientific academy in continuous existence.
The Royal Society The Royal Society is a Fellowship of many of the world's most eminent scientists and is the oldest scientific academy in continuous existence.

Susannah Odell works as a policy adviser at the Royal Society...
Susannah Odell works as a policy adviser at the Royal Society...





Related posts :



Open Robotics Launches the Open Source Robotics Alliance

The Open Source Robotics Foundation (OSRF) is pleased to announce the creation of the Open Source Robotics Alliance (OSRA), a new initiative to strengthen the governance of our open-source robotics so...

Robot Talk Episode 77 – Patricia Shaw

In the latest episode of the Robot Talk podcast, Claire chatted to Patricia Shaw from Aberystwyth University all about home assistance robots, and robot learning and development.
18 March 2024, by

Robot Talk Episode 64 – Rav Chunilal

In the latest episode of the Robot Talk podcast, Claire chatted to Rav Chunilal from Sellafield all about robotics and AI for nuclear decommissioning.
31 December 2023, by

AI holidays 2023

Thanks to those that sent and suggested AI and robotics-themed holiday videos, images, and stories. Here’s a sample to get you into the spirit this season....
31 December 2023, by and

Faced with dwindling bee colonies, scientists are arming queens with robots and smart hives

By Farshad Arvin, Martin Stefanec, and Tomas Krajnik Be it the news or the dwindling number of creatures hitting your windscreens, it will not have evaded you that the insect world in bad shape. ...
31 December 2023, by

Robot Talk Episode 63 – Ayse Kucukyilmaz

In the latest episode of the Robot Talk podcast, Claire chatted to Ayse Kucukyilmaz from the University of Nottingham about collaboration, conflict and failure in human-robot interactions.
31 December 2023, by





Robohub is supported by:




Would you like to learn how to tell impactful stories about your robot or AI system?


scicomm
training the next generation of science communicators in robotics & AI


©2024 - Association for the Understanding of Artificial Intelligence


 












©2021 - ROBOTS Association