Robotic vehicles have been used in dangerous environments for decades, from decommissioning the Fukushima nuclear power plant or inspecting underwater energy infrastructure in the North Sea. More recently, autonomous vehicles from boats to grocery delivery carts have made the gentle transition from research centres into the real world with very few hiccups. Yet the promised arrival of self-driving cars has not progressed beyond the testing stage.
In this technical talk, Amanda Prorok, Assistant Professor in the Department of Computer Science and Technology at Cambridge University, and a Fellow of Pembroke College, discusses her team’s latest research on what, how and when information needs to be shared among agents that aim to solve cooperative tasks.
Since 2007, two professors at the TU Delft have been researching ways to harvest energy from the wind using a kite. The robotic kite looks set to make its debut in the energy sector, but often inventions are used in unexpected ways. In this series of articles, we take robot innovations from their test-lab and bring them to a randomly selected workplace in the outside world. From kindergarten teacher Fransien, we learn that big kites could also be child’s play, quite literally.
Talking Robotics is a series of virtual seminars about Robotics and its interaction with other relevant fields, such as Artificial Intelligence, Machine Learning, Design Research, Human-Robot Interaction, among others. They aim to promote reflections, dialogues, and a place to network. In this seminars compilation, we bring you 7 talks (and a half?) from current roboticists for your enjoyment.
In recent years, robots have gained artificial vision, touch, and even smell. “Researchers have been giving robots human-like perception,” says MIT Associate Professor Fadel Adib. In a new paper, Adib’s team is pushing the technology a step further. “We’re trying to give robots superhuman perception,” he says.
Mapping is an essential task in many robotics applications. To build a map, it is frequently assumed that the positions of the robots are a priori unknown and need to be estimated during operation. Multi-robot SLAM is a research direction that addresses the collective exploration and mapping of unknown environments by multi-robot systems. Yet, most results so far have been achieved for small groups of robots. Multi-robot SLAM is still a growing field, and a number of research directions are yet to be explored. Among them, swarm SLAM is an alternative, promising approach that takes advantage of the characteristics of robot swarms.
There are some tasks that traditional robots — the rigid and metallic kind — simply aren’t cut out for. Soft-bodied robots, on the other hand, may be able to interact with people more safely or slip into tight spaces with ease. But for robots to reliably complete their programmed duties, they need to know the whereabouts of all their body parts. That’s a tall task for a soft robot that can deform in a virtually infinite number of ways.
Having reached this point I needed a robot – and a way of communicating with it – so that I could both write getRobotData(spec) and test the EBB. But how to do this? I’m working from home during lockdown, and my e-puck robots are all in the lab. Then I remembered that the excellent robot simulator V-REP (now called CoppeliaSim) has a pretty good e-puck model and some nice demo scenes.
From swallowing pills to injecting insulin, patients frequently administer their own medication. But they don’t always get it right. Improper adherence to doctors’ orders is commonplace, accounting for thousands of deaths and billions of dollars in medical costs annually. MIT researchers have developed a system to reduce those numbers for some types of medications.
Nearly all real-world applications of reinforcement learning involve some degree of shift between the training environment and the testing environment. However, prior work has observed that even small shifts in the environment cause most RL algorithms to perform markedly worse. As we aim to scale reinforcement learning algorithms and apply them in the real world, it is increasingly important to learn policies that are robust to changes in the environment.
While modern cameras provide machines with a very well-developed sense of vision, robots still lack such a comprehensive solution for their sense of touch. At ETH Zurich, in the group led by Prof. Raffaello D’Andrea at the Institute for Dynamic Systems and Control, we have developed a tactile sensing principle that allows robots to retrieve rich contact feedback from their interactions with the environment. I recently described our approach in a TEDx talk at the last TEDxZurich. The talk features a tech demo that introduces the novel tactile sensing technology targeting the next generation of soft robotic skins.
If you’ve ever swatted a mosquito away from your face, only to have it return again (and again and again), you know that insects can be remarkably acrobatic and resilient in flight. Those traits help them navigate the aerial world, with all of its wind gusts, obstacles, and general uncertainty. Such traits are also hard to build into flying robots, but MIT Assistant Professor Kevin Yufeng Chen has built a system that approaches insects’ agility.
European Digital Innovation Hubs are one-stop-shops where companies and public sector organisations can access and test digital innovations, gain the required digital skills, get advice on financing support and ultimately accomplish their digital transformation in the context of the twin green and digital transition which is at the core of European industrial policy.
The ability for humans to generalize their knowledge and experiences to new situations is remarkable, yet poorly understood. For example, imagine a human driver that has only ever driven around their city in clear weather. Even though they never encountered true diversity in driving conditions, they have acquired the fundamental skill of driving, and can adapt reasonably fast to driving in neighboring cities, in rainy or windy weather, or even driving a different car, without much practice nor additional driver’s lessons. While humans excel at adaptation, building intelligent systems with common-sense knowledge and the ability to quickly adapt to new situations is a long-standing problem in artificial intelligence.
My coding project is to start building an ethical black box (EBB), or to be more accurate, a module that will allow a software EBB to be incorporated into a robot. Conceptually the EBB is very simple, it is a data logger – the robot equivalent of an aircraft Flight Data Recorder, or an automotive Event Data Recorder.