As head of the Laboratory of Intelligent Systems at the Ecole Polytechnique Fédérale de Lausanne (EPFL) in Switzerland, Dario Floreano knows a thing or two about flying robots. In the first segment of a two-part conversation, Waypoint caught up with Floreano to discuss his team’s drone research, the potential applications of such innovations, and more.
Could you start by giving our readers a little background on the Laboratory of Intelligent Systems (LIS) at EPFL?
Yes of course. I started the laboratory in the year 2000 and today we have around 20 researchers. senseFly’s CEO [Jean-Christophe Zufferey] was actually one of my first PhD students.
I had always been interested in bio-inspired artificial intelligence and robotics. I had two interests in particular. First was designing systems with animal characteristics; robust systems capable of solving problems that couldn’t be predicted in a changing environment. And the other interest, by contrast, is using these tools to better understand biology.
At LIS we work closely with biologists to extract principles from animals, such as how their vision works, then we translate these principles into control algorithms to manage a robot’s performance as the corresponding animal does. In other words, we use biology to design better robots and we use robots to understand more about biology.
Where does this bio-inspiration lead you in practice? What are some key themes?
Our activities focus on three core themes: flying robots, soft robots and evolutionary robots.
Since Waypoint is all about the professional use of flying robots, or drones, could we dig down into this area some more? Why flying robots? What’s the specific interest there?
Why flying robots? Because they’re very challenging!
They need to be very lightweight but at the same time you have to include energy, computing power, sensors… this is a really challenging job. This challenge is why I thought bio-inspired solutions would provide an advantage compared to traditional approaches. That’s the point we started from when Jean-Christophe Zufferey was working in my lab originally.
So what has your drone research typically looked into, and how do you see this work evolving in future?
If you want to have robots capable of flying in cluttered environments, they need to be able to ‘see’ in these spaces. So right now we’re looking at different sensor modalities: vision, in particular compound vision, which insects use, and acoustic sensing. The latter is mainly used for coordinating the flight of multiple drones, which we call swarming. We also do research in drone communication for coordinated flight and aerial telecommunication networks, but this work is less bio-inspired.
Other thing one needs to look at is how the body of a robot is designed. They need to be very agile to fly in cluttered environments so we’re looking at robots with adaptive morphology, drones that can change their shapes to transition between fixed wing and rotary formats, which each have their own pros and cons in terms of flight times and agility.
On this morphology front, you’ve been looking into folding drones?
Yes. So with morphology we’re also looking at size. In order to carry larger payloads, typically this means a larger drone. You have two choices when looking to carry larger payloads: either fly faster or accelerate propeller rotation, or increase the surface of the aerial surfaces. What people typically do is to increase the size of the props or the wings. This ends up with larger drones, which leads to problems transporting them and when making them fly through narrow passages. So we’ve been looking into the concept of foldability. We are looking at origami technology to make drones that you can easily carry on your pocket. There’s also the issue then of controlling such systems; how to make such drones stable after they open.
We’re also studying artificial muscles made of soft actuators, which could be used one day for foldable drones.
When you fly in really difficult situations— with things like cables around—then sooner or later there will be contact. This brings the question: how do you protect the robot, the environment and the people who come into contact with it?
One very simple way is to add a cage or a gimbal (such as Flyability’s Gimbal, which is another spin-off of my laboratory), and another way we’re now looking at is using new types of materials. Materials that minimise damage, absorb collision and then revert back to their original shape.
I’d like to go back to perception. You mentioned working on two sensor modalities: vision and acoustic sensing. Can we explore the first a little further?
Sure. When it comes to a robot’s perception, we’ve been working on vision for a long time. Most recently in this area we’ve been working on designing new types of vision sensors that mimic the compound vision of insects.
Conventional cameras are designed on the same principles of the vertebrate eye whereby an image is perceived via a very dense, high number of photoreceptors under a single lens. The advantage of this conventional approach is very fine picture quality, but on the downside you have a very limited field of view.
Now, when you have a drone, you want to have a very wide field of view because you need to see in many more directions than a terrestrial animal. But if you simply increase the number of cameras to achieve this, then you increase the weight. Insects on the other hand, with their compound eyes, cover a very wide field of view by using hundreds of simpler miniature eyes, with just a few photoreceptors under each lens pointing in the desired directions. Their perception is the mosaic image that results from these thin eyes.
So we decided to design new miniature sensors made from very simple cameras, each with between one and three photoreceptors, and a very narrow view lens. Only instead of three or four of these cameras we have hundreds of them. This insect-like approach gives a very wide field of view and the insect-inspired photoreceptors give us very fast reaction time; we can extract optic flow three times faster than insects can! However the drawback is that the images are not very fine, so these cameras are for navigation, not for taking pictures. (Read the paper: Miniature curved artificial compound eyes, or see the Robohub article here. )
What stage of development are you at with these multi-camera, insect-vision robots?
We have produced a few first prototypes. The cameras themselves are less than a millimetre thick and each weighs less than a milligram. We can also arrange them on fully flexible substrates, which I call “vision tapes”. These are packed full of sensors and you can wrap them around parts of the robot.
The challenge with our current prototype is that it has a very limited vision range—it can only see up to a maximum of 50 cm. So we’re working on designing of new type of sensor, which has a longer range but is still very simple to produce.
Thank you very much for your time!
You’re very welcome.
===========================
About Dario Floreano
Dario Floreano (M.A. 1988; M.S. 1992; Ph.D. 1995) is the director of the Laboratory of Intelligent Systems and director of the Swiss National Center of Competence in Robotics. He is co-founder of the company senseFly S.A., of the International Society for Artificial Life, and founder of the popular robotics podcast series Talking Robots (which later became RobotsPodcast).
He has been on the advisory board of the Future and Emerging Technologies division of the European Commission, vice-chair of the Global Agenda Council on Robotics and Smart Devices of the World Economic Forum, and on the board of governors of the International Society for Neural Networks. Prof. Floreano is a senior member of the IEEE and received several national and international awards for professional and cultural achievements. Prof. Floreano has published four books, more than 300 technical articles, and more than 10 patents in drone technologies.