news    views    podcast    learn    |    about    contribute     republish    


by   -   November 3, 2014
A new foldable actuator has been successfully used to fly a MAV.
A new foldable actuator has been successfully used to fly a MAV.

Traditionally, many key robot components (including sensors and actuators) are rigid, and this makes it difficult for researchers and industry to make them truly compliant with their surroundings. It’s with this problem in mind that a team from NCCR Robotics in the Laboratory of Intelligent Systems and the Microsystems for Space Technologies Laboratory both at EPFL in Switzerland have developed a new soft actuator that enables robots to fold.

Jonathan Cheseaux, a masters student from Switzerland, has developed a system can detect and localize any WiFi device by sniffing its WiFi packets.


VIDEO UPDATE 06/13 It’s June 2014 and all eyes are on Brazil. If you’re a football fan then June 12th is the day you’ve been waiting for, but eagle-eyed technophiles are likely to have noticed one very exciting addition to the opening ceremony.

by   -   December 20, 2013

From the LASA Lab at EPFL:
The LASA robots secretly team up with Santa to organize the Christmas gifts! Happy Holidays from the Learning Algorithms and Systems Lab!

by   -   November 1, 2013

ShanghAI Lectures logoAnimal locomotion control is in a large part based on central pattern generators (CPGs), which are neural networks capable of producing complex rhythmic patterns while being activated and modulated by relatively simple control signals. These networks are located in the spinal cord for vertebrate animals. In this talk, I will present how mathematical models and robots can be used as tools to get a better understanding of the functioning of these circuits. In particular I will present how we model CPGs of lower vertebrates (lamprey and salamander) using systems of coupled oscillators, and how we test the CPG models on board of amphibious robots, such as a new salamander-like robot capable of swimming and walking. I will also show how the concept of CPGs implemented as coupled oscillators can be a useful control paradigm for various types of articulated robots from snake to humanoid robots.

by   -   October 30, 2013


Gimball is a flying robot that survives collisions. It weighs just 370g for 34cm in diameter. Photo credit: A. Herzog, EPFL.

Generally, flying robots are programmed to avoid obstacles, which is far from easy in cluttered environments. At the Laboratory of Intelligent Systems, we think that flying robots should be able to physically interact with their surroundings. Take insects: they often collide with obstacles and continue flying afterwards. We thus designed GimBall, a flying robot that can collide with objects seamlessly. Thanks to a passively rotating spherical cage, it remains stable even after taking hits from all sides. This approach enables GimBall to fly in the most difficult places without complex sensors.

The Deployable Air Land Exploration Robot (DALER) uses its own wings to crawl and roll over a variety of terrains. Using a self-adjusting structure to transform its wings into rotating arms, the robot is able to flip, rotate and navigate its way around and over obstacles on the ground. Sharing the wings across different modes of locomotion reduces the amount of infrastructure and weight the robot must carry, thus improving flight performance. The ability to adapt to a variety of environments is important in search and rescue operations, where both air and ground searching may be required.


Flies have small brains that would not be able to process high-resolution images such as those that we see with our own eyes. Instead, they’ve perfected the use of compound eyes, composed of a dense mosaic of tiny eye-like structures called ommatidia. Each ommatidium consists of a microlense that focuses light from a specific section of the insect’s field of view onto an independent set of photoreceptors. Think of it as having many low-resolution cameras pointing in different directions. The result is a vision system with low spatial resolution (i.e. it can’t see details), but a wide field of view (i.e. it can see all around). By comparing information across the different ommatidia, flies can extract temporal information useful for detecting motion. This motion information, also called optic flow, is what allows flies to navigate, take-off, land and avoid obstacles while using very little processing power.


The Airburr, a light-weight flying robot from the Laboratory of Intelligent Systems (my PhD lab) at EPFL, was designed to fly in cluttered environments. Unlike most flying robot, which avoid contact at all cost, the Airburr interacts with its environment to navigate. Just like you might trail your hand along a wall to find your way in the dark, the robot can bounce of walls or follow them without crashing to the ground.


This concludes the ShanghAI Lecture series of 2012. After a wrap-up of the class, we announce the winners of the EmbedIT and NAO competitions and end with an outlook of the future of the ShanghAI Lectures.

Then there are three guest lectures: Tamás Haidegger (Budapest University of Technology and Economics) on surgical robots, Aude Billard (EPFL) on how the body shapes the way we move (and how humans can shape the way robots move), and Jamie Paik (EPFL) on soft robotics.

by   -   April 24, 2013


The 6th Annual Festival Robotique in Lausanne, Switzerland drew a record number of visitors this past Saturday, making it one of the largest public science events in the country.
by   -   April 16, 2013


In this 8th part of the ShanghAI Lecture series, Rolf Pfeifer looks into differences between human and computer memory and shows several types of “memories”. In the first guest lecture, Vera Zabotkina (Russian State University for the Humanities) talks about cognitive modeling in linguistics; in the second guest lecture, José del R. Millán (EPFL) demonstrates a brain-computer interface.

by   -   March 30, 2013


In this sixth part of the ShanghAI Lecture series, Rolf Pfeifer introduces the topic “Artificial Evolution” and gives examples of evolutionary processes in artificial intelligence. The first guest lecture, by Francesco Mondada (EPFL) is about the use of robots in daily life; in the second guest lecture, Robert Riener (ETH Zürich) talks about rehabilitation robots.

← previous page        ·         next page →

Solid State Lidar – the 3D Camera
June 29, 2020

Are you planning to crowdfund your robot startup?

Need help spreading the word?

Join the Robohub crowdfunding page and increase the visibility of your campaign