Robohub.org
 

Plug-and-play artificial compound eye for robotic applications

by
20 May 2013



share this:
compound_eye

Flies have small brains that would not be able to process high-resolution images such as those that we see with our own eyes. Instead, they’ve perfected the use of compound eyes, composed of a dense mosaic of tiny eye-like structures called ommatidia. Each ommatidium consists of a microlense that focuses light from a specific section of the insect’s field of view onto an independent set of photoreceptors. Think of it as having many low-resolution cameras pointing in different directions. The result is a vision system with low spatial resolution (i.e. it can’t see details), but a wide field of view (i.e. it can see all around). By comparing information across the different ommatidia, flies can extract temporal information useful for detecting motion. This motion information, also called optic flow, is what allows flies to navigate, take-off, land and avoid obstacles while using very little processing power.

Inspired by the fly’s vision system, the Curved Artificial Compound Eye (CurvACE) published today in the prestigious journal PNAS can enable a large range of applications that require motion detection using a small plug-and-play device. As shown in the video below, you could use these sensors to control small robots navigating an environment, even in the dark, or equip a small autonomous flying robot with limited payload. Other applications include home automation, surveillance, medical instruments, prosthetic devices, and smart clothing.

The artificial compound eye features a panoramic, hemispherical field of view with a resolution identical to that of the fruitfly in less than 1 mm thickness. Additionally, it can extract images 3 times faster than a fruitfly, and includes neuromorphic photoreceptors that allow motion perception in a wide range of environments from a sunny day to moon light. To build the sensors, the researchers align an array of microlenses, an array of photodetectors, and a flexible PCB that mechanically supports and electrically connects the ensemble. The panoramic field of view is provided by dicing the rigid parts of the ommatidia, thereby allowing the mechanical bending of the sensor. The necessary components for signal readout and processing are embedded in the curvature of the sensor.
fabrication

CurvACE is a European project bringing together the Laboratory of Intelligent Systems in EPFL (Switzerland), the Laboratory of Biorobotics in the University of Aix-Marseille (France), the Fraunhofer Institute of Applied Optics and Precision Engineering (Germany), and the Laboratory of Cognitive Sciences in the University of Tübingen (Germany).

Don’t miss the next ROBOTS podcast for my interview with the researchers behind this new artificial compound eye.



tags: , , , , , , ,


Sabine Hauert is President of Robohub and Associate Professor at the Bristol Robotics Laboratory
Sabine Hauert is President of Robohub and Associate Professor at the Bristol Robotics Laboratory





Related posts :



How do we control robots on the moon?

In the future, we imagine that teams of robots will explore and develop the surface of nearby planets, moons and asteroids - taking samples, building structures, deploying instruments.
25 September 2022, by , and

Have a say on these robotics solutions before they enter the market!

We have gathered robots which are being developed right now or have just entered the market. We have set these up in a survey style consultation.
24 September 2022, by

Shelf-stocking robots with independent movement

A robot that helps store employees by moving independently through the supermarket and shelving products. According to cognitive robotics researcher Carlos Hernández Corbato, this may be possible in the future. If we engineer the unexpected.
23 September 2022, by

RoboCup humanoid league: Interview with Jasper Güldenstein

We talked to Jasper Güldenstein about how teams transferred developments from the virtual humanoid league to the real-world league.
20 September 2022, by and

Integrated Task and Motion Planning (TAMP) in robotics

In this post we will explore a few things that differentiate TAMP from “plain” task planning, and dive into some detailed examples with the pyrobosim and PDDLStream software tools.
16 September 2022, by
ep.

360

podcast

Building Communities Around AI in Africa, with Benjamin Rosman

Deep Learning Indaba is an organization that empowers and builds communities around Artificial Intelligence and Machine Learning across Africa. Benjamin Rosman dives into how Deep Learning Indaba is impacting these communities.
14 September 2022, by





©2021 - ROBOTS Association


 












©2021 - ROBOTS Association