Robohub.org
 

Inroads in perception


by
22 August 2014



share this:

Google’s recent acquisition of Emu Messenger is just one of many items in recent news about improvements in perception and artificial intelligence (AI).Emu, not to be confused with the Australian ostrich-like bird nor the European Monetary Union, is a small Palo Alto start-up comprised of some serious software talent with experience in machine learning, natural language processing and mashing up different data, databases and systems at Siri, Apple, AOL and Google. Perception in this case is the feeling that your words and intentions are understood and acted upon. No financial details were disclosed about the acquisition however the Emu app will be shut down next week.

Another form of perception, computer eyesight (AKA machine vision), took a giant step forward when the winners were announced for this year’s Large Scale Visual Recognition Challenge. The NY Times reported the winners: the National University of Singapore, the Oxford University, Adobe Systems, the Center for Intelligent Perception and Computing at the Chinese Academy of Sciences, as well as Google in two separate categories.

Fei-Fei Li, director of the Stanford Artificial Intelligence Laboratory said, “What really excites us is that performance has taken a huge leap.”

Machine vision has been a challenge in automation and robotics but in recent years has become integral in countless applications including computer gaming, medical diagnosis and factory robotics. Carmakers too have added the ability to recognize pedestrians and bicyclists and trigger safety actions. Factory robots need much improved perception systems in order to monitor the progress of their tasks and the tasks of those around them. [A quick Google Images search produces a collage of various uses, from milking to adaptive cruise control.]

Enhanced algorithms, data libraries and faster and cheaper computing are all contributing to the increased accuracy and speed of the systems recognizing objects and identifying them by type and in 3D space, nevertheless, at their best they are still no match for human vision and perception.



tags: , , , , , ,


Frank Tobe is the owner and publisher of The Robot Report, and is also a panel member for Robohub's Robotics by Invitation series.
Frank Tobe is the owner and publisher of The Robot Report, and is also a panel member for Robohub's Robotics by Invitation series.

            AUAI is supported by:



Subscribe to Robohub newsletter on substack



Related posts :

How to teach the same skill to different robots

  11 May 2026
A new framework to teach a skill to robots with different mechanical designs, allowing them to carry out the same task without rewriting code for each.

Robot Talk Episode 155 – Making aerial robots smarter, with Melissa Greeff

  08 May 2026
In the latest episode of the Robot Talk podcast, Claire chatted to Melissa Greeff from Queen's University about autonomous navigation and learning for drones.

New understanding of insect flight points way to stable flapping-wing robots

  07 May 2026
The way bugs and birds flap their wings may look effortless, but the dynamics that keep them aloft are dizzyingly complex and difficult to quantify.

Robotically assembled building blocks could make construction more efficient and sustainable

  05 May 2026
Research suggests constructing a simple building from interlocking subunits should be mechanically feasible and have a much smaller carbon footprint.

Robot Talk Episode 154 – Visual navigation in insects and robots, with Andrew Philippides

  01 May 2026
In the latest episode of the Robot Talk podcast, Claire chatted to Andrew Philippides from the University of Sussex about what we can learn from ants and bees to improve robot navigation.

Ultralightweight sonar plus AI lets tiny drones navigate like bats

  29 Apr 2026
Researchers develop ultrasound-based perception system inspired by bat echolocation.

Gradient-based planning for world models at longer horizons

  28 Apr 2026
What were the problems that motivated this project and what was the approach to address them?

Robot Talk Episode 153 – Origami-inspired robots, with Chenying Liu

  24 Apr 2026
In the latest episode of the Robot Talk podcast, Claire chatted to Chenying Liu from University of Oxford about how a robot's physical form can actively contribute to sensing, processing, decision-making, and movement.



AUAI is supported by:







Subscribe to Robohub newsletter on substack




 















©2026.02 - Association for the Understanding of Artificial Intelligence