Robohub.org
 

Inroads in perception

by
22 August 2014



share this:

Google’s recent acquisition of Emu Messenger is just one of many items in recent news about improvements in perception and artificial intelligence (AI).Emu, not to be confused with the Australian ostrich-like bird nor the European Monetary Union, is a small Palo Alto start-up comprised of some serious software talent with experience in machine learning, natural language processing and mashing up different data, databases and systems at Siri, Apple, AOL and Google. Perception in this case is the feeling that your words and intentions are understood and acted upon. No financial details were disclosed about the acquisition however the Emu app will be shut down next week.

Another form of perception, computer eyesight (AKA machine vision), took a giant step forward when the winners were announced for this year’s Large Scale Visual Recognition Challenge. The NY Times reported the winners: the National University of Singapore, the Oxford University, Adobe Systems, the Center for Intelligent Perception and Computing at the Chinese Academy of Sciences, as well as Google in two separate categories.

Fei-Fei Li, director of the Stanford Artificial Intelligence Laboratory said, “What really excites us is that performance has taken a huge leap.”

Machine vision has been a challenge in automation and robotics but in recent years has become integral in countless applications including computer gaming, medical diagnosis and factory robotics. Carmakers too have added the ability to recognize pedestrians and bicyclists and trigger safety actions. Factory robots need much improved perception systems in order to monitor the progress of their tasks and the tasks of those around them. [A quick Google Images search produces a collage of various uses, from milking to adaptive cruise control.]

Enhanced algorithms, data libraries and faster and cheaper computing are all contributing to the increased accuracy and speed of the systems recognizing objects and identifying them by type and in 3D space, nevertheless, at their best they are still no match for human vision and perception.



tags: , , , , , , ,


Frank Tobe is the owner and publisher of The Robot Report, and is also a panel member for Robohub's Robotics by Invitation series.
Frank Tobe is the owner and publisher of The Robot Report, and is also a panel member for Robohub's Robotics by Invitation series.





Related posts :



Robot science fiction books of 2021

2021 produced four new scifi books with good hard science underpinning their description of robots and three where there was less science but lots of interesting ideas about robots.
23 January 2022, by

How robots learn to hike

A new control approach that enables a legged robot, called ANYmal, to move quickly and robustly over difficult terrain.
20 January 2022, by

How robots and bubbles could soon help clean up underwater litter

Everyone loves to visit the seaside, whether to enjoy the physical benefits of an exhilarating swim or simply to relax on the beach and catch some sun. But these simple life affirming pleasures are easily ruined by the presence of litter, which if persistent can have a serious negative impact on both the local environment and economy. However, help is at hand to ensure the pristine nature of our coastlines.
19 January 2022, by

Maria Gini wins the 2022 ACM/SIGAI Autonomous Agents Research Award

Congratulations to Maria Gini on winning this prestigious award, recognising her research and leadership in the field of robotics and multi-agent systems.
18 January 2022, by

UN fails to agree on ‘killer robot’ ban as nations pour billions into autonomous weapons research

Given the pace of research and development in autonomous weapons, the U.N. meeting might have been the last chance to head off an arms race.
16 January 2022, by

Science Magazine robot videos 2021

A compilation of Science Magazine videos featuring robotics research that were released during last year.
14 January 2022, by





©2021 - ROBOTS Association


 












©2021 - ROBOTS Association