Vision sensor capable of detecting moving spots 0.05mm in size across from distance of 2m

26 November 2012

share this:

Technos has introduced the Super5000K 7K Model neuro-visual sensor. This is the world’s highest precision visual inspection system, and can detect spots 0.05mm in size at a distance of 2m, with a 1.4m field of vision. This is 1,000 times the resolving power of a conventional 4,000 pixel line sensor CCD camera, and 4,000 times that of a full HD camera.

“For industrial applications, this product has recently been used in many automotive-related situations, but it is also used in the steel, electronics, and flat-panel display industries. Typical manufacturing processes end with a visual inspection; we can fully automate that. Our sensor has high speed and high precision. It operates on the same principles as the human eye, but with 100 times the precision. It can perform automatic inspections with 1,000 times the precision of conventional CCDs. In principle, it emulates the movement of cells in the human eye; your eye vibrates up and down 80 times per second, and we emulate that vibration using electronic circuits.”

Conventional CCD-based equipment has had difficulty detecting color variations. By applying the principles of human vision, Technos has achieved a sensor with 100 times the precision of human eyes, making it possible to detect color variation.

“Color variation is a problem in a variety of settings; this sensor is used in industrial applications, but is also used in maintenance applications such as inspecting highways or oil storage tanks. This technology for rapidly picking up small details will be even more widely used in the future, and we plan to develop those applications.”

Pricing for the minimum configuration starts at an equivalent of $240,000, and runs up to between $470,000 and $730,000, depending on the particular specifications. Technos estimates that about 240 companies listed on the first section of the Tokyo Stock Exchange will need equipment with this level of precision. The company aims to sell about 10 units per year to the automobile, steel, semiconductor, and liquid crystal industries. Technos has been awarded patents in 14 countries around the world. With inquiries coming from foreign countries, it is looking to expand overseas as well.

tags: ,

DigInfo TV is a Tokyo-based online video news platform dedicated to producing original coverage of cutting edge technology, research and products from Japan.
DigInfo TV is a Tokyo-based online video news platform dedicated to producing original coverage of cutting edge technology, research and products from Japan.

Related posts :

Study: Automation drives income inequality

New data suggest most of the growth in the wage gap since 1980 comes from automation displacing less-educated workers.
27 November 2022, by

Flocks of assembler robots show potential for making larger structures

Researchers make progress toward groups of robots that could build almost anything, including buildings, vehicles, and even bigger robots.
25 November 2022, by

Holiday robot wishlist for/from Women in Robotics

Are you looking for a gift for the women in robotics in your life? Or the up and coming women in robotics in your family? Perhaps these suggestions from our not-for-profit Women in Robotics organization will inspire!
24 November 2022, by and

TRINITY, the European network for Agile Manufacturing

The Trinity project is the magnet that connects every segment of agile with everyone involved, creating a network that supports people, organisations, production and processes.
20 November 2022, by

Fighting tumours with magnetic bacteria

Researchers at ETH Zurich are planning to use magnetic bacteria to fight cancerous tumours. They have now found a way for these microorganisms to effectively cross blood vessel walls and subsequently colonise a tumour.
19 November 2022, by

Combating climate change with a soft robotics fish

We have fabricated a 3D printed, cable-actuated wave spring tail made from soft materials that can drive a small robot fish.
17 November 2022, by

©2021 - ROBOTS Association


©2021 - ROBOTS Association