AUTOMATICA 2016, held in June in Munich at the massive Messe tradefair facility, is a gigantic show focused on automation, mechatronics, robotics and emerging technologies as they relate to the industrial and manufacturing sectors.
To give you an idea of how enormous the Messe is and the trade show was, it has two subway stops dedicated to it: one on the east side and the other on the west. They have automated walkways to get from one side to the other because it’s so big and the distances are immense. AUTOMATICA took up the east half and a European Solar trade show took up the west.
AUTOMATICA used 6 exhibition halls; each about 120,000 sq ft; so the show took over 720,000 sq ft of space. There were 839 exhibitors from 47 countries which was attended by 45,000 visitors, a 30% increase from 2014 (it’s a biennial show), one-third of which came from foreign countries. In between each of the halls are outdoor grassy areas with food stalls, park benches and tables. It’s a terrific venue.
Managing Director Falk Senger described the themes of this year’s show, two of which are examined below:
“AUTOMATICA showed the future of production with deep insights into the possibilities of digitalization, human-robot collaboration and professional service robotics.”
In the not-too-distant future, billions of intelligent devices and machines – connected through network communications – will generate massive amounts of data. Turning that data into value is the key promise of digitalization.
As the price of sensors has come down, the emergence of the Internet of Things (IoT), and the consequent availability of streaming data from sensors of almost every type, the process, whatever it is called, has become an important driver of AI. Some high-value crops in agriculture such as grapes, are already beginning to be remotely monitored, vine by vine, though the use of these sensors.
FANUC, the world’s largest maker of industrial robots, plans to start connecting 400,000 of their installed CNC and robot systems by the end of this year. The goal is to collect data about their operations and, through the use of deep learning, improve performance and reduce downtime. Fanuc is also testing reinforcement learning as a way to train industrial robots in new tasks such as learning to grasp unfamiliar objects. A robot tries picking up objects while capturing video footage of the process. Each time it succeeds or fails, it remembers how the object looked, knowledge that is used to refine a deep learning model, or a large neural network, that controls its action. After 8+ hours it gets to 90+% accuracy which is very close to having had a human program it. Fanuc and other researchers are testing reinforcement learning as a way to simplify and speed up the programming of robots that do factory work. Google recently published details of its own research on using reinforcement learning to teach robots how to grasp objects.
When many robots work in parallel, the training time required is reduced accordingly. Similarly, Kuka is building a deep-learning AI network for their industrial robots, and ABB, in it’s power division, is working with Microsoft Cloud Services to enable ABB chargers to stream data for analysis as a precursor of a more extensive cloud analytics service for ABB’s full range of products.
“Moving away from having to program robots by hand by endowing robots to learn autonomously is a key element for the future of robotics,” says Jens Kober, an expert on robot learning at Delft University of Technology in the Netherlands. “Having robots share the information they have learned will be crucial.”
The most visible deep learning efforts are in Silicon Valley which has seen widespread investment of money and talent in start-ups in AI research by almost every university, think tank, car company, telecom giants, Baidu, Alibaba and, most recently, Toyota’s $1 billion investment in establishing the SV Toyota Research Institute headed by Gil Pratt (of DARPA’s Robotics Challenge fame). Apple, IBM, Google and Facebook have led investment in more advanced uses, but practical deep learning systems such as Aethon’s, FANUC’s and Kuka’s are also becoming prevalent in the industrial sector.
The black box concept, i.e., the storing of streamed sensor data for analysis and learning, is already a valuable tool in air safety and may soon become a mainstay in autonomously driven transportation, mobile robots and robotics in general. That data, and super-fast computer processing, are enabling deep learning engines to find and build patterns that can make the devices safer, more productive, more cost effective – and be programmed quicker. But these very same characteristics, and those created by the new era of IoT, create security and standardization problems. There is no standard architecture for streaming data from different devices – each manufacturer has their own – and the very act of streaming opens up networks to security breaches. Consequently there is a lot that needs to happen before we’ll see any widespread benefits from digitalization.
This topic is broader than just the emerging collaborative robots market spearheaded by Universal Robots and Rethink Robotics. In the US and EU the SME (small and medium-sized enterprise) movement has represented a large untapped marketplace for robotics – a marketplace to enhance worker productivity rather than replace labor.
Human-robotic collaboration also includes the movement toward the Internet of Things (IoT) described above. The telecom industry has already set up what they call the Robolliance Program to help standardize the various methods of machine to machine, autonomous methods of connecting people, places and things, but, as of today, they haven’t developed a solution. Apple just announced that iOS 10 will have a HomeKit protocol that puts control of all compatible devices in one place. But Kuka has their own protocol as does ABB, Fanuc and Yaskawa. There is no standard and each major player is offering their own as that standard. This will have to change before any true benefits can be achieved.
Every robot manufacturer at the show was showcasing their safety features enabling their robots to work safely alongside humans. But a lot was semantics and not very practical. Hence the need to differentiate the term “human-robot collaboration” from “collaborative robot” also known as “co-bot” or “cobot”, the domain of the products produced by Universal Robots and Rethink Robotics.
A driver of this interest in human-robot collaboration is what the auto companies have recently concluded: customized products and shorter delivery times are better served by humans than robots because these changes require more flexibility than existing robots can provide. Human productivity can be augmented by robotics and that’s why this topic was one of the themes as AUTOMATICA: humans and robots working together to increase productivity and increase flexibility by each doing what they can do best. Humans can see, select and grasp; robots can handle, move, process, lift and repetively process. The car companies have already come to this conclusion and are hiring workers and replacing old-style inflexible robots with either humans or humans augmented with co-bots.
Henrik Schunk, Managing Partner of Schunk, the well-known German gripper and arm maker, said:
We currently see two central trends in assembly and handling technology: First, the trend towards digitalization in the course of Industry 4.0 and consequently towards the mechanization of assembly and handling systems. And second, the trend towards human-robot collaboration, which will allow new automation scenarios.