Many researchers think that robot calibration is an issue that has been successfully resolved decades ago, but they are wrong. While the underlying theory is well established, its practical application continues to be in its infancy. This is the first of a series of posts that will provide evidence that it is still very hard for a user to get an accurate industrial robot. So hard indeed that a company in New Zealand didn’t hesitate to ask one of my postdocs fly in for help.
In the past decade, robot simulation and off-line programming software packages have become very popular. However, apparently many users still ignore the fact that industrial robots are repeatable but not accurate, at least not “by default.” When these users start machining with their robots, they suddenly realize that the actual results are quite different from the simulations.
Industrial robots have (unidirectional) position repeatability that is usually better than 0.1 mm (sometimes as good as 0.01 mm). This is due to the use of zero-backlash gearheads and high-resolution encoders. However, because it would be too expensive to machine and assemble robot parts precisely enough and because gearheads are not perfectly rigid, there are important differences between the robot’s mathematical model that is implemented in its controller and the real robot. Because of these differences, if you program a robot off-line to go to a certain position, you may observe position errors of up to a few millimetres in the real robot.
The process of finding a new mathematical model that represents the real robot more closely and leads to smaller position errors, is called robot calibration. In order to identify the parameters of the new model, sophisticated coordinate measurement equipment is needed, such as laser trackers, which are extremely expensive (more than $100,000). The cost of the required measurement equipment is probably one of the main reasons why robot calibration is still in its infancy in practice.
Robot manufactures (at least ABB, FANUC, KUKA and MOTOMAN) offer factory robot calibration as a relatively inexpensive option. I’ll examine the performance of these options in a different post, so let’s focus now on those users who forget to purchase such an option.
This happened to a company based in Auckland, New Zealand, a two-hour drive from Hobbiton (the famous village from Lord of the Rings movies). The company manufactures marina products through robot milling but observed important errors in the surface quality of their machined parts. It uses an ABB IRB 6640-130/3.2 robot, which has 130 kg payload, 3.2 m reach, 0.050 mm repeatability, and… was not factory calibrated. On top of that, the robot is placed on a linear track, which introduces yet another source of errors.
After weighing all options (such as shipping the robot back to Sweden), the company called us and asked for help. Not only local expertise in robot calibration was unavailable, but even a laser tracker was nowhere to be found nearby. And so, last January, one of my Ph.D. graduates, Albert Nubiola (pictured above in Hobbiton), armed with a pair of flip-flops and a C-Track, headed to New Zealand.
The C-Track is a photogrammetry-based optical CMM developed by Creaform. It is a bit less accurate than a laser tracker (volumetric accuracy is 0.065 mm) and has a smaller measurement volume, but is way more compact and less than half the price of a laser tracker. Furthermore, it can measure the pose of the robot’s end-effector (not just the position) and (if properly used) is not influenced by vibrations or air currents.
The robot calibration solution was delivered using RoboDK. The latter is a software tool for robot simulation and off-line programming (OLP), launched recently as a spin-off from our lab. This simulation and OLP platform is highly versatile and can deploy the calibration methods developed in our lab. It currently supports more than 200 robots from 10 different manufacturers.
The robot calibration can be performed in 4 steps:
The measurement acquisition is fully automated by RoboDK and the calibration can be performed in less than an hour using the C-Track or a FARO laser tracker (as in this demonstration, where MATLAB was used instead).
The ABB robot’s mean position error was improved from 3.443 mm to 0.765 mm, as validated in 315 arbitrary robot configurations. The robot position accuracy was also validated by milling a series of slots equally spaced and machined at different tool orientations. After calibration, machined distance errors between slots were reduced by half or better.
Once the robot is calibrated, the parameters of the new mathematical model can be taken into account by RoboDK and the offline simulator platform directly generates accurate robot programs for milling. Although filtering programs is also supported by RoboDK, the robot milling programs can be automatically generated accurately from a CAM file (such as G-code or APT files). RoboDK’s path solver automatically manages 6-axis robots to make them behave like a 5-axis CNC for milling. This methodology can also be used for welding and painting applications. It is also possible to simulate and generate robot programs through Python. RoboDK automatically handles brand-specific syntax.
Of course, our lab is not the only organization offering robot calibration services. Dynalog, for example, has been operating exclusively in this field for 25 years. However, our specialty is the development of robot calibration methods based on novel low-cost high-accuracy devices. Perhaps our most innovative device and method is the one based on the use of a single telescoping ballbar from Renishaw. This method has already been implemented in AV&R Aerospace for FANUC robots.
Do you still think your robots are accurate?