Assembly Robots 101
| By: Joe Campbell, Chairman, RIA Membership Committee (2006-2008)
Assembly Robots 101
by Joe Campbell
While robots are among the highest technology products on the manufacturing floor, they are essentially just another tool. Twenty years ago, cam-actuated devices were commonly accepted, then pneumatic devices, and today it's robots.
To the uninitiated, robots appear complex: They use complicated programming languages, controllers, and advanced sensors like machine vision. Plus they can carry a big price tag and require a lot of engineering to design and install the cells.
But today most of that perception is incorrect. Complex programming languages may still be used by advanced integrators and OEM's, but most of the systems can also be set up using point and click graphical user interfaces. The controllers are radically simplified and have shrunk from the size of a commercial washing machine to the size of a shoebox. Machine vision has gotten so powerful that set up is literally one-click, and advanced applications, such as using vision to pick parts off moving conveyor belts, are routine. Finally, prices have dropped dramatically.
Why use robots? Product lifecycles are shortening, while product customization options are increasing, and flexible automation supports the routine changes. Products are also shrinking, often beyond the ability of manual labor to do the assembly. Under these conditions, flexible controlled automation is often the best way to ensure that quality controls are met.
Once considered exotic and extremely difficult to implement, vision guidance, in particular, is becoming a core technology for flexible manufacturing. A few leading vision and robot vendors have responded by tightly integrating vision and motion functions at the system and application software level, and by developing more powerful and easier to use tools to locate parts. But in spite of 'one shot training' and 'point and click programming', there is still a need for the project engineer to understand the application issues and how to avoid the common mistakes that cripple some applications.
Ten Classic Pitfalls
Do a tolerance map for the entire process. It's common practice to look at the tolerance stack up on the production part, but this is typically a small problem in the overall process. Map all tolerances for parts, cameras, lenses, grippers, tooling, conveyors, and encoders. Each of these elements contributes to the machine's ability to locate a part, acquire it, and place it. And if the tolerance interaction isn't apparent, model it, test it, and research it before starting the project.
Determine how the parts and assemblies will be presented to the robot, especially if a moving conveyor is involved. The most advanced systems can fully integrate robot motion control with vision guidance and conveyor tracking to allow parts to be accurately picked from moving conveyor belts. This is an attractive feature, as it minimizes the investment in inflexible, dedicated tooling. But it is necessary to match conveyor speed with the belt encoder precision and the robot controller's processing bandwidth. Vision processing is the most CPU intensive operation in robotics, especially when coupled with the constant recalculation of setpoints and trajectories based on encoder input from a moving belt.
Remember that the shortest distance between two points may not yield the fastest cycle time. Each kinematic style, and each manufacturers individual design and tuning strategies will result in different performance outcomes throughout the work envelope.
Consider calibration. And that means everything: the robot, the vision system, and the conveyor belt. Calibration requires access and clearances in and around workcells. Be certain you understand the whole process of calibration, including what happens after servicing a camera, the robot, or a motor.
Model the gripper , in detail, and understand how it will interact with the overall part tolerances. There are a number of ways to design a gripper for an individual part, with widely varying levels of precision and ability to center and secure a part after pickup.
Don't scrimp on lighting and optics. And if you don't understand the engineering issues and physics behind the applications issues, find someone who does. There is nothing more frustrating than reviewing a precision assembly application with low-cost, low-quality lenses and inconsistent lighting. If the application allows backlighting, do it. In spite of the advances in adaptive gray-scale processing, backlighting remains more robust and more accurate than top lighting.
Understand the special lighting issues for conveyor belts. Moving parts blur during the image acquisition cycle, and for precision guidance applications require strobe lighting or shuttered cameras. That means understanding the dynamics involved and calculating the strobe or shutter duration.
Select the robot carefully. Payload and reach are obvious, but make sure it has cycle time capacity to deal with the inevitable surprises in the application. Studies show 50% of application failures are about cycle time. Also be sure to consider the robot design life. Light duty mechanisms with small bearings, undersized harmonic drives, and overworked motors may run for a while, but a straightforward engineering review will show they will fail early and often. Finally, select the controller properly, based on how well integrated the various technologies are for setup, performance, and support.
Mount the robot properly. This sure seems obvious, but inexpensive mounting hardware flexes under high power moves, and that destroys accuracy. Mounting flex is maddeningly difficult to find once an application is running, since manual or slow cycles run perfectly well, but high speed cycles fail for no apparent reason.
Consider the safety implications early in the project. Meeting CE and ANSI safety standards now requires far more than bolting on a few light curtains. The entire cell layout, access, logic, power and E-Stop circuitry must be engineered up front, with a clear understanding of the requirements and a robot controller that can fully meet the requirements.