« Back To Robotics Industry Insights
Robotic Industries Association Logo

Member Since 1900


RIA has transformed into the Association for Advancing Automation, the leading global automation trade association of the robotics, machine vision, motion control, and industrial AI industries.

Content Filed Under:



Companies Tweak Vision Software to Robotic Purposes

POSTED 11/01/2002  | By: Winn Hardin, Contributing Editor

Whether it is for part inspection, robot guidance or as a robot cell programming aid, robots are increasingly using automated vision systems to expand their application base thanks to efforts to (1) simplify the robot/vision human-machine interfaces (HMI) and (2) improve communication.

‘‘Machine vision is much easier to use today, more accurate and precise than in the past, so we’ve been making more use of it—switching to vision from hard sensors for various applications. In the past, slight changes in lighting could drastically effect your vision systems, but improvements to the algorithms and overall vision systems have made it less of a factor,’‘ explained Steven Krotzer, applications engineering manager for Staubli Robotics (Faverges, France).

Growing Inspection and Guidance
As each manufacturing line expands to perform a greater variety of assembly processes with a commensurate number of parts, part orientation to the robot and the impact of part orientation on line speed is helping to push the need for intelligent, flexible robot guidance.

‘‘We’re seeing an increase in the use of vision for guidance as well as inspection,’‘ said Robin Schmidt, engineering specialist at Nachi Robotics (Novi, MI). ‘‘DaimlerChrysler is our biggest customer. They’re using vision increasingly in stamping facilities for incoming part qualification and guidance in order to pickup loosely fixtured parts and enter them into the manufacturing process…. In a stamping environment, you might have a rack of 200 parts nested very tightly. At the beginning of the rack, parts are consistent, but at the back of the rack, they move, slide and twist. You need 3D guidance to pick up the part and move it.’‘

According to Schmidt, Nachi’s original success at DaimlerChrysler has prompted ‘‘50 to 60’‘ other opportunities in stamping and similar applications. A big part of that success, Schmidt said, is due to the simplified HMI in Shafi Inc.’s (Brighton, MI) RELIABOT software. Nachi uses Cognex (Natick, MA) In-sight vision sensors and PatMax software library with the RELIABOT front end.

‘‘We’ve seen a jump in vision use after going to a nice user front end that allows shop floor people maintain and identify problems in the vision systems. Historically, it’s been very complicated. You had to have an engineer do the maintenance on the system, but Shafi’s user interface is more convenient for the operators,’‘ Schmidt said. Shafi’s RELIABOT software has been ported to Cognex vision systems and Adept and Staubli equipment.

Reducing the Programming Concern
Simplified HMIs that bring together vision and robot controls require specialized knowledge of both disciplines, something that not all integrators possess. Epson Robotics’ (Carson, CA) Vision Guide 3.0 robot guidance software uses Matrox (Dorval, Quebec, Canada) image processing boards and MIL image processing algorithm library.

‘‘So we’re using the best hardware with our GUI for ease of use,’‘ said Epson’s application engineering manager for robotics, Phil Baratti. ‘‘With our system, you spend time doing the application, not trying to program a vision system.’‘

In the past, Baratti said that integrators would build a one-off system, spending most of their time writing code to interface and handshake between the motion control system robot and the vision package. The process was complicated by a lack of common reference. Vision systems work with a linear coordinate system, while robots use a physical, or real world coordinate system. SCARA robots, with their servos, illustrate the problem because servo based robots work in circular motions, while vision systems think in X, Y coordinates. ‘‘Doing that seamlessly is the real key to a total guidance package,’‘ Baratti said.

After the systems are fully integrated behind the HMI, Baratti added that customers could now take full benefit of machine vision capabilities, adding inspection tasks to a robot procedure for additional quality checks.

‘‘We have a new geometric pattern matching tool that not only works for guidance, but …can check to make sure the part is viable and properly packaged…Customers can use the tool to check the parts as they come to the robot and see how the parts change from one vendor to another,’‘ Baratti said, indicating this gives the customer greater control over its supply chain and quality control.

Granularity is Key
To Braintech Inc. (North Vancouver, BC, Canada), finding the right mix of granular software components and high level routines is critical to building a successful vision guided robot (VGR) platform.

‘‘We used to go through the process of developing one-off software solutions [for robot guidance], but it was very expensive and error prone because each application was different,’‘ said Barati. Changes to the process or the conditions on the [production] line were very difficult without destabilizing the entire application. Training was a problem for everyone – integrators and customers – because each application would have a different interface and behavior,’‘ explained Braintech’s president, Babak Habibi. ‘‘Documentation was the same headache.’‘

Instead of using visual basic to call a proprietary executable file, Braintech combines granular image processing functions such as binarization, thresholding and edge detection together with robot motion control routines to build modular software components specific to robot guidance.

‘‘It incorporates not just machine vision, but principles and algorithms for 3D geometry, manufacturing process principles and other disciplines. Contrast that with general purpose machine vision which is primarily focused on image processing,’‘ Habibi said.

Details Guarantee Success
Braintech’s E Vision Factory (EVF), a communication protocol that allows a company or robot integrator to essentially create its own call center, is another example of the way vision is changing robot applications. EVF allows integrators to create their own installation specific knowledge bases and remotely access and control the robot cell without opening special holes in the end users firewall.

‘‘IT departments don’t want to open holes in their security because they really don’t have as much stake in the manufacturing side of the operation. We were lucky we had a lot of people working in the networking area when we started this development, and they managed to crack the problem of sending and receiving data through the corporate firewalls without having to open ports in addition to the standard port 80 for TCP traffic…using standard Microsoft Web and XML standards.’‘

Details like communication and intuitive HMIs that make training machines and people easier will prove the value of a new technology to customers more than an extra 5 cycles or 5 kg will ever do. With the ubiquity and power of desktop computing, vision systems and robotic controllers have become far more accessible to a growing number of applications. Whether it’s a completely new software class for vision guided robotics, remote access software that improves customer relations, or something as small as Staubli replacing photodetector sensors with automated cameras to speed the programming of a robotic cell, the repetitive success of vision and robotics will deliver more converts to both technologies.

Braintech's unique Single Camera 3D technology uses a single CCD camera to guide the robot to transfer and position the engine head on the engine block.