« Back To Vision & Imaging Industry Insights
AIA Logo

Member Since 1984


AIA - Advancing Vision + Imaging has transformed into the Association for Advancing Automation, the leading global automation trade association of the vision + imaging, robotics, motion control, and industrial AI industries.

Content Filed Under:

Food & Beverage and Robotics Food & Beverage and Robotics


Vision-Guided Robotics Takes Evolutionary Steps with Revolutionary Benefits

POSTED 07/27/2016  | By: Winn Hardin, Contributing Editor

When it comes to the latest developments in vision-guided robotics (VGR) into machine vision systems, think more in terms of evolution than revolution. But before the disappointment sets in, don’t forget that evolution — making systems more robust while easing system design — is one of the most powerful forces on the planet.

"With the different vision systems and robot controllers, what has become easier to integrate is communications,” says Nick Tebeau, Manager Vision Solutions at LEONI Engineering Products & Services (Lake Orion, Michigan). In the past, integrators had to deal with DeviceNet or serial communications. Both could prove cumbersome when it came time to link vision systems with robotics. “But with Ethernet IP and ProfitNet being an Ethernet-based protocol, and with virtually every smart camera and PC-based machine vision system able to support that, it opens the floodgate for the myriad [of automation] technologies that can now marry with robots.”

The benefits of better communications can be felt across the VGR marketplace, according to David Dechow, Staff Engineer for Intelligent Robotics/Machine Vision with FANUC America Corp. (Rochester Hills, Michigan). “It means that most robots play very nicely with almost everyone’s vision system,” he says. Whether it’s a third-party private-label system offered by a robot manufacturer or one that’s developed and tightly integrated by the robot manufacturer for low-complexity applications, it matters not.

The more complex the tasks are that have to be done, the clearer the value proposition for having a plug-in solution, says Dechow. The tighter the integration between robot and vision system, the easier it is for designers to correlate the three relevant coordinate systems — robot, vision, and real world, for example. “And we could expand the ease-of-use improvements to interfacing and robot programming too. Whether the vision system was made by the robot manufacturer or a third party, the line separating machine vision guidance from the robotic system is nearly invisible [to the user],” Dechow says. “We can, for example, use the camera as a tool and guide the robot based on the center of the camera imaging versus using trained tool points, which is the normal way you guide a probe or program robot movement.”

Communications is just one of the evolutionary enhancements that have made VGR easier to design and operate, according to Steve Wardell, Director of Imaging at ATS Automation (Kitchener, Ontario, Canada). “And that’s great,” Wardell says, “because it’s always been a big integration job to take a robot and vision system and tie it all together. Successful integration is vital to achieving the kinds of returns that justify making the capital investment in the first place. It’s really reduced the amount of effort you need to develop a VGR and that’s led to more acceptance [of automated systems]. We tie a lot of systems to a robot manufacturer with custom solutions, but the system that is more tightly integrated with the robot controller often presents the greatest benefit.”

“You can certainly have templates that suggest how to do a nice 2D guidance application,” Dechow says. “But there is definitely a difference between [FANUC’s] iRVision machine vision solution and similar solutions from other robot manufacturers and third-party private-label solutions developed by machine vision companies for robot OEMs. Basically, the more complex the application, the better off you are using a fully integrated solution from the robot manufacturer.”

Integration Considerations
A number of factors make integration easier and result in a system that is easier to use, explains Dechow. For one, he cites iRVision’s depth in available vision processes and algorithms developed just for VGR applications. Consider, Dechow says, simple point-and-click menu structures and automated transformations making it possible to handle multi-camera imaging, robot arm vision, multi-frame imaging, and multiple robot user frame imaging. “These can be and ultimately will be increased in a lot of robot systems,” he says. Today, these functions are native to a product like iRVision or some of the other dedicated direct robot manufacturer vision systems.

“For example, I might want to have a single camera on the end of a robot arm, have it capture multiple images at different locations, and combine the feature offsets to get a highly accurate composite 2D or 3D position,” Dechow says. “It is a real benefit to have the robotic vision system handle all of the position transformations entirely in the background automatically, without the need for complex math and additional calibration.”

While there are benefits to using machine vision systems with a long history of solving a specific application — such as robot OEM’s vision systems for VGR — that doesn’t mean that robot OEM–supplied machine vision systems are the best for every task, even every VGR task.

“While you always want the robot to be the master of the vision system,” LEONI’s Tebeau adds, “that doesn’t mean a robot controller can handle every vision task.” Advanced 3D algorithms, for example, are more memory intensive, Tebeau says. In some cases, the best approach is to use a machine vision system with its own processing unit separate from the robot controller.

As things stand today, robot controllers set the computational priority. This means that a designer can set the priority of a process, but both things can’t run simultaneously.

“It’s not like you fully run motion and vision at the same time without having any hiccups from a single controller,” Tebeau says. “That’s why I for some applications, we suggest using a separate controller for vision for the memory-intensive applications.”

According to Wardell, a company like ATS Automation often becomes involved when a VGR system is designed to accommodate 80% of the applications but falls short for the remaining 20%. In those cases, “you still need someone with the overall optics and lighting requirements to make it a good robust system” that delivers the best image, he says. And that doesn’t necessarily come with the software offered by the robot providers, which covers primarily camera interface issues. Work still has to be done to set up the lighting and optics to get the best image, Wardell notes.

As for industrial applications, Wardell says that his company has not had a big presence in food-processing applications. “But we’re seeing more applications coming,” especially in meat handling and meat cutting. “From our perspective, it’s productivity [driven].” And since productivity drives manufacturing success, the marriage between machine vision and robotics is expected to be a long-lasting evolutionary one — regardless of who’s in charge at any given moment.