Industry Insights
3D Machine Vision: Flexibility is the Name of the Game
POSTED 03/11/2004 | By: Winn Hardin, Contributing Editor
Business success is short lived. Today’s success prompts tomorrow’s question: namely, what have you done for me today? The machine vision industry, unfortunately, is no exception.
Today, simple-to-use smart cameras are one example of the growing ubiquity of machine vision. These systems pack sufficient functionality to meet the majority of standard automated optical inspection (AOI) applications in an easy-to-use package. These applications include 2D inspection, measurement and quality assurance processes. However, the real world is multidimensional. Bolstered by the success of machine vision in a 2D world, customers are not asking for systems that mimic parts of human vision – but demanding systems capable of dealing with our entire 3D world. Robotic guidance is one application driving these developments. At the same time, the familiar refrain of the 2D world – make it faster and simpler! – is driving development of 3D technologies, including both standard imaging approaches and laser based scanning systems.
As this story will show, machine vision engineers are always ready to answer the question: What have you done for me today?
Robots versus hard tooling
One application that goes hand-in-hand with enhanced 3D vision systems is robotic guidance. According to Walt Pastorius, technical and marketing advisor for LMI Technology (Delta, B.C.), robot guidance is more than just using a sensor to determine the location of a part and provide feedback to a robot controller; these systems provide both guidance and inspection at the same time.
‘‘In Europe’s automotive industry, for example, most production lines have more than one model,’‘ Pastorius said. The same is becoming true in North America, Pastorius added. ‘‘Production lines are becoming more flexible to accommodate more than one product and hard tooled fixtures simply cannot provide the necessarily flexibility. Four robots can examine any car body and reach all points. The measurements are put into a common coordinate frame that all reference the same zero point. The measurements are compared to the CAD design, and the vehicle should have the same geometry as the common reference frame,’‘ Pastorius explained.
LMI Systems produces an optical inspection system that places a laser scanner at the end of a robotic arm to create a work cell that determines 3D object coordinates while performing. Filtering out ambient light is critical to a high-speed laser scanning 3D inspection, Pastorius explained. ‘‘The sensors are designed to minimize effects of ambient light on sensor performance. This is particularly important for robot-mounted sensors, where the robot can move the sensor to anyplace in the work envelope in essentially any orientation, potentially causing changes in ambient lighting at the sensor’‘ said Pastorius.
LMI Systems has built a specially designed sensor housing with CCD camera -- complete with notch filters and enclosure to eliminate the influence of ambient light. The sensor acquires images of the laser light as it reflects off the object under test. Distortions in the laser line reveal Z, or offset values for all points along the laser line.
High resolution robotics
Robot manufacturers are also realizing that for manufacturing processes with extremely tight tolerances, perhaps at limits of the robot’s minimum repeatability specification (a few millimeters) or less, thermal influences may need to be compensated by the vision guidance system. ‘‘Temperature compensation is typically done by placing known artifacts or targets in the robots’ workspace. These artifacts are measured from time to time, often during idle time, such as when parts are being transferred in and out of the workspace, and software uses this information to compensate for thermal growth. It’s automated, so the user doesn’t need to be involved,’‘ said Pastorius, In some cases, where guidance or measurement applications require only ‘‘relative’‘ information, such as how far is one part from another, thermal expansion of the robot is not important, Pastorius added.
Smart Cameras Provide Easy Installation and Lower Cost
In another nod to flexibility, manufacturers are combining laser line scanners – a 3D vision technology – with smart cameras. Smart cameras are self-contained units that include the imager as well as the ‘‘intelligence’‘ and related I/O capabilities.
According to Francois Martin, an engineer in the Industrial Vision and 3D Sensors Division of INO (Sainte-Foy, Québec), his company’s Smart Laser Profiler (SLP) is enabling a multitude of 3D inspections in wood and other manufacturing industries. For the measurement of sawdust on a conveyor, INO’s SLP utilizes a smart camera with embedded digital signal processor (DSP) to compare the profiles from the empty conveyor to the presence of a conveyor covered in sawdust. The surface between the two curves multiplied with the speed of the conveyor provides the sawdust production rate. ‘‘Once the application is configured, the system is ready to go. A traditional system would require a PC, a frame grabber and a considerable amount of programming,’‘ said Martin.
Laser Profiling Captures High-Speed Objects
Another use for the SLP is the inspection of wood flooring. ‘‘ INO developed a system that utilizes two SLPs communicating results to a PC with a touch screen through an Ethernet link. ‘‘Profile analysis is performed on the SLPs and requires very simple programming,’‘ Martin added.
The SLP can provide up to 200 lines/profiles per second, each containing 640 data points. All electronic and optical components are contained in a single housing, and a PC is required for the configuration process only. The fastest laser profiling sensors available from INO can capture 900 lines/profiles per second.
For the application in the flooring industry, where tongue-in-groove notches on either side of the board need to be measured for accuracy, INO built a system to inspect side edges of flooring using two SLPs (on either side of a high-speed conveyer line). ‘‘We expect the precision of the measurements to be in the range from ±0.001 to ±0.003 inches,’‘ Martin said.
He added that by using technologies such as smart cameras and laser profilers, ‘‘higher speed is coupled with the high resolution. While the object is moving, a more dense profile capture requires a higher capture rate. For the wood industry, for example, it is necessary to capture small defects like knots and cracks while the piece of wood is moving quite fast, motivating the need for high-speed profilers,’‘ Martin said, adding that this trend towards high-speed 3D inspection that is likely to continue across multiple industries.
LEDs: Are They the Future of 3D Lighting?
The emergence of light emitting diode (LED) illuminators is also adding to the arsenal of 3D inspection technologies, according to Kevin Harding, senior optical engineer for General Electric (Schenectady, NY). LEDs, unlike traditional light bulbs, have very good lifetimes and don’t put out a lot of heat, which can be important in an industrial setting where heat can impact the performance of robots and other equipment, Harding said. And of course, LEDs offer lifetimes as much as 10 times longer than a standard light bulb.
In addition to providing a cost-savings through more efficient conversion of electricity to light with less heat generation, said Harding, machine visioning has benefited from LEDs in another way: LED light configurations can provide more structured light to fully illuminate an entire object. They also can generate specific light colors without the use of filters or coatings.
‘‘Structured light is a method by which the part to be measured is illuminated with a pattern of light, such as lines of light, grid patterns or the like,’‘ said Harding. By viewing this pattern on the parts, 3D data is obtained by triangulation. This method of obtaining 3D data has been the most commonly used in recent years, said Harding, who added, ‘‘LEDs offer the possibility of generating more sophisticated patterns of light [compared to traditional lamps.]’‘
Most triangulation gages today use lasers, Harding said. When a laser beam strikes an opaque, rough surface, the microstructure of the surface can act like a range of small mirrors, all pointing in different directions.
‘‘These micro-mirrors may reflect the light off in a particular direction, as generally machine marks do, or may direct the light along the surface of the part. Depending on how random or directional the pointing of these micro-mirrors may be, the apparent spot seen on the surface will not be a direct representation of the light beam as incident on the part surface,’‘ Harding explained. The result of this type of laser reflection or ‘‘speckle’‘ is a noisy signal from some surfaces. That’s not a problem with LEDs, he added. LED vision systems can be used ‘‘to view a shiny part, a dull part, a bare metal part. You can acquire more data from more points,’‘ said Harding. Plus, said Harding, LEDs offer a lower cost than lasers, and hence provide for a lower-cost 3D system.
As illustrated above, machine vision continues to leverage technologies developed for consumer and other purposes for the benefit of advanced automation and improved manufacturing. ‘‘Computing systems or PCs, lighting systems, vision systems or cameras, all are becoming a faster, better, flexible and more affordable,’‘ Harding noted ‘‘The changes are driven by the needs of consumers, but benefit 3D machine vision.’‘