« Back To Vision & Imaging Industry Insights
AIA Logo

Member Since 1984

LEARN MORE

AIA - Advancing Vision + Imaging has transformed into the Association for Advancing Automation, the leading global automation trade association of the vision + imaging, robotics, motion control, and industrial AI industries.

Content Filed Under:

Industry:
N/A

Application:
Visual Inspection & Testing Visual Inspection & Testing

Machine Vision Application Analysis and Implementation -- Part 5A - Considerations in Acceptance Testing for Gaging and Part Location Applications

POSTED 12/03/2001  | By: Nello Zuech, Contributing Editor

This is the first part in the fifth of a series of articles designed to provide the framework for a successful machine vision system installation. The process described is targeted at companies that are planning the adoption of a machine vision system for the first time or that have a unique application that no one has previously attempted to implement.

As observed in Part 1, today one can find many application-specific machine vision systems for somewhat generic applications in many manufacturing industries. Purchasing these 'off-the-shelf' solutions poses little risk to any first time buyer. In some cases, one can find application-specific software developed by suppliers of general-purpose machine vision systems, imaging board/frame grabber suppliers, software suppliers or merchant system integrators. While these are not turnkey packages, the vision experience is itself less risky. Examples of these packages include: alignment, OCR/OCV, LCD/LED inspection, BGA inspection, etc.

Even less risky are the turnkey machine vision systems that are industry specific; e.g. bareboard or assembled board inspection for the electronic industry, overlay registration/critical dimension inspection for the semiconductor industry, various industry specific web scanners, systems targeted at food sorting, etc. Virtually every manufacturing industry has these systems, many of which can be identified through the resources of the Automated Imaging Association.

Table 1 - Application Analysis and Implementation

Systematic planning
Know your company
Develop a team
Develop a project profile
Develop a specification
Get information on machine vision
Determine project responsibility
Write a project plan
Issue request for proposal
Conduct vendor conference
Evaluate proposals

Conduct vendor site visits

Issue purchase order
Monitor project process
Conduct systematic buy-off

Where these 'solutions' are not the answer and a machine vision application has been identified, success requires proceeding systematically and not treating the purchase as if one is purchasing a commodity item. It is not sufficient to send good and bad parts to various vendors and ask them if they can tell the difference.

The above table depicts the process that should be used as one proceeds with the deployment of a machine vision system that is uniquely defined for a company. In Part 2 we covered a process for how to assess an application's requirements with ideas of what should be included in a functional specification. In Part 3 we covered how to get information on machine vision, defining project responsibility, writing a project plan, completing a bid document and conducting vendor conference. In Part 4 we covered evaluating proposals, conducting vendor site visits, issuing a purchase order and monitoring project progress.

In this part we are covering the topic in bold - conduct systematic buy-off. However, as the article got too long, this part will deal with acceptance testing for gauging applications and the next part will deal with attribute related applications.

The first part of any systematic buy-off requires documenting that the system operates as defined by the specifications. Each of the elements of the system must be tested - switches, camera, lighting, PLCs, shift registers, etc. Testing should include checks on:

  • Confirming components being used within ratings
  • Sequence functions
  • Interlock/alarm functions
  • Data acquisition functions
  • Shut down/safety functions
  • Report functions
  • Communications functions
  • Graphics
  • Alarm actions
  • Mass storage
  • Interface testing
  • Downloading and saving
  • Password protection
  • Etc.

This stage of the testing should verify that the system and its components consistently operate as outlined in the specification. This mechanical/electrical testing should then be followed by a structured acceptance test.

The nature of the acceptance test will depend on the nature of the application. If the application involves dimensional measurements, acceptance testing is rather straightforward. Accuracy and repeatability should be demonstrated. Absolute accuracy in making measurements refers to how well the measuring instrument's (machine vision system) readings match the actual dimensions of a calibrated standard part being measured. It is the average difference between the true value and the reported value.

Ideally, the calibrated piece should be similar in shape to the objects, which the machine vision system will normally measure in production and its measurements in turn should be traceable back to the National Institute of Standards and Technology (NIST). In actuality there should be three calibration pieces, one reflecting the lowest tolerance dimensional limit, one the highest and one the nominal or center of the tolerance range. Depending on the application, this calibration piece itself could be pretty expensive. Conditions such as temperature have to be monitored when taking measurements to determine gage accuracies. A formal calibration program is the only way to ensure accuracy of a machine vision system. This is essential for achieving and maintaining an ISO 9000 certification. If the machine vision system is not properly calibrated, a systematic variation or bias may result.

In any event by making 'N' number of measurements on the known calibrated piece one can calculate accuracy as

  i = N
Accuracy = S    (T.V. - xi )
  i = 1 --> N

where T.V. refers to the true value and x the specific measurement data. Ideally accuracy should be calculated on calibrated parts representing the lowest tolerance value, nominal or middle tolerance value and highest tolerance value. Each of the calibration pieces should be measured at least ten.

Repeatability, also referred to as precision, refers to how accurately a measurement can be duplicated or repeated. Mathematically it is the standard deviation of the measurement error - how closely multiple measurements of the same feature cluster around their mean.


Repeatability = S

                                                                                                                        
                             (N-1)

In this case one should use a set of measured parts whose measurements have been made in a manner consistent with traceability to NIST. The set should include parts reflecting the full tolerance range - the lower tolerance limits on the dimensions, nominal and upper as well as in between. Ten parts could be used as the set and each part run through the machine vision system 30 times fixtured exactly as it would be during production operation. Repeatability can be calculated accordingly for each part and each dimension on the part mentioned from the above formula.

Given that most machine vision applications are on dedicated lines and although there may be more than one operator for a system, and given that no changes are made when one operator replaces the other, the issue of measurement reproducibility between operators when it comes to machine vision systems used in gaging applications goes away. However, if, when a new operator takes over the line, physical changes are made, reproducibility also has to be demonstrated. Reproducibility is the variation in the average of the measurements made by different operators using the same machine vision system when measuring identical characteristics of the same parts. Having the different operators conduct the repeatability test and assessing the distributions of the average range by operator can determine reproducibility.

Comparing the gaging variation to the total tolerance of each of the characteristics measured can be done using the following formula:

Gage capability index (GCI) = 6s gage system/total tolerance

If GCI is less than 0.1 or 10 percent of the part tolerance, one can be confident that the machine vision gaging system is not negatively affecting results.

Another concern with using machine vision for gaging applications is drift over time and temperature. Temperature drift may be due as much to the effect a change in temperature has on the part or the fixturing as it does on the machine vision system. While some may argue that changes in ambient temperature cause offsetting errors, this is not likely. Parts, fixtures and machine vision related hardware usually expand or contract at differing rates as temperature changes because they have different effective coefficients of expansion. Where dimensionally accurate and repeatable measurements are critical, temperature compensation may be required. This may require mounting temperature sensors to monitor the temperature of the part being gauged, fixtures and machine vision system.


Market Intelligence News & Insights: 

How Will Apple’s $500 Billion Investment for US Manufacturing, Education, and AI Play Out?

On February 24th, Apple announced a “commitment… to spend and invest more than $500 billion in the U.S. over the next four years”, with plans for new manufacturing plants, enhanced support for manufacturing partners via their Advanced Manufacturing Fund, and a myriad of other initiatives. As part of this plan, Apple projects 20,000 new hires, with roles focused on R&D, silicon engineering, software development, and AI/ML engineering.

Read More


 

When the tolerance ratio approaches 0.1%, temperature can be a factor in shop-floor measurement accuracies. For example, if a steel component has a nominal dimension of 3.0000 +/-0.0005', the tolerance ratio would be (0.001/3.0000) x 100 = 0.033%, which is less than 0.1% and, therefore, suspect. A 10o F temperature change will cause the dimension to vary at least 20% of the total tolerance.

Until now we have avoided the term resolution. Resolution in gaging applications is the smallest interval that can be distinguished between two measurements (this is also referred to as discrimination). To relate this to machine vision resolution requires a fundamental understanding of how a computer operates on a television image to sample and quantize the data.  Understanding what happens is relatively straight forward if one understands that the TV image is very analogous to a photograph.

The computer operating on the television image in effect samples the data in object space into a finite number of spatial (2D) data points which are called pixels.  Each pixel is assigned an address in the computer and a quantized value, which typically varies from 0 to 255. The actual number of sampled data points is going to be dictated by the camera properties, the analog to digital converter sampling rate, and the memory format of the picture buffer or frame buffer as it is called.

Today more often than not the limiting factor is the television camera that is being used.  Since most machine vision vendors today are using cameras that have solid state photo sensor arrays on the order of 500 or so by 500 or so one can make certain judgments about an application just knowing this figure and assuming each pixel is approximately square.  For example, given that the object you are viewing is going to take up a one-inch field of view, the size of the smallest piece of spatial data in object space will be on the order of 2 mils, or one inch divided by 500.  In other words, the data associated with a pixel in the computer will reflect a geographic region on the object on the order of 2 mils by 2 mils.

In the case of making dimensional measurements with a machine vision system one can consider the 500 pixels in each direction as if they were 500 marks as on a ruler.  Significantly, just as in making measurements with a ruler a person can interpolate where the edge of a feature falls within lines on a ruler, so, too, can a machine vision system.  This ability to interpolate, however, is very application dependent.  Today the claims of vision companies vary all the way from one fourth of a pixel to one tenth or one hundredth of a pixel.  For purposes of a rule of thumb, you can use one tenth of a pixel. So for a machine vision system that uses a camera with a nominal arrangement of 500 X 500 pixels and a 1/10 sub-pixel capability, one has the equivalent of 5000 markings on a 'ruler' that can be applied across a part. The size of the distance between sub-pixels is application dependent. Nevertheless, this distance is equivalent to the resolution/discrimination of the machine vision-based gauge.

What will this mean in conjunction with a dimensional measuring application?  Metrologists have used a number of rules of thumb themselves in conjunction with measuring instruments.  For example, the accuracy and repeatability of the measurement instrument itself should be ten times better than the tolerance range associated with the dimension being checked is often used. Another way to say this is that as long as the combination of inaccuracy and imprecision of the measurement instrument were less than 10% of the tolerance on the dimension be measured, the error can be disregarded.

So how does one establish what the repeatability of a vision system should be?  Given the sub-pixel capability of one tenth of a pixel mentioned above and as in the example an object that is one inch on a side, as a gauging instrument the resolution/discrimination (the smallest change in dimension detectable with the measuring instrument) associated with the machine vision system as a measuring tool would be one tenth of the smallest spatial data point or two mils or .0002'.  Repeatability will be typically +/- the resolution value or 0.0002'. 

Accuracy, which is determined by calibration against a standard, can be expected to run about the same.  Hence, the sum of accuracy and repeatability in this example would be 0.0004'.  Using the ten to one rule, the part tolerance should be no tighter than 0.004' for machine vision to be a reliable metrology tool.  In other words, if your part tolerance for this size part is on the order of +/-.005' or greater, the vision system would be suitable for making the dimensional check.

As you can see, as the parts become larger and with the same type tolerances, machine vision systems based on cameras with nominally 500 X 5 00 resolutions might not be an appropriate means for making the dimensional check.  Conversely, if the tolerances were tighter the same would be true. Fortunately today machine vision systems are available with cameras with nominally 1000 x 1000 resolution. In even more demanding applications line scan cameras can be used with motion to either move the camera over the part or the part under the camera. In the case of line scan technology, cameras are available with up to 8000 pixels. With a one part in ten sub-pixel capability this essentially yields a 'ruler' with 80,000 markings that can be applied to cover a part.

Using machine vision to perform a part location function one can expect to achieve basically the same results as making dimensional checks.  That is, most vendors whose systems are suitable for performing part location claim an ability to perform that function to a repeatability and accuracy of +/- one tenth of a pixel.  Using our example again, namely a one inch part, one would be able to use a vision system to find the position of that part to within +/-.0002'. Most part location applications involve motion or placement.

Acceptance test for a machine vision system in a part location application should be performed in a manner similar to that described for a gauging application. Placement accuracy refers to how close the location of a pattern or part feature is found relative to where it is supposed to be. It is the difference between the actual placement location and the desired placement location. Accuracy should be determined by using a standard with properties measured traceable back to NIST. Again, the system accuracy should be validated at upper and lower limits of the placement as well as at nominal. Repeatability refers to how well the actual placement returns to a taught location. Repeatability should be demonstrated on a number of parts each time removing them from any fixture arrangement and returning them to the fixture.

Methods must be established to routinely calibrate the system and validate the accuracy and repeatability of the machine vision-based gauging station or machine vision system used for part location. Calibration serves two functions: 

  1. To determine the difference, or amount of error, between the unknown and known readings
  2. To adjust the output of the machine vision system to bring it to the desired value.

This should be a standard practice whenever line changes are made, especially any that affect the optical path and/or scene of the machine vision system. Even if the line is dedicated and changes are never made, a procedure should be in place to routinely calibrate the system and demonstrate the system's accuracy and repeatability periodically. Calibration may be required once a day. If calibration suggests that nothing has changed a full-blown accuracy and repeatability test may only be required once per month. Most Quality Assurance departments have standard procedures and schedules for conducting gauging system calibration and accuracy and repeatability checks. They should always be consulted about these matters.

Having said all this about using machine vision systems for gauging applications, one has to be attentive to how the measurements are being made by the machine vision system. Unlike a contact gauge that actually provides the peak dimension over the area where the contact is being made, the edge pixel serving as the basis of a machine vision system measurement will often be microscopic and discrete. Hence, averaging the data of a number of contiguous pixels along an edge may be more like making measurements with a contact gauge. A contact gauge will generally ignore chamfers and rounded edges. A machine vision system has to be programmed to ignore those features.

Reliable performance of a machine vision system also requires attention to keeping lenses clean so debris does not affect pixel edge assignment. Ideal machine vision-based gauging applications are those in which the features to be measured can be observed in the silhouette of the object. In these cases sharply defined edges exist. Most critical gauging applications can also benefit from the use of telecentric lenses which keep feature size and shapes constant regardless of movement that may stem from vibrations, positional uncertainty, etc. between the part and the camera/optics. Similarly, collimated light arrangements can reduce the affects of light bouncing off of surface edges that might contribute to measurement errors due to misassignment of pixels to correct boundary positions. By paying attention to application details one can guaranty quality measurements with machine vision technology.


Bruce P. Shay, 'Gaging Precision,' Quality, 1988
Raymond J. Kimber, 'Accuracy and precision,' Quality, 1983
William S. Ford, 'Calibrating makes good business sense,' I & CS, January 1992
Elizabeth Clarkson and Mark Clarkson, 'A Little SPC,' Sensors, February 2000
Bill Mostia, Jr., 'Accuracy Know What You're Getting,' Control Design, April/May 1999.
Paul Sagar, 'Temperature Variations Can Crush Accuracy,' Manufacturing Engineering, March 2001
Patrick Folmar, 'The truth about placement accuracy,' Advanced Packaging, April 2000
Orlando Lopez, 'Automated Process Control Systems Verification and Validation,' Pharmaceutical Technology, September 1997.

 

 

 

= Ö Ni=1(xi- xavg)2