Machine Vision and Lighting
POSTED 01/06/2003 | By: Nello Zuech, Contributing Editor
As in real estate where the key to successful investments is location, location, location, in machine vision the key to value (equal to success) is LIGHTING! LIGHTING! LIGHTING! Significantly, the key to success with lighting is THE LOCATION OF THE LIGHTING! The principal reason for success in machine vision is the elimination of appearance variables and consistent appearance that appropriate, application-specific lighting yields. Unlike the early days of machine vision when many of the entrepreneurial researchers in pioneering machine vision companies suggested, ‘‘We just need an image, our image processing and analysis algorithms will work for your application,’‘ today the surviving companies acknowledge the importance of lighting and scene consistency.
Today as the industry has over 30 years of experience in applying machine vision, many ‘‘canned’‘ lighting arrangements have been developed that are appropriate for many applications that have been widely deployed in one or another manufacturing industry. Nevertheless for many applications one must still understand the application, appearance variables, geometry, surface issues, specular issues, positional variables, etc., as well as the affect of lighting sources (incandescent, fluorescent, quartz halogen, HID, Xenon, LED, etc.) and lighting arrangements (back, side, front, structured, etc.) and lighting geometry (point, diffuse, collimated, etc.) to optimize the lighting to achieve image consistency.
This article is not intended to be a ‘‘Fundamentals of Lighting Theory for Machine Vision Applications’‘ article. One can find such fundamentals in various already published articles, the book ‘‘Understanding and Applying Machine Vision’‘ published by Marcel Dekker, at various meeting forums and even the catalogs of the lighting suppliers. Rather, this article is based on insights shared by some of the major vendors of lighting for machine vision applications, in a round table format. Those that contributed to this article were:
- Harlen Houghton and John Merva – Advanced Illumination
- Jim O’Hanley – CCS America
- Mike Muehlemann – Illumination Technologies
- Peter Niedzielski and Paul Karazuba – Perkin Elmer Optoelectronics
- Joe Smith – Schott-Fostec
- Luc Many and Alain Beauregard – StockerYale
1. How would you segment the lighting technology used in machine vision?
Most answered that lighting technology was segmented by lighting type: incandescent, fluorescent, LED, quartz halogen and other halide-based lamps, lasers,
Alain Beauregard suggested an application-dependent segmentation - 2D vs. 3D and Mike Muehleman distinguished between lighting for 2D applications versus lighting for 1D applications, like web scanning. Peter Niedzielski suggested the broad classes of continuously on and pulsed.
2. How would you segment the lighting techniques used in machine vision?
Mike summarized the opinions of most, ‘‘In simple terms we look to define applications in the following order:
1) Front Lighting or Backlighting
1) Brightfield/Darkfield/Both(CDI or Dome)
3) Geometrical Structure Required to Enhance or Minimize Contrast
4) Spectral Properties Required to Enhance or Minimize Contrast
Joe Smith added under front lighting bright field coaxial and Peter Niedzielski suggested the distinction was between
- ‘‘Direct Illumination: Backlighting, Front Lighting, UV Lighting, Structured Lighting.
- Fiber-Optic Illumination: Ring lights, Line Lights, Structured Lighting, Collimated, Backlights.
- Diffuse Illumination: On Axis and Tenting.’‘
While some overlap, Jim O’Hanley suggested
‘‘This is a bit more difficult question, as there are many more techniques, and some do not apply across all technologies. Very broadly speaking, I would list the following:
- Low Angle
- Dark Field
In addition to these, you have many additional techniques or ‘‘tricks of the trade’‘ that can further enhance the image. Examples of these would be
- Sharp cut or band pass
- Color spectrum (utilizing different wavelengths to enhance or remove certain characteristics)
While John Merva added color or wavelength as another class to consider.
3. What advice regarding lighting would you give to someone investigating a machine vision application?
Harlen Houghton observes, ‘‘Start with lighting. Before choosing a camera or designing an inspection station, determine if lighting is capable of fulfilling your needs, and if so, what the best type of lighting is for your application. Choosing the lighting first eliminates headaches later when you are attempting to make a light work in restricted space or trying to overcome some other problem associated with the work environment.’‘ John suggests ‘‘research your lighting needs, by asking suppliers to provide sample images and recommendations, before you purchase anything else or finalize installation foot print.’‘
Mike makes the following points:
‘‘1) Define Geometrical Structure that drives contrast in the direction required
2) Look for Spectral properties that can enhance contrast values if possible
3) Select all technologies that can deliver the above advantages
4) Establish intensity requirements per application
5) Evaluate lifetime and stability parameters based on remaining technologies
6) Select lighting technology based on final engineering trade-offs’‘
Peter summarizes ‘‘Don’t choose a lighting method based on light measurement claims from a manufacturer; they can be used as guidelines only. The best method is actual empirical testing using the actual product. Make sure you consider lighting up-front in the application process. Do not leave it to the end. The key to successful MV application is start with a good contrast, repeatable image that is not affected by ambient light or the surroundings.’‘
4. What does one applying machine vision have to know about their application (scene) that can influence the selection of the specific lighting arrangement? Or How does one decide what lighting type/technique is most suitable for a specific application?
Peter counsels, ‘‘There are many things to consider.
- ‘‘What is the size of the part you are imaging? This will partially determine the type of lighting you will need. If you are looking at a 1-inch square part, you could use LED or fiber optic lighting. If you are looking at a 3-foot square part, you would most likely use some type of large area direct illumination system.
- Will the part be moving of static when the image is captured? If it is static you could use some form of continuous illumination. This would be most cost effective. If the part is moving a strobe source may be needed. This will help freeze the motion.
- Is it a color or black & white/gray scale application?
- What is the shape of the part? Typically you try to match the lighting technique with the shape of part. Circular or round objects are well suited to ring lights. Square or rectangular parts require direct illumination or line lights.
- What is the surface texture? Shinny/glossy, matte, diffuse, etc.
- What measurement or defect are you trying to determine? Example: If you are looking for dimensional information – back lighting is excellent. If you are looking for surface defects on a shiny surface – Oblique lighting or darkfield illumination works best.’‘
And John indicates, ‘‘Desired inspections, FOV, stand off limitations, process speed, part variations and special environmental issues all impact the lighting needed. Either you have enough experience that you understand the analysis and recommendation process or just ask the leading suppliers for help. Lighting company application engineers have much more experience than even vision co. application engineers and can help you make the right choice.’‘ To which Harlen adds ‘‘Approach the inspection environment with the understanding that a vision system requires consistency in order to function properly. Ambient lighting is a potential problem; human interaction in the area of the work site might eliminate the use of a bright light (IR, of course, could be used in this instance); other environmental factors such as space limitations, dirt, etc. ’‘
While Mike summarizes ‘‘Primary factors that influence selection is whether the object under inspection is
1) Flat or curved
2) Absorbing, transmissive or reflective
3) And the nature of the feature to be imaged in comparison with the background.’‘
5. What are the properties of lighting that can influence the selection of specific machine vision lighting arrangement?
Harlen suggests ‘‘Intensity, wavelength, speed of inspection and the ability of the light to be synchronized with the camera and production line. Working distance from light to object. Field of view.’‘ To which John adds ‘‘Geometry or structure and wavelength or color.’‘
While Jim observes ‘‘there are six common properties that the user should consider:
- Response speed
- Cost vs. performance’‘
Mike, while agreeing on geometrical structure and spectral properties as the two main components also adds ‘‘Two additional properties, polarization state and fluorescence properties, are relatively minor for the majority of machine vision applications. However, they can be used to solve inspection problems in a few strategic places where structure and spectral properties are insufficient to provide the desired contrast.’‘ And Peter adds ‘‘Continuous or strobe illumination, color temperature, intensity.’‘
Alain Many speaking from the perspective of using lasers in structured light arrangements observes ‘‘Directionality, wavelength, spectrum, speckle, intensity, source type (extended or point source).’‘
6. Does camera influence lighting? How?
Alain notes ‘‘Yes, mainly because of the camera optics (depth of field, aperture and resolution) and the actual detector (CCD or CMOS) optical properties. The Scheimpflug condition is rarely met in off-the-shelf cameras, to the detriment of the vision system’s performance.’‘
Joe comments, ‘‘Line scan vs. Area scan camera affect the lighting. The projected
lighting shape needs to match the camera array shape.’‘ Peter adds, ‘‘Different cameras are sensitive to different wavelengths, based upon the construction and architecture of the camera. For example, CCD-based cameras are most sensitive around the 580nm region. Accordingly, lighting which produces a high quantity of illumination around that wavelength allow the full capabilities of the camera to be used.’‘
Paul Karazuba suggests, ‘‘Different cameras are sensitive to different wavelengths, based upon the construction and architecture of the camera. For example, CCD-based cameras are most sensitive around the 580nm region. Accordingly, lighting which produces a high quantity of illumination around that wavelength allow the full capabilities of the camera to be used.’‘ While Mike points out ‘‘Variations in camera and camera lens can alter the effectiveness of the lighting. Spectral response of the camera, and spectral throughput of the lens can alter the contrast produced by the part under inspection. Similarly, the numerical aperture of the system can limit the effectiveness of the incoming structure. The primary problem here is that success, as measured by human eyes or another camera lens setup - does not translate to success by a secondary camera and lens system.’‘
Jim suggests, ‘‘Cameras most definitely influence lighting. CCD’s have unique spectral sensitivity characteristics, which means that they are more sensitive at some wavelengths than others. Many times this may be used to the users advantage. An example of this is a CCD camera that is sensitive in the near IR range. In a case like this, it is possible to light the target with an IR light, which is virtually invisible to a human operator. This is frequently done in the real world to eliminate the distraction of a light to the operator.
There are two new technologies that are also having an effect on lighting technology. The first is TDI (Time delay and integration) for line scan cameras such as the Piranha line from DALSA. This technology has increased the sensitivity of line scan cameras to light up to 100 times vs. traditional line scan cameras. The end result has been to finally open up line scan applications to LED technology.
The second technology that will have an effect is CMOS technology. Many camera manufacturers are beginning to introduce CMOS cameras, which although there are many advantages, have a bit of a disadvantage from a lighting perspective. Our experience with them is that they are considerably less sensitive to light (10-20% has been mentioned). This results in slightly different considerations when specifying lights for this technology.’‘
7. Do optics influence lighting? How?
John suggests, ‘‘Optics should be chosen to complement the camera sensor and lighting chosen.’‘ And Jim adds ‘‘The biggest influence optics will have on lighting is that the higher the magnification, the more light that is necessary.’‘ While Peter advises, ‘‘ABSOLUTLY, You could right an entire thesis on this topic. Light output, stability, image clarity, repeatability, etc. Do not underestimate the power of optics can have on success and failure in a system.’‘
And again from the perspective of using lasers in structured light applications Alain observes ‘‘Yes, more than the detector type. It is critical in many applications, especially with lasers and speckle noise.’‘
8. Within the last year, have you introduced something new in lighting for the machine vision market? If so, please describe and give details.
StockerYale: Thermo-electric cooled laser with high wavelength stability allowing the use of very selective filters for improved signal-to-noise ratio
Schott-Fostec: In the last year we have introduced smart LED lighting. Our LED head has light feedback and temperature control technology built right on the heads. Using these heads in conjunction with our new microprocessor based SC2100 controller provides the functionality needed to solve a wide variety of LED lighting applications.
Illumination Technologies: Introduced the 4900 light source series in 2002. The product is the first white light fiber optic source, which can be easily calibrated to NIST traceable standards in the field. The unit provides the industry's first Ethernet interface for seamless integration into plant automation systems and OEM product offerings. There is a detailed product sheet on the website.
CCS America: Introduced the HLV Series of LED Spotlights. The HLV Series are designed to supplant halogen lights, particularly for area scan, line scan, 2 D Code reading and Micro-precision markets. The HLV series combines the advantages of LED lighting (long life span, low heat generation, selectable color) with the intensity of halogen fiber lighting (300,000 lux) to open new markets for LED lighting solutions.
Advanced Illumination: More standard variations of LEDs with different colors, working distances and geometries. A robust selection of different lights is key to solving a wide variety of applications
9. What are advantages/disadvantages of fluorescent, LED, quartz halogen, HID, Xenon, etc.?
Click Here to view a table that reflects a composite of the inputs from all the respondents.
10. How important is feedback control?
There was no consensus in the answer to this question. It is safe to say that the need for feedback control is application dependent as some suggested it is very important for machine vision applications and others suggested it was not necessary. Joe suggested, ‘‘We feel it is very important. We have feedback technology available for both our fiber optics and LED systems. Stable light is required on about 5% of the applications in the field.’‘ Mike amplified this further ‘‘Feedback control is only important in applications where long term stable light output is required. These include calibration applications (CCD & CMOS sensors, LCD displays, reflectance, colorimetry, etc.) as well as multi-sourced balanced systems typical in the web illumination markets.’‘
Jim comments from the perspective of LEDs, ‘‘Feedback control is an interesting concept, and one that has been discussed for a considerable length of time in the lighting industry. It’s necessity, however, I think is debatable. The concept of feedback control is that the lighting system is constantly monitoring itself, and will provide minute adjustments to ensure a repeatable light source. The need for a feature like this immediately brings into question the stability and repeatability of the light source itself. Most light sources will of course degrade over time. If the degradation of the light source is so rapid that feedback control is necessary, I would say the value of the light source is questionable. With LED’s, periodic calibration should be sufficient to ensure light source stability. ‘‘
11. What is your perception of the impact of LEDs on the machine vision field?
Harlen notes, ‘‘LEDs have been a quiet revolution for machine vision. A light source once dismissed as little more than a stereo indicator light has come a long way, and continues to be improved upon. LEDs have led machine vision applications into many areas fluorescent or fiber optic light sources could not go. Without the cumbersome fiber optic bundle and light source, LEDs can be placed remotely from the power supplies or strobe controllers. The low power usage allows some LED sources to be used in conjunction with battery power, creating opportunities for their use in dangerous situations such as the inspection of explosives.’‘ While John adds, ‘‘It has revolutionized it. Ten years ago LED lighting companies were almost non-existent, now there are many. Because of their output, cost, size, life and stability, they are the light source of choice.’‘
Jim reiterates ‘‘LED’s have had a huge impact on machine vision. My early experience in machine vision was on the system and solution end of the food chain. At that time, lighting was always a constant headache. Lighting was a major part of the system cost, mechanical consideration, and maintenance procedures. With the advent of LED technology, most of those issues have been diminished or removed. LED lighting is relatively low-cost (although the price of lighting relative to the overall system price is still high), very flexible, robust, and extremely long life.’‘
Mike comments on the impact it has had on the overall machine vision market ‘‘Combined with ever lowering turnkey ‘‘vision in a box’‘ or ‘‘smart camera’‘ type systems, LEDs have helped open a large volume of machine vision applications that were previously outside the price performance curve.’‘
12. Given these advances in LED lighting, why would anyone buy any other lighting? E.G. fiber optics ring light? Or fluorescent line of light?
Peter observes, ‘‘Certain applications require certain types of lighting. You can get higher intensities with halogen fiber optic systems then with LEDs. Xenon strobes offer higher intensities and are still the best for color applications. Fluorescent lighting is still less expensive.’‘ While Mike suggests, ‘‘It is clear that the brunt of growth for new applications groups will be addressed with LED technologies. The other technologies still service niche application groups where the particular fit provides benefits that LEDs cannot. ’‘
Jim notes, ‘‘As much as I hate to admit it, there are still some limitations with LED technology, which means there is still a market for some of the more traditional types of lighting.
- Cost – There are still some less expensive alternatives that may be utilized for some particular applications. If you need to bathe a target in diffused white light, and it is not a high use application, you cannot beat the price of a fluorescent ring light.
- Intensity – Although the intensity of LED’s is still improving rapidly, there are still applications that are beyond the capability of LED’s. This is usually a combination of large target size, long work distance, and pricing consideration.
- UV – There seems to be a market for UV LED lights, and some of the LED Lighting suppliers are offering them, but the price performance ratio is much lower than traditional UV lights.’‘
13. What are any emerging developments that will impact machine vision lighting?
John suggests, ‘‘LEDs and electro-luminescent are where the future lies. LED brightness levels and additional wavelengths are the hottest things there.’‘ Mike comments ‘‘Continued improvements to lighting tolerant software will probably play the largest role in determining lighting requirements in the future.’‘ And Peter notes ‘‘Faster cameras that require less light; more powerful and sophisticated vision systems; and movement towards more custom solutions to lighting applications.’‘ While Joe suggests ‘‘HID lamps could effect they market if the stability could be improved.’‘
Jim observes, ‘‘There will be constant technology advances that will change our industry. I believe that LED’s will continue to improve. Chip on Board technology is an interesting technology, and may have some relevance for some of the OEM market. I mentioned some changes in camera technology that will have an impact earlier.
I think the development that will have the biggest impact on our industry, some of which may potentially be fatal, is the advent of the low cost machine vision systems and smart cameras. This trend has been a terrific opportunity for the industry by the mere fact that it has broadened the market by orders of magnitude. The ominous portion of this is that prices are being driven lower and lower, and low cost machine vision systems are being marketed with no consideration for the need for lighting, programming, integration or resident expertise. This could potentially create a negative backlash similar to that found when machine vision was first introduced. I believe that it is important to keep customers expectations realistic from an overall cost, performance and maintenance point of view.’‘