Vision & Imaging Blog
Understanding Optics in Machine Vision Applications
Machine vision systems are incredibly complex. In even the simplest system, hardware and software work together to produce results. Although there are many vital components, one stands out: The lens.
Today’s precise machine vision applications would be impossible without help from the science of optics. Since the earliest days of telescopes and looking glasses, optical science has been in a tireless advance that now supports high-end electronic imaging.
The lens is crucial since it captures data that will ultimately be recreated by software. It locates image features, maintains focus, and maximizes contrast. However, it operates under various specifications that must be optimized for best performance.
Some of these include:
Field of View: The field of view is the object area imaged by the lens. All features the system must inspect should be covered by the FOV. In applications involving gauging and alignment, the field of view presents the image in a fixed geometry calibrated to the object’s position.
Working Distance: The distance from the lens to the object.
Depth of Field: Depth of field is the maximum object depth that can be held entirely in focus. It also determines how much working distance variation is possible while maintaining acceptable focus levels.
Sensor Size: Sensor size dictates the sensor’s active area. This is typically measured in the horizontal dimension. The ratio between the sensor size and field of view is primary magnification. Generally, the larger the sensor, the better the image.
Certified System Integrator Program
Set Yourself at the Forefront of the Global Vision Market
Vision system integrators certified by A3 are acknowledged globally throughout the industry as an elite group of accomplished, highly skilled and trusted professionals. You’ll be able to leverage your certification to enhance your competitiveness and expand your opportunities.
Resolution: Resolution describes the vision system’s ability to reproduce objects in detail. Smaller sensors are unable to distinguish the fine details of objects. Even large, sophisticated sensors must be used with the proper level of zoom and separation to distinguish objects.
Contrast & Filtering
Contrast is the separation in intensity between white and black portions of an image. The greater the difference between the two, the higher the contrast. The correct lens can enhance contrast even in situations where the sensor, position, and focal length are all unchanged.
One method of increasing contrast is color filtering. Many basic sensors and lenses may be well-suited for specific industrial applications, but show only subtle differences between colors. Adding a filter of the appropriate color – for example, red or green for objects predominantly these colors – raises contrast and compensates for environmental lighting variations.
Diffraction & Distortion
Also known as lens blur, diffraction reduces contrast at high spatial frequencies, reducing image quality. The gulf between ideal and real lens behavior is called aberration. Distortion is a specific kind of aberration that results in magnification differences throughout the image. Some vision systems can compensate for this issue with software.
To develop the best systems possible, machine vision engineers should maintain a working knowledge of optics.
BACK TO VISION & IMAGING BLOG
Recent Posts