Vision Capitalizes on New Opportunities at the Ends of the Rainbow
| By: Winn Hardin, Contributing Editor
Most machine vision systems measure the reflectance or transmission of visible light at wavelengths of approximately 400 nm to 700 nm. However, there are a number of applications that benefit from imaging systems that can measure electromagnetic radiation at either end of the visible spectrum at wavelengths in both the ultraviolet (UV) from 10 nm to 400 nm and infrared (IR) from 700 nm to 1000 nm.
Capturing images at UV wavelengths allows surface artifacts to be resolved in greater detail since the wavelength of UV radiation is smaller. At the other end of the spectrum, the near-infrared (NIR) from approximately 800 nm to 2500 nm, medium-wave IR (MWIR) from 3 µm to 8 µm, and long-wave infrared (LWIR) from 8 µm to 15 µm can be used in applications ranging from bruise detection in the food industry, remote sensing of chemical and biological species, and night vision reconnaissance.
Just as with visible wavelength imaging, different modalities exist to image an object at UV or IR wavelengths. UV and IR radiation reflected from an object that may or may not be illuminated with UV or IR light can be measured. Alternatively, in UV and IR fluorescence imaging, an object illuminated with UV or IR light may fluoresce, absorbing light at one wavelength and emitting light at a longer wavelength. In UV applications, the light emitted is typically in the visible range and can be captured using off-the-shelf CCD- or CMOS-based cameras.
Imaging in the IR
Many different types of IR detectors exist with which to convert IR energy into electrical signals. These can be broadly classified into either photon or thermal detectors. While photon detection is accomplished using semiconductor technologies, thermal detectors rely on capacitive (ferro- and pyroelectric) or resistive bolometers.
To image IR radiation, different types of semiconductor material ranging from photon-based detectors based on InGaAs (for SWIR imaging), InSb (for MWIR), and mercury cadmium telluride (for LWIR), as well as bolometer-based detectors, can be used (Figure 1). Microbolometer-based imagers span a wider range of wavelengths (x axis) from SWIR to LWIR, albeit with lower detectivity (y axis), a measure of the signal-to-noise ratio of an imager normalized for its pixel area and noise bandwidth.
When semiconductor- or bolometer-based IR detector materials are used in cameras, sensitivity is specified as the noise equivalent temperature difference (NETD). NETD is specified in milli-Kelvins, or mK (thousandths of a degree), and represents the minimum detectable temperature difference that can be obtained by the camera.
For the 360-degree panoramic thermal imager from IRCameras based on an uncooled 768 x 1024 VOx microbolometer, the NETD is specified at <40 mK. To image the MWIR band, the company’s Niatros MWIR camera cores are designed to optically image and detect hydrocarbon gases and use a 320 x 256 InSb imager with an NETD of <15mK.
In many industrial, machine vision, scientific, and medical applications, IR reflection techniques are used to obtain an image of the object to be inspected. In many cases, reflected IR radiation from an object is simply measured by an IR camera. In other cases, the object is illuminated by IR light and the reflected IR radiation captured. Alternatively, IR illumination can be used to make the object fluoresce and the reflected light then imaged.
In applications such as scientific research and building maintenance, the use of relatively low-cost microbolometer-based IR cameras is used to measure reflected IR energy. At Loyola Marymount University’s Center for Urban Resilience, this type of camera is being used to understand the temporary hibernation, or torpor, of hummingbirds when their body temperatures drop from 105°F to as low as 45°F. This information may be useful to reduce the oxygen and food consumption of astronauts during long-term space travel — a human version of torpor.
By using the Vue Pro R uncooled VOx microbolometer-based drone camera from FLIR, researchers placed the camera close to the birds without disturbing them to capture frequent, noncontact temperature readings (Figure 2).
While such applications do not require the use of direct IR illumination to capture thermal images, it is necessary in applications where the objects being inspected do not emit thermal energy. One example is food sorting, where hidden defects in fruit, for instance, must be identified and the item rejected. In many of these systems, the wavelength of illumination is highly proprietary and not even disclosed to the lighting provider.
In its line of food processing systems, Tomra Sorting analyzes food items by observing their spectral response in specific illumination wavelengths in the visible and SWIR spectrum (Figure 3). For SWIR analysis, Xenics has developed a custom-made InGaAs detector that facilitates the classification of quality and freshness of certain food products. This allows Tomra to use biometric signature identification to identify slight chemical and molecular differences, serving as a quality differentiator.
Rather than using thermal signatures or reflected IR radiation, IR fluorescence imaging is used in many medical applications to capture parts of cells or objects of interest. These methods require the subject to be illuminated with a specific frequency of IR energy depending on its composition, and then imaging the IR fluorescence. Such IR fluorescence imaging has been used at the Children's National Health System (Washington, D.C.) in the development of a Smart Tissue Autonomous Robot (STAR) to assist physicians in performing surgery.
The system employs an LED light source from Marubeni America, which illuminates the patient at 760 nm to excite markers attached to soft tissue. Under the illumination, the markers fluoresce and are then captured by an NIR camera from Basler. Locations of the markers’ coordinates are then merged into a 3D point cloud image generated from an image captured by a plenoptic 3D camera from Raytrix. A robot uses coordinates of the soft tissue to perform surgery with a laparoscopic suturing tool.
Sensing the UV
To capture images in the UV spectrum, the cover glass is removed to provide sensitivity down to approximately 300 nm. This action makes the sensors susceptible to damage, so CCD and CMOS sensor vendors will often fit them with a quartz-glass cover — which does not block UV.
Alternatively, a Lumogen phosphor coating can be added to the sensor that re-emits UV light in the 500–600 nm range. Since UV sensors are standard monochrome sensors, this visible light is captured and results in an increase of overall UV quantum efficiency (QE) by 30–50 percent. Customers can work effectively below 200 nm thanks to this type of sensitivity boost, says Rich Dickerson, Manager of Marketing Communications at JAI Inc.
“In most cases, microlenses are kept on the sensors to maximize the QE since they do not block UV transmission as does the cover glass,” Dickerson says. “However, the microlenses do introduce some ‘fringe’ effects, so for applications such as UV laser profiling, customers require sensors where the cover glass and microlenses have been removed. These customers are willing to sacrifice some sensitivity to avoid even the smallest amount of distortion.”
Just as reflected radiation and fluorescence-induced techniques can be used in the IR, the same holds true in the UV band. In reflected UV applications, radiation can be measured using a camera sensitive to radiation in the UV spectrum. In other applications, a UV lamp illuminates the object, with the reflected light captured by a CCD or CMOS camera sensitive in the UV. In UV fluorescence applications, the light impinging on the object is absorbed, radiating light at a longer wavelength — usually in the visible part of the spectrum.
One application of UV reflected imaging is measuring the corona discharge from high-voltage electric power transmission lines. Since corona discharges represent a significant waste of energy for utility companies, they must be identified and if necessary reduced by increasing the conductor size or the distance between conductors.
Numerous companies manufacture handheld devices to monitor corona discharges. The UV-260 from Sonel is designed as a predictive maintenance device for overhead transmission lines and high-voltage substations. With a camera sensitive between 240 nm and 280 nm, the UV-260 features both UV and visible autofocus, a built-in GPS, and still image and video capture.
“Transparent materials such as oils, greases, and other organic compounds that have a high absorbance or a high reflectivity in the UV band relative to the material underneath can be imaged by subjecting them to UV light and measuring the reflected UV radiation,” says Dr. Austin Richards, President of Oculus Photonics.
As an example, kitchen tiles — like other many inorganic materials — do not absorb UV and reflect it instead because of the refractive index change (Figure 4). “Since grease stains absorb UV very strongly, however, these can be visualized when imaged by Oculus UVCorder-HD, a UV camera with a spectral response from 350–400 nm,” Richards says.
Like illumination wavelength in industrial IR applications, the wavelengths used to produce fluorescence in the UV are an often highly guarded secret. However, there is some documentation that reveals the wavelength of radiation emitted when subjected to UV illumination. One particular article is a summary produced to assist Museum Victoria’s Conservation team in interpreting the results of UV examination of artifacts. The article lists a table of materials such as wood, glass, stone, and textiles; the type of UV illumination; and the wavelength of fluorescent light emitted.
Today’s camera makers and vision system integrators are helping customers see the unseen as the use of IR and UV across a broad range of applications evolves and expands.