« Back To Industry Insights
Association for Advancing Automation Logo

Member Since 1974

LEARN MORE

Content Filed Under:

Industry:
N/A

Application:
N/A

AI Improves Machine Vision System Performance and Versatility

POSTED 04/03/2023  | By: John Lewis, A3 Contributing Editor

FIGURE 1: Advanced AI technologies like Neurala VIA can reliably detect highly variable defects such as short shots or burrs on injection molded fasteners. (Image Courtesy of Neurala.)Deep learning solves a set of problems that are either not solvable with traditional rules-based machine vision or are exceptionally difficult to solve using traditional tools. As such, it is opening up possibilities to automate inspections that previously couldn’t be automated. Deep learning is particularly useful in cases where products have high variability in terms of what is considered "good" or "defective," such as some food products. For instance, there can be massive variety in the type of features that make an apple or baked good acceptable.

AI Isn’t Magic
AI isn’t magic and, like all traditional machine vision applications, the problem can be made much easier with the right sensor, image acquisition, and illumination configuration, according to Max Versace, CEO of Neurala. “For example, we’re working with a large food company that is inspecting raw material inputs into their process to identify the presence of contaminants. For this use case, hyperspectral cameras are ideal as they make it extremely apparent that there’s a foreign object in the field of view. Similarly, we have baked-good manufacturers looking to identify the presence of moisture using SWIR cameras, which make that a much easier task.”

Strong machine vision fundamentals, including image formation and lighting stability, are key to the success of any machine vision system. However, work cells based on technology from off-the-shelf 2D cameras and lenses reduce hardware expenses associated with vision-guided robotic cells. “These work cells operate in ambient light and are immune to changes in light,” explains Apera AI head of marketing Eric Petz. “This includes the ability to work outside in a variety of weather conditions, which compares favorably against conventional vision that is vulnerable to changes in light and must use specialized equipment for scanning, light control, and other tasks.”

FIGURE 2: There are fewer exceptions to the use of AI-powered vision for different object shapes and geometries. Identifying and picking clear objects—impossible for conventional vision technologies—can occur with a total vision cycle time as low as 0.3 seconds. (Image Courtesy of Apera AI.)Software-Defined Lighting
“80% of the challenge in machine vision comes down to lighting,” says UnitX senior business development representative Cole Taylor. Some AI vision systems rely on software-defined lighting that can be tuned such that the defects really stand out in images captured. This results in improved manufacturing yield and quality through software-defined lighting coupled with AI.

“[This] system can be trained on as few as five images for each defect and one image of a good part,” explains UnitX senior business development representative Cole Taylor. “The system is very customizable, and thresholds for surface defect size can be tailored to meet the specific needs of [the] customers.” 

Bellwether Applications
In most cases, the success of an AI-based machine vision application largely depends on choosing a suitable application. Often, highly variable “good” products and highly variable “defects” are the center of AI-powered inspections’ capabilities. Neurala, for instance, works with a leading meal producer in the health and social care sector, apetito. While producing in excess of 1 million meals a week, apetito wanted a solution that could efficiently detect missing ingredients in the products coming off the line, without compromising efficiency or cost — better yet, improving both.

“Finding a missing ingredient is a hard task,” says Versace. “It’s much harder than we think to classify something missing. The meals include green beans, chicken curry, beef and gravy, etc. All of which look massively different, but AI does a really good job in that scenario where weighing the tray isn’t helpful as an overage of one component might offset the absence of another. We have similar scenarios in product packaging inspections, where we’re looking at label correctness checks, injection molding applications where we’re identifying short shots or burrs that could appear anywhere on the part, and so on. All highly variable issues.”

Another bellwether application example for AI-based vision includes a tier 1 automotive supplier that uses Apera AI’s system to guide a robot in sorting highly similar black plastic parts on a black conveyor belt. The objects are hard for human eyes to identify on the belt, and the difference between objects is very slight. “There are small features on the ends that are only millimeters long,” Petz says. “Conventional vision couldn’t accomplish the task, and certainly not at the speed at which it is performed by AI-powered vision.”


Don't Miss These Industry-Leading Events!

A3 Business Forum

January 20-22, 2025
Orlando, FL

Automate

May 12-15, 2025
Detroit, MI

AISA

November 3-5, 2025
Houston, TX


 

Another tier 1 automotive supplier uses Apera AI’s system to guide a robot in placing clips into an injection-molded interior subassembly. The clips must be placed very precisely, with the robot applying pressure to seat the clips into towers. However, the task is not easy because injection-molded parts are subject to bending and dimensional variations. The Apera system can find the clip towers, place the clips, and apply gentle pressFIGURE 3: Off-the-shelf camera pairs couple with AI-powered robotic vision software to guide an industrial robot in picking and placing multiple bolt sizes and shapes. This type of sorting and kitting is common in automotive manufacturing. (Image Courtesy of Apera AI.)ure to the clip using the robot. According to Petz, the Apera AI system drove part quality acceptance to over 99% at that company — a significant improvement since, previously, a high percentage of parts required rework or disposal (due to the robot breaking the part).

When Rule-based Machine Vision Fails
AI-based machine vision is a natural fit for any application where conventional machine vision has failed, according to Petz. Such failures may happen due to slow object recognition and robotic path planning speeds, the need to control and manipulate light, or where there were too many exceptions because of object finishes, geometries, and other factors.

Key Considerations
After the right project is identified for AI-powered vision, there are many important factors to consider, such as sensor selection, how many images will be needed for model development, and what’s needed to add vision AI to an existing inspection setup — all equally important questions. Another key factor to consider is if any additional skills on staff will be needed to successfully implement and maintain the vision AI system. “To get to the right answer quickly and cost-effectively, it’s best to engage with a partner that has a wealth of experience solving problems like yours,” Versace explains. “Without that, you’re likely signing up for an expensive and unnecessary ‘trial and error’ pilot program.”

The reality is,whatever data is used to train the model likely won’t account for every possible scenario that will occur in the future. It’s important to embrace the notion of adapting over time. For example, slight changes in the inspection environment such as lighting, or part or camera position can cause AI models to make erroneous predictions.

FIGURE 4: UnitX uses software-defined lighting & AI to inspect random defects. (Image Courtesy of UnitX.)Crawl, Walk, Run
The performance of AI is highly dependent on the quality and quantity of the data powering it. Figuring out the correct proportions of data can make or break the application. As a result, Versace strongly advocates for organizations to take a “crawl, walk, run” approach to implementing AI-powered inspections.

Crawling refers to confirming the project’s feasibility as inexpensively as possible. “For example, we and our partners will often perform low or no-cost feasibility studies that demonstrate technical viability,” Versace explains.

Just because you can solve a problem with AI, doesn’t mean that it will be a compelling business case. With that in mind, walking means getting on site to verify what it might cost to implement in a line. According to Versace, the cost of most solutions are 90% hardware and services versus actual AI software, which is influenced in large part by motion control and integration.

“Once those expenses are buttoned down, implement one. Not 10, just one,” he emphasizes. “That allows confirmation of what happens in production. Only then should you run into a full scale, multi-line deployment. There are things in that first deployment that will influence how you will approach the next 10. Trust me!”

Upfront Feasibility
Cole agrees that project feasibility is a key factor to consider. “We generally will have clients send parts to our lab at HQ and San Jose where we run tests and create a report indicating our ability to detect the defects of interest.”

Likewise, Apera AI uses synthetic data to construct neural networks that get progressively better as inputs are adjusted. With wave after wave of calculations, the AI can progressively improve the outcomes of the resulting robotic cell, according to Petz.

“Validation of new applications upfront is key to success, and someone using DL can do this because of how the technology works,” Petz explains. “It used to be that you would have a problem or desired improvement, then make a hypothesis that certain hardware technologies could provide a solution when custom programming tied them together. The prior model put most or all of the spending upfront. With software based in AI, the performance of the resulting cell can be simulated millions of times before committing to hardware. This is a massive shift in how automation projects work.”