Industry Insights
Comprehensive System Integration Optimizes Data Reliability
POSTED 12/11/2017
| By: Winn Hardin, Contributing Editor
When AIA introduced the GigE Vision global camera interface standard in 2006 using the Gigabit Ethernet communication protocol, machine vision customers quickly saw the advantages of fast, high-bandwidth (125 MB/s) transfer over low-cost CAT5e and CAT6 cables up to 100 m in length. Since GigE Vision’s inception, however, some users have reported data integrity problems during image transfer — specifically, that GigE cameras drop packets.
Depending on whom you ask, the issue of dropped packets runs the gamut from virtually nonexistent to low risk to frequent and problematic. While they may not agree on the regularity or severity of GigE’s data reliability issue, nearly all machine vision system integrators concur that only proper system design and development can overcome data integrity concerns.
Finding a Better Way
GigE Vision transmits image data packets via the user datagram protocol (UDP). The UDP packet identifies the image number, packet number, and timestamp, which are used to construct the image in user memory.
Unlike its transmission control protocol (TCP) counterpart, UDP cannot guarantee delivery of the packet or that the packet is in order. As such, GigE Vision employs a packet recovery process to ensure that images aren’t missing data. Although the packet recovery implementation isn’t required to comply with GigE Vision, most high-end industrial cameras will have this feature.
Even with this packet recovery process, dropped image data packets still occur. “USB3 Vision has a big advantage over GigE Vision cameras in that I haven’t seen one dropped packet of information yet,” says Robert Eastlund, president of Graftek Imaging, Inc. (Austin, Texas). “Dropped packets, which show up as blocks of missing information in an image, are common with GigE Vision cameras, in addition to various startup headaches and firewall issues.”
When testing a machine vision system before deployment, a qualified system integrator will identify missing image data, identify the source of lost data, and troubleshoot accordingly until the problem disappears. Eastlund points to two common causes of the dropped packets: a bad cable — which can plague any data network — or improper network card.
“Often these problems will happen on a network you didn’t provide the customer,” Eastlund says. “You’ll see customers attach an old GigE cable to a new camera and they get an image on screen, and a few seconds later, part of the image is gone, then the image comes back but a different part of it is missing.”
It’s one thing to troubleshoot dropped packets in a fixed environment; it’s another to do so in rugged subzero conditions. Eastlund has a customer on the northern slopes of Alaska where some images captured by the camera are missing 10% of their data. While he briefly considered outside electromagnetic interference as a possible cause of dropped packets, the cabling itself is more likely the culprit.
“I can potentially get around the problem by sending a command to the camera to tell it to slow down, or throttle back its bandwidth usage, but I’d rather get the images in time,” Eastlund says. “It’s not usually a good solution. That’s like buying a sports car and saying it runs great until you get to 80 mph and it starts to vibrate.”
In the integration work he performs for customers, Andy Long, CEO of Cyth Systems (San Diego, California), says that dropped frames in themselves aren’t the problem. “The concern is not easily being able to identify the dropped frames,” he says. “The GigE Vision standard allows you to identify the frame number, but it is not obvious and should be.”
Mitigating dropped packets or frames requires “an understanding of the application and whether missing a frame and knowing you missed a frame is okay, or that you cannot afford to miss a frame at all,” Long says.
Mind the Bandwidth
Brian Durand, president of i4 Solutions (St. Paul, Minnesota), doesn’t experience problems with dropped frames or packets as long as the camera’s maximum frame rate isn’t exceeded. “We never use an Ethernet switch, instead connecting each camera to its own dedicated network port,” Durand says. “The network adapters we use have multiple ports, each with its own chip optimized for and supported by the camera drivers. So each camera is independent of the other and gets full bandwidth.”
These network adapters also power the cameras via PoE. In addition, Durand uses quality shielded Ethernet cables appropriate for industrial environments.
“The only situation where we could lose a frame would be if the image processing isn’t keeping up with the cameras,” Durand says. “We can detect this condition in our software and take appropriate action.”
Durand recommends designing associated controls and equipment to assume a part is bad. For example, a part reject system should always reject a part unless the vision system indicates it is good, rather than assume a part to be good unless the vision system indicates it is bad. If a wire gets loose, or a device gets unplugged, the system will then reject all parts.
“This condition quickly gets an operator’s attention,” Durand says. “It is much preferred to not rejecting any parts, which may go unnoticed for hours or days at a time. “Rejected parts spilling on to the floor, or equipment automatically stopping on excessive rejects, gets attention quicker than an email or text alert.”
Mitigation Methods
A well-designed machine vision system will always find a way to get around the “catastrophic scenarios” in which a GigE Vision camera might lose a frame, writes Teledyne DALSA’s Eric Carey, chairman of the GigE Vision committee, in his blog on the subject.
One such scenario occurs when the camera simply cannot capture the image, most likely because the camera is triggered too fast and too often. The camera manufacturer determines how to handle over-triggering, which is typically addressed by ignoring the extra trigger signal altogether or storing it until it can be executed after the current frame.
If the camera does acquire the image, Carey cites a few situations where a frame is dropped before being transmitted. In one instance, the camera will have to drop a few frames to adjust acquisition bandwidth to match the transmission bandwidth if the gigabit link is too slow to support the acquisition bandwidth.
Another scenario involves what Carey calls a “race condition between the trigger and the stop of an acquisition,” where the PC sends a StopAcquisition command to the camera before the frame has been fully exposed. Cameras from different vendors will handle such a condition differently, as one might expect. Carey says it’s up to the integrator to design the system to “deal with such a scenario in a more elegant manner.”
Most system integrators agree that the only way to verify data 100% in real-time is to buffer that data. In the case of Camera Link, a frame grabber stores the data until the system is ready for it. But the GigE port on the PC is being run by the Windows or Linux operating system, which doesn’t operate in real-time. Using GigE Vision software designed to minimize CPU usage achieves real-time image acquisition.
However, data loss can happen if the transmission consumes too much CPU or bus bandwidth or other software programs on the PC block the image acquisition. Specialized drivers address this problem by having a higher priority over other actions scheduled by the operating system. While Carey points out that other “ill-behaved kernel drivers” could limit the performance of the optimized drivers and therefore risk losing data packets, he hasn’t seen this situation occur.
Finally, if the application software may not be able to process the data quickly enough, and if there aren’t enough buffers to accommodate the next image, it will be discarded, according to Carey. As long as the integrator identifies and corrects this and other aforementioned problems during system design, it won’t be a problem at all during operation.
While some debate remains on how much of a threat data integrity represents to a well-designed machine vision system, standards such as Camera Link, GigE Vision, and USB3 Vision are designed with data integrity and system confidence in mind. According to Cyth’s Long, you know you’re moving in the right direction when others imitate your efforts, such as the emerging embedded vision industry and its recognition and use of traditional machine vision standards.
“This is good for the industry, but I wonder whether they have been designed for this lower-power space, and are there active participants trying to determine a low-power single frame on-demand acquisition standard,” Long asks.