Industry Insights
The Latest Advancements in Vision Standards
POSTED 12/16/2024 | By: Aaron Hand, TECH B2B, A3 Contributing Editor
In the rapidly evolving world of machine vision, standards play a pivotal role in ensuring interoperability, reliability, and performance across diverse applications. Standards establish a common framework for communication between cameras, processing units, and software, enabling seamless integration and scalability in industrial, scientific, and medical imaging systems.
A variety of interface standards help manufacturers connect cameras to their operations. Before these standards were developed, industry operations were governed more by proprietary systems, a situation that was less than ideal for users needing flexible, scalable operations.
“If the system didn’t do exactly what you were looking for, you would then have to get a new system and potentially throw your old system out,” says Bob McCurrach, director of vision and imaging standards at the Association for Advancing Automation (A3). “With these interoperability standards for the interfaces, if your needs change, you can change your software and keep using your same cameras, and vice versa. It’s a very powerful concept.”
The machine vision industry began its standardization process with Camera Link. Developed in 2000 by the Automated Imaging Association (AIA), which is now part of A3, Camera Link standardized connections between cameras and frame grabbers for machine vision and other high-performance imaging applications. It not only promoted interoperability — including data transfer, camera timing, serial communications, and real-time signaling to the camera — but also addressed a growing demand for faster and more reliable data transfer.
Later, a group of manufacturers got together to address different hardware standards — namely, using native ports on PCs rather than systems based on frame grabbers. That was the genesis of the GigE Vision standard in 2006 and USB3 Vision in 2013, both managed under A3.
Other standards were released — CoaXPress in 2010 and Camera Link HS in 2012 — to enable high-performance, high-speed image transfer. Camera Link HS (from A3) operates over copper and fiber lines; CoaXPress (hosted by the Japan Industrial Imaging Association, JIIA) traditionally operated over coaxial cables but added fiber a few years ago.
“They’re slightly competitive,” McCurrach says, “but hopefully they’ll be coming together in the future in some form or fashion.”
In a recent webinar from A3, now available on demand, McCurrach and other standards chairs provide a rundown not only of how most of these important standards got their starts but also where they’re heading next.
GenICam Provides Uniform Software Standard
Before getting into the nuances of how each of the standards work, it’s useful to understand that they all operate with the software standard GenICam (Generic Interface for Cameras), which provides a uniform programming interface regardless of the camera interface type — whether Camera Link, GigE Vision, USB3 Vision, or CoaXPress.
“Importantly, GenICam, which is used for generic camera control, supports all these interface standards,” McCurrach says. “So there’s a similar look and feel as you go from one standard to another, which helps greatly.”
GenICam, developed in 2006 by the European Machine Vision Association (EMVA), helped to standardize and simplify integration across different camera types and communication protocols. It allows software developers to control and acquire images from any compliant camera without needing to know the specific hardware details, simplifying both software development and integration for machine vision applications.
Camera Link HS: Pushing Forward at High Speeds
To address needs for higher-speed transfer rates, a Camera Link HS subcommittee was formed in 2010, and the standard was released in 2012. Based largely on the Teledyne DALSA HS Link machine vision interface, Camera Link HS increased bandwidth, but it also improved on the Camera Link standard by using off-the-shelf cables to extend reach.
Camera Link HS v1.2 was released in September 2022, adding an MPO connector and increasing the possible line rate to 25 Gbps. High-speed command communication and video data transfer have been important from the beginning, making it easy to communicate with the camera with a very short latency, says Martin Schwarzbauer, manager of R&D camera systems at Excelitas PCO and chair of the Camera Link HS Committee.
“A unique key feature of Camera Link HS is the possibility to use asymmetric or unbalanced cables,” he adds. “If we have four down lines for video packets, it’s not necessary to have four up lines for communication.”
This is good for camera manufacturers because the cameras do not have to provide as much power since every high-speed transceiver needs power. On the computer side, it makes it possible to spread out the image processing over several frame grabbers.
Another unique feature is that, for both the M and X protocols, the interface is immune to bit errors. Errors can be corrected on the fly without any distortion of jitter, bandwidth, or image data, making the interface very reliable, Schwarzbauer adds.
In addition to the specification itself, the Camera Link HS subcommittee has decided to provide an IP core for camera manufacturers using field-programmable gate arrays (FPGAs). An IP core allows developers to integrate Camera Link HS functionality directly into custom hardware designs, reducing development complexity and time.
“This IP core is not a complete final solution, but it is a very good starting point for using Camera Link HS,” Schwarzbauer says, noting that it’s easy to use the IP core across several FPGA vendors and families. “We have several customers which are using them on several FPGAs without any problems. And the IP core is meanwhile well-tested on 25 G hardware, so it’s futureproofing for every high-end camera.”
Several new features are planned for future releases of the standard, according to Schwarzbauer, including better integration into the GenICam Standard Features Naming Convention (SFNC).
“In addition, we figured out that we have a lack of information on how video data is transferred if we have multiple connectors with multiple lens connectors, so we will introduce a self-describing method of how the image data is transmitted,” Schwarzbauer says. This makes it easier whether you want to do image processing on the FPGA or transmit the image data directly to the computer memory.
The committee is also working on achieving bit rates faster than 25 Gbps. The team plans to use the existing IP core, with as little modification as possible, to reduce development time.
GigE Vision Looks to Take Advantage of Ethernet Speeds
Developed using the Ethernet (IEEE 802.3) communication standard, GigE Vision supports multiple stream channels and uses standard Ethernet cables for fast error-free image transfer over very long distances.
Rather than an interface, GigE Vision is a protocol, running on top of UDP/IP to ensure interoperability among vision components connected through a network, explains Eric Bourbonnais, program manager at Teledyne DALSA and chair of the GigE Vision Committee since 2017.
“If it’s Ethernet, why do we need GigE Vision? Because GigE Vision allows you to identify which device is really the machine vision component that you are looking for,” he says. “The user protocol includes the discovery mechanisms that are used to detect any vision camera on the network, and you don’t need to know in advance what the IP addresses are or what the MAC addresses are.”
The GigE Vision Control Protocol (GVCP) provides camera control and configuration, ensuring efficient communication for high-performance imaging systems. Designed to work with the GenICam GenAPI, GVCP allows developers to control cameras without needing to know the hardware specifics.
The GigE Vision Stream Protocol (GVSP) defines how images can be transferred over the network. “Over time, we have added a lot of functionality for different types of images — including some 3D data and different types of compressed images,” Bourbonnais says.
The GigE Vision certification process involves several steps to ensure that the products comply with the GigE Vision standard, including a compliance matrix and validation framework. “You also need to come to a plugfest and demonstrate publicly that you are able to integrate with some other products out there,” Bourbonnais says.
The latest version of the standard, GigE Vision 2.2, released in 2022, features a new payload type to stream GenICam GenDC containers. “This is a really flexible way of streaming the more complex data structures like 3D information,” Bourbonnais says.
In 2023, GigE Vision released the latest version of the validation framework, including new tests for GenDC payload type requirements, a new Python framework, and optional GenICam device validation tests (which will become mandatory in the next major version).
In fact, the committee is working on GigE Vision 3.0 now. The key reason the committee felt the need for an update was the considerable improvement in Ethernet speed since the protocol was first developed almost 20 years ago.
“GigE Vision can theoretically work at those speeds, but it is not optimized for this,” Bourbonnais says. “So we decided to look at how we can make GigE Vision better on higher-speed Ethernet.”
To achieve this, GigE Vision 3.0 will reduce CPU utilization for transferring data, which will increase the throughput of the camera and reduce latency. A new streaming protocol, GVRSP, is based on RDMA over Converged Ethernet version 2 (RoCEv2), which has already been defined and standardized by the group, Bourbonnais says. A key reason for choosing RoCEv2 as the basis, he adds, is that most network interface cards (NICs) with speeds of 25 Gbps and higher incorporate RoCEv2 offloading in the hardware.
The RoCEv2 benefits contribute to low CPU utilization. RoCEv2-capable NICs offload data transfer tasks from the CPU, freeing up system resources for other tasks, such as image processing. RoCEv2 is also scalable, offering physical links ranging from 10 to 400Gbps. The NIC can also offload multiple streams simultaneously.
“It’s really a flexible solution that allows you to choose or design different systems with different configurations based on the needs of your application,” Bourbonnais says.
Perhaps more compelling are the benefits in enhanced reliability, low latency, and compatibility and interoperability, Bourbonnais notes. RoCEv2 has robust error handling capabilities, he points out, delegating error detection and recovery to dedicated hardware. This minimizes disruptions in critical applications and enhances stability.
USB3 Vision Benefits From USB Speed Improvements
USB3 Vison is a transport layer specification that builds on the widely available USB3 technology to connect imaging devices to a computer. The standard came about when, in 2011, a group of companies established a working group within A3 to standardize an “on the wire protocol” using what was at the time a new USB3 standard, which allowed a data rate of 5 Gbps with very good efficiency. At the time, there were only proprietary solutions using USB in industrial cameras.
The standard has seen several enhancements since it was released in 2013, but the basic principles remain the same since the standard relies completely on USB, says Uwe Hagmaier, vice president of product development vision at Balluff Machine Vision. He is a founding member of the USB3 Vision technical committee and took over as chair this year.
“Minimal change was needed to support newer and faster versions of the USB standard. USB3 Vision mostly relies on the bulk transport protocol of USB, which is very similar to what GigE Vision now does with RoCE, so it shares the property of having very high efficiency, zero copy, and low-latency error recovery,” he says. “USB3 Vision is fully based on USB itself, so it gets all the nice things from USB, like plug and play, power over cable, everything.”
But USB3 Vision also relies on several other standards developed by the machine vision industry, Hagmaier notes. As a lower-level transport layer specification, it uses higher-level semantics. Like other standards, it employs GenICam as a generic method to talk to cameras, making it independent from the underlying transport layer.
“Then there’s the Standard Features Naming Convention, which defines a standard camera model, meaning it tries to explain every aspect of a camera and put a definition on how to control it,” Hagmaier says. “And this is really the key of having an application which can use several different cameras from different vendors, even with different transport layers, because there is that common vocabulary on camera control.”
The Pixel Format Naming Convention (PFNC) has a similar objective but for the pixel payload, defining standard payload types.
Using GenICam and SFNC over USB3 Vision, an application can control gain, exposure time, pixel format, and image size, as well as camera sequences, digital outputs on the camera, and automatic modes. It can also trigger images either by hardware or software.
“USB as a transport layer provides high bandwidth with low CPU load on the computer at a very low cost,” Hagmaier says. “And as USB is available on many different types of computers, from small embedded systems to high-performance workstations, this gives you great flexibility on positive systems to use and connect your camera to.”
USB3 Vision is currently on version 1.2, which includes GenDC support and covers high-speed USB3 and USB4 standards supporting up to 20 Gbps.
“The main focus of the USB3 Vision Technical Committee currently is on testing and verification and improving the USB3 Vision infrastructure like compliance rules for cable extenders,” Hagmaier says. “A good thing for USB, since we rely on an established standard, we can always benefit from speed improvements. Now we can build faster devices going up to 10 or 20 Gbps because there is new silicon coming, which gives even more speed to USB3 Vision cameras.”
CoaXPress Expand on Fiber Capabilities
Hosted by JIIA, the first version of CoaXPress was released in late 2010. It provides a high-speed interface between cameras and frame grabbers in imaging applications such as machine vision, medical imaging, life sciences, broadcast, and defense. The standard can use a single coaxial cable to transmit data from a camera to a frame grabber while also transmitting control data and triggers from the frame grabber to the camera.
CoaXPress supports data transfer rates of up to 12.5 Gbps per cable. Combining multiple cables leads to an even higher bandwidth, enabling the capture and transmission of ultra-high-resolution images at high frame rates.
The standard allows long cable lengths — 40 m at 12.5 Gbps, and considerably longer distances at lower speeds. With Power over CoaXPress (PoCXP), it can deliver power to the camera through the same coaxial cable used for data transmission, simplifying system design and reducing cabling complexity.
The protocol ensures minimal latency and is highly resistant to electromagnetic interference (EMI).
The current version of the CoaXPress standard, v2.1, was released in February 2021. CoaXPress v2.0 added 10 Gbps (CXP-10) and 12.5 Gbps (CXP-12) speeds, doubling the cable bandwidth. To allow for even faster cameras, v2.0 permits a single camera to send data to more than one frame grabber, even if in different PCs. The uplink speed is also doubled for CXP-10 and CXP-12, enabling trigger rates over 500 kHz without requiring a dedicated high-speed uplink cable.
Support for GenICam has been enhanced with the addition of GenICam-compliant event packets, which allows the camera to signal events to the PC. CoaXPress v2.1 includes support for GenDC, which defines how image data is structured, transmitted, or received independent of its format, including complex image formats such as 3D.
The committee is planning for version 3 of CoaXPress now and is working on merging the existing CoaXPress over Fiber protocol, introduced with version 2.1, into the main standard document.
Bringing Everything Together
With all these imaging standards, it might seem that they’d be jockeying for position among suppliers and users. In fact, the standards bodies work closely with one another to avoid conflict and redundancy. G3 is a collaboration that now includes A3, EMVA, JIIA, the China Machine Vision Industry Union (CMVU), and the German Mechanical and Plant Engineering Association (VDMA) — all working together to develop and promote global standards for machine vision.
“It’s important that we communicate regularly and make sure that we’re not developing standards that overlap. Developing standards is very complicated. It’s very expensive. So, for the industry, it really makes sense for us to coordinate,” McCurrach says. “We agree to promote each other’s association standards and to not overlap on other association standards. We keep each other abreast of what’s going on, and that’s helped out greatly since 2009.”
Since 2016, G3 has jointly organized twice-yearly International Vision Standards Meetings (IVSMs), rotating locations between North America, Europe, and Asia. The most recent fall meetings took place in Switzerland, and this coming spring, they will be hosted in Quebec City, Canada.
An important part of each meeting is a plugfest, in which all the manufacturers and technical committee members come together to interoperate. “It’s very important for both the development of the standard, but also as part of the qualification of vendors, that their products actually meet the standard,” McCurrach says. “We also have a future standards forum, where each of the chairs gives a brief description of what they’re working on so that the whole community is in sync.”
To learn more about the developments in machine vision standards, including Camera Link HS, GigE Vision, and USB3 Vision, watch A3’s 2024 Vision Standards Update webinar, available on demand.