Event-Based Vision Delivers Faster Decisions with Less Data
Machine vision has become pervasive in the industrial landscape. Its ability to rapidly and nondestructively extract actionable information from a scene have made it indispensable for applications ranging from inspection to process control, condition monitoring to safety. Conventional CMOS cameras are a staple of vision technology but they can be bandwidth hogs, with particular limitations when it comes to supporting fast decisions on high-speed processes. The issue is that conventional frame-based detectors capture data from the entire scene – whether a feature has changed or not. Depending on frame rate and resolution, that can add up to massive amounts of data but limited information. Event-based sensors present an effective alternative. Inspired by the human vision system, event-based sensors only capture changes to the scene. The result is faster information delivery with minimal data, reducing latency, bandwidth demand, storage requirements, and cost.
What is event-based sensing?
The CMOS imagers used in conventional industrial machine-vision systems consist of arrays of photosensitive pixels that convert light intensity to voltage. Each pixel is equipped with circuitry to convert that voltage to (typically digital) output. The entire array transfers its output simultaneously at a preset interval known as the frame rate. Then all of that data is processed and stored whether anything in the chief scene changes from frame to frame or not.
This model works well for applications that require capture of an entire image – pictures, video, even certain industrial use cases. The drawback is that recording the entire image at every time increment oversamples all the parts of the image that have not changed. Depending on the application, it can be grossly inefficient, producing up to 90% or more of useless data in some applications. Meanwhile, it potentially under samples what is really important in industrial use cases – what changes in the scene. Event-based sensing addresses this issue.
Event-based sensing is a dynamic visual information acquisition paradigm that records only changes in the scene rather than recording the entire scene at regular intervals. An array of independent pixels asynchronously detect and acquire changes in the visual scene. So, unlike frame-based cameras, the pixel array in an event-based sensor or an event camera doesn't operate at a common frame rate. In fact, there are no frames at all. Instead, each pixel sends a signal when the incident light intensity changes by an amount that exceeds a user-set threshold. If the incident light variation doesn't exceed the threshold (for example, the background of the camera view of a packaging line) then the pixel stays silent. If the incident light intensity does change (for example, an object passes through the field of view of the pixel), the logic circuit for the affected pixel sends a signal to the readout periphery.
Figure 1: State-of-the-art sensors for event-based vision consist of a stacked architecture with a photo pixel array flip-chip bonded to a CMOS chip containing the pixel circuit array and dedicated readout circuitry.
The resultant data doesn’t represent a static image but instead provides dynamic scene information that machine algorithms can use for fast decision-making (see figure 2).
Figure 2: Data generated by an event-based sensor (right) dynamic scene contents in the form of pixel-individually recorded changes. The subject’s right hand, which is in motion and blurred in the conventional image (left) is clearly delineated in the event-based data (right). Meanwhile, the static background that is captured and transmitted by the conventional image sensor (left) does not even show up in the event-based data. To see the video version of this comparison, click here.
Hardware is not the only thing that’s different in event-based vision. The technology requires specialized software to convert input into actionable intelligence. Fortunately, a thriving ecosystem to support these efforts exists, starting with OpenEB, an open architecture for event-based software. OpenEB is the core of Prophesee’s Metavision Intelligence software suite. It consists of five fundamental software modules available under open-source license for developers to use to build applications for plug-ins.
Event-based sensors or event cameras are capable of much higher dynamic range than conventional CMOS imagers because each pixel automatically adapts to the incident light. For this reason, the detectors aren’t saturated by high-intensity lighting conditions, such as afternoon sun shining through the factory windows, the way a traditional sensor might be.
The focus on events also lets sensors operate with very high temporal resolution for near-continuous image capture versus the series of discrete frames captured by a CMOS camera (see figure 3). The result is that event-based systems can cost-effectively capture high-speed motion that would normally require expensive conventional cameras running at tens of thousands of frames per second. With this approach, movement is captured as a continuous stream of information and nothing is lost between the frames.
Figure 3: Conventional frame-based sensing (left) captures discrete frames versus an event-based sensor (right), which captures motion on a quasi-continuous basis.
Which brings us to what is perhaps the most important benefit – the decrease in data transfer compared to conventional systems. That, in turn, reduces demand for computing resources and bandwidth, both typically in short supply on the factory floor. Event-based sensors also require less power to operate, making the technology ideal for remote or mobile systems. This is a particularly important attribute with the increasing interest in mobile robots, UAVs and drones, autonomous vehicles, and the host of other mobile use cases that will be enabled by the rollout of 5G.
Applications of event-based sensing
Event-based machine vision can improve overall factory throughput by bringing ultra-high-speed, real-time and affordable vision to manufacturing. The technology is an effective solution for many industrial applications that require machine vision, especially those that involve highly dynamic scenes, including surveillance, tracking, counting, equipment monitoring, robotics and more. It enables more effective quality control and maintenance to ensure efficient operations.
High Speed counting
Accurate, high-speed counting is an essential task across a range of industrial operations.
The streamlined data capture model of event-based sensing enables systems to achieve counting speeds above thousand counts per second. Event-based algorithms can track objects smoothly and extract geometry even at very high speeds. In addition, vision processes like counting and tracking can be realized on modest computing systems, including at the edge.
Vibration monitoring/predictive maintenance
Depending on the industry, the cost of downtime can range from tens of thousands to millions of dollars per hour. Predictive maintenance uses a combination of sensors and analytics to detect equipment issues before they go critical. This enables repairs to be made at a time that doesn’t interrupt production.
Vibration monitoring is a key technique for tracking changes in asset health. Every piece of equipment has its own natural resonance. Changes to the vibration spectrum, both in amplitude and frequency, can be used to diagnose equipment flaws with extraordinary specificity, isolating issues like cracked pump blades, bearing cage defects, improperly tightened belts, etc. While a range of vibration monitoring technologies exists, most devices require costly installation and physical contact to the object under test. Event-based vision provides an easy to deploy, and cost-effective alternative.
The mode of operation of an event-based vision system is to detect changes in incident light. When the normal vibration modes of a motor or gearbox proliferate and amplify, the surfaces reflect light differently. Setting up an event-based system for vibration monitoring is as simple as aiming the camera at the asset and setting the appropriate thresholds. Properly configured, the system can be used to monitor vibration at frequencies from below 1 Hz to tens of kHz. That frequency range can be used to provide advance warning of common equipment issues like bearing defects and lubrication breakdown.
This type of monitoring can also be applied to manufacturing process control by analyzing equipment process deviations through kinetic or vibration monitoring.
Changing how machines see and the value we get from them
Event-based vision can be integrated as part of a modern machine vision system and customized for specific uses and applications. It provides a new level of efficiency in how machines can discover important information from almost any process. Indeed, in many instances it can reveal to machines what was previously invisible to them. It has truly breakthrough potential to bring more value to critical tasks in manufacturing, packaging, logistics, surveillance and more.
Prophesee offers an event-based sensor in board level and packaged form factors. We also provide a comprehensive development environment that includes 95 algorithms, 67 code samples and 11 ready-to-use applications from vibration monitoring to machine learning capabilities that be downloaded and used to tailor systems for specific requirements.
Prophesee is the inventor of the world’s most advanced neuromorphic vision systems.
The company developed a breakthrough Event-Based Vision approach to machine vision. This new vision category allows for significant reductions of power, latency and data processing requirements to reveal what was invisible to traditional frame-based sensors until now. Prophesee’s patented Metavision® sensors and algorithms mimic how the human eye and brain work to dramatically improve efficiency in areas such as autonomous vehicles, industrial automation, IoT, mobile and AR/VR. Prophesee is based in Paris, with local offices in Grenoble, Shanghai, Tokyo and Silicon Valley.
The company is driven by a team of more than 100 visionary engineers, holds more than 50 international patents and is backed by leading international equity and corporate investors including 360 Capital Partners, European Investment Bank, iBionext, Inno-Chip, Intel Capital, Renault Group, Robert Bosch Venture Capital, Sinovation, Supernova Invest, Xiaomi.
Learn more: www.prophesee.ai
US-EU Agency: Mike Sottak, firstname.lastname@example.org +1 650248-9597
Prophesee Global: Guillaume Butin, Marketing Communication Director: email@example.com