Industry Insights
Embedded Vision Systems Gain Traction in Multiple Industries
POSTED 04/13/2016
| By: Winn Hardin, Contributing Editor
A pile of gravel at a road construction site might be one of the last places you’d expect to find embedded vision technology at work.
But commercial drone maker Kespry uses the craft to help aggregate companies keep track of their expensive rock pile. Kespry does this by flying a drone around a gravel pile and then using computer visioning equipment to assess the pile’s size. The data is analyzed and an estimate (with an accuracy within 1%) indicates how big the pile is. The analysis occurs with cloud-based software and details are sent to the field technician’s portable base station device.
In around an hour the drone finished a job that once took most of a day, and with more accurate results. Here’s why it matters: The vision-enabled drone was directed from a tablet-sized base station and data analysis took place in the cloud. These represent three key elements of embedded systems: vision, portability, and computational horsepower. Perhaps the only element missing was a wearable device so that the operator could see what the drone was seeing, says Jeff Bier, founder of the Embedded Vision Alliance (Walnut Creek, California).
For Bier, embedded vision is “any practical, deployable vision implementation.” That can range from a custom solution with a use-specific chip to a cloud-based solution. In the case of the gravel pile, the worker likely neither knew nor cared whether the data analysis was done in the drone, on the base station, or in the cloud, Bier says.
Embedded vision is being incorporated into a myriad of applications. Even so, a handful of industrial sectors are receiving most of the attention, largely due to economies of scale. Top sectors include automotive, medical, and retail. Taken together, they spotlight three trends in embedded vision systems today: developers are working to drive out cost, shrink device size, and offer enhanced flexibility. In the automotive sector, most automakers offer models with interior as well as exterior vision sensors. Increasingly, vehicles are becoming connected to their environment, enabling them to adapt to road conditions, avoid obstacles, and communicate with other vehicles.
In automotive, “fast computing with low energy consumption” is important, says Ingo Lewerendt, Strategic Business Development Manager at Basler (Ahrensburg, Germany). After all, no one wants to sacrifice battery power to run the sensors and vision systems that a fully autonomous car will feature. But custom solutions seem almost inevitable as automakers offer up their own branded cabin configurations of entertainment and information systems.
“I would be amazed if all car manufacturers have the same internal interfaces,” Lewerendt says. That doesn’t negate the need for industry standards, however. Two important standards trace their roots to other technology. First, MIPI camera interface standards were initially developed for smartphone displays. They are now being used in the automotive sector to standardize interfaces and displays, Bier says. Second, Khronos OpenVX software enables performance and power-optimized computer vision processing in advanced driver assistance systems (ADAS) and other systems.
The bane of electronics in automotive applications is that it can be hard to get rid of heat that is generated. That difficulty often limits how much power a piece of electronics can consume. To cope, it’s common for developers to use two or more processing engines and rely on Open VX to help with the programming complexity, Bier says.
Embedded vision in life sciences has evolved from five years ago when most demand was for barcode reading, according to Christoph Wimmer, Global Business Development Manager for Microscan (Renton, Washington). “Machine vision is now an integral part of the field, but smaller devices with modular components are critical to fit into smaller and smaller spaces,” he says. Additionally, this approach “helps with component certification, such as those governed in the U.S. by FDA requirements.”
Equally important are efforts to drive down costs and make equipment more flexible and portable for medical applications, says Basler’s Lewerendt. For him, the progression in embedded vision technology is from desk-mounted equipment to mobile equipment to smartphone applications to wearables, each step smaller and more portable. “We are looking for solutions that are smaller, easier to share, and that move rather than fixed and installed,” he says. “Everything that provides the market with price, portability, and battery improvements is a step in the right direction.”
Embedded vision systems also are being tailored for use in retail settings not only for computing solutions but also to assist with staffing requirements and product placement. Prism Skylabs (San Francisco, California), for example, uses surveillance video as the basis for customer analytics tools. Rather than looking for shoplifters, the application seeks to identify shopping patterns: How long do customers linger at a display, what times of day are the busiest, what products draw the most attention? Vision systems can be put to work outside of Main Street shops. Pole-mounted cameras could identify vacant parking places and relay their availability to connected vehicles. The same pole-mounted cameras could determine when traffic is congested and adjust nearby traffic lights to better manage the flow. With such applications, one question for system designers, says Bier, is how much intelligence should be local and how much should be based in the cloud.
“It’s easy to say the cloud, but you need a lot of wireless capacity” to handle a continuous stream of data and video, he says. Systems that are embedded in a light pole or adjacent to a retail counter, by contrast, may offer only limited analytics and relatively terse messaging about conditions.
“Finding the right tradeoff is important,” Bier says, adding that from a user’s point of view it ultimately may not matter where the computing power is located. The question for embedded vision system developers, according to Bier, is “how can this technology be used to help people and organizations?”