New Vision System-on-Chips Could Be End of the Beginning for Machine Vision
| By: Winn Hardin, Contributing Editor
The tail doesn’t wag the dog.
That’s what engineers have said for the past three decades when it comes to machine vision’s unrequited love for the microchip industry.
Vision engineers would quickly acknowledge that, thanks to Moore’s Law and a stream of ever-faster microprocessors and cheaper memory, machine vision has grown from an enabling technology to a major, albeit discreet, contributor to every new cell phone model and gaming console.
In other words, machine vision loves Silicon Valley. But they don’t really know we exist.
But if Jeff Bier, founder of the Embedded Vision Alliance, is right, the days of machine vision being the red-headed fifth-wheel are over. “Ten years ago, the SUV that won the DARPA autonomous vehicle challenge was packed full of racks of electronic equipment doing simultaneous localization and mapping [SLAM], monitoring the 3D space and the vehicle’s orientation in that space,” says Bier, who also is president of embedded processing consultancy BDTi (Walnut Creek, California). “This year, Dyson, the British company that makes autonomous vacuum cleaners, does all that with a single board that fits in a robot the size of a pasta pot. Moore’s Law has made this possible. It’s not like you wake up one day and there are two suns in the sky. It’s happened over 10 years, so we don’t notice that the world is going through what truly is a paradigm shift, a revolutionary change.
“If we could install nuclear reactors in each of our homes and never have to pay for electricity again, that would be something,” Bier continues. “But the spread of machine vision will actually touch more parts of our lives than a personal reactor would. We’re seeing a proliferation of vision into automobiles, retail stores, consumer electronics — everywhere.”
The uptake of machine vision technology by consumer markets with massive R&D budgets has caused a discontinuity in the pace of development of machine vision technology in general. Just a decade ago, the military was the only organization with enough funding to develop chips just for machine vision and imaging systems. Today, thanks to cell phones, autonomous robots at Amazon, and gaming devices like Microsoft’s Kinect sensor, consumer companies are paying attention to develop system-on-chip (SoC), embedded, and highly optimized imaging systems.
“Industrial markets, like machine vision, are conservative and slow to develop new technologies,” adds Bier. “But even industrial machine vision companies are taking advantage of low-power microprocessors developed for consumer electronics to make their smart cameras both more powerful and cheaper.”
According to Ingo Lewerendt, strategic business development manager at Basler AG (Ahrensburg, Germany), the larger trend of making imaging systems that consume less power and are smaller, smarter, mobile — even wearable — is good news for the industrial vision market.
“A stationary application in a factory can pay for a €10,000 machine vision system, but to break into new markets like retail or even wearable systems in the medical industry, for example, the vision system can’t cost more than €500,” Lewerendt says. “New markets want machine vision without the PC, the GPU, the hard drive. They want the system reduced to the minimum, like Google Glass. It’s so small, you clip it on your glasses and don’t even know you’re wearing it. Between a traditional machine vision system and the Internet of Things is everything in between. That’s why some early adopters in machine vision have started looking at non-industrial applications.”
Vision-on-Chip Meets the Cloud
Even though vision SoCs are solving many of the financial challenges to an expanding machine vision marketplace, technology and financial concerns aren’t the only hurdles the machine vision industry will have to overcome in order to capitalize.
“Some applications could easily be solved with half the image performance, but that runs against the traditional machine vision point of view,” says Basler’s Lewerendt. “It’s a psychological challenge for a machine vision company to say, ‘Let’s build a camera with a fraction of the image capability and sell it at €100 instead of thousands of euros.’ And then the machine vision company has to figure out how to support market demand of hundreds of thousands, even millions, of units instead of just thousands.”
Just as ultra-low-power microprocessors have enabled low-cost vision sensors and smarter “smart cameras,” both Lewerendt and Bier expect that SoC technologies will open up even more markets that have traditionally eschewed machine vision in favor of manual or lower-cost solutions.
“Instead of putting cameras above a robot workcell on a truss system or a camera at the end of the arm connected to a PC, you can put the entire vision system — including camera and processing — at the end of the robotic arm and interface directly to the robot,” says Bier.
SoC vision systems also will help emerging applications, such as autonomous robots used in automated warehouses and smart agriculture, which VisionOnline recently covered in its article on hyperspectral imaging.
“Smart agriculture isn’t a 100-million-unit-a-year market, but it’s sizeable,” adds Bier. “And these are big, expensive combines, so there’s a good potential market there.”
The cloud also could be a disruptive force for the machine vision market, according to Bier, who points at analytics provider Sight Machine, Inc. (San Francisco, California). “They’re trying to disrupt factory machine vision by sending all the video up to the cloud,” Bier relates. “You can ride the processing-power-per-dollar curve in a way you can’t with an installation you put on the factory floor once and use it for 10 years.”
Bier agrees that while safety systems wouldn’t be able to use cloud processing, slower production lines such as sheet metal part fabrication and inspection could. And it allows customers to store every bit of video generated from the floor for later recall and analysis.
“These are a couple of examples, but there are literally thousands of new applications out there,” Bier says. “In the next few years, with the aggressive development going on in this space, I expect we’re going to see a lot more market disruption.”