« Back To Vision & Imaging Industry Insights
AIA Logo

Member Since 1984

LEARN MORE

AIA - Advancing Vision + Imaging has transformed into the Association for Advancing Automation, the leading global automation trade association of the vision + imaging, robotics, motion control, and industrial AI industries.

Content Filed Under:

Industry:
N/A

Application:
N/A

Enterprise Applications for Augmented/Virtual Reality Offer Machine Vision Real-World Opportunities

POSTED 12/05/2019  | By: Dan McCarthy, Contributing Editor

Vision-empowered AR/VR tools are increasingly helping to streamline the product development process by replacing conventional scale models and prototypes with virtual representations of end products. Ford Motor Company, for example, announced this year that it is working with a 3D virtual reality headset and controller tool to develop 3D models in hours instead of weeks. Image courtesy of Ford Motor Company.In principle, if not in practice, machine vision is not that different from augmented reality (AR) or its counterpart, virtual reality (VR). All are methods to peer through an algorithmic lens to obtain an enhanced version of reality for a specific purpose. Another similarity that vision technology shares with AR/VR is an overall positive outlook for growth that draws, in part, from overlapping enterprise applications in manufacturing and industry.

According to Allied Market Research, the global AR/VR market is on track to expand from $11.32 billion in 2017 to $571.42 billion by 2025—a compound annual growth rate (CAGR) of 63.3 percent. That spells opportunity for embedded vision, which powers many of the eye-tracking and environment-mapping functions that enable AR/VR equipment.

Consumer devices including smartphones, gaming, and entertainment headsets represent the largest market segments for AR/VR products. However, these devices offer limited opportunity for traditional machine vision companies’ offerings as consumer market economies favor highly integrated, near-commodity vision components with economical price points due to high-volume production.

Traditional machine vision companies can take heart from enterprise applications, which are expected to generate upwards of 40 percent of Allied’s projected market for AR/VR and deliver a CAGR of 70 percent through 2025. These enterprise applications likely will demand more specialized vision technology and software to support more rigorous applications in design and visualization, simulated training, quality assurance, and field maintenance.

Reality at Work

In product design, for example, AR/VR tools are streamlining the development process by replacing conventional scale models and prototypes with virtual representations of end products. Ford Motor Company announced this year that it is working with Gravity Sketch — a 3D virtual reality headset and controller tool — to develop 3D models in hours instead of weeks. AR is enabling a similarly immersive approach to design at the Volvo Group, where AR headsets superimpose virtual 3D data and quality-assurance details directly onto the automotive engines traversing its production line. The headsets are powered by PTC’s Vuforia software, which enables the tracking and anchoring of digital content on physical objects viewed through AR-compatible headsets, as well as tablets or mobile devices.

Image fidelity, the hallmark of industrial machine vision components and technology, is important in these and other AR/VR applications as well, notes Doug Kreysar, Chief Solutions Officer for Radiant Vision Systems. “The more information provided by machine vision equipment like sensors, cameras, and other detectors, the better one-to-one match developers can make between digital and real-world environments, user behaviors, and digital response, thereby improving the output of the AR/VR headset and sensory experience of the user.”

Vision-based AR/VR tools are also playing a role in bridging the skills gap. Embedded cameras in AR headsets or handheld tablets are increasingly enabling seasoned engineers to document work processes from their point of view or overlay instructions onto objects viewed through an AR headset to guide trainees through maintenance or repairs.

BAE Systems took the latter approach to accelerate maintenance operations by 30–40 percent. The company used Microsoft HoloLens 2 headsets with built-in support from Vuforia to train new employees, according to David Immerman, Business Analyst at PTC. The AR implementation also allowed BAE to deliver instructions to its workers in hours at one-tenth of the cost of previous methods and cut assembly time in half.

AR’s ability to display data over a live camera feed is also enabling value-added services from capital equipment suppliers, who can now provide remote, real-time troubleshooting support to customers. Howden, for example, empowers customers viewing its equipment through Microsoft HoloLens headsets (again powered by PTC’s Vuforia solution) to maintain process-critical industrial products by providing interactive, intuitive access to 160 years of compressor knowledge in real-time.


 

Vision for Viewers

As referenced earlier, embedded vision is instrumental in eye-tracking technology because it allows AR/VR headsets to track in real-time where a user is looking. (We covered eye-tracking in more detail earlier this year. Read the article here.) Eye-tracking is not only critical to accurately overlaying display data on real-world objects in AR headsets;  it also helps optimize display resolution and power consumption of AR/VR gear alike by enabling dynamic foveated rendering, which selectively sharpens a display where the eye is focused while reducing resolution in the periphery.

Everyone who puts on an AR/VR headset is different in terms of eye and pupil position, shape, distance, and general visual acuity. Some developers, such as Canada-based North, resolve this by providing a custom-fit solution that features a limited eyebox—or viewing window—tailored to the user. But most headset developers targeting volume markets instead rely on highly sensitive eye-tracking sensors that support a large field of view (FOV) to ensure proper performance of the digital projection for a wide variety of users, said Radiant’s Kreysar.

Case in point: Microsoft’s HoloLens 2, released earlier this year, more than doubled the diagonal FOV from its first iteration, which had offered a 34-degree FOV. “This makes sense,” Kreysar added. “Humans have a pretty large FOV — about 120 degrees horizontal — and if we’re wearing an AR headset, we ideally want the digital projection to map to anything we can see. At Radiant, we have measured a range of display FOVs using photometric imaging systems and wide-FOV optics, covering up to the full 120-degree horizontal FOV.”

Capturing Reality

VR headsets, in general, need only track a user’s orientation within the virtual world they create. One notable exception is Microsoft’s DreamWalker VR system, which immerses the viewer in a reconstructed virtual model of the world in which the user moves. The DreamWalker leverages several positioning technologies to achieve this, including inside-out tracking, which relies on embedded cameras and sensors to determine a user’s position and orientation in relation to marked GPS locations.

The DreamWalker notwithstanding, AR equipment poses more unique challenges for vision technology. Namely, AR gear must map real-world environments in real-time relative to the orientation of the user’s focus so it can accurately superimpose relevant data over the user’s view. Headset developers must carefully consider where and how their product will be used to correctly specify the resolution, sensitivity, and depth of field of embedded cameras. In most cases, AR uses cameras that operate in the visible spectrum.

However, AR systems using infrared (IR) imagers or light detection and ranging (LiDAR) sensors are finding a home in automotive applications that forego the headset to instead project data in head-up displays that provide drivers with better situational awareness in darkness, fog, or rain. Interest in autonomous vehicles is providing more traction for LiDAR. There are tradeoffs with regard to which wavelengths of near-infrared light are optimal for machine vision sensing in these kinds of applications, said Kreysar.

Currently, state-of-the-art automotive LiDAR systems operate either at 905 or 1550 nanometers (nm). Both are effective, though 1550-nm sensor systems have exhibited performance degradation due to water-related environmental effects. In contrast, LiDAR systems based on the 905 nm wavelength require less power to operate in inclement weather and can also leverage widely available, lower-cost CMOS technology.

While this suggests that 905 nm–based CMOS systems will win out in automotive applications, the future is one reality that neither AR nor VR technology can disclose. The same applies to predict how much momentum vision technologies will gain from AR/VR headsets. What is certain, however, is that opportunities abound for vision suppliers able to provide AR/VR gear with higher image fidelity, sensitivity, and low power consumption.