Time of Flight (ToF) Sensors Bring Autonomous Applications to Market
| By: Winn Hardin, TechB2B, Contributing Editor
Time of flight (ToF) cameras saw their first wide commercial deployment a decade ago when Sony, Microsoft, and Panasonic first released this technology for gesture-based gaming controls (Microsoft Kinect), cell phones, and other consumer electronics. Early sensors offered low 2D, angular, and depth resolution and they were slow — excellent for consumer electronics but not so great for safety-critical industrial applications.
Today, a few machine vision companies not only have the ability to drive ToF sensor development but, with Sony’s acquisition of Kinect in 2015, followed by Teledyne’s purchase of e2v and IFM Effector’s ownership of PMD Technologies, they also own the fabs that make ToF sensor chips. This is resulting in more commercial options and in overall performance that perfectly matches the needs of logistics, autonomous robotics, and related applications.
What Is ToF?
ToF systems include an illuminating light source — normally laser diodes or LED-based structured light sources — and a ToF sensor. The light source sends out a pulse. Each pixel measures the time it takes for the light to return and determines distance to the object based on the length of the round-trip. A variant of ToF called indirect time of flight (iToF) offers a simpler, more cost-effective electronics design for the sensor, which measures phase changes instead of time. iToF distance accuracy declines the farther the distance to the object. While traditional ToF cameras can easily work out to 30 or 40 meters, iToF cameras typically are best suited for mid-range applications, in the 5- to 7-meter range, excellent for autonomous mobile robots (AMRs) and autonomous ground vehicles (AGVs) such as automated forklifts.
“Some applications, like advanced logistics, industrial safety, or mapping, are looking for higher-resolution sensors — current resolution in the market is typically VGA or lower,” explains Yoann Lochardet, marketing manager at Teledyne. “Other applications, such as industrial AGV, [intelligent transportation systems], and automotive, are looking for sensors able to capture fast-moving objects, therefore requiring no motion artifacts and often high frame rates. Furthermore, most of the previous applications (and others, like drones) are also looking for high dynamic range sensors to increase distance and/or the reflectivity range of the objects.”
Lochardet shares that Teledyne e2v’s most advanced ToF sensor is the Hydra3D, which has a resolution of 832 x 600 pixels — well above standard VGA resolution — a maximum frame rate of 416.7 fps at 12 bits, a minimum integration time of 20 ns, and three-tap pixels to minimize motion artifacts for moving objects. Because frame rates for 3D ToF applications can vary considerably based on application requirements, the Hydra3D gives the user the flexibility to optimize key application parameters: resolution, distance range, precision, 3D frame rate, depending on distance range object reflectivity, and so on. The Hydra3D also offers high dynamic range for outdoor applications, where sunlight parasitics can reduce range and resolution, allowing low- and high-reflective objects to be seen, regardless of the distance. Finally, the Hydra3D incorporates an on-chip function, allowing multiple ToF systems with no connection to each other to work in the same area without mutual interference — a major differentiator in autonomous systems, which typically use multiple cameras to sense large operational spaces.
Matching ToF Sensors to Application Needs
While the ToF sensor was initially driven by consumer electronics, Martin Gramatke, Basler’s product manager for 3D image acquisition, explains that the technology has developed to a point where machine vision can also make use of it. While ToF 3D measurement accuracy, which is often calibrated in millimeters or a percentage of the range, may not be sufficient for all industrial applications, for many robotic picking tasks, the accuracy is good enough for hand-sized objects, particularly if placement accuracy doesn’t require great precision. This makes ToF technology more applicable for robotic pick-and-drop applications and less suited to pick-and-place. Gramatke says that with ToF sensor acquisition times in the millisecond range, cycle times of 500 ms are easily achievable using ToF 3D sensor technology.
“Logistics applications typically have lower accuracy requirements,” said Gramatke. “A common application is dimensioning of freight objects like parcels, boxes, bags, suitcases, and pallets to optimize storage and transport space. Calculating the bounding box with an accuracy of about 2 centimeters is good enough for many logistics applications.”
He continues, “Often these objects are moving while the measurement takes place. Therefore the complete acquisition time matters, which is more than just one exposure time. In the case of the Basler blaze [ToF camera], it requires at least four frames, resulting in a relatively short acquisition cycle of 15 ms.” Gramatke adds that given ToF’s cycle time, hardwired encoder triggers tend to work better than software triggers, which might capture only part of the box to be dimensioned if the target is moving.
For bigger items, such as pallets, end users need more than one ToF camera to measure dimensions from all sides across a large working envelope. ToF cameras can interfere with each other — with one sensor mistaking another sensor’s illumination pulse for its own. While Teledyne’s Hydra3D solves this problem at the sensor level, Basler’s blaze offers the ability to modulate the frequency of each camera to reduce interference. Two cameras with different frequencies interfere significantly less.
Range: Autonomous and Fused Solutions
Measuring large pallets for transportation and storage can require multiple ToF sensors, similar to multi-camera 3D visible imaging solutions. But due to ToF’s unique sensing paradigm, which is more like a radar transceiver than a visible camera receiver, measuring not just intensity but also time and phase, ToF range becomes a bigger concern. While visible cameras can simply place magnifying optics in front of a sensor to “see further,” ToF systems can’t leverage every ambient light source to ensure good contrast and, therefore, accurate 3D measurements.
Autonomous vehicles have a variety of tasks for ToF cameras, notably obstacle detection, localization, and picking or placing objects, such as pallets. Localization can require long ranges, while obstacle detection requires low latency to react fast and in many cases a wider field of view. As long as the vehicle is moving at walking speed, the measuring range of a ToF camera is sufficient. For higher speeds, you may use lidar with a scanning ToF principle. ToF cameras are not certified safety devices, so they have to be used in combination with other sensor modalities. “The Basler blaze — with its resistance to sunlight, robust design including IP67 protection, M12 connectors, shock- and vibration-proofing, and nickel coating to withstand adverse environmental conditions — makes an ideal tool for outdoor use,” said Gramatke.
Simplifying multi-head ToF camera fusion with lidar and other sensor modalities is at the heart of IFM Efectors’ O3R ecosystem. O3R, which today includes Windows/Linux/ROS-compatible software development kits for optimizing IFM’s 3D robotic solutions, will soon expand to include camera and processing hardware that will reduce the “friction necessary for autonomous robot application designers to develop multi-mode AMR perception solutions for a fraction of what they cost today,” explains Garrett Place, head of business development for robotics perception at IFM.
Like Teledyne, IFM owns its own ToF sensor company, PMD Technologies. The company initially developed ToF 3D sensors for consumer applications but has recently added a line of iToF solutions that cost considerably less than most industrial ToF cameras. “Traditional ToF cameras are more accurate over longer ranges than iToF, but traditional ToF is also much more costly per pixel,” Place says. “If you don’t need super-high accuracy, then you can save a bundle with iToF. We can extend our range to 20 meters, 30 meters outside with indirect ToF, because we own the silicon and have extra-large pixels and powerful illumination sources, but it’s not a trivial accomplishment. However, the best fit for our technology may be in the 5- to 7-meter range, which is perfect for most industrial applications.”
According to Place, when it comes to furthering the case for AMR and AGV, it’s not about one camera or sensor; it’s about making it easier to combine multiple camera heads and sensing modes into a single data stream for processing. IFM’s O3R solution will allow autonomous system designers to include up to six iToF cameras and an Nvidia-based processing unit designed to accept imaging and lidar systems — for under $3,000.
“You need lidar plus the iToF camera because AMRs need to collect 3D data across large areas, not just in front of the unit but on the floor and hanging above. Today, ToF cameras by themselves are not good at identifying bumps or holes in a floor, or overhead cables that might get caught in an autonomous forklift. Our solution eliminates the friction the designer faces when trying to integrate the key components of an autonomous perception system,” says Place.
IFM “flipped the script” to reduce the cost of its cameras, taking nearly all processing out of the camera and passing raw data to the dedicated IPC. Place says, “In the past, IFM was focused on the camera, the component, a piece of hardware or software. But we realized that our customers want a wholistic solution that makes developing these systems easier and cheaper. It shouldn’t only be Fortune 100 companies that can benefit from AMR solutions.”
RealSense and Real Opportunity
With Intel’s recent announcement that it intends to close its RealSense 3D sensor line — a shock to many in the vision industry — the opportunity for other cost-effective 3D solutions has never been brighter.
“Based on the current market conditions, the market share for ToF sensors will continue to increase, and ToF technology will be adopted in more and more markets,” concludes Teledyne’s Lochardet. “Not only for pure distance measurement but also to improve image segmentation in randomly illuminated scenes (surveillance, automotive, consumer, etc.). With that, the innovation on this technology will also continue to increase resolution, precision, and speed, whilst keeping or even reducing production costs.”
After all, when you can buy a 360-degree sensing solution for less than a standard industrial 3D camera, the warehouse floor may truly be the next killer application.