Roboception GmbH Logo

Component Supplier

Member Since 2017


Eyes and brains for your robot: With its innovative hardware and software products, Roboception GmbH, based in Munich, is a pioneer in 3D sensor technology. Our 3D stereo sensor rc_visard enables robotic systems to reliably perceive their environment in real time; the complementary rc_reason software suite offers application-specific modules to add on as needed for your individual tasks. Intuitive user interfaces make the implementation as easy as can be: Users, even without any prior robot vision experience, will be able to deliver task-relevant information such as grasp points with just a few mouse clicks. We combine both classical and AI-based methods to give the robots eyes – and hence deliver key elements of our customer’s most forward-looking automation solutions in domains like logistics, automation and beyond.

Content Filed Under:



Automated Kitting Solution with 3D Robot Vision

POSTED 07/09/2023

Kitting cell at Danfoss factory becomes robust and fast with Roboception's 3D stereo vision solutions

Kitting is a common industrial task performed globally every day. It involves preparing parts into a 'kit' for assembly. Workers gather parts from bins, crates, or bags. The task is usually done in a warehouse and trays are then taken to the assembly site. Automating this task is far from easy due to the high demands placed on the vision system used.

Danfoss and their integrator VARO were faced with the challenge of improving a kitting cell that was not performing well and was experiencing frequent downtime. The 2D vision solution did not provide sufficient depth information, so parts often could not be grasped. "We have seen many attempts to solve this kind of challenge, but no one has ever really solved it before." says Arne Lundfold Bjerring, CTO at Varo. 

Retrofitting an existing cell for performance improvement

The problem: In the kitting setup, a robot assembles an assortment of different parts, picked directly from the supplier's pallets, into different trays. A 2D camera mounted on the robot was used to identify, pick and place the required parts. However, as soon as the positioning of the parts changed - for example, if a pallet was slightly tilted, its contents shifted slightly, or a supplier changed the way they packaged the parts - the setup ran into problems: 

"We just had way too much downtime and engineering effort with the original setup, and the cycle time wasn't great either," recalls Morten Hansen, Manufacturing Technology Engineer at Danfoss. "With Roboception's vision solution, we were able to add a third dimension to our process, making it much more robust and flexible at the same time."

“The new cell is running robustly and adapts to changes autonomously, and what’s more: We were able to reduce the cycle time from 40 to as little as 25 seconds, a pick-and-place operation per part now takes 7 seconds instead of 12 seconds in the previous set-up.”

3D vision increases robustness and reduces cycle time

Two rc_viscores are mounted on rails above the cell. They are coupled to an rc_cube that runs both sensors, the rc_reason CADMatch software, and some custom sorting strategies. AprilTags and a customized software module in the rc_cube's UserSpace (without additional computing resources) ensure high-precision localization of the sensors in relation to the robot at all times.

The vision solution is connected to the PLC of the KUKA robot via RestAPI. The robot is also mounted on a linear rail to access the entire length of the cell.


In fact, this implementation is one of the first operational cells based on the rc_viscore, the world's first 12 MPi stereo sensor introduced by Roboception in 2022. Its high-resolution capabilities allow the sensors to be placed well above the rather large workspace: With two rc_viscores mounted at a height of 2.9 m, the entire 5 x 3 m² cell area is covered, while even smaller parts can be reliably detected.

In Danfoss' current setup, the smallest part has a surface area of 1.5 x 5.5 cm and is detected with a sub-millimeter accuracy of 0.2 mm based on its CAD template. Eliminating an on-arm solution also has a direct impact on cycle time, as image processing can take place while the robot is still performing a previous task.

A sophisticated mix of machine learning and classical image processing solves the challenge


Objects to be detected (a layer of coils), left camera image, intermediate detection results


rc_reason CADMatch detection results, rc_reason CADMatch detection result, including grasp points (green), 3D visualisation of detection result

A customized solution, minimal workflow disruption and scalability

Last but not least, the Danfoss team appreciates the constructive collaboration with the Roboceptioneers: Not only were the vision experts able to build on standardized products and optimize their use with customized extensions, they were also readily available to provide support with solid simulations and testing prior to implementation.

This, combined with the intuitive user interfaces and regular software updates, resulted in minimized on-site installation time and minimal user training requirements. Integrating additional or replacement parts into the kitting process is easy using templates based on their CAD data; and once the first cell has been successfully installed, it can now be replicated 1:1 without any additional engineering effort.

Building on some of the elements developed in this application (e.g. the customized software module in the UserSpace or the CADMatch recognition templates), the next joint projects are already in the works.