« Back To Vision & Imaging News
Photoneo, Inc. Logo

Component Supplier

Member Since 2017

LEARN MORE

Photoneo is a leading provider of robotic vision and intelligence. Based on a patented 3D technology, Photoneo developed the world’s highest-resolution and highest-accuracy 3D camera, thus unlocking the full potential of powerful, reliable, and fast machine learning.

Content Filed Under:

Industry:
Automotive and Robotics Automotive and Robotics

Application:
Material Handling and Vision Guidance for Robotics Material Handling and Vision Guidance for Robotics

Part 2: The new Bin Picking Studio 1.4.0 - Re-build the robot environment without the need of CAD files

POSTED 05/28/2020

We’ve prepared a comprehensive yet easily understandable guide for you.

Robot

Before accessing the “Robot” tab, you first need to select a robot and gripper from the database and define the tool point. You can then virtually jog the robot and define its workspace.

Robot controls

You can choose from three motion modes:

  • Joint
    Here you can virtually jog the individual robot joints in two ways:
    • By moving the slider for the corresponding joint
    • By manually entering the joint position in the input field
  • Linear (Tool coordinate system / Robot-base coordinate system)
    You can move the robot’s tool point either in the coordinate system of the tool or of the robot base. There are three ways to move the robot:
    • By moving the marker in the robot’s tool point, dragging it to the desired position
    • By dragging the arrows to move it in each respective axis or by dragging the circles to rotate it
    • By manually entering the TCP position and orientation in the “Toolpoint position” panel

If you click the “Home pose” button, the robot will automatically be moved to its default position. To enable the collision checking feature, use the “Highlight collisions ON/OFF” toggle button.

When maneuvering the robot, joint limit restrictions are applied.

Joint limits allow you to define the path planning workspace of the robot.

The robot usually needs to move only in a fraction of its joint range. We strongly recommend you to set up the joint limits so that the path planning takes place only in that subspace of the robot’s full workspace. Correctly set up joint limits have a positive effect on the path planning computation time and success rate.

If you leave a joint limit field empty, the system will use the robot’s default hardware.

Please note that too restrictive joint limits can lead to a path planning failure. Always allow for some leeway. To verify that the reach of the robot is sufficient, use the jogging options of the “Robot controls” panel. Make sure the robot can reach every place in the bin model with various gripper orientations.

Vision

You may also have encountered difficulties stemming from discrepancies between the real bin picking cell and its CAD version. This will be no issue anymore with the new option to trigger a scan and check whether the scanned point cloud corresponds to the modeled virtual space. Given that now you can also check the scanning volumes of the employed vision systems, it has never been easier for you to set up the field of view.



 

The “Vision” tab enables you to manipulate the configured Vision Systems to validate the placement of the collision objects.

If scanners are mounted on the robotic arm, you first need to select a robot model for that Vision System to appear in the list. Otherwise, you get a warning message.

A Vision System can have the following statuses:

  • Available (the scanner is ready to connect and the Vision System is calibrated) 
  • Not available (the scanner is currently not ready to connect; check the power, connection, and network configuration) 
  • Not calibrated (the Vision System is not yet calibrated – you first need to perform a successful calibration to use this Vision System in the “Environment builder”) 

Get detailed information about a Vision System (scanner’s ID, mount position, and model type) by clicking on it. The bottom fields reflect the calibration of the Vision System – translation and orientation of the scanner’s camera relative to the robot base (extrinsic calibration) or robot flange (hand-eye calibration).

The Vision System overview window contains several buttons:

  • Visibility (you can turn on/off the visibility of each scanner model, scan volume and the origin of the scanner (more specifically its camera))
  • Connect (you can click the “Connect” button once the scanner is ready to connect and the Vision System has been calibrated)
  • Disconnect (once connected,  you can disconnect from the scanner using this button)
  • Trigger (while connected to the scanner, you can trigger scans)

Please note that only one scanner can be connected at a time. In case the same scanner is used in multiple Vision Systems, it gets connected in all of them.

The “Environment builder” shares scanner controls with the PhoXi Interface – once connected in the “Environment builder”, the same scanner is connected in the PhoXi Interface as well.

Validation

Connecting a scanner and triggering scans helps you place collision objects (mainly the bin) so that the model matches the reality precisely.

To ensure a precise object placement, a calibrated Vision System must be available to connect. You can then trigger a scan and use it to position the CAD model so that the model and the point cloud overlap.