« Back To Vision & Imaging Tech Papers
AIA Logo

Member Since 1984

LEARN MORE

AIA - Advancing Vision + Imaging has transformed into the Association for Advancing Automation, the leading global automation trade association of the vision + imaging, robotics, motion control, and industrial AI industries.

Content Filed Under:

Industry:
Aerospace and Automotive Aerospace , Automotive , Consumer Goods/Appliances , Electronics/Electrical Components , Food & Beverage , Medical Devices , Miscellaneous Manufacturing , Packaging , Pharmaceutical , and Robotics

Application:
Vision Guidance for Robotics Vision Guidance for Robotics

See More

High Speed Visual Servo Improves Robot Tracking Ability

POSTED 07/17/2000  | By: Jacques Gangloff

New technique is based on fast image sampling, predictive controllers

The ability of a robot to perform a task on a moving target has wide applications in the manufacturing industry, from soldering cars on an automated assembly line to picking cookies on a conveyer belt. In many applications, the position of an object cannot be precisely known, so the robot must be capable of tracking the object as it moves in order to perform its designated task. A visual guidance system can adjust for variations in speed and trajectory of the target, permitting the completion of tasks impossible with blind placement.

Researchers have been working for years to develop effective camera-based tracking systems to control a robot's position relative to a moving object. One of the most important techniques is visual servo - the use of feedback about motion of a target to control the robot's movement. As the target moves, its new position is captured by a camera. The image information is then acquired by a frame grabber and delivered to the visual loop controller - software that translates the image information into signals to the robot's joints to move it in the appropriate direction.

Rapid Image Transfer

A research team at the University of Strasbourg in France has developed a system that combines faster image sampling with a predictive controller to improve the ability of a robot to follow a moving target. The robot on which the tests were performed is a six degrees of freedom (6 DOF) industrial manipulator, so called because it has six rotational joints that allow the end effector (the 'hand' on the movable arm) to make three-dimensional translations and three-dimensional rotations.

The image sampling rate is the frequency at which updated visual information is delivered to the visual loop controller. The higher the sampling rate, the more closely the robot is able to follow the target and respond to changes in speed and direction. The system developed at the University of Strasbourg has a sampling rate of 120 Hz, or the delivery of fresh image data to the controller 120 times per second. According to a team spokesman, this is believed to be more than twice as fast as the sampling rates achieved in experiments with 6 DOF systems at other labs.

The system includes a high speed CCD camera (JAI M30) mounted on the robot's end effector, together with a PC containing the frame grabber (Coreco Imaging, Bedford, Mass.) and the visual loop controller. The camera captures images of the moving target at the rate of 120 non-interlaced images per second with a resolution of 640 x 240 pixels. The frame grabber acquires the analog image information at the same rate for use by the visual loop controller. The IC-FA (Fast Analog) frame grabber digitizes the video signals.

At a sampling rate of 120 Hz, image transfer from board memory to host memory is a critical issue (image transfer, image processing, and control must be all be performed in less than 8.33 ms). Since the frame grabber transfers image information to the controller at the rate of 120 MB/sec, the time required for acquisition is negligible. However, the image acquisition process must be precisely synchronized with the visual loop controller in order to reduce delays that slow the performance of the loop.

Synchronized Acquisition and Sampling

Synchronization is performed by the IC-FA frame grabber, using a feature that indicates which line in the image is being sampled. Each sample to the controller has a duration of 8.33 ms, the same duration as the image acquisition by the frame grabber. The beginning of a new sample to the controller is synchronized with the acquisition of the middle line in the image. As the upper half of an image is being processed (transfer into host memory, filtering, and extraction of positioning information), the lower half of the image is being acquired. When acquisition of the lower half of the image is complete, its processing is then performed.



 

This synchronization of image acquisition with sampling by the controller results in a reduction in delay of 8.33 ms ÷ 2, or about 4 ms. The delay period is critical because it represents the time between when the camera detects movement of the target and when the controller receives this information. Obviously, the controller cannot react to motion until it has information about it. In an extreme example, if you had a delay of 1 second and the target moved, the robot would respond only after 1 second.

Predictive Control Improves Precision

To keep pace with the rapidly changing image data, the visual loop controller must be closely calibrated to account for manipulator dynamics - the ability of the six mechanical joints to respond to velocity commands. The team solved the problem by developing a mathematical model of the 6 DOF robot that could be used to predict its behavior. The robot with its six joint velocity controls is modeled as a linear system around a sample trajectory. Then, a Generalized Predictive Controller (GPC) is designed based on a linear model of the visual feedback loop.

A GPC is a software program that analyzes previous commands and trajectory data to predict a system's future speed and direction. The GPC compares the predicted trajectory to the actual trajectory, then issues new commands to the six joints to move the robot at the desired speed and path. In the case of the visual servo loop developed in Strasbourg, the process is repeated every 8.33 ms at the 120 Hz sampling rate.

Experiments by the Strasbourg team demonstrate the capabilities of the new visual servo technique under various simulated operating conditions. One example is the Profile Following Application that highlights the ability of the camera to move at a constant velocity along a 3D profile whose curvature is unknown. By analyzing the image of the profile captured by the camera, the curvature of the 3D profile can be predicted in the forward direction to help keep the camera position constant with respect to the tracking object.

Practical Applications

While the population of robots continues to rise, especially in factory settings, the lack of effective sensory capability has excluded robots from many areas where the work environment and object placement cannot be closely controlled. A high speed visual servo system combining a fast image sampling rate together with a closely calibrated visual loop controller has been shown to markedly improve the precision of 6 DOF industrial manipulators.

The new approach can boost the versatility of robots in a number of industrial applications where an operation depends directly on the accuracy of the visual sensor and the robot end-effector. For example, many tasks consist of following a path along a profile, such as making a soldered joint along an edge, distributing glue for a sealing strip on the door of a car, or removing burrs along the edges of a piece of metal. With an accurate visual servo system, there is no need to know the exact curvature of the profile, nor its position - the robot will find it.