« Back To Motion Control & Motors Industry Insights
Motion Control & Motor Association Logo

Member Since 2006

LEARN MORE

The Motion Control and Motor Association (MCMA) – the most trusted resource for motion control information, education, and events – has transformed into the Association for Advancing Automation.

Content Filed Under:

Industry:
Motion Control Component Manufacturing and Robotics Motion Control Component Manufacturing and Robotics

Application:
N/A

Servo Motors Give Robots Human Face

POSTED 03/23/2010  | By: Kristin Lewotsky, Contributing Editor


 

Albert Einstein humanoid RobotThey appear in everything from amusement park rides to motion pictures - animatronic characters that substitute electromechanical systems for muscle and sinew. Whether a 26-ft-high version of a football player, a talking lion, or a long-dead president like Abraham Lincoln, animatronic characters take us beyond the bounds of reality. With the aid of motion control elements, they make the impossible possible. 

Like machine vision, animatronics provides a vivid example of how biological systems make incredibly complex problems appear simple. Let’s forget about the challenge of standing or walking. Just creating natural-looking expressions using electromechanical systems is extremely challenging. The average human face features 40 to 50 separate muscles. We can smile with barely a thought, in an instant. Now consider trying to duplicate that electromechanically, performing path planning, coordinating axes, and handing down commutation commands to 40+ separate motors in milliseconds. 

The developers of the Albert Einstein humanoid robot (Albert-Hubo) have done nearly that (see figure 1). A collaboration between Hanson Robotics and the Korea Advanced Institute of Science and Technology  (KAIST), with support from the University of Texas, the Albert-Hubo can not only walk un-tethered and unsupported (see video here), it features a range of natural expressions and the ability to recognize people in its field of view and turn to address them (see video here).
 
Emulating muscle movement in the face alone requires 32 DC gearmotors, says David Hanson founder and CEO of Hanson Robotics, which built the head portion of the robot. Because the motors are reversible, each simulates the movement of two muscle groups, for a total of 64 muscles. Given the space constraints, the motors needed to be small diameter, high-torque-density devices. The Hansen Robotics team chose three different motors the smallest of which is less than a centimeter in diameter. This model generates 2.8 kg/cm of torque; and with an additional gearbox provides a whopping 6000:1 reduction ratio. The motors either pull on drive linkages attached to the skin or press against it with steel pins and levers. Integrated drive electronics simplify assembly and operation, and encoders provide condition and position feedback.
 
For years, the default material to simulate human flesh was rubber. Hanson Robotics has developed a more compliant, porous material dubbed Frubber that more effectively simulates the response of human tissue. The result is the eerily realistic. Better yet, the force required to deform the material is 23 times less than that of rubber, cutting demands on motor torque and power, which increases battery life.
 
Taking Control
However appealing smart components might be, this type of tightly-coupled problem requires centralized control. Albert-Hubo incorporates two controllers: one for the facial expressions and the other to run the motors in the body that allow the robot to walk. The act of walking is essentially a controlled fall past a zero momentum point, involving tightly coupled weight shifts. The Einstein robot accomplishes this with a combination of inertial sensors, gyro accelerometer, and very fast feedback PID loops. The result is natural looking gestures that escaped the artificial-looking constant-velocity movements so often seen in animatronic devices.
 
In addition to facial expressions, the robot can move its eyes and adjust the tilt and rotational angle of its head. Instead of a motion controller, the group uses a PC, which can handle higher-level calculations. Image sensors in the eyes allow the robot to recognize people in its field of view and turn toward them using simple visual servoing techniques, for example. "The sensors send the information and we run our perceptual algorithms on it to correlate it with a [3D space] model we call the egosphere,” says Hanson. “The robot will remember that people have been seen in that space and make some probability predictions of the likelihood of them to be in a particular position. Then it does a correlation of its head and eye kinematics relative to that three space so that it can calculate how to servo over to look at a particular location while maintaining expressive use of the head.”
 
Animatronic lion at Calgary ZooThe makers of an animatronic lion for the Calgary Zoo (see video here) took a different approach to the problem. The lion lounges on top of a building at the zoo. At the prompting of visitors, the lion comes to life and delivers a little performance, complete with carefully coordinated sound and lighting (see figure 2). As a result, the engineering team chose to go with DMX512-A, the ANSI standard for theater staging, says Donald Labriola, president of QuickSilver Controls Inc., which teamed with animation-control specialist Gilderfluke & Co. to provide motion control for the project. Under this standard, systems typically feature dedicated processors with sound channels and an additional communications link known as the DMX channel, which combines multiple different signals in the same datastream. “A whole lot of the work has to do with coordinating motion with sound. If a roar is supposed to last 2 s, you don't want the motion to last 2.5 s," says Labriola. "The Gilderfluke control interface allows the artist to quickly define and refine the many coordinated motions. Using different channels, you can coordinate sound segments, spotlight intensity and direction, as well as the motions controlling the various aspects of the animated figures.”
 
DMX512 by the Numbers
The DMX 512 data frame consists of up to 512 “slots” (bytes) of data sent in a serial format at 250 kbaud. A data frame starts with a break character, followed with a “start code” byte to designate how the contents of the frame are to be used. Next, comes the data “slots” at nominally 1 byte each. According to the number of slots configured in the DMX datastream, the frame time can vary between 1.2 ms and 22.7 ms for a maximum of 44 updates per second. "We interpolate the motion between these updates so we don't get staccato motion," Labriola says.
 
The basic DMX data slot, being only 1 byte, is limited to representing only 256 states or positions. The Guilderfluke controller and the QuickSilver motor controllers are able to combine multiple slots into larger words, up to 32 bits, allowing the user to define a much greater range of motions, and allowing much smoother motions than would be available to devices using only single slots.
 
When an application requires rapid motion, one approach is to add more DMX channels and processed in parallel. "You can run at what they call a smaller constellation," Labriola says, "with fewer data points going out in each frame. Instead of sending the full 512 slots, they can run at 128 or less. This brings the base speed of 44 updates per second up to several hundred updates per second.” Because the master DMX controller pulls data from the same database, the multiple parallel DMX streams are all still synchronized to each other. “The other method allows the artist to define a channel controlling the degree of interpolation dynamically: with more smoothing for gradual motion periods and little smoothing for rapid motions," Labriola adds.
 
As good as it is, DMX does have its drawbacks. The standard originated as a unidirectional protocol. The most recent update includes bidirectionality, but so far, little compliant equipment has reached the market, and any data must be polled, slowing the update rate for downstream data. For applications that require two-way data flow, engineers can bring in additional tools. Labriola points to a recent animated art application that combines DMX with CANopen to achieve the desired functionality. "They wanted to be able to read back motor temperatures and current so they could run diagnostics and correct things before they fail," he says. "They use CANopen for these back-channel communications and diagnostics and DMX for the dynamic motion control. Both channels run simultaneously so you can watch what's going on through the CANopen channel while the DMX is coordinating the motions with the rest of the stage functions.” The CANopen channel can also be used to change modes on the serial channel; when more extensive updates are required, the user can switch out of DMX entirely and into a diagnostic mode to download software.
 
As if the control and motion challenges weren't enough, audible noise can be a problem for this application, says Labriola. “If you’ve got a lion up there and you have servo motors whining as his mouth is going up and down, it really ruins the effect. It’s better not to make noise than to try to silence it.” Rather than starting with a high-speed motor and gearing it down to achieve more torque, he starts with high-torque, direct-drive servo motors. The devices provide significant muscle at low speed, eliminating the need for a gearbox. "At lower speeds, the motors do not make as much noise,” he observes. “It’s hard to couple low frequencies from a small structure. Up to a couple hundred rpm, they’re essentially silent, whereas if you're spinning them up to 2000 rpm and then gearing them down, you start hearing the motor whine and the gear box teeth chattering."
 
Despite the complexity, these are commercial systems and need to be as economical as possible. Forget about absolute optical encoders, for example - a simple potentiometer provides a much more economical alternative. Hanson’s group buys integrated gearmotors as a way to economize.
 
Although animatronics still has a long way to go before it will replace the real thing, the technology gets better and better every year. As motion control provides engineers with ever more powerful tools, robots come closer and closer to being indistinguishable from the real thing.