Universal Robots help Autodesk Push Construction Industry’s Boundaries for Human-Robot Collaboration
POSTED 12/05/2018 | By: Universal Robots
At Autodesk’s Robotics Lab in San Francisco, CA, Universal Robots are used in research projects exploring new ways to automate the construction industry. The projects span human-robot interactions, machine learning, drawing and smart assembly systems.
Crazy. That’s the response Heather Kerrick, Senior Research Engineer at Autodesk’s Robotics Lab, often encountered when she explained how her team was going to have collaborative robots and conference attendees build a pavilion out of raw bamboo and fiber string.
“There’s always this sort of apocalyptic ‘robots are going to take over and humans will have nothing to do,’” says Kerrick. “We wanted to explore the possibility of humans and robots working together to accomplish things that neither species could accomplish alone.”
In doing this she chose a formidable challenge. Raw bamboo is a very uneven, bendable material with different lengths and widths. “When we started, we weren’t really sure to what extent we could work with our robot and help it understand the uncertainty and the variability that we were giving it,” explains Kerrick, adding that this question is central to the construction industry. “In manufacturing, the supply chain can allow for much smaller tolerances, but in construction the tolerances are pretty broad. So we were really proud of our ability to empower the robot by giving it sensors and decision-making abilities and then act on that accordingly.”
The Hive Pavilion was built at “winding stations” in the lobby of the Autodesk University conference. At each station, attendees fastened three random pieces of bamboo onto a Universal Robot that generated the necessary movement sequence to hook fiber on the tips of the bamboo to create a unique, tumble weed-looking tensegrity element.
“The UR robots were able to offer very precise movements and very precise measurements that would have been difficult for a human to do on-site, so the human didn’t need anywhere near as many measuring tools or equipment. They were able to go to the robot, get the part that they needed, and then take that back to a construction site,” says the Autodesk research engineer, whose team successfully created the HIVE in three days.
Safe robot enables “daring” research
Building the pavilion involved close collaboration between the conference goers and the Universal Robots’ collaborative robot arms, also called cobots. The UR cobots have force-limiting safety features that cause the robot arm to stop operating if it encounters obstacles in its route. This was a crucial factor in Autodesk choosing this particular robot for the high-profile project. “We’re doing experimental research where the robots are moving based on real-time sensor data, so the chance of the robot doing something unexpected is really high,” explains Kerrick. “We really wanted to engage with people that had no experience with robots, providing them with a safe and fun experience while also furthering our research,” she says, explaining that had her team used a larger, more industrial robot, they wouldn’t have been able to engage with the public in the same way and it would have been a much slower research project. “But with the Universal Robots, we were able to be a little more daring with our research because we could trust that the robot wouldn’t break itself, and wouldn’t pose a danger to others.”
Drawing the enamored robot
Being able to operate in an open space without safety guarding also landed the UR10 robot a cameo in “Artoo in Love” a viral short film created by Evan Atherton, a research engineer at Autodesk. The film depicts an R2D2 robot in a park in San Francisco falling in love with a mailbox that it mistakes for another robot, and a UR10 that draws a portrait of the loving couple. Moving an industrial robot out into a rugged terrain is not often attempted. “Taking a robot to this unknown place with no power—we had to bring a generator—was an interesting challenge,” explains Atherton. Together with colleagues, he calibrated the robot and wrote a simple program that directed the robot to follow the paths of a vector drawing that was projected onto a canvas. “The UR10 was perfect; it was small, mobile and safe. We could bring it out in a pelican case. Had we used one of our traditional robots, it would have required a forklift and a safety cage so that would never have worked,” he says.
A flexible robot assistant following you around
The same premise of a flexible mobile robot in a rugged terrain applies to the construction site. “You don’t see many robots in construction today as it is not super feasible to have industrial robots that need to stay inside cages in this setting,” explains Atherton, emphasizing how working with the UR10 robot can change that norm. The Autodesk research team put a router on the end of the robot arm, gave it a camera and a projector and developed machine learning software that enabled the robot to recognize human gestures and voice commands. The UR10 can be rolled up to a piece of drywall, for instance, and project an outlet onto the wall that the user can modify and then use voice command to tell the UR10 to go ahead and cut it out. “We can put the robot on a cart and simply roll it around the construction site and have it help out,” says Atherton.
From hard-coded to machine learning
Another construction industry challenge now addressed in Autodesk’s research with the UR robots are smart assembly systems. Yotto Koga, software architect at Autodesk, explains that assembly systems of today tend to be hard-coded and brittle. “They’re engineered so that the parts as they come down the line need to be in an exact position and you need customized tooling and fixtures for each specific part in order for the robot to assemble them,” says Koga. “That in itself is very costly in terms of time and money. But what’s more problematic is when things become misaligned—the fixtures become damaged or parts need to be swapped out—the whole line becomes out of sorts.” This scenario prompted Autodesk to look into a “learn as you go” system, something what would be smart, flexible to change, easy to set up – and ultimately a system that Autodesk customers can use to assemble their own designs. Enter the “Brick-bot.” While having a robot pick up blocks of Lego might not at first seem that impressive, the project can explore recognizing and handling more than 10,000 different bricks with very tight tolerances.
Side-by-side iteration with the robot
Brick-Bot tackles three sub-problems: bin-picking, re-grasping and placement. Using vision guidance, the robot can pick out a pre-defined brick in a jumble of different sizes and colors. If the brick is grasped in the wrong position for placement, the UR10 performs a visual survey and can re-position and re-grasp the brick until it is correctly placed in the gripper. The final placement is also vision-guided by a second UR robot, a UR5, holding a camera to check the brick assembly.
“The next iteration is to actually start assembling designs, for example a house out of Legos or a toy giraffe, and then have the robot automatically build it,” explains Koga, emphasizing how the ability to work right next to the robot in this process is imperative. “One of the major reasons we chose Universal Robots is because it’s safe to work around. I could literally connect the robot to my laptop, work next to it, and quickly iterate through our experiments without worrying about safety protocols slowing things down. This was very important for us to make progress in this project.”
Open architecture enables easy command streaming
Quick progress was also facilitated by the UR robots’ open APIs. “We were able to get pretty low-level control of the UR robots using the streaming API over TCP communication, which was vital to our particular needs as we needed to directly access the by robot bypassing the robot’s own operating system,” explains the software architect.
His colleague Heather Kerrick recounts how the HIVE project benefited from the robot’s open architecture as well. “Building the HIVE meant working in a bunch of different coding languages and environments across teams and devices. We were able to simplify all of our commands into a single string that we could send to the robot,” she says. With our larger industrial robots, there’s often extra steps or extra software that’s required in order to sidestep whatever native controls are built into the robot, which is not the case here. The scripting language for the UR is also very, very simple to learn and to use.”
About Autodesk Robotics Lab
The Autodesk Robotics Lab pursues a broad scope of inquiry around robotics — such as advanced robotic control, simulation, visualization, and machine learning. The team builds real-world prototypes to truly understand how cutting-edge technology will develop in the future, and how these developments will affect the future of manufacturing, construction and production.
About Universal Robots
Universal Robots was co-founded in 2005 by the company’s CTO, Esben Østergaard, who wanted to make robot technology accessible to all by developing small, user-friendly, reasonably priced, flexible, industrial collaborative robots. Since the first collaborative robot (cobot) was launched in 2008, the company has experienced considerable growth with the user-friendly cobot now sold in more than 50 countries worldwide. The company, which is a part of Teradyne Inc., is headquartered in Odense, Denmark, and has subsidiaries and regional offices in the USA, Spain, Germany, Italy, Czech Republic, China, Singapore, India, Japan, Taiwan and South Korea. U.S. regional offices are located in Ann Arbor, MI, Long Island, NY, Irvine, CA and Dallas, TX.