« Back To Industry Insights
Robotic Industries Association Logo

Member Since 1900

LEARN MORE

RIA has transformed into the Association for Advancing Automation, the leading global automation trade association of the robotics, machine vision, motion control, and industrial AI industries.

Content Filed Under:

Industry:
N/A

Application:
Assembly , Material Handling , and Safety Assembly , Material Handling , and Safety

Smarter Robot Grasping with Sensors, Software, the Cloud

POSTED 05/24/2018  | By: Tanya M. Anandan, Contributing Editor

In robotics, end effectors are where the rubber meets the road. The robot “hand” is the ultimate touchpoint for every product or part that goes out the door. Smart factories and warehouses can only achieve the agile, super-connected and collaborative environments envisioned by Industry 4.0 if all of the systems are intelligent, and add value to the overall enterprise. Robot grippers need to be smarter than average. 

New grasping systems have moved to the front of the class. Hardware and software advances enable safer, closer human-robot collaboration, ease of use, and flexibility for handling a wide variety of shapes and sizes. Plug-and-play features make them easier to integrate and implement, especially for small and midsize enterprises (SMEs) with day-to-day changeovers. Integrated sensors put 3D vision, tactile, and force sensing in the palm of our hand. Algorithms orchestrate this multimodal concerto into an intelligent solution that learns as it grows. Cloud-sharing, AI-enabled robots are teaching themselves how to grasp not just better – smarter.

Smart Collaborative Grippers
One of the ways robot grippers are getting smarter is by learning to play nice with humans. Automation has become less about replacing humans. Now, it’s more about humans sharing production responsibilities with their robot coworkers. Collaborative robots, or cobots, are at the center of the movement. This is focusing more attention on collaborative gripping solutions. 

SCHUNK GmbH & Co. KG is one of the leading gripping systems suppliers in the world. Established in 1945, the family-owned company is headquartered in Germany and has been designing and producing robot grippers since the early 1980s. They see great potential for collaborative grippers that enable direct interaction and communication with humans. 

Gripper designed specifically for human-robot collaborative applications has limited force and no sharp edges or pinch points for safer operator interaction. (Courtesy of SCHUNK Inc.)
Gripper designed specifically for human-robot collaborative applications has limited force and no sharp edges or pinch points for safer operator interaction. (Courtesy of SCHUNK Inc.)

Co-act, which stands for collaborative actuator, is a family of collaborative grippers made by SCHUNK. The series is based on the company’s tried-and-true gripping technology with modifications to limit force and prevent other potential hazards when working closely with people. The company took its standard, electric 2-finger parallel gripper and built a protective housing around it with rounded corners to eliminate sharp edges and pinch points. 

The Co-act EGP gripper is force limited to comply with Technical Specification ISO/TS 15066:2016 Robots and Robotic Devices – Collaborative Robots. Released in 2016, the technical spec provides data-driven guidelines for designers, integrators and users of human-robot collaborative systems to evaluate and mitigate risks. Annex A of ISO/TS 15066 contains data from a study on pain thresholds for different parts of the human body, including hands and fingers. It provides thresholds for maximum permissible pressure and force. 

Check out Collaborative Robots and Safety Hand in Hand for more info on ISO/TS 15066.

“The biggest difference between a standard EGP gripper and the EGP we use on the collaborative robot is that we safely limited the force to 140 newtons,” says Markus Walderich, Automation Group Manager at SCHUNK Inc. in Morrisville, North Carolina. “We also made sure that if something should go wrong with the power supply, there’s no way the peak force could ever surpass 140 newtons.”

Plug-and-Play Ready
The Co-act gripper series is compatible with a variety of cobots on the market. These include the KUKA LBR iiwa (pictured), FANUC compact CR-4iA and CR-7iA, Rethink Robotics Sawyer, Techman TM5, Universal Robots UR series, and the Yaskawa HC10. The Co-act MPG-plus was specifically adapted for the ABB YuMi dual-arm cobot.

“Our grippers are plug and play,” says Walderich. “You don’t need any adapter plate and the electrical connection is already provided so you can connect it right at the wrist. You don’t have to run a cable along the arm to wire it into the controller.”

SCHUNK stocks the Co-act grippers with different mechanical and electrical connections, making them plug-and-play ready for the different robot brands. Walderich says the gripper is easy to integrate and control.

“We use discrete signals to control it, simple 24 volt signals,” he says, explaining that by using discrete signals, there’s no need for software drivers. “You only need one signal to open the gripper and one signal to close. Every robot already comes with a digital output that can open and close our gripper.”

This summer, the Co-act EGP-C will begin shipping with an integrated LED status ring that will provide visual indication of the gripper state. Different colors will indicate a proper grip or an error state. 

“Visually, you will be able to see right away if something is wrong,” says Walderich. “There’s integrated sensor feedback that tells you if the gripper is open or closed”

The Co-act grippers are best suited for material handling, machine tending, and simple assembly tasks. The manufacturer is using them in its own factory to increase capacity and avoid worker injuries. In the largely manual assembly process for its trademark grippers, SCHUNK is using a collaborative robot equipped with a Co-act gripper to scrape components across a sharp-edged extraction plate to remove residual sealant material. 

Watch the Co-act gripper in action on SCHUNK’s shop floor.

Walderich says the gripper’s main limitation is the force-limited operation. But what if you eliminate the likelihood of accidental human-gripper contact in the first place? That opens up a whole new world of possibilities.

More Sensors, More Collaboration
You may have already seen the future on the show floor. Sophisticated sensors will bring a whole new level of smarts to the SCHUNK Co-act gripper family. The Co-act JL1 prototype is a “technology carrier” for demonstrating features that may be used in future collaborative grippers. The prototype won the prestigious Hermes Award for innovative industrial technology at the Hannover Messe exhibition in 2017. 

Prototype collaborative gripper uses multiple sensor systems, visual feedback, a built-in touchscreen interface, and other advanced features to deliver higher levels of human-robot interaction. (Courtesy of SCHUNK Inc.)
Prototype collaborative gripper uses multiple sensor systems, visual feedback, a built-in touchscreen interface, and other advanced features to deliver higher levels of human-robot interaction. (Courtesy of SCHUNK Inc.)

“We put many possible technologies that we are aware of today in this prototype that makes it a safe, collaborative gripper,” says Walderich. “We will use what we’ve learned to derive future Co-act grippers with higher payloads. The next evolution of collaborative grippers will be higher force grippers that detect if there’s a human finger or hand in the gripping area and then it will not apply force higher than 140 newtons.”

The Co-act JL1 prototype gripper has a suite of sophisticated sensors to track the proximity of humans and trigger evasive movements to avoid direct human contact. A capacitive sensor creates an electric field around the gripper to detect when anything containing a lot of water enters this field. That way it can distinguish between a workpiece and a human body part. And it can do it within a narrow radius of 20 cm. If a human hand comes within proximity, the gripper automatically switches into safe operating mode. 

A force-moment sensor detects unexpected force effects, such as a collision or malfunction. It also allows for manual guidance, positioning and teaching. Tactile sensors in the fingertips give the gripper a sense of touch. Then it can determine the exact gripping force acting on an object, allowing it to apply the appropriate amount of force for fragile items. 

The prototype gripper has built-in 3D cameras to help detect workpieces. An on-board touchscreen provides direct communication with the gripper for teaching or switching operating modes. Two different gripping types, parallel and angular, allow the JL1 to handle objects with varied geometries.  

This video illustrates the forward-looking technologies in the SCHUNK Co-act JL1 prototype gripper. 

These advanced capabilities will help enable the agile manufacturing environments required for Industry 4.0 and beyond, where humans and robots work collaboratively.

Ease of Use, Flexibility for SMEs
On Robot grippers were designed with collaborative applications in mind. The startup likes to say that its grippers are built for “plug-and-produce” automation. This is especially advantageous for small and midsize enterprises with low-volume, high-mix production that need to stay agile as needs change.

The electric 2-finger parallel grippers have on-board smarts enabled by software. This not only limits the force for use in human-robot collaborative applications, but also makes the servo grippers easy to integrate and implement.

“Beyond safety, there’s also the ease of use,” says Kristian Hulgard, VP of Sales - North America for On Robot A/S headquartered in Odense, Denmark. “When we go out and demo the product, it’s ready to pick and place items in 5 to 10 minutes. We cut a lot of engineering and programming hours out of the installation. That’s a huge part of what makes it collaborative.”

Robot grippers designed for human-robot collaborative operation load and unload metal parts in a CNC machine tending application. (Courtesy of On Robot A/S)
Robot grippers designed for human-robot collaborative operation load and unload metal parts in a CNC machine tending application. (Courtesy of On Robot A/S)

Hulgard says the CNC machining space is a huge market for collaborative robots, where cobots are often used to load and unload the machines. Most of these companies are small to midsize mom-and-pop shops with production schedules that often change daily.

“They are running 200 parts one day and then 300 parts the next day,” explains Hulgard. “With our smart gripper, the flexibility comes with being able to grab different sizes of objects with different force. You simply enter the size of the item you want to grab and how much force you want to apply, and away you go. The option to change the gripper’s functionality is a game changer. The return on investment is about 3 to 4 months. It’s easy to see the value in that.”

Established in 2015 by cofounders Bilge Jacob Christiansen and Ebbe Overgaard Fuglsang, On Robot is a new thread in a string of success stories that have emerged from Denmark’s booming robotics cluster. The startup’s acting CEO and one of the original investors, Enrico Krog Iversen, is the former CEO of Universal Robots, one of the leading collaborative robot manufacturers in the world. Another major investor and Universal Robots alum is Thomas Visti, the CEO of Mobile Industrial Robots (MiR). Both Universal Robots and MiR experienced early dramatic growth and were acquired by Teradyne

Drinking from the same lucky fountain, On Robot is experiencing nearly 300 percent annual growth, according to Hulgard, and looking forward to new products and global expansion. In June, On Robot will open its first North American regional office in Dallas, Texas.

Software-Enabled Smart Gripping
Currently, On Robot’s intuitive software interface is available for use with the Universal Robots UR series of cobots. On Robot grippers are part of the UR+ Solutions program for end effectors and other products certified for plug-and-play compatibility with UR cobots. 

“You can mount our gripper on any robot you want and it will work. But where we see the value is how you control the gripper in the software,” says Hulgard. “Right now, the software is only for Universal Robots, but we will introduce that software for other robot brands in the future.”

Operators use the touchscreen tablet that comes with the UR cobot to enter commands. 

“When you install our software into the robot, you cannot tell that it’s third-party software,” says Hulgard. “It becomes an integrated part of the Universal Robots PolyScope software. The same way that you teach the cobot, you can teach the gripper. It will do much of the work for you. Once you show the gripper the part to be picked by pushing the ‘close’ button on the screen, then it measures that part. With a click of a button, you then apply that size into your program. In the same programming process, you insert the force you want to apply to the part.”

Hulgard explains how you can set different force values depending on the nature of the items you’re handling. If it’s a CNC machine tending application, you might apply full force to ensure that you have a strong grip on the metal part. For a packing application with a fragile item where you need to be careful not to crush the product, you would apply a small force.


 

Check out this case study with On Robot grippers packing delicate herbs in a greenhouse.

“That’s where the flexibility of the gripper really shows. You can grip different types and different sizes of objects with the same gripper.”

The On Robot gripper comes in 2 kg (RG2) and 6 kg (RG6) payload models. The electrical connection is made directly to the tool flange, so there’s no cable running the length of the robot arm to the controller. The direct connection also makes it possible for the UR cobot to make endless rotations without getting entwined in the cable. 

A dual gripper enables the collaborative robot arm to handle more parts at a time, increasing productivity in this CNC machine tending application. (Courtesy of On Robot A/S)
A dual gripper enables the collaborative robot arm to handle more parts at a time, increasing productivity in this CNC machine tending application. (Courtesy of On Robot A/S)

A dual gripper configuration is available for both payload models. With a dual gripper, a part can be unloaded from the CNC machine in the same pass that a new part is loaded into the machine for processing. This increases productivity by reducing cycle time.

Watch the On Robot dual gripper on the job in this CNC machine tending application. 

“We go from a project time of about 29 seconds in that same cycle with a single gripper, down to about 17 seconds with the dual gripper. It’s almost half the time,” says Hulgard. “If we’re talking about huge batches where time is money, for the extra investment in the dual setup, it makes a lot of sense.”

By mounting the cobot on a mobile base, the customer can easily move the machine tending station from one CNC machine to another and handle a wide range of components of varying shapes and sizes. It’s important to note that no machine vision was used in this CNC machine tending application. The On Robot gripper is able to determine whether it’s gripping part A, B or C by detecting the different widths of the parts. Watch this intelligent gripper feature in action.

Hulgard says they welcome feedback from users on their hardware, software, and any aspect of its application with the goal to continue to improve and add more value to the company’s products.

“We added depth compensation in response to customer’s suggestions. Since our fingers grip in an arcing pattern, we need to compensate for the height that the fingers are arcing. If you have to grip something very flat to a table, like a coin, then you need to move the robot upward while you’re gripping, so the fingers don’t hit the table.”

With a lot of development and a little tweaking, software is making grippers smarter and more adept.

Reliable Piece-Picking
Not far from the Ivy League halls of Harvard University, another startup may get noticed for their innovative grippers. But once again, it’s the brain behind the brawn that’s noteworthy. 

RightHand Robotics draws from several advanced technologies to automate individual item picking in warehouses and e-commerce fulfillment centers. They offer a hardware-software solution combining innovative grasping, advanced sensors, and artificial intelligence to ramp up the range and reliability of automated “piece-picking” in intralogistics.

Robotic piece-picking solution for intralogistics combines innovative grasping technology, intelligent sensors, computer vision, and machine learning to automate individual item picking in warehouses and fulfillment centers. (Courtesy of RightHand Robotics, Inc.)
Robotic piece-picking solution for intralogistics combines innovative grasping technology, intelligent sensors, computer vision, and machine learning to automate individual item picking in warehouses and fulfillment centers. (Courtesy of RightHand Robotics, Inc.)

We got our first look at the RightPick solution during last year’s Automate show. A curiously designed, 3-finger robotic hand with a suction cup protruding from its palm was proficiently picking items from bins in the Honeywell Intelligrated booth. Later, we walked across the hall to the collocated ProMat show. There, we found three cobots equipped with these hands picking random items from bins in the startup’s own booth. 

They may have been tucked away in the back corner of the exhibit hall, but RightHand Robotics piece-picking debut stopped hundreds of fascinated showgoers in their tracks. Even more beguiling, how were they doing it? How were these robots picking hundreds of items of varied shapes and sizes with accuracy and speed?

Watch RightPick in action. From bottles, tubes and even cans of soup, to boxes, bags and shrink-wrapped multipacks, the variety of picked items is mind-boggling.

Fast-forward a year, RightHand Robotics was breaking world records for piece-picking at the MODEX supply chain event in April. RightPick workcells operating in five exhibitor booths picked and placed 131,072 items over the duration of the show. Coupled with UR cobots, the RightPick systems achieved pick rates of up to 1,000 units per hour across an assortment of items, including products that the system had never seen before.

The real story, here? It’s about reliability, at rates nearly unimaginable just a few years ago.

“Over the course of the tradeshow, we can run as many picks as you would in a small warehouse over the course of a day,” says Leif Jentoft, one of the cofounders of RightHand Robotics, Inc. in Somerville, Massachusetts. “For us, this was really about the reliability of the system. Our systems are ready for primetime.”

Investors like Andy Rubin’s Playground Global are counting on it. RightHand Robotics has raised over $11 million in Series A funding.

Clever Grasping via the Cloud
The RightPick piece-picking solution relies on a host of intelligent hardware and software technologies. A compliant, rubber-jointed fingered hand with a suction cup (made by RIA member Piab) grasps various items with the aid of 3D depth cameras and other sensors. The fingers help stabilize an item, so you can achieve a faster cycle time and pick heavier items. Computer vison helps the system figure out how to grasp the item. Artificial intelligence, specifically machine learning, is applied to improve the grasps over time. Data is shared with other robots via the cloud.

“Our core product is the RightPick.AI software and then the gripper working together. The brain and the hand,” says Jentoft. “The brain gets feedback from the gripper about what works and what doesn’t work, and is able to get better. We’re at the stage now where we have systems on the ground in warehouses in Japan, Europe, Canada and the U.S. and we’re using all of those to pool into the dataset, through the cloud. 

“That’s really where the magic is. It’s about the datasets. We don’t think the winning proposition is training a robot on each different item. We think the value is in training a more general set of capabilities. We can take something we’ve never seen before and based on our experience, pick that up effectively.”

To understand how far this technology has come, it helps to take a look back. As a company, RightHand Robotics has been aggressively working on this challenge since its inception in 2014. The cofounders’ research, however, dates back much further. A closer look provides insight into some of the lesser-touted innovations behind the solution.

The RightHand Robotics’ technology team hails from the Harvard Biorobotics Lab, the GRAB Lab at Yale University, and the Massachusetts Institute of Technology. Many of them have doctorates in mechanical engineering, computer science, or robotics. Jentoft met his fellow cofounders, Yaro Tenzer and Lael Odhner, while the three were still in grad school.

“Yaro and I were at the Harvard Lab, and Lael was at the Yale lab, trying to build better robotic hands to handle things outside of carefully controlled factories. In the factory you have the same part presented the same way a million times. We were trying to understand how you do something where you don’t have that predictability.”

The three university researchers ended up working on a joint project with iRobot Corporation to develop an end effector for the DARPA Robotics Challenge. That work led to the iRobot iHY under-actuated hand, which later won the competition. For more on the early precursor to the RightPick hand, check out this article.

“We had these soft, compliant fingers with mechanisms that make it easier to grab items. We had been studying those since the early 2000s in Rob Howe’s lab at Harvard,” says Jentoft. “We were using tactile sensing. Every time you pick an item, you get sensor feedback as to what works and what didn’t work.”

Jentoft and Tenzer also cofounded TakkTile LLC, which makes tactile sensors. Their patent-pending technology uses microelectromechanical systems (MEMS) to provide inexpensive gram-level sensing in a robust form factor.

Robotic piece-picking system is able to grasp items it’s never seen before and share what it’s learned with other robots in the cloud. (Courtesy of RightHand Robotics, Inc.)
Robotic piece-picking system is able to grasp items it’s never seen before and share what it’s learned with other robots in the cloud. (Courtesy of RightHand Robotics, Inc.)

Multimodal Intralogistics Solutions
Since their postdoc days, Jentoft and his cohorts have redesigned the gripper mechanism to make it more industrial. They’ve also stripped it down to make it more reproducible and affordable.

“It’s important to note that we’re a hardware-enabled software company,” says Jentoft. “You need the right hardware to do the grasping even if you’re building the brain.

“In the last five years, with 3D printing and all of these off-the-shelf sensors that have been developed for the cell phone industry, it’s become much less expensive to build production-grade hardware. Hardware is always hard. But it’s getting easier.”

It’s the same for vision technology. “In the last few years, we’ve had good depth sensors out on the market. We use off-the-shelf sensors but the whole vision stack is internal. We’re able to use those 3D images to figure out how to pick up items that we’ve never seen before. When I started grad school that was a $10,000 problem. When the Kinect came out,” referring to Microsoft’s depth sensor, “it became a $150 problem.”

Jentoft stresses the importance of finding a balance between precision and speed. Their goal is not to be perfect at the expense of cycle time. They call it the 3Rs – range, rate and reliability.

“With Amazon breathing down everyone’s neck, it’s not just about did you do the right thing? It’s did you do it quickly? Can you do it scaled up? And can you do it with enormous labor shortages in the market? Stories of 20 percent absenteeism are very common. We were talking to a warehouse that has 300 percent annual turnover, also not unusual.” 

RightHand Robotics is focused on providing their RightPick solution for integration into existing warehouse technologies and workflows, including automated storage and retrieval system (AS/RS) tending, sorter induction, autobagger induction, and kitting. They partner with other intralogistics and e-commerce systems providers to deliver the whole solution. Examples include Honeywell Intelligrated robotic each picking, Tompkins Robotics t-Sort, and OPEX Corporation Perfect Pick. End users include e-commerce warehouses, retailers’ warehouses, and third-party logistics providers.

RightPick.AI software is robot agnostic. Although we usually see the RightHand Robotics gripper teamed up with a UR cobot, the hardware-software solution can be used with other collaborative robots or traditional industrial robots. The gripping system is designed to grasp items 2 kg or less, which is typical of the types of products handled in these e-commerce and intralogistics applications.

Grasping our Collaborative Future
Smart robot grasping has become a multidisciplinary endeavor. Solutions are coming from all corners of the engineer’s toolbox. Mechatronics, soft robotics, sensor technology, intuitive software, and now AI and cloud robotics – all are having an impact. The future will be collaborative at every level.

RIA Members featured in this article:
On Robot A/S
RightHand Robotics, Inc.
SCHUNK Inc.

Service Robots This content is part of the Service Robots curated collection. To learn more about Service Robots, click here.