Force and Tactile Sensors Give Robots a Feel for the Job
| By: Tanya M. Anandan, Contributing Editor
Touch is one of our most valuable sensory inputs. Think about how you insert a key into a lock. You use your eyes to generally locate the position of the keyway, but you use your sense of touch to make the final adjustments as you slide the key into the opening. This comes naturally for humans. We barely think about it.
Robots also benefit from the sense of touch. Force and tactile sensors enable robots to manipulate objects in less-structured environments with greater precision and sensitivity.
Most robots designed for collaborative applications have built-in force and torque sensing capabilities for safety, especially when working in close proximity or directly with humans. But that’s not our focus here. We’re exploring force and tactile sensors, either standalone or integrated, that enable real work. The dull and repetitive, dirty and dangerous tasks we humans like to offload to our mechatronic workers.
Thanks to proven sensor solutions, new designs and smart software, we’re realizing a future with robots that “feel” what they touch.
In robotics, as they are in the human domain, sense of sight and sense of touch are complementary modalities. Industry experts expect that touch sensing will soon join vision technology in mainstream robotic applications.
“There aren’t many major manufacturing facilities in the world where you’re not going to find a vision system,” says Ian Stern, Product Manager, Force/Torque Sensors for ATI Industrial Automation in Apex, North Carolina. “That’s where force sensing is headed. As we start to automate jobs that require that sense of touch, force sensing will become as prevalent as vision.”
Force and torque sensing enables robotic processes such as grinding, deburring, sanding and polishing. In machine tending, a force torque sensor can help a robot locate jig stops when placing a part in a vice on a CNC machine. Force sensing also aids product testing, packaging, and robotic assembly applications.
Piston stuffing, or inserting a piston into an engine block, is a common application for force and torque sensing. The precision required by this application is beyond vision technology alone.
“These parts have a gap all the way around them that is a tenth the thickness of a human hair,” says Stern. “Vision is good, but it’s not good enough to locate a part with that kind of accuracy. And a robot is not accurate enough to align it perfectly. When you have those types of applications where the sense of touch is really the only way to get those parts together, that’s where force sensing comes in.”
Force torque sensors come in varied types based on strain gage, optical or capacitive technology, among others. Each technology has different performance levels, longevity, calibration requirements, and cost. We’ll leave the details on how force sensors work and the myriad applications, both in and out of the industrial space, to other sources that have already addressed the subject.
ATI six-axis force torque sensors measure force in x, y, z and torque in x, y, z. They are machined from one solid piece of aluminum for a very rigid, monolithic structure. ATI uses silicon strain gage technology, which can measure an extremely small amount of deflection. The small deflection also gives the sensor high resolution and high stiffness.
“Even when we put a very heavy load on the end, it’s deflecting a very small amount. This is really important for robots,” says Stern. “A lot of people spend extra money on these high-precision robots that have a very accurate tool center point. If the sensor is deflecting a lot under load, you’re giving up that accuracy.”
Stern says electronics assembly is driving a lot of demand for ATI’s newest force torque sensor, the Axia80. With its smaller form factor and cost-effective design, the Axia80 fits right into this niche.
Force and Torque Sensing On Scale
Even in China, where labor costs are rising, it’s become cost effective to automate. Force torque sensors are bringing that sense of touch to small parts assembly tasks that were once the exclusive domain of human fingers.
“Things like putting memory modules into a computer, you don’t really know you did it right until you feel it click,” says Stern. “Or when you place glass onto a phone, in a lot of cases those are glued down. You need to make sure you’re pressing evenly across the entire surface for it to seal properly.”
ATI was able to pack all of the electronics for the Axia80 into a smaller package with one cable that satisfies the power and communication requirements for external interfaces. The output is all digital using Ethernet, EtherCAT or USB serial communication.
This force torque sensor is shown mounted to a Universal Robots collaborative robot, but it can be used on a variety of robots that can carry payloads up to 10-12 kg. The Axia80 is part of the UR+ Solutions program for products certified for plug-and-play compatibility with UR cobots. The software is available for free download on the ATI website.
“Part of the reason Universal Robots has been successful is because they developed this URCaps platform,” says Stern. “It’s kind of like the app development platform for smartphones. It allows us to go in and write software to work on their robot.”
Force and torque sensing is keeping up with the latest trends. One of those is the robots-as-a-service (RAAS) model, where robots are leased for use instead of purchased as capital equipment. One of ATI’s customers is using the RAAS model coupled with their force torque sensors to monitor processes.
“A company we’re working with called Hirebotics leases mobile workcells with cloud interactivity,” explains Stern. “They set up these robot cells with our sensors and whatever safety equipment and tooling they need. Through Hirebotics, manufacturing facilities can temporarily outsource various tasks, such as riveting and machine tending, to robots. The robots are rented by the second and can run the application for as long as needed.”
Stern says the customers use the force data from those applications to determine if tooling is wearing abnormally or if other issues are causing problems.
“You can either use the force sensor to guide a process, or you can use the sensor data to monitor a process,” he says. “A lot of robot manufacturers are pushing connectivity and Industry 4.0; these analytical tools give users a good picture of their entire manufacturing facility from their desktop. Force sensors are definitely starting to play a bigger role in this respect. It’s not mainstream or common today, but that’s the direction where things are headed.”
Force torque sensors enable all kinds of interesting applications.
It was an ATI force torque sensor, actually two of them connected to each robot wrist, which university researchers in Singapore used to help assemble an IKEA chair. The internet was all a buzz when a pair of robots accomplished in minutes what often leaves humans exasperated. The force torque sensors help the DENSO robots detect where to insert dowels into holes on different chair components.
Check out this video and article for the full story on how robots are getting better at furniture assembly, but why it’s still not as easy as it appears. Sophisticated software played a major role in this three-year research project.
Almost every robotics discussion we have these days comes down to smart software. Empowering a robot’s sense of touch is no different.
“The sensors are one thing, but doing something useful with it is something different,” says Jean-Philippe Jobin, Chief Technical Officer and Cofounder of Robotiq in Lévis, Quebec, Canada.
Robotiq got its start as one of the first manufacturers to design adaptive grippers for collaborative robots. The manufacturer just released the new Hand-E gripper. Robotiq also supplies vision and force torque sensing solutions primarily focused on the cobot market. Jobin is excited by the global explosion of collaborative robots.
“Ten years ago when we started the company, the cobots industry barely existed. Now, there’s a lot of new players and a lot of excitement about that. Next week will be our 10 year anniversary. We have about 90 employees. They will come with their families and I will see how their kids have grown. That’s exciting for me.”
“The motto of Robotiq is to free human hands from dirty, dull and dangerous jobs. We free their mind and their talent to do something with more value.”
Robotiq’s force torque sensor is freeing human hands from a dull and potentially dangerous job. The FT 300 Force Torque Sensor is a six-axis device and uses a patented capacitive technology that helps minimize electrical noise. It is fully digital and calibrated by the manufacturer. Jobin says it will keep its calibration for the sensor’s service life.
For Easier Programming
When coupled with advanced software, Robotiq’s force torque sensor makes programming a robot for certain tasks much easier and faster. One of Robotiq’s customers uses the FT sensor to record the movements of a robotic process for polishing glass.
Watch this Robotiq case study video to see how force torque sensors help program robots.
As humans, when we move, it’s usually one continuous movement, not point by point the way most robots are programmed. Processes such as polishing, deburring, sanding, gluing and painting require these fluid-like, continuous movements.
As Robotiq’s customer says, programming a robot movement that must follow a volume in space is a complicated thing to do. With the path recording function of the FT sensor, the operator can simply grasp the end-of-arm device and make the intended movements. The FT sensor records the force and the direction applied by the operator. Then the robot reproduces the operator’s motion. This helps avoid the risk of operator injuries caused by ergonomically challenging and repetitive movements.
Robotiq’s software records the trajectory as the operator demonstrates it. Then tweaks are made using the software interface to ensure the trajectory has a constant speed, something even we humans have difficulty maintaining. Other refinements to the movement are also made via the software tools.
Get Started with Apps/Skills
The software intelligence doesn’t stop there. Robotiq offers a new FT Mode node for the URCaps interface. It allows you to enter the direction and magnitude you want the robot to apply a certain force, making programming tasks like polishing, sanding, deburring or insertion much easier.
This video shows the FT Mode node in action on a polishing demo in Robotiq’s booth at the recent Automatica show.
Robotiq provides instructional documentation and how-to videos for different processes. Jobin recommends that users start with the provided parameters depending on their particular application and then adjust as needed. The powerful software interface allows for extensive editing.
Robotiq also provides skills, which are basically pre-bundled apps designed to accomplish a specific task. These are similar to the way popular voice-assistant platforms use skills to control the lights in your home, or order a pizza from your local delivery service. For instance, say your robot needs to push a button or insert a cable into a receptacle, such as an Ethernet cable into a port, Robotiq offers the Click Detection Skill.
As humans, we hear and feel a click when we perform these tasks. The robot feels the depression of force and can infer that the task was completed properly. The skill can be used for assembly tasks or part validation.
Watch the Click Detection Skill in action with a robot depressing a button and a lever, and inserting a PCB cable.
“We have some customers testing for millions of cycles with different products having buttons on them,” says Jobin. “They can detect when a button stops working and determine it failed at 1.6 million cycles, for example.”
Robotiq has more than 20 skills available, free to anyone to download.
While force torque sensing is gaining in prevalence, especially with help from quick-start apps and sophisticated software engineering, tactile sensing is just starting to emerge in the wild.
Exploring Tactile Sensing
Tactile sensing is still largely reserved for research applications. Wear is a major issue when you consider industrial applications, where the sensor must have direct contact with an object to capture a measurement. It must be able to withstand millions of cycles.
Still, manufacturers are starting to investigate the practicality of integrating tactile sensing into their grippers. One of those prototypes was featured in Smarter Robot Grasping with Sensors, Software, the Cloud.
Tactile sensing helps detect defects on the surface of an object, determine exact gripping force so you can adjust for fragile items, or validate whether you’ve gripped the correct part, among other applications.
Jobin says tactile sensing is where force torque sensing was about 10 years ago, when he and his cofounders were still pursuing their university educations. The applications are few, but there’s great potential. Once again, he thinks software will play a major role in what people do with that tactile sensor data.
One mountain-state robotics startup has high aspirations for tactile sensing. Robotic Materials is a spin-off from the University of Colorado Boulder. Founder and CTO Nikolaus Correll is an associate professor at CU Boulder and Director of IRT on Multi-Functional Materials. He did his postdoctoral work with Daniela Rus’ Distributed Robotics Laboratory at MIT.
Correll says he’s sold about $15,000 worth of their patent-pending tactile sensors since the company’s 2016 inception, mostly to researchers and other universities. Now the startup is on a new mission.
They have married tactile sensing with 3D vision in a novel robot gripper design and coupled that with intuitive user interfaces for an out-of-the-box grasping solution. Applications include pick and place, bin picking, machine tending, kitting and assembly. The startup is in the second round of angel investing and was awarded SBIR seed funding by NSF, so they can focus on getting product out the door.
This video captures the novelty of Robotic Materials’ solution. You see their gripper installed on a collaborative robot arm, which is then mounted on a Canvas Autonomous Cart. The mobile manipulator maneuvers around other machinery before stopping alongside a table. There it grabs small screws and washers from three different bins of randomly oriented items and drops them into another bin on the cart.
“The robot stops within a 10 centimeter accuracy circle” says Correll. “Then it finds the bin, finds something inside of it, and then grabs the object. (In this case, very small screws. Random bin picking is no small feat.)
This was not a carefully choreographed demonstration as we’re accustomed to seeing in viral videos of robots opening doors and doing backflips. Correll says they received the mobile robot cart from Canvas Technology and had the demo the next day.
“It was very easy to put that together,” says Correll. “A Roomba (robot vacuum) is great, but imagine if it could pick up stuff that is in the way, or a hotel robot that could load and unload itself. All of these mobile applications really become valuable when they can also do manipulation. That’s something we can do now.”
Robotic Materials also provides the API, or graphical programming environment for configuring the application.
“The video you’ve seen is all programmed in our environment, where the hand really controls both the arm and the cart,” says Correll. “You can quickly drag and drop these things together, but you need that bin picking block in order to pick the things out of the bin.”
Small to midsized businesses (SMEs) could benefit from a similar setup where an autonomous mobile cart may be more cost effective then a fixed conveyor belt. Now add its own gripper arm with integrated multimodal sensors and you could have an out-of-the-box mobile bin picking solution.
Correll says the next step is to add a tactile sensing skin for safety. Turns out, that’s right up their alley. He collaborated on a paper entitled A Robotic Skin for Collision Avoidance and Affective Touch Recognition, and is now pursuing a grant to develop an aftermarket skin for robot arms.
Sensor Fusion with Proximity Sensing
Robotic Materials, the company, uses patent-pending proximity sensors for tactile perception in their gripper. The sensors use infrared proximity sensing embedded in a transparent, abrasive-resistant polymer. The technology allows for proximity sensing at a distance and enables force sensing upon contact due to the internal deformation of the polymer. This makes the sensor capable of zero-force contact sensing.
Zero-force contact sensing is important if you want to know where an object is without exerting force upon it. You can’t tell that you’ve made contact with something with a force sensor until it’s actually moved, even if it’s very minute movement. Correll says you need tactile sensing for that. Without tactile perception, if a robot attempts to pick up a marble or another very lightweight item, that object will move before it can sense contact.
Robotic Materials’ technology integrates both an infrared emitter-detector and a MEMS barometer to form a proximity, contact, and force sensor. Proximity sensing allows the robot to make up for inaccuracies of visual perception and manipulation. Zero-force contact sensing allows the robot to validate pose without exerting forces. Pressure (force) sensing allows the robot to confirm the object pose relative to its body.
You can see the proximity sensors at play in this video demonstrating a multi-part gearbox assembly. At first the robot gripper (not a Robotic Materials gripper) equipped with the proximity sensor-enabled fingers scans each component to determine its approximate location, shape, size and orientation. Then when it grasps the parts, you see the “touch” indicator spike. Using the sensor’s combination of proximity, contact and force sensing, the robot is able to wiggle the shaft into the hole and then screw it into the base.
“The tactile sensing technology allows the robot to see where conventional cameras cannot, for example, right before the grasp or inside a bin,” says Correll. “Because if you are 0.5 mm off, you haven’t made contact, which you can’t see. If you drive in too fast, the part has already moved. You’re already exerting force. That is really where the touch comes in.”
The benefit from tactile sensing alone however is marginal, says Correll. That’s why they embedded a 3D camera in the palm of their gripper.
Robotic Materials’ gripper is a four-bar linkage design with two fingers that move independently. This design plus impedance control enable the gripper to handle a variety of objects, including delicate items such as fruit.
Watch this novel gripper design in action and the interplay between tactile and force sensing as the Robotic Materials gripper “picks” strawberries.
Novel designs, precision sensors, and sophisticated software give robots a sense of touch. They do the dull and repetitive work. Humans reap the benefits again and again.