Industry Insights
Advancing Artificial Intelligence in Motion Control
POSTED 03/20/2019 | By: Ray Chalmers, Contributing Editor
However you might define it, companies are discussing and adopting artificial intelligence (AI) applications in ever-growing numbers, and the pace is only going to increase. According to Gartner’s 2019 CIO Survey of more than 3,000 executives in 89 countries, AI implementation grew 270 percent in the past four years, 37 percent in the past year alone. That’s up from 10 percent in 2015, which isn’t too surprising considering that by some estimates, the enterprise AI market will be worth $6.14 billion by 2022.
“We still remain far from general AI that can wholly take over complex tasks, but we have now entered the realm of AI-augmented work and decision science — what we call ‘augmented intelligence,'” Chris Howard, distinguished research vice president at Gartner, said. “If you are a CIO and your organization doesn’t use AI, chances are high that your competitors do and this should be a concern.”
Among the CIOs surveyed, whose employers represent $15 trillion in revenue and public-sector budgets and $284 billion in IT spending, deployment of AI tripled in the past year, rising from 25 percent in 2018. Gartner credits the climb with the “maturation” of AI capabilities and the rapidity with which it’s become an “integral part” of digital strategies.
That’s in alignment with Deloitte’s second State of the Enterprise compendium released in the fall of 2018, in which 42 percent of executives said they believed that AI would be of “critical importance” within two years. That same report showed that natural language processing outstripped all other categories regarding growth, with 62 percent of companies reporting having adopted it (up from 53 percent a year ago). Machine learning came in second with 58 percent (up 5 percent year-over-year), and computer vision and deep learning followed close behind, with 57 percent and 50 percent adoption, respectively (up 16 percent from 2017).
The first practical steps towards artificial intelligence were taken in the 1940s. Today, AI is reaching a historical moment because of six converging factors:
Bigger data: Many devices have given us access to vast amounts of data to process, both structured (in databases and spreadsheets) and unstructured (such as text, audio, video, and images). As trillions of sensors are deployed in appliances, packages, clothing, autonomous vehicles, and elsewhere, “big data” will only get bigger. AI-assisted processing of this information allows us to use this data to discover historical patterns, predict more efficiently, make more effective recommendations, and more.
Processing power: Accelerating technologies such as cloud computing and graphics processing units have made it cheaper and faster to handle large volumes of data with complex AI-empowered systems through parallel processing. In the future, “deep learning” chips – a key focus of research today – will push parallel computation further.
A connected globe: Global manufacturing supply chains together with social media platforms have fundamentally changed how individuals interact and what information they can expect and when. Increased connectivity is accelerating the spread of information and encouraging the sharing of knowledge, presaging the emergence of a “collective intelligence”, including open-source communities developing AI tools and sharing applications.
Open-source software and data: Open-source software and data are accelerating the democratization and use of AI, as can be seen in the popularity of open-source machine learning standards and platforms. An open-source approach can mean less time spent on routine coding, industry standardization, and wider application of emerging AI tools.
Improved algorithms: Researchers have made advances in several aspects of AI, particularly in “deep learning”, which involves layers of neural networks, designed in a fashion inspired by the human brain’s approach to processing information. Another emerging area of research is “deep reinforcement” in which the AI agent learns with little or no initial input data, by trial and error optimized by a reward function.
Research on AI algorithms has been moving quickly, especially since big data has combined with statistical machine-learning algorithms. Narrow, task-driven AI techniques, already important in many industrial applications, are now working with big data to allow pattern recognition in unstructured text and images. The potential of deep learning using neural network architecture continues to grow – as computers become faster and big data ever more prevalent.
Experts expect that supervised and unsupervised learning techniques will become increasingly blended and that such hybrid techniques will open the way for human-machine collaborative learning and for AI to develop more advanced, human-like, capabilities.
The convergence of these factors has helped AI move from in vitro (in research labs) to in vivo (in everyday lives). Established corporations and start-ups alike can now pioneer AI advances and applications. Indeed, many people are already using AI-infused systems, whether they realize it or not, to navigate cities, shop online, find entertainment recommendations, filter out unwanted emails, or share a journey to work.
The spectrum of AI definitions and focus is also expanding and now includes:
Automated intelligence systems that take repeated, labor-intensive tasks requiring intelligence, and automatically complete them. Resistance welding is one example. Resistance welding used to be air- or hydraulic-driven, with sensors that told the robot to squeeze to a given pressure and fire the current. The result was over-welding on a scale of 30% on average, according to FANUC North America.
With the luxury of a developmental R&D lab that brings to bear emerging research in robotics, linear motors, CNC controls, sensors, and industrial applications experience, one of the latest results is what FANUC calls Learning Vibration Control.
Combining cameras and software, “gakushu” (studying/learning) robots equipped with the LVC package automatically adjust to fine variables in fixturing or other conditions in real time and adjust their motion for up to 15% cycle-time improvements in spot-welding processes, all with quality checks that result in tolerances not reachable manually.
Assisted intelligence systems that review and reveal patterns in historical data, such as unstructured social-media posts, and help people perform tasks more quickly and better by using the information gleaned. For example, techniques such as deep learning, natural language processing, and anomaly detection can uncover leading indicators of hurricanes and other major weather events.
Augmented intelligence systems that use AI to help people understand and predict an uncertain future. For example, AI-enabled management simulators can help examine scenarios involving climate policy and greenhouse gas emissions.
Autonomous intelligence systems that automate decision-making without human intervention. For example, systems that can identify patterns of high demand and high cost in home heating, adapting usage automatically to save a homeowner money.
A5L
lgorithmic intelligence/Smart devices
Perhaps closest to our attempt to define and discuss AI in motion-control devices and applications, Dr. Rashmi Misra, general manager for AI business development at Microsoft, added algorithmic intelligence to the list of AI definitions in her MCMA TechCon keynote in November 2018.
AI developmental highlights Misra noted, include Alan Turing’s famous Turing Test in 1950, the coining of the term “artificial intelligence” by John McCarthy in 1956, and the first “AI winter” of 1971-1974 when DARPA cut all AI funding. More recent AI accomplishments include IBM’s Deep Blue defeating chess grandmaster Gary Kasparov in 1997, IBM’s Watson winning Jeopardy over Ken Jennings in 2011, and Google’s Deep Mind defeating a 9th dan Go champion in 2017.
Microsoft’s AI leadership includes more than 1.2 million developers using the company’s cognitive services online. “Last year alone we released more AI research papers than any other organization,” says a company spokesperson. Dr. Misra describes AI as integral to the ongoing industrial transformation from physical production assets and systems to digital smart products and connected enterprises. AI-enabled analytics are at the heart of a digital feedback loop taking operational data, product telemetries, and employee and customer feedback and delivering more efficient operations, better products, more effective employees, and deeper customer relationships. “Modern manufacturers are embracing customer centricity, innovating faster, and becoming more agile,” she says.
In June 2018, Microsoft announced its acquisition of Bonsai, based in Berkley, California. The company is building a general-purpose, deep-reinforcement learning platform especially suited for enterprises leveraging industrial control systems such as robotics, energy, HVAC, manufacturing and autonomous systems in general. This includes unique machine-teaching innovations, automated model generation and management, as well as pre-built support for leading simulations. Using Bonsai’s AI platform and machine teaching, subject matter experts from Siemens with no AI expertise trained an AI model to auto-calibrate a computer numerical control (CNC) machine 30 times faster than the traditional approach. This represented a huge milestone in industrial AI, and the implications, when considered across the broader sector, are just staggering.
More Information for Better Decisions
Don Baughan, the regional sales engineer for Elmo Motion Control, echoes this decidedly real-world approach when it comes to intelligent motion-control devices. “We’re all working for somebody who’s doing their best working for their customers. Where smart devices are concerned, more than how it works, it’s critical they understand what they can do with it.”
At the component level, Baughan sees AI as that which delivers the ability to detect and affect change as forces act on the device – supplying more torque, more current, more feedback – for a customer’s given system.
Control engineering is an important factor. At the most basic level, there is current mode, or “dumb brute force,” as Baughan defines it. This translates a given level of input current to a given level of torque. At the next level is velocity mode, which can detect where things are in a multi-axis three-dimensional space and recognize kinematic motion to determine where such things are going relative to the application.
The highest control level is position mode. “This gets exponentially better at enabling AI-like functionality – where motors and drives can respond to upper-level control and detect and respond to changes in real time,” Baughan adds.
He goes on to add that motion control is where the rubber meets the road in terms of AI. “You can talk about networks, protocols, and algorithms, but it’s all ones and zeroes. Motion control has to happen in real time. At the end of the day, something has to move under control.”
Aaron Dietrich, director of marketing and product management for Tolomatic, referenced artificial intelligence in a recent interview. “Tolomatic continues to invest in leading electric actuator technologies including integrated servo motors, integrated drives/controls, expanding force capabilities, and engineering tools,” he said. “I believe a key emerging technology is what the industry is calling “machine learning” or artificial intelligence. Machines are becoming more and more intelligent and increasingly able to make decisions without human intervention. Examples of this are already in the market with the explosion of autonomous robots/vehicles for all sorts of tasks, from household chores to driverless vehicles to military applications, to name a few. As this trend continues, it will drive an even further explosion in automation growth for all different types of components and automation technologies.”
Yet Dietrich adds that device “intelligence” is not necessarily a function of the drive or servo motor. “We can build the actuators and add sensors, but that’s not necessarily artificial intelligence,” he says. For such, automation technology still needs to include the IP, the algorithms that allow a DSP (digital signal processor) to “speak” I/O (input/output). Now sensor inputs can feed more variables up to a drive or motor and form a more “intelligent” system, given the device’s ability to adjust its state relative to conditions.
Living in the Real World
Mike Chen, director of the Omron’s Automation Center Americas, concurs that interest is building in an AI approach to motion control but specifies it has to take place in the real world. “Whether it’s people or the technology making the decisions, it has to be based on real time in the real world, collecting, analyzing, and utilizing the data, with all its intellectual, ethical, and often safety-related implications.”
Such an approach makes AI an effective bridge between IT (information technology) and OT (operational technology), leveraging the intelligence of human assets (manufacturing engineers, operators, quality and maintenance personnel) with device intelligence. “The entire process becomes more robust the more we decide what to do with both the data and the decisions we can trust technology to make,” he says.
For example, devices such as Omron’s Sysmac AI controllers are able to identify abnormal machine behavior without being explicitly
programmed to do so. Since there could be many different factors and measurements that indicate an issue when observed together, automating the feature-extraction process saves a significant amount of time and resources. Leveraging machine learning results during production is key to ensuring cost savings.
This will have the effect of optimizing individual production processes. Chen likens this to two individuals having identical cars from the same manufacturer and the same lot. “You can have the same vehicles and I’ll guarantee you won’t have the same maintenance schedules,” he explains. “Car maintenance is now based on analysis of real usage, not just time duration, and the same is happening with AI-enabled manufacturing to extend the life of machinery and equipment based on true usage data.”
Some controllers are also able to addresses cybersecurity by operating with their own CPU and function blocks, requiring no internet connectivity or cloud computing. Data collection and analysis is performed within the same hardware as the controls program, improving data-processing speed and accuracy.
Collect, Analyze, and Connect
Chuck Lewin, CEO of Performance Motion Devices, outlines AI in motion control as requiring three elements: data collection, data analysis, and connectivity.
Data collection turns out to be a natural outcome of the industry's migration to software-based motion controllers, he says. Motor controllers, operating at servo rate speeds of 10 Khz or more, continuously measure and adjust the motor drive. “It is, therefore, a simple matter to record (into memory) useful data such as servo error, drive output, average energy and much more. In fact, at such high servo rates, the challenge is not having enough data to collect, the challenge is processing the data so that it can be stored in a reasonable amount of memory space.”
Next comes data analysis. The analysis here should not be thought of as an absolute one-time result but of changes in the machine behavior. “Predicting that a rotary bearing may fail in the future requires knowing the baseline behavior of that bearing, and then comparing it from time to time to the observed behavior. There are many mathematical techniques that can be applied to the analysis task. A brief summary includes frequency-based math such as FFT (Fast Fourier Transforms), various types of observers, and even simple averaging and long-term trending.”
Connectivity is the last element required for AI-based motion control, and in the form of the relatively new IIoT (Industrial Internet of Things) scheme, is really the fuel that is igniting the fire, Lewin continues. Connectivity means the results of data collection and analysis can be reported to plant supervisors and machine manufacturers. Just as important, connectivity means fleets of operating machines in the field can be compared. “Call this metadata analysis - analysis not just of one machine against its own past behavior, but of multiple machines thereby helping to identify patterns with component vendors, production processes, and test procedures.”
With Better Content, Better Knowledge
With these three elements, data collection, data analysis, and connectivity, starting to appear even in basic products such as single axis motor controllers, a door is opening that will eventually shift the focus of motion control vendors from transistors to content. Where in the past the focus has been on amplifier and algorithmic efficiency, these features are now assumed and the differentiating factor among controls vendors will be the quality of their AI-based content -- that is, the software and specialized AI hardware that interprets the data collected by the motor controller as it goes about its job of controlling the motor.