« Back To Motion Control & Motors Industry Insights
Motion Control & Motor Association Logo

Member Since 2006

LEARN MORE

The Motion Control and Motor Association (MCMA) – the most trusted resource for motion control information, education, and events – has transformed into the Association for Advancing Automation.

Content Filed Under:

Industry:
N/A

Application:
N/A

Converting Big Data into Actionable Intelligence

POSTED 08/18/2016  | By: Kristin Lewotsky, Contributing Editor

Editor’s note: For more information about solutions for putting big data to work in manufacturing, watch our free webinar, How to Turn Big Data into Actionable Information.

The industrial Internet of Things (IIoT), Industrie 4.0, cyber-physical systems, big data. Unless you’ve been living under a rock, you’ve heard about the trend toward intelligent factories and networked operations. It’s making news for good reason. Properly executed, it can decrease downtime, enhance productivity, reduce operating costs, increase product consistency, and support predictive maintenance. It doesn’t just boost overall equipment effectiveness (OEE), it enhances operations company-wide, whether that’s across the factory or across the globe.

Although it’s a natural fit for asset owners, uptake is currently highest among integrators and OEMs. “If you go to the automation shows like the Hanover Fair in Germany, you will see a lot of discussion around the IIoT and Industrie 4.0, but when you translate that into the end-user shows and the machinery shows like Pack Expo, there are not a lot of mentions of the technologies,” says Alex West, principal analyst for smart manufacturing and industrial communications at IHS Markit (London, UK). “It’s very much manufacturer led at the moment.”

Expect that to change in the near future. The level of insight and intelligence the technology delivers brings a significant competitive advantage to early adopters. Especially in low-margin industries like consumer packaged goods, being able to produce even one more diaper or bottle of laundry detergent per minute can make a big difference in profit margin. “Much of the [IIoT press coverage] is about the process industry because those examples are the easy ones to shout about, but a lot is going on in discrete manufacturing as well,” says Andrew Hughes, principal analyst at LNS Research (Cambridge, Massachusetts). “You’re going to see a lot more activity, especially in connectivity, from all of the big suppliers. It’s going to expand very, very rapidly this year and next.”

Plenty of articles have talked about the potential. The question is how to go from concept to reality. Getting the data from your assets is the easy part. The challenging part is converting it to actionable insights, and doing it quickly enough to make it useful.

Start small
Yes, it’s called big data for a reason but if you indiscriminately slap sensors and meters on every piece of equipment you own, you’ll soon find yourself drowning in data. Start by identifying a specific problem that you need to tackle. Maybe you’ve been having issues with excess downtime, maybe one shift consistently turns out better product, maybe the company is focused on cutting operating costs. Set specific goals: reduce downtime by 20%, position operations to move to predictive maintenance, cut energy consumption or costs by 10%. Don’t attempt to do it all over your plant or at every facility at once. Keep your pilot program small: one machine, one production line.

Once you’ve set your scope, then install the hardware to capture the data you need. Don’t assume that you have to have all new equipment to apply this approach. You can derive plenty of benefit with brownfield installations, too. “We are in no way saying that [your old machines] need to be ripped out or replaced because we’re trying to do data acquisition and analysis,” says Andy Henderson, industry analyst for heavy industry and discrete manufacturing at GE Digital (San Ramon, California). “There are ways of knowing what’s going on with these systems.” It could be as simple as putting a current transducer on the wire going to the green light of the stack light. If the signal goes high, the machine is running.

Granted, the information available from existing equipment is much less detailed than for greenfield installations. A new milling machine, for example, can provide diagnostics on the load, which program is running, any alarm codes displayed on the machine, even the specific block of code that is running. Getting information from an older machine can be a challenge. “You’re probably not going to get that level of information but you will be able to see whether the machine is running or not,” Henderson says. “Through a combination of signals, you might be able to deduce with more granularity what’s happening in the machine – is the door open, is there a load on the spindle, etc.”

Although the rule of thumb is to define a project with a narrow focus, within the bounds of that focus, it’s essential to be as granular as possible. The value of big data lies in gathering enough information that it can be mined to provide answers to your questions – and to the questions you didn’t know you were going to ask. One of the benefits of big data analytics comes through finding unexpected correlations. With detailed data, when unusual results crop up, you can ask new questions and analyze the data in new ways to come up with the answers.

Once you’ve defined the scope of your problem and set up your equipment monitoring, there’s one more essential step: establish a baseline. After all, how do you know whether you’ve solved your problem if you don’t know exactly how bad your problem is? Being able to quantify performance will not only help you determine when you achieve return on investment, it provides a powerful tool to get buy-in from decision-makers throughout the organization. If you can demonstrate the effectiveness, it will be easier to find ways to apply the technology in other parts of the operation and to other problems.
 
Choose the Software Solutions
You can think of an intelligent factory system as a combination of a supervisory control and data acquisition (SCADA) system and a manufacturing execution system (MES), with a dollop of enterprise resource planning (ERP) tossed in. SCADA systems were developed in the 1970s as an on-site tool for monitoring the health and operations of industrial equipment. Today, they’ve grown far more sophisticated but the essential operations are the same: Gather data from the PLC and the remote terminal units (RTUs) that digitize sensor data, analyze the data, and serve up the results to operators, maintenance, and managers throughout the organization.

SCADA applications not only relay equipment status and alarms to the HMI and to mobile platforms, they can analyze performance and display it through a variety of visualization tools. The system can compare operations from line to line, shift to shift, facility to facility, and across an entire global organization.

SCADA systems are a necessary but not sufficient condition for an intelligent factory. While the SCADA software focuses on production, MESs handle tasks like production scheduling, maintenance services management, quality assurance, etc. SCADA systems provide the equipment input, but MESs are typically more integrated in nature. Next-generation SCADA systems can support decision making across multiple plants. In comparison, MESs generally run within a single plant. To encompass multiple plants, they require MES interface devices to exchange data over the internet.

A SCADA application might enable a US-based manager of an automotive supplier to view a dashboard with a map showing the status of each factory around the world. If a red dot indicating an alarm appears next to the location of a factory in Mexico, for example, the manager just needs to click on it to get more information. The software lets them explore a detailed 3-D rendering of the facility to view the source of the fault, whether it’s an open door or a faulty drive. Inside the factory, the operator and maintenance can use the same software to pinpoint the source of the fault and address it.

If the problem is a drive, maintenance staff can use the plant MES to check the spares inventory and find the exact location of the part in the warehouse. Meanwhile, the US-based manager can use that plant’s MES to check orders and reschedule processing in order to make up for in any shortfall caused by the downtime.

This is just the start. MES software includes targeted applications such as those designed for intensive facilities-based analytics. A SCADA system manages fault tracking and alarms, but the analytics package features functionalities like fault detection and diagnostics. Based on historical readings and operating parameters, the software can monitor equipment and warn when components are displaying signs of wear. The goal is to identify problem components before they fail and cause unplanned downtime. The software can also calculate the cost of replacement and downtime so that maintenance can make an informed decision about how to spend their time and resources.

Other applications are designed for energy management to reduce operating costs. Based on the output of power meters on motors and drives on a packaging line, for example, or the pumps and fans of an HVAC system, energy analytics software can uncover hidden waste. Are certain pieces of equipment drawing more current than other, identical versions? Are pumps or compressors running when that section of the plant floor should be shut down? Which processes draw the most energy and is there a way to spread them out so that the power company sees a lower level for peak usage? Energy analytics can answer all of these questions.



 

Use the right tools
Converting big data into actionable insights requires the right set of tools. SCADA systems have grown increasingly sophisticated in recent years. In many cases, they can support some of the functions discussed above but they are still designed primarily for production monitoring rather than analytics. It’s tempting to use a pre-existing SCADA system because it’s there, but it’s not the best path to benefit.

The typical SCADA model is to upload data from data loggers embedded in PLCs and RTUs to the SCADA database via a gateway PC. Although proprietary protocols exist, the systems are typically built around the Open Platform Communications (OPC) family of standards and specifications. OPC supports interoperability among components from different vendors. It simplifies the job of programmers and developers with blocks of pre-written code. The gateway PC approach still requires a certain degree of customization to work properly with the system, and the work needs to be repeated anytime operating systems and applications are upgraded. The bigger issue is that it may not effectively transfer the volumes and velocity of data required to truly reap the benefits of the IIoT.

“Historically, because of a lack of a better solution, companies have been trying to leverage technologies already installed, primarily SCADA solutions, and tying that up into enterprise applications, like MES,” says Sloan Zupan, senior product manager at Mitsubishi Electric Automation (Vernon Hills, Illinois). “That wasn’t the intended purpose of those visualization tools. Industrial automation companies have developed specific products whose sole function is connecting the automated assets on the production floor with MES software applications.”

These data appliances can do a more effective job of handling the volumes of data and the numbers of devices involved. High-speed data loggers and MES interface modules perform edge processing, and push the output to a relational database where it can be accessed by a range of applications.

Data transfer is not the only issue. IIoT analytics require data historians that store performance and status data to make it available for future analysis. While some SCADA suites include data historians, they may not have the volume or speed functionality to be able to truly execute.

Go to the cloud
By now, you may be recognizing the value, but you may also be thinking that the technique is both complex and expensive. You may not have a large IT department. You probably don’t have teams of developers. The good news is that the process is actually easier than you might think. Multiple vendors provide both hardware and software solutions designed to get manufacturers up and running quickly.

You can use these products to build a system of your own. The alternative is to go with a cloud-based solution. This is particularly beneficial in terms of supporting multi-user, web-based solutions. “You should consider streaming the data up into some sort of central cloud-based repository so that you can utilize all of that in-depth information,” says Oliver Gruner, director of cloud and IoT business development for Iconics (Foxborough, Massachusetts). “That’s where the cloud really works well. Then you can use the data for machine learning, you can do visualization, you can show historical trends. You can deploy all of that in the cloud rather than on premises. You don’t have to have a big IT department.”

Cloud-based analytic services do more than just place data and applications in a central repository. Coupled with open software initiatives, they simplify the development of new products and services. Developers can take advantage of application programming interfaces (APIs). These are snippets of code designed to perform specific tasks. A software company or automation house might write an API for data transfer or for plotting energy usage over time, for example. An OEM or a software developer trying to write an application for automotive analytics doesn’t need to write the visualization subroutine. They only need to call the visualization subroutine with a few lines of code.

This type of API speeds the development process. It also gives users more options while letting industrial automation houses focus on the hardware and software aspects where they excel.  “We want to keep the applications in the industrial app stores open,” says Jagannath Rao, vice president, business unit lead, Siemens Digital Factory (Norcross, Georgia). “We want others to build apps and put them in there because there is no way that we can cover all applications.”

Cloud-based services with à la cart selection and transparent pricing give users the ability to choose what it is they want with minimum commitment. “The customer has the flexibility of managing his cost,” Rao observes. “He might be just embarking on this journey. Maybe he just wants to visualize performance for the first few months. Maybe he doesn’t want any complex analytics.” Once the manufacturer is comfortable with the technology and has a track record of success, they can add additional applications and functionality.

Don’t go it alone
A bevy of hardware and software options exist to simplify the process the sources interviewed for this article all agreed, however, that it’s best to look for experienced integrator’s to help with the initial project. You don’t just want data, you want insights.

“You can buy off-the-shelf visualization package that are easy to use but you have to work with an integrator who has done that sort of work before,” says Gruner. “They will have the expertise to say this is the information that we can get for you, and will be able to visualize and create key performance indicators.”

“There are many different ways of getting data,” says Zupan. “But data isn't really what the business systems need and not what management needs in order to make better decisions. Providing customers with the best practices of how to aggregate and macroprocess data into useful information and explain how it can be used really helps them in their decision making process.”
 
It takes time
Creating a digital factory is a process, not an activity. Gathering and even visualizing data is just the start. It takes three to six months to acquire enough data to truly understand your assets, according to Rao. “I always say the first year you are basically digitizing your assets, getting to know them, looking at data, storing data, building data models,” he says. “By the end of the year, you have a pretty firm grip on that asset. Now you’re able to run it reliably, get rid of unplanned downtime. That is the first step in the value chain.” At this point, you can move to more sophisticated tasks like machine learning, predictive models, etc. “Nothing will happen before 8 to 12 months because it takes that much time to even start making sense of the whole thing,” says Rao. “But once it does, there are many things you can measure.”

Big data in manufacturing is gradually gaining steam as an increasing number of organizations begin to see the benefits. This ranges from OEM machine builders who use it to offer their customers new value services to companies seeking a competitive advantage in their market sector. “Medium-sized businesses are beginning to see this as one way to differentiate themselves from their bigger rivals without adding new people,” says Rao. “It is fascinating to me the kind of things that drive businesses to adapt the technology. It’s different for each one.”

For more information about solutions for putting big data to work in manufacturing, watch our free webinar, How to Turn Big Data into Actionable Information.

Further Reading
M2M and Big Data Combine To Improve Machine Performance
Making the Intelligent Factory Today