Getting good quality data with the right machine control architecture
| By: Thomas Kunckhoff, Product Manager - Controllers
Data is incredibly powerful, and it’s the foundation of a future-ready automation strategy. As today’s Industrial Internet of Things (IIoT) technologies make it increasingly effortless to gather a wealth of data at every step of the manufacturing process, it’s crucial to consider ways to ensure that this data confers the greatest possible value by differentiating between normal process data “noise” and data which represents a change in the fundamental process — known as a “signal.” The context around the data itself is very important when it comes to doing so.
In this article, we’ll take a look at how the right control platforms can maximize the value of your data by helping bring out the signal while suppressing noise. You’ll learn what to look for when selecting a system that can focus on collecting good-quality data accurately and sufficiently.
When you select a control platform, be data-focused
Is data power? Absolutely. However, without enacting change from the data, the power becomes wasted talent.
Value doesn’t lie in data by itself — rather, it’s the ability to implement changes using the data that confers the value. When selecting a control platform, manufacturers should focus on selecting one that can keep the entire context of the data embedded in the data itself. It’s also important to consider the speed to which the control platform can adjust.
A transparent machine control architecture becomes paramount here because you would otherwise be simply chasing small symptoms rather than the underlying root cause. For example, data should be timestamped and aligned with video and ladder logic to allow multiple perspectives of the same event. A single perspective is commonly a subjective perspective, whereas multiple perspectives help minimize bias by excavating the objective reality of the process.
Many manufacturing facilities understand the importance of collecting and analyzing data to make critical operational decisions. In fact, some manufacturers who are successful in working with data understand what their customers want and how to meet their needs at a world-class level. The goal when setting up data collection methodologies is not to diminish the existing experience or passion but to build on it. When talent and experience are ignored, then a solution that attempts to build credibility quickly results in massive amounts of cumbersome data.
The power of starting small — and gathering data that are meaningful
We don’t want a bunch of data that mask true signals and dilutes its own relevance. Rather, it’s best to start small, trend a process that is known, establish a baseline, and use data sources that represent the key performance indicators of the process. We do this to reinforce institutional knowledge and prove a methodology for gathering data. As there are seemingly endless ways to pull data out of processes, we may be talking about a dozen or so inputs, but the inputs allow organizations to see in real time how successful they are.
Starting small also helps businesses grab a few incremental wins and justify the capital spent before scaling and gathering data from every cell and every process in the plant. Data collection needs to be completed with experts who can guide where the data is collected from, making it possible to build trends across the plant. That way, once an entire system is up and running, artificial intelligence will have a strong head start on learning and can then escalate only a few data points instead of thousands.
However, extracting key metrics can be difficult for certain operation or production settings, either due to the complexity of the system or the fallout from a failed data harvesting project. Many industrial automation manufacturers in the space have service teams that can help. Application engineering, technical sales, and service technicians all have years of experience that can help get good relevant data quickly.
How all-in-one control platforms make safety more effective than ever
Safety is a critical part of the manufacturing environment. Safety controllers communicate on safety protocols, which receive data from safety inputs while broadcasting data on safety outputs. This has been the solution for decades, but what makes many automated solutions cutting-edge is that they do this at the very heart of non-safety components. If safety isn't integrated into the design of the system, like CIP safety or Fail Safe over EtherCAT (FSoE), then duplicate systems that double the number of connections everywhere on critical processes can quickly become the norm. This is where all-in-one platforms can really make a difference.
Easy-to-program function blocks in an all-in-one software platform take a lot of the risk off the table, as these function blocks are created by experts and vetted by their peers. As a result, many of these all-in-one platforms allow programmers to simulate their safety programs in both two-dimensional and three-dimensional environments, allowing the entire program to be rigorously pressure tested without endangering anything or anyone in the physical world. This can decrease development time, as the program can be vetted in tandem with the construction of the physical system.
Lastly, many of these industrial automation firms have service teams that can walk through customers’ plants verifying programming and safety compliance. Giving customers not only a vetted process with peace of mind that their operators are safe but also gain ISO certifications at the same time. This is becoming ever important in today’s facilities because when safety is threatened by one process it is endangered for all.
Looking toward the future of industrial control technology
This is an exciting time to be discussing industrial control, and Moore’s law has yielded some extremely fast and capable controllers. Globally open networks have taken market share here in the Americas for motion control, while software simulation allows users to troubleshoot in both ladder logic using soft HMIs as well as the rendering of three-dimensional environments. Secure protocols and isolated networks have been bridging the gap between the data in the IT world and the control in the OT (operational technology) world.
Looking to the future, we can expect the potency of these technologies to be able to yield greater process improvements. Just as how riding in a car today is exponentially more comfortable, safe, and efficient than riding in a car from the 1960s, the platform trends we can expect to see are taking a lot of complexity and moving it away from the limelight. We’ll see artificial intelligence distill mounds of data into actionable tasks. We’ll take advantage of greater compatibility between cutting-edge components like vision, motion, and traceability with controllers and controller software. In short, we can expect more simplicity, which will ultimately bring more solutions to more facilities.
Machine performance and data solution integration should always be a part of the conversation, but the ease of use, reduced development time, future enhancements, and training advancements can be found right alongside. The capability needed for manufacturers to solve their machine control problems today without jeopardizing their ability to create new solutions tomorrow is what we should be looking for in a control system.