Why Industrial Edge Computing Might Be Your Best Path to Digitalization
POSTED 10/23/2020 | By: Kristin Lewotsky, Contributing Editor
With the increasing focus on smart factories and the Industrial Internet of Things (IIoT), having a digitalization strategy is becoming just as important as the manufacturing process itself. Use cases like predictive maintenance and energy analytics offer a number of ways to increase productivity and decrease operating expenditures, but organizations don’t necessarily have the skills and computing resources in house to execute. The alternative is working with the cloud model, but many companies are uneasy with the idea of interfacing equipment running real-time processes to outside networks. Edge computing provides a middle ground alternative.
In edge computing, analysis and storage take place on-site, near the asset, without necessarily sending the data to the cloud. By performing operations as close as possible to the source of the data, edge computing minimizes both latency and bandwidth use, as well as security vulnerabilities. Edge computing devices are functional enough to run surprisingly sophisticated analytic applications such as artificial intelligence (AI) and machine learning (ML), not to mention time-sensitive use cases such as augmented reality, virtual reality, and video processing. Organizations can also take advantage of hybrid architectures that use both cloud and edge computing to deliver the benefits of both.
The approach can dramatically impact operations by reducing unscheduled downtime, improving operational equipment effectiveness (OEE), reducing operating expenses, and enabling consistent operations across the enterprise. “I think there has been some very definitive proof of the value proposition at the Edge,” says Chantal Polsonetti, VP, advisory services at ARC Advisory Group (Dedham, Massachusetts) specializing in IIoT edge hardware and software. “And I would say that it has particularly been true on the discrete side of the industry. “
Edge Computing 101
Edge computing gets its name from the location of the edge platform at the interface between the enterprise network and the Internet (the network “edge.”) That doesn’t necessarily mean a single connection for the enterprise. In the era of the IIoT, the industrial network edge can be in any of the physical devices, assets, machines, processes, and applications that intersect with the Internet and/or intranets says Polsonetti.
The initial approach to cloud computing was to aggregate data with data loggers or data historians and send it directly to the cloud for processing. The data from a collection of sensor and device nodes (e.g. encoders, temperature sensors, drives, motion controllers, etc.) was aggregated by a gateway that converted the raw data for use with modern protocols like MQTT or OPC UA. The first step toward edge processing was for these gateways to begin to perform some basic operations such as filtering and normalizing the data prior to sending it to the cloud. (depending on the architecture, this approach is known as fog computing).
This intelligent data pump, as Sahil Yadav, Edge Product Manager at GE Digital calls it, minimizes the volume of data sent to the cloud, lowering bandwidth demand. It also reduces the processing burden at the cloud level. “One of the main reasons why edge computing is not being adopted as much is because it can drive up network costs,” he says. “But it actually reduces costs overall. [With preprocessing], you are not just dumping all the data into your central repository. You increase your ability to send data that drives value and is meaningful for you.”
The evolution from edge processing with gateways to edge computing with dedicated edge devices represented a major advance. Edge devices don’t just preprocess data, they can run many of the applications and analytics previously restricted to the cloud. The processing model basically shifts from centralized (cloud computing) to distributed (edge computing). The ideal deployment is a hybrid version that performs time-sensitive processing locally but also ships data and edge results up to the cloud for further analysis and for purposes of standardization across the enterprise.
The standard edge architecture consists of sensor and device nodes connected to one or more edge devices capable of being virtualized and running containerized software. Software containers are self-contained packages that enclose not only the application code but also any associated libraries or even a specific OS needed to run the software. Containers ensure that the enclosed software runs stably on any platform, making them well suited to bringing edge computing functionality to a variety of platforms.
In a sense, the containerized software and edge devices operate analogously to apps and smartphones. “You have a combination of applications,” says Ramey Miller, edge product manager for Siemens Industry factory automation. “You would have an app that is watching performance and another one for energy use, for example. You might have another one gathering information for digital twin.” Organizations with access to developers can create their own customized apps to deploy in the same fashion.
Taking it a step further, apps can begin working with an AI system also installed on the edge device. “Now you can tie a performance insight from an app into an AI system so that you essentially train the AI,” says Miller. “After the time, it learns, ‘Hey, I’m getting ready to run out of alignment, I need to load share so that I can get this motor back within its operating tolerances and not have to stop the system because of failure coming up.’” With that, the system begins to move into an automatic predictive maintenance regime.
Edge devices are available as purpose-built products, but an industrial PC, with the right software installed, could do the job just as effectively. It’s helpful to think of the hardware as just an abstraction layer, says Nathan Slider, systems consultant and solution architect at Aveva. “As long as the hardware has the processing power needed by the software and enough memory available, then it can perform its functional role,” Slider says. “so that edge device could be something as simple as a PLC, provided it has enough memory and processing available, or an HMI, or even a gateway.”
Edge computing architectures typically have a supervisory layer that sits above the edge devices. In some cases, this edge computing platform manages the edge devices. In other cases, it simply stores the applications and pushes updates down to the edge devices when necessary.
Combined with containerized software, this architecture provides an opportunity for operational improvement and consistency. “If we can load these Docker containers onto this hardware, now I can use that to standardize the operating system and applications across the entire corporate enterprise,” says Slider. “Being able to push these new operating system patches out or manage the applications remotely with this technology allows us to really manage these edge devices in a lot more efficiently.”
Depending on the architecture, either the edge device itself or the edge computing platform interfaces with the cloud. It’s worth noting that edge computing systems can be purely local, but that fails to take full advantage of the opportunity. It’s also important to remember that cloud doesn’t necessarily mean a public service provider. It simply virtualized resource that can be located in the corporate data center or even on a different floor of the same building.
The Benefits of Edge Computing
Low latency: Because edge computing doesn’t require the ground-trip travel time of the cloud model, it opens the way for a new kind of real-time smart manufacturing in which machines improve their performance autonomously. In this model, the edge device runs analytics on node data, derives results from the data, then uses those results to send control signals back to the machine. “This is especially useful in some factory floors when you need to make decisions, or Edge needs to make decisions in real-time and there is no time to send data back and forth to the Cloud,” says Yadav. “I have only a limited number of customers who are interested in this use case today, of course, but most customers see this a bigger use case eventually.”
Even in the immediate term, the ultrafast response brings important benefits for applications like predictive maintenance or even product type and quality analysis in high-speed systems. If data indicates a sudden change in condition of a crucial piece of equipment, action needs to take place as quickly as possible. Similarly, if a machine suddenly begins producing scrap because of an improperly tensioned belt, money will be lost until the line is stopped and issue corrected. Edge computing minimizes latency by performing calculations as close as possible to the actual assets.
Reduced bandwidth: The data sets required to support predictive maintenance or other applications of AI consume significant amounts of bandwidth, clogging the network, or even requiring upgrades. The proximity of the edge device to the data nodes minimizes bandwidth requirements,
Security: Edge devices are designed with the highest levels of security to enable safe interface with both the OT network and the Internet. “There is an enormous number of security aspects that are kept in mind while designing these products from essentially ensuring that they are no unwanted connections to the Edge device, to ensuring that the applications running on the Edge device do not malfunction or impact other devices on the network,” says Yadav.
Every installation is different but there are a few paths to success:
Start small: Avoid the temptation to bring every device on your floor into your edge network. This just results in massive amounts of data that obscure overwhelm the network and go unused. The system will waste time and money and failed to achieve ROI. Instead, identify a specific problem to solve – excess downtime at a pinch point, a belt that loosens to result in bad product. And sensors and configure your edge computing solution to identify issues and potentially even respond.
Be strategic: focus on applications most likely to benefit from the technology. “if you have data that needs to be analyzed and processed to make the right business decision in real-time, then edge computing is where you need to go,” says Craig Resnick, VP at ARC Advisory Group.
“if you’ve got a PLC that just runs a conveyor belt system, then that’s not a good use case for an edge device,” says Miller. “But if you’ve got something in your process that is mission-critical, then that’s where you want to put an edge device to collect data, monitor your inspection tests, your KPIs, and do it all right there at that focal point. Then as production needs change, you just change your apps a little bit to keep the process flowing as efficiently as possible.”
Choose a scalable solution: The pilot project will be a starting point but it will be far from the last. Success will result in new applications, and there’s no telling what data will become important in the future. This holds equally for end-users and OEMs. “A machine builder who wants to put out products that have, say, remote access or monitoring, or other IIoT enabled incremental service revenues attached to them will certainly want scalability to be able to go from 0 to 1000 or even 100,000 nodes really quickly,” says Polsonetti. “In our minds, capabilities like zero-touch provisioning and hardware/software virtualization are easy to enabling that scalability.”
Look for ease-of-use: Although it can be tempting to consider a homegrown solution, the initial development phase is just the start. The software stacks for edge platforms need constant updating, turning the end-user into a software divider developer rather than a manufacturer. “We see a distinct emphasis on ease of use now, with turnkey operations, self-service, etc.,” says Polsonetti “[Providers are] trying to get away from that heavy custom service or consulting component, and making it much easier for customers to implement themselves.”
Edge computing provides an effective approach for industrial users to leverage data and analytics to boost productivity, shrink unscheduled downtime, enhance product quality, and reduce cost of operations. Locating analytics near the data sources reduces latency, making the technology effective for monitoring high-speed systems for machine learning applications. In the long term, it will enable new applications such as machines that can not only detect developing problems in the early stages but also autonomously take steps to compensate.