Volume 5 Issue 10 October 2023

A
Paperitalo
Publication
In this Issue

Welcome to Industree 4.0 for October, 2023, exclusively sponsored by SAP.

SAP

By Kai Aldinger, SAP, Mill Products and Mining Industry Solution Management

Leveraging Artificial Intelligence in ERP

The last few years have seen several use cases for artificial intelligence that impressively demonstrate its diverse potential applications in the paper industry. The focus was often on demonstrating the value of AI in supporting production processes, with the associated commercial business processes playing more of a secondary role. While encouraging results have been achieved, actual use in an operational environment often falls short of expectations.


What are the reasons for this? And what needs to change to transfer the successful “proof of concept" into everyday operations?

 

One important reason is related to the data.


Data is all over the place


Although many companies have undertaken numerous efforts in the past to enable easy access to all relevant data, many have not yet reached their goal. Often, the data is located at different levels, with different access paths, and thus cannot be used directly for AI applications. For example, the sensor data of the plants are processed at the production level, whereas information about the batches, such as their quality or the suppliers of the primary products, are processed at the ERP level. This is challenging because, to infer the expected quality of the batch for predictive quality scenarios based on the sensor data, data from both levels must be combined. While the "static" data, e.g. as csv files, may be sufficient for the implementation of an initial proof-of-concept, the data for later productive use must be provided automatically, traceably and, in real-time. This places significantly higher demands on the underlying solution architecture, which is also usually outside the scope of pure ML/AI. 


How can this be solved?


There are various digital solutions that can be used as a foundation for the implementation of scenarios such as the one described above to improve product quality.


What most of these scenarios have in common is that ML/AI is used to identify patterns in the sensor data that can lead to poor quality. This enables the plant manager to take preventive measures to avoid a deterioration in quality and thus increase the manufacturing quality. This scenario could be realized with the help of the following architecture: 


  • Using SAP Data Warehouse Cloud, sensor data from the production plant is linked with the SAP ERP data on the batches (for example, a batch can represent a paper roll) produced. For each batch, detailed sensor data as well as higher-level data from the ERP system on quality (in/out specification), components (supplier), and upstream processes (production lines run through) are then available. 


  • Based on this data, a quality prediction can be created using an inference pipeline in SAP Data Intelligence and written back to the SAP Data Warehouse Cloud. The models used for the predictions were previously created using historical ERP data along with associated sensor data. The model also identifies the factors (for example, speed, temperature, suppliers) that have the highest impact on quality.


  • The predicted quality of the current batch produced can then be displayed to the plant manager together with the factors (or variables) that influence the quality. The plant manager can then take preventive measures to avoid quality deterioration and inform the plant manager accordingly, for example, recommending an increase or decrease in speed.


Operationalize AI


Another key challenge that is being addressed to help AI go mainstream has to do with the management of AI. The increasing availability and importance of AI scenarios show its increasing relevance for today's business success. However, as the amount of AI content in the enterprise increases, so do the requirements to operate and maintain it in a consistent, standardized, safe, and scalable manner across all business applications. On the other side, due to its technology diversity, it is important that the authoring of AI content is open for different tools (such as JupyterLab) and technologies and is not subject to any restrictions in this respect.


For example, many innovative solutions come from startups that are not necessarily based on the same runtimes. Nevertheless, it is necessary to integrate them with existing business processes via an open framework to quickly leverage the added value associated with their use. One example of this is the SAP partner Cogniac, which offers its AI-based solution for evaluating visual data used, for example, to detect punching defects in the metals industry.


To support the management and extension of business applications with AI scenarios, SAP also offers an AI foundation with the following key components:


  • SAP AI Core provides an engine that lets you run AI workflows and model serving workloads.


  • SAP AI Launchpad manages several AI runtimes. It allows various user groups to access and manage their AI scenarios.


  • AI API provides a standard way of managing the AI scenario lifecycle on different runtimes, regardless of whether they are provided on SAP technology (such as SAP S/4HANA) or partner technology (such as Amazon Web Services). 


In general, the AI foundation is the central vehicle for customers, partners, and SAP’s internal teams to manage and operate the full lifecycle of the AI content (versioning, deployment, and monitoring) across applications and extend SAP’s offerings with AI capabilities. Whereby the consumption of these AI capabilities is unified by the SAP-governed AI API. This approach primarily pursues the following goals:


  • Allow seamless, easy embedding of AI capabilities into other applications


  • Leverage high-volume data from the applications to create robust machine learning models


  • Execute machine learning training on accelerated hardware


  • Serve machine learning inference with low latency and high throughput cost-effectively


  • Adhere to a compliant, explainable, and maintainable process


  • Manage all stages of the AI lifecycle using a comprehensive set of tools and services


  • Focus on the productization and operationalization of ML scenarios


There are several examples of thought-leading companies in the paper industries that are using ML/AI with high and increasing business benefits.


Steinbeis Paper


Steinbeis Papier combined data from 25,000 factory sensors with data from commercial systems into one database with powerful machine learning algorithms to ensure that monitoring is seamless and entirely automated. This allows the company to run an analysis without waiting hours for the results and to immediately recognize inconsistencies in product quality.


To be able to do this they mapped the entire asset structure, production processes, business processes, and more in graph data models as digital copies. Having all this data with its semantics into one place allowed Steinbeis to not just use the information in production, but also for innovative solutions throughout the company in purchasing, materials management, and management accounting. Read more here: 


Steinbeis Embraces Digital Circular Economy | SAP News Center


To find out more about how SAP can help you with your AI Aspirations visit: SAP.com/AI

or Chat with SAP


Building I4: Level 2, Historian, the digital landfill

By Pat Dixon, PE, PMP


Vice President of Automation, Pulmac Systems International (pulmac.com)


and


Mariana Sandin, Industry Principle, Core & Emerging Industries

Seeq Corporation



The 4th industrial era is overflowing with digital data from multiple sources. There is no lack of data in our industry. While we may think that the financial, medical, and government sectors may be very data intensive, the manufacturing sector dwarfs all others in the production of digital data.   


In industry, most of that data ends up in Data Historians. Data Historians have been in the market since the 1980s, and they bring benefits to the industry by:


  • Complimenting the data architecture by storing operational information for long periods of time (months to years)
  • Strengthening data security by isolating access to the Control System Network, since, most commonly, historians run on a business network.
  • Gather information from several operations systems at the mill, providing a more uniform means for accessing data.
  • Provide information to multiple teams and systems at the mill without overloading the production systems


Our historians are full of data, yet much of it is rarely seen. That effectively turns historians into digital landfills. Data can decompose if it isn’t used when produced. While past history can inform present day decisions, aging data will not be as pertinent as recent data.

 

At the same time, instrumentation is producing more data by adding diagnostic and configuration data along with the signal. This can pile up the heap higher on the digital landfill, requiring more memory resources.


Most historians make the best use of memory resources by handling second or sub-second data with mechanisms of exception, and compression, or by separating the long-term data from the recent data acquired. These techniques maintain the characteristics and accuracy (fidelity) of the source data over time.

 

One exception is, by definition, the exclusion of samples that do not meet the criteria to pass into the historian storage. This is usually done closest to the data source, at the control level. One important caveat regarding exceptions in data is that the instrumentation has a level of accuracy and tuning parameters, exception, to work properly, cannot be more precise than these limitations. Over time, as instrumentation ages and signals drift, this configuration needs to be revised and adjusted.


Compression is the way that historians remove data that is deemed “invaluable” because it is repetitious or does not meet a certain threshold for storage, for example, data with very low variability is captured every second for 1 hour. A swinging door algorithm is an example of a data compression technique. It helps with the performance of data extraction, either for a visualization tool or for external systems that need to integrated with the historian. Averaging data in a separate instance of the historian database reduces the number of samples for the long-term storage of historical information.

 

Until recent times, mills needed to worry less about the cost of storing data locally, so prioritizing the compression or number of tags to collect is no longer a roadblock for deploying a data historian as it was 10 or 20 years ago. Nevertheless, it is important, when vetting different historians, to look for functionality that would enable to “backfill” data streams and calculations that may want to be added at any point in time of the life of the historian and to save data into the future for forecasting analysis.

 

The average number of signals for a well-automated and surveilled integrated mill is, from the author’s personal experience, 50,000 tags. This number of data streams is far too great for a single person or group of people to monitor constantly or make sense of it efficiently. So, the need to organize and add context to the operational signals is of additional value to the raw signals. Tools like Asset Framework help to organize the signals from the historian into groups of attributes that represent pieces of equipment, areas of the process, geographies, etc. It is based on object modeling principles that leverage functionalities such as templates, categories, static and dynamic attributes to the objects and signals, and relationships among them that will contribute to making information easier to find for anyone with access to the historian.

 

It is important to understand that the data in the mill is not the only data of interest. In the 4th industrial era, at the enterprise level pulling data from multiple mills can reveal learnings that would not be apparent in isolation. This leads to the need for an on-premises and off-premises approach to data collection and accessibility, so the cloud is a way to solve this problem.

 

It is possible to transition your digital landfill into raw materials that help solve present day problems. A sustainable automation system in the 4th industrial era uses historians effectively to mine data for gold.

Discipline in Data Gathering and Storing

Both Kai and Pat, in articles above, are bemoaning data. Kai's slant is one primarily focused on formatting it while Pat is talking about all the data we have stored that we can't identify. Two sides of the same coin.


How did we get here? Lack of standards can be blamed and so can loosening of standards.


On the second example, one need look no further than the original Microsoft Operating System. It had a restriction of file names to, if I remember correctly, about five letters or numbers. Maybe it was slightly more, makes no difference.


Around 1999, Microsoft loosed this constraint and allowed very long file names.


This was a move in the wrong direction. Shorter names required succinctness and forced individual organizations to make their own rules for file names. Consequently, in those days, file names were intelligent and had meaning within an organization.


Organizations need to do this today if they are to avoid Pat's data landfills going forward. One of the best organizers of apparently random data in the past was the famous German "Enigma" machine. It intentionally turned clear data into garbage, and then pulled it back out when needed.


I don't think we should worry much about data in the past. The cost of fretting over it is too high. We should expend our scarce resources developing standards and systems to properly classify and store data from this point forward.


Leave the data landfill for the archeologists of the future. Get on with organizing from this point forward.


The Industrial Internet of Things (IIoT): Transforming Industries with Connectivity

By IoT Business News

In today’s interconnected world, the convergence of technology and industry has given rise to the Industrial Internet of Things (IIoT), a powerful force that is revolutionizing the way we operate, optimize, and innovate within various sectors. The IIoT, a subset of the broader Internet of Things (IoT), is reshaping industrial processes, enhancing productivity, and fostering new possibilities across a wide range of industries.

Read the full article here

From Internet of Things to Internet of Everything: The Convergence of AI & 6G for Connected Intelligence

By Haziqa Sajid

Internet of Things (IoT) establishes a network for connecting physical objects, such as devices, machines, sensors, or any equipment with processing abilities that can connect to the Internet. It refers to a digitally connected universe built on smart devices like fitness trackers, home voice assistants, smart thermostats, etc.

Read the full article here

My Opinion: Use the IIoT Thoughtfully

By Ed Brown

The connected factory has a lot to offer, but getting it from the minds of the designers to a working system on the shop floor is not a simple process.

Read the full article here

How IIoT is driving the manufacturing transformation across industry segments?

By Vidushi Saxena

With the advent of new technology, everything is getting revolutionized. Keeping Industry 4.0 in mind, there's no denying that these rapid innovations reolutionized the way companies manufacture, pack, and disseminate their products. Manufacturers are integrating smart technologies along with traditional manufacturing and industrial practices.

Read the full article here
Industree 4.0 is exclusively sponsored by SAP