Paper manufacturers are reducing input costs by using AI to predict quality in real time and identify optimal process settings.
The papermaking industry has high variability in its raw materials, most especially in the creation and use of wood pulp. This variability, combined with the highly complex data generated in the downstream processes of refining, forming, and pressing, makes process optimization very difficult for paper and pulp manufacturers.
For most paper products, there is also intense pressure to reduce manufacturing input costs.
The complexity of the papermaking processes is the main obstacle to optimizing material and energy usage costs.
At each step of production, there are numerous settings that impact the final product. Typically, the exact influence that these settings have in different situations is sparsely documented.
Even the most experienced operators and engineers have limited time to devote to fully optimizing their process. Often, this means the process has considerable opportunity for improvement, but the manufacturer is unable to identify those improvements.
The first step to using AI to reduce paper manufacturing costs is to unify production data sources. Importantly, this must be done with a data architecture that is flexible and dynamic. If the only mode of unification is to collect data into a data lake, it will be very difficult to apply this data in various analytical use cases.
If this sounds like a monumental challenge, it doesn’t need to be! Oden helps manufacturers to unify and contextualize their manufacturing data.
Manufacturers with a lack of data accessibility have difficulty both in the creation and the communication of high-level analyses of paper manufacturing data. For example, paper quality metrics like moisture content and paper strength are difficult to measure in real time. Many manufacturers therefore choose to refer to offline quality tests. These offline quality tests are often infrequent and are communicated to operators in ways that are easily missed.
Using a tool to dynamically unify datasets from different sources (MES, ERP, historian, etc.) is the foundational structure for enabling meaningful AI insights. By combining process data with production data and quality test data, a unified data architecture enables analytics that capture the full set of production conditions.
Establishing a foundational data architecture simplifies analytics. Often, analyses that would otherwise take weeks of data formatting and collection can be accelerated to occur within a few minutes or automatically.
One of the most powerful applications of this is to predict the results of offline quality tests in real time. For most paper manufacturers, offline quality tests are conducted periodically. The results are hugely important to meet the exact specifications of the buyer.
When creating more advanced AI models to minimize input costs, reducing the lag time of paper quality information gives operations teams the confidence to change product recipes.
Quality prediction models allow manufacturers to “fill in the gaps” between offline quality tests by predicting the quality values at all times. The predictions can also be streamed in real time to the factory floor. This gives early feedback to quality failures. By predicting the results of offline quality tests, a reaction is possible within seconds, instead of an interval of 90 minutes for a traditional offline quality test.
Regardless of the source of paper quality metrics, the operator must have improved access to quality information in order to trust any insights generated by AI to adjust their process.
Predictive quality provides visibility into quality characteristics at any point in time, so that a recommendation can be made with confidence that it will result in good quality and stable production.
Many operators are accustomed to quality information coming at regular intervals, perhaps a handful of times per shift. By making quality information available in real time, there’s a change in how operations teams engage with their processes.
Easy access to a unified data architecture enables AI models to identify complex patterns in manufacturing data. By using AI, especially with machine learning models designed for manufacturers, meaningful metric relationships and opportunities for improvement are identified far faster than a process engineer could do alone.
This is the light at the end of the tunnel for reducing paper manufacturing input costs. Two areas that are likely to yield early results are reducing energy consumption and optimizing material usage during production.
Process optimization performed by AI should be layered on top of predictive quality in order to generate optimal recommendations while keeping end product in spec. The leading tool to deliver this capability to paper manufacturers is Process AI.
Process AI focuses on minimizing the cost of production, while maintaining the current speed.
Process AI performs a process optimization, identifying process settings that save cost and increase speed while maintaining good quality. Built-in Predictive Quality models provide visibility into quality throughout the production process, which ensures that recommendations correspond to good, stable production and are likely to yield good results in the future.