Greg: There’s been a lot of buzz in the industry regarding artificial intelligence (AI) and machine learning (ML). Vendors claim their tools can transform operations and leap tall buildings in a single bound. So, how do we get past the hype and unlock the real value for industrial plants? Julie Smith, group manager for DuPont’s Engineering Technology Center and its global automation and process control leader, is once again here to discuss these two trends. Julie, has your team seen value in AI and ML?
Julie: We certainly see potential. There’s a lot more we can do with the reams of process data we generate daily. However, any tool must be used with caution and under the right circumstances.
Get your subscription to Control’s tri-weekly newsletter.
You can’t throw data at an ML engine and expect it to perform miracles. The dataset used to train the model is critical to success, and context is important. For example, if you want to predict equipment failures, the training dataset must include those failures. If you want to predict off-spec materials, you need data spanning the period when the off-spec material was produced. Such data can be hard to come by because we don’t want to cause equipment failures or make poor-quality material.
Greg: Agreed. Good data quality is hard to come by, particularly for multi-product plants. How do you address this issue?
Julie: For the product-quality use case, we’ve been able to turn to dynamic simulation. A high-fidelity digital twin is a great asset to an ML model in several ways. First, the digital twin helps train the ML model, generating cleaner datasets for the ML algorithms to absorb. It also fills in gaps for products made less frequently. Since a simulation runs much faster than real-time, you can make hundreds of batches in a fraction of the time.
Another approach is to use a trained ML model to provide parameters for a first-principles model. This is helpful when trying to estimate reaction kinetics or other non-linear phenomena. A common use case is that the temperature profile is known, but not the Arrhenius constants. We developed a Python application program interface (API) to allow our models to exchange variables with an ML engine for these cases, and we’re just beginning to see its power.
A third possibility is using both paradigms together to model a process that works well when you know mechanisms for some, but not all, of the product properties of interest. For example, let’s say you have a polymer product that suffers from viscosity excursions and occasional light or dark spots. The first-principles model can predict the polymer viscosity as a function of the molecular weight, but it won’t have a clue about the spots. An ML model can regress years of data to find the correlation with spots, assuming the training set included batches with those spots. The two tools can complement each other.
Greg: Those are great use cases. Have you had much demand for these solutions?
Julie: It’s been slow to start. Both efforts take a fair amount of investment in terms of resource hours. You must make sure the model has high enough fidelity, and there needs to be a strong enough business case to find the optimum operating point. With today’s high energy prices, particularly in Europe, the business case is becoming easier to justify.
Greg: What other ways has AI become useful in process automation?
Julie: We’ve begun to use generative AI (genAI) to help engineers find information faster. We started with classic IT applications, such as “knowledge finders” and other self-service tools. While it can save hours of searching through manuals, it can also be dangerous. All outputs must still be checked and validated. GenAI will not tell you “I don’t know;” it will make something up if it doesn’t have the data. This is where verification is key.
Another area with high potential is mining our alarm and event logs for insights. If you’ve ever tried to look at an event log from any control system, regardless of vendor, you’ve likely found it frustrating. The logs weren’t set up for human readability. Today, there’s an intermediate step required to parse the data, which is a tedious process. Could genAI make that faster? That would be a real boost to productivity.
Greg: Could genAI generate process automation system (PAS) configurations?
Julie: That’s certainly been the claim from many vendors. There’s value in taking a functional requirements document, which is typically written in narrative form, and using it to generate PAS logic. We haven’t tried this yet, but the potential is there. A classic use case would be for a newer plant engineer still learning the specific control system in service. Of course, I advocate testing any such logic that AI generates offline first.
Such code generation would also seem best suited to simple tasks such as generating a timing sequence or other things at the control module level. Humans must still generate the overall control strategy for the unit. For example, AI can’t tell you whether to control column top temperature with make or reflux flow because that pairing is highly dependent on the specific vapor liquid equilibrium (VLE) of the process.
Greg: That strategy can be found by process simulation, potentially coupled with optimization.
Julie: Exactly.
Greg: All the above use cases are aimed at engineers. How can AI and ML improve operator efficiency?
Julie: Our friends in IT have a tool called intelligent process optimization (IPO) that consists of an ML model coupled with a recommendation engine. IPO acts like an expert advisor to the operator, suggesting ways to improve yield, reduce energy, etc, as desired. It’s like the old “expert systems” of the 1990s, but with today’s ML computing power. It’s an open-loop tool, so it’s up to the operator whether to apply the recommendations or not.
The challenges are the same as 30 years ago—building trust in the advisor, and keeping it up to date with subtle changes that occur in the process. These changes are particularly challenging for multi-product plants, where new formulations are frequent. Without proper care and feeding, these tools can fall into disuse quickly.
Greg: I think there’s an important synergy between AI, ML, first-principle models and digital twins, when final control elements detailed in the ISA-TR75.25.02-2000 (R2023) Annex A and measurement dynamics detailed in my books are included. The deadtime, secondary, lags, resolution and lost motion by including instrumentation are important for getting a realistic, dynamic response, and recognizing the value of better control valves (true throttling valves instead of imposters), sensor types and installations (stepped thermowells and insertion electrodes in pipelines with appreciable velocity).
Design of experiments (DOE) with a digital twin can be done with simulations to provide the richness in spectrum of data by extensive changes in operating conditions. This helps eliminate unrealistic AI and ML interpolations and extrapolations. AI and ML correlations can be confirmed to be actual cause-and-effect relationships. There are also opportunities for predictive maintenance to reduce instrumentation and equipment problems.
The control system ability to handle process startups, shutdowns, transitions and abnormal operations can be improved by introducing scenarios into digital-twin, first-principle dynamic simulations that have been shown to match plant data. AI and ML can identify unsuspected relationships, resulting in improvements to the first-principle models and solutions in terms of procedure automation and state-based control.
There are several challenges to simply using plant operations data to improve closed-loop control. Foremost is the complexity and slowness of dynamic responses in process control, including large dead times, negative and positive feedback time constants, unidirectional response (batch applications), integrating action, and open loop gains that change with time, production rate, equipment conditions and stream compositions. There is also the interaction between loops and the dramatic effect of tuning and algorithms (e.g., PID forms and structures) and the transfer of variability from controlled variables to manipulated variables. Also, plants are increasingly unwilling to change setpoints or flows for a DOE. The synergy between AI, ML, first-principle models and digital twins is our hope for a better future for process control, especially considering the loss of expertise by retirements. You might even be able to do a podcast.