Smart Technology Artificial Intelligence in the Process Industry – Very Hot Affair

Author / Editor: Bierweiler Thomas, Dr. Daniel Labisch and Dr. Konrad Grießinger* / Ahlam Rais

In recent years, artificial intelligence (AI) has regularly hit the headlines: in 2016, it beats Grandmaster Lee Sedol in playing Go, and in 2018, the auction house Christie's auctioned off the first artwork created by artificial intelligence, "Portrait of Edmond de Belamy", for over 400,000 dollars. And today the AI-based text generator GPT-3 from the research laboratory Open AI, co-founded by Elon Musk, is said to flood the web with fake news through machine-generated posts but what about industrial applications of artificial intelligence? What are the opportunities and challenges of AI in the process industry?

Related Company

AI methods are combined with mechanistic models to fully exploit the potential of the digitally available process knowledge. Such methods are already being used profitably in applications and digital services for the process industry.
AI methods are combined with mechanistic models to fully exploit the potential of the digitally available process knowledge. Such methods are already being used profitably in applications and digital services for the process industry.
(Source: Siemens)

What actually is this 'artificial intelligence'? The topic polarizes, often frightens and disturbs people. Not least because Hollywood's film machinery has shaped our idea of artificial intelligence for decades far more than scientifically based statements ever could. So let's stick to the facts: AI is an umbrella term for the way computers handle data, how they analyse, interpret and learn from data, and finally how they use what they learn to achieve certain goals through flexible adaptation.

The more detailed we look at the subject, the more blurred the boundaries between individual procedures, methods and approaches become. Sometimes individual disciplines are subsets of others. However, a fundamental distinction is made between strong and weak AI. Much of what Hollywood wants us to believe is based on strong AI, because the goal of strong artificial intelligence (also called super intelligence) is to achieve or exceed the same intellectual abilities of humans. What we can already implement today, on the other hand, is based on weak AI, which proceeds statistically, collects data and gains knowledge from it. It is not a matter of replacing people, but of supporting them - for example, in processing unimaginably large amounts of data for text or image recognition.

Gallery

Whenever large amounts of data are transformed into a manageable number of allowed results, artificial neural networks (ANN) help. They are a class of learning algorithms whose structure is inspired by the human brain and which are characterised by an input, output and several intermediate layers. From a technical point of view, layers of artificial neurons are combined with each other, so that even complex relationships can be modeled if the network is large enough. When we talk about artificial intelligence, however, we usually mean so-called machine learning (ML) - and thus rather 'prognostic competence'.

ML is the generic term for the generation of knowledge from experience: self-learning machines can, for example, recognise error patterns in industrial production or predict faults. In contrast to algorithms that work purely rule-based, ML derives probabilities from structured training data. Thus, even tasks can be solved whose rules are difficult or impossible to describe. An ML system learns from hundreds of thousands of examples and can generalise them to a statistical model after a learning phase. Deep Learning (DL) is again the part of ML that deals with the learning of complex relationships by means of deep neural networks. A deep neural network is a network with many layers. There is no general definition of the number of layers that can be considered as 'many'. A classic example for a DL-application is the automatic face recognition.

AI Applications in the Process Industry

Artificial neural networks have been known for decades in the process industry, for example in the chemical, the steel or the water and wastewater industry. ANNs are used especially in predictive control and modelling to replace conventional controllers or to provide them with set points which the neural network has determined from a self-developed prognosis of the process course. Fuzzy systems can also be made learnable by a bidirectional conversion into neural networks. Due to their flexibility, ANNs can be used for a variety of different applications. Thus, they perform outstanding services in image recognition, time series analysis or error detection.

Specialist Book „Heat Transfer Technique“The comprehensive standard work „Heat Transfer Technique“ offers not only a detailed and well-founded presentation of the basics of heat transfer technique, but also shows the latest state of the art and the latest regulations in the use of organic fluids. Thematically, the book is rounded off with an overview of property data of organic heat transfer fluids as well as many use cases from practical experience.

ML methods are characterised by the fact that they enable computers to learn from data without the need for explicit programming. Computers are trained to recognise patterns in unstructured data sets by means of algorithms and to make decisions based on this 'knowledge'. In the process industry, ML will play an increasingly important role in automated and autonomous anomaly detection in asset management, inter alia.

When it comes to finding correlations between data and the forecasts that can be derived from them, Deep Learning is an extremely flexible and adaptable tool. DL enables algorithms to enhance their own ability to identify and classify patterns and relationships between data. Even unknown types of data can be considered automatically without the need for manual learning. This increase both the amount of data supplied to the DL-based forecast model and the reliability of the forecasts. This qualifies such systems for sophisticated asset management with predictive maintenance concepts as well as automatically implemented quality management.

Exemplary Evolution of AI Methods

At Siemens, the topic of AI is system-relevant, and this is not only true of the medical technology subsidiary Healthineers or the newly created Siemens Energy: As early as the 1990s, neural networks were implemented in steel plants for process optimisation. For the development and verification of AI methods, in-house process engineering test facilities are available. The development of AI-supported methods typically starts with the identification of real customer requirements and with synthetic and simulated data.

Typical process engineering research focuses at Siemens are, for example, the increase of plant availability, the improved planning of maintenance work or the optimisation of the overall equipment effectiveness (OEE). Suitable AI methods are tested in the research facility in the next step. Here, the conditions are reproducible and yet real. The results help to improve and develop the methods and to identify new aspects. Only when validated results are available does the contact with potential pilot customers begin in order to test the methods in the real environment of process plants.

Additional Information
Information Box
Three Siemens examples of AI solutions in the process industry

The Asset Performance Suite is an asset management solution that supports predictive maintenance and provides a way to use artificial intelligence to improve the reliability and performance of assets. It helps plant owners and operators in the process industry to get the highest possible operational efficiency from their assets.

With the software Siemens Predictive Analytics (Siepa), plant problems can be detected early and downtimes avoided. An AI model first learns the normal behaviour of the plant using historical data and then detects anomalies during operation. On the basis of the anomalies found, a root cause analysis can then be used to identify causes and derive recommendations for action.

For individual process industries, there are software suites that combine industry-specific applications, software modules and digital services. For example, the Chemical Suite includes various modules for the analysis of data in chemical plants. It enables machine-supported monitoring of valves, pumps or heat exchangers and offers a variety of instruments for process optimisation. The success of data-driven AI models lies in the appropriate combination of expertise in the functionality, structure and instrumentation of the plant and its components.

Siemens pays particular attention to optimisation methods to improve production processes: Algorithms are used to identify system behaviour and recognise dependencies. Depending on the complexity of the process, this ranges from simple characteristic maps to state-of-the-art DL methods. In addition, AI methods are combined with mechanistic models to fully exploit the potential of the digitally available process knowledge. Such methods are already being used profitably in applications and digital services for the process industry, thus helping module manufacturers, original equipment manufacturers (OEMs) and plant operators to run their process plants more reliably, efficiently and robustly.

Future Potential and Challenges

So, there is not only 'one' artificial intelligence, but different expressions. Nevertheless, it opens up undreamt-of possibilities for the process industry to go beyond classic hard-coded approaches to adaptive self-learning solutions based on large amounts of data and machine learning algorithms. This means that in the future, Operational Technology (OT), i.e. the production area, and Information Technology (IT), i.e. the area of data processing using hardware and software, will move even closer together.

Process and production engineers as well as data engineers and computer scientists will have to combine their respective expertise to develop optimal AI methods that also meet the highest IT security requirements. In the future, we will have to attach as much importance to topics such as databases, data architecture, modeling, statistics, data storage and logging as to data quality, access to (historical) data, analysis and contextualisation of data.

Despite decades of research - the academic beginnings go back to the 1960s - we still have a huge potential to exhaust in terms of artificial intelligence, not only in the process industry. Due to many successfully realised applications, the advantages of AI are already measurable today - the changes are visible. And if the dystopian Hollywood scenarios are to be good for anything other than cozy goose bumps, then perhaps for reminding us that the introduction of new technologies should not only bring economic success but also responsible behaviour.

* Bierweiler Thomas, Senior Key Expert, Siemens; Dr. Daniel Labisch, Project Manager, Siemens and Dr. Konrad Grießinger, Data Scientist, Siemens.

(ID:47021229)