Quality4.0 - First project results available
The first half of this project was dedicated to the requirement specification and method development to enable the implementation of the Quality4.0 framework in the second half of the project.
The project started with an examination of the existing IT infrastructure at the industrial partners. Quality relevant data sources with respect to the horizontal integration of information over the complete supply chain were determined and a survey was created. Several samples of the selected data were analysed to understand their structure and problems. A scenario-based approach was executed to define functional requirements of various stakeholders through the industrial use cases, using precise Use Case Cockburn standard template. Based on the necessary background and the use cases, the functional requirements were formulated derived from high level objectives of the project.
Besides the functional requirements, the non-functional ones have been defined. While functional requirements define the behaviour of a system, non-functional requirement define criteria for the operation of the system (e.g. operating system, performance, stability, safety etc.)
In addition, existing IT standards for quality data exchange were analyzed with regard to their compliance with the Quality4.0 project’s requirements. Indeed, the following three solutions were determined and investigated:
• the QDX (https://portal3.gefeg.com/vdaqdx/page/home),
• the STEP (http://www.steptools.com/stds/step/),
• and the EUROFER - Quality Tracking System (http://www.eurofer.eu/Issues%26Positions/Quality%20Tracking/Quality%20Tracking.fhtml)
As the consortium aims to use a free standard and to the best of its knowledge, there is not a free standard that may support the requirements of the Quality4.0 project, it was finally proposed to define a Quality4.0 specific IT standard by taking inspiration from the other tools. Thus, in the upcoming implementation work package, a unique programmable interface exposed by the QXS service and adapted to the chosen Quality4.0 architecture will be specified and implemented.
After all functional and non-functional requirements have been formulated and a suitable interface was determined, the partners developed the required methods that should be implemented in the Quality4.0 framework. This method development includes on the one hand methods for data plausibility protection and on the other hand methods for adaptive quality supervision.
To guarantee information reliability the relevant data sources at the industrial partners were deeply analysed and plausibility values (PVs) were defined for each single measurement. By means of these PVs the Quality4.0 framework should be able to assess the reliability of information coming from the different measurement systems.
The partners developed different methods for calculating PVs. Besides simple thresholding as well fuzzy-based, variation-based or data-driven methods were defined. Furthermore, unsupervised outlier detection was used and the plausibility of measurements was formulated by means of their outlyingness. The results of these investigations were summarized in a list of data plausibility measures and guideline for their use and combination.
For the implementation it is planned to apply suitable real-time technologies dedicated to managing data stream in which the defined plausibility checking methods can be implemented. For instance, a given method may be integrated in the Lambda Stream Layer or in the Stream Processing Layer of the Kafka architecture.
To prevent the repeated calculation of aggregation or the interference of time-critical stream processing, an intelligent caching and pre-fetching strategy was developed for the Quality 4.0 framework based on scalable storage patterns (i.e. data lake, data warehouse), message queues for resiliency and elasticity, modern infrastructure, deployment and development practices (e.g. DevOps). Also, the usage of edge computing strategies was evaluated shifting aggregation tasks to the sensor side if possible and considered in the Quality4.0 architecture design.
The main approach to achieve adaptive quality supervision for the Quality4.0 framework is to model quality decision processes semantically through the production workflow and to determine customer intimacy levels for quality data exchange. Based on this information the system can decide on the exchangeability of quality data. For this semantic modelling the partners developed two ontologies within RP1. A first ontology is covering supply chain and production workflow for the steel production and a second one is covering customer relationship.
Finally, based on the developed results the Quality4.0 framework was designed by the partners and will be prototypically implemented at the industrial partners in RP2.
The general concept for the Quality4.0 framework will be a Service-Oriented-Architecture (SOA) to be able to combine and integrate it in existing IT infrastructures without relying on a single software vendor, product or technology. The strict privacy regulations at the industrial partners ultimately prohibited a solution where data leaves the plant, thus the consortium finally decided for an edge architecture that allows a local processing of the data at plant level.
Progress beyond the state of the art, expected results until the end of the project and potential impacts
Considering the main objective of this project to establish a new level of customer-supplier collaboration by means of the horizontal integration of quality information over the complete supply chain of steel production no approaches can be found in the international state-of-the-art covering all aspects of this global concept adequately. Thus, this project is by far a significant step beyond the current state-of-the-art.
Referring to data plausibility, available tools cover only certain aspects (dimensions) of data quality. Data quality monitoring in steel industry has to consider the complex material tracking from liquid steel to rolled products. Moreover, steel production faces a huge variety of diverse processes from sintering and steel making over casting to rolling. Therefore, the Quality4.0 solution was designed to cope as well with various time constraints as various data types (from laboratory analysis to high-resolution flatness profiles).
At the end of this project it is expected to put the Quality4.0 framework into industrial practise by means of a prototypical installation at two different industrial sites providing a big potential for the industrial partners to improve their processes and accordingly to optimise the performance in general.
Furthermore, the horizontal integration of quality information over the complete supply chain provides a big potential for steel industry in general. Only if the European Steel Industry succeeds to win customer-trust and solidifies client intimacy, it will achieve a durable competitive advantage and thus reduce pressure from world-wide imports. Furthermore, it is of strategic importance for the European steel industry to proactively promote such a common platform instead of reacting on specific customer demands.
The main potential impacts, expected from the Quality4.0 platform, are improved information reliability, supply chain reactivity and customer intimacy. In RP1 the partners defined KPI able to quantify these aspects as precise as possible. At project end they will perform a before/after comparison to evaluate the overall effect of operating the plants with the solutions developed within this project.