Between agility and stability


By "digitization" we do not mean the conversion of analogue content (books, music, films) into digital media.
We understand this to be a megatrend that is favoured by several new technologies such as sensor technology, mobile communication and super-fast data processing and that ultimately boils down to a real-time synchronization of physical and digital reality.
This development is not being driven by technology fans and hobbyists, but by tangible economic benefits. The automated processing of real-time data creates benefits that can be measured in euros and cents.
A classic SAP business case from the field of preventive maintenance: sensors in the housing of a pump measure temperature, humidity and vibration and transmit this data to an analytical system in real time.
Powerful statistical algorithms use experience-based rules to predict the probability that a particular pump will fail within the next 24 hours.
If necessary, maintenance messages or orders are created immediately. The assigned technician on site reports back the actual condition of the pump and enables the analytical application to learn from incorrect or correct forecasts.
The potential benefits are obvious: less pump downtime, fewer false alarms and defective pumps and significantly lower operating costs.
ROI does not fall from the sky
The potential is huge, but the desired ROI does not fall from the sky. New solution architectures, new qualifications and new ways of thinking are needed in the specialist department and IT.
Our pump scenario consists of several subsystems (see figure) 1) the pump with its sensors, 2) an event stream processor (such as SAP ESP) that calculates failure probabilities and creates maintenance messages, 3) an ERP solution for processing maintenance orders and receiving feedback data and 4) a rule detection engine that generates and continuously improves the algorithms in the ESP.
Event Stream Processing
Event streams - as in our example - provide large amounts of data, often of uncertain quality. The data must be supplemented and/or cleansed before further processing.
The art of designing a solution architecture lies in finding a balance between high data throughput and high fault tolerance. Both special architecture models (such as the Lambda architecture) and special programming paradigms are used.
Closed feedback loops
The data from ESP and ERP is merged in the Rule Detection Engine, where it is further analyzed and used to improve the rules in the Event Stream Processor.
This means that the historical data also indirectly influences future maintenance orders. This results in a closed feedback loop.
All of this happens automatically, continuously and - more or less - without delays or latency. The overall system is therefore able to react flexibly when new insights are gained or when the "behavior" of the pumps changes (for whatever reason).
Closed control or feedback loops are not new in themselves; many process control systems work with them. The innovation consists of completely digitalized business models:
Such control loops then exist not only for process steps in production, but for complete business processes. The underlying rules are not rigid, but can be designed to be self-learning.
New qualifications
The processing of event streams and self-learning control loops are in themselves a challenge for veteran Abap developers. Added to this: Sophisticated statistical algorithms (Bayesian networks, Arima/Armax, latent variable models) are often used in the solutions.
For both SAP BW and S/4, SAP has opened the door to the world of statistics with HAP (Hana Analysis Process), SAP PAL (Predictive Analysis Library) and the integration of the R language.
Experienced programmers will be replaced by data scientists in the future, and many universities already offer corresponding master's degree programs. In five to ten years' time, their graduates will either be working for you or for your competitors.
Induction instead of deduction
Organizations today operate in an unstable environment. Customer behavior and the decision-making rules in business processes based on this no longer change on an annual basis, but on a daily, hourly or minute-by-minute basis.
The architectures described above must therefore be extremely flexible; at the same time, it is important to maintain an overview in terms of governance and compliance.
This only works if you a) focus only on correlations instead of explaining cause-and-effect relationships and b) allow yourself to be guided by data instead of imposing your own world view on data (induction instead of deduction).
Changing the corresponding mindset takes more time than the pure technical migration from ERP 6.0 to S/4.
IT as a composer
The CIO can play a decisive role in this and accelerate the further development of the organization. With or without outsourcing:
It is no longer a question of doing all the development work ourselves, but of further developing the organization so that it can manage the outsourcing of work efficiently and monitor it in a results-oriented manner.
The term "results-oriented" does not refer to ITIL service levels. "Results-oriented" means that IT service providers are measured on the basis of quality criteria for the performance of algorithms (example: "accuracy in terms of pump failure").
How such algorithms work in detail plays a subordinate role and must at best be comprehensible to the company's own employees.
Crowdsourcing portals such as www.kaggle.com for the development of algorithms or SAP's own Idea Marketplace (ideas.sap.com) show the way here.
The balancing act between agility and stability can only succeed if the IT experts develop from "musicians" to "composers" in-house. Instead of accommodating all possible functionalities in Z programs or BAdIs, the openness of a platform such as Hana can be used to integrate other solutions, for example via Extended Application Services (XS).
This means that simple tasks (such as reporting via Tableau) or highly specialized work steps (calculating travel times as part of real-time customer segmentation) can be outsourced to other applications.
At the same time, when setting up data flows, it must be clarified where persistent structures are required (e.g. the new Adso in Hana-based BW) or what is to be mapped via virtual (e.g. Open ODS View) or hybrid (e.g. composite providers).
Ultimately, the CIO is responsible for ensuring that his or her organization can make such decisions and confidently arrange instruments such as crowdsourcing or SaaS.
Understanding probabilities
Classic ERP systems are organized deterministically. Of course, you can store order probabilities at quotation item level in SAP CRM.
But in the next step, a quotation may or may not become a sales order. Subsequent steps (such as material requirements planning) are not controlled by probabilities.
This is a serious limitation for Hana - a platform with a wide range of stochastic functions. In the future, it will therefore be a matter of making partially automated decisions based on calculated (much more realistic) probabilities rather than on (probably incorrect) assumptions. E
Our pump scenario provides an example of this: With a limited number of service technicians, maintenance notifications are only triggered above a certain probability threshold, which is automatically determined and continuously adapted.
Conclusion
Hana is an invaluable tool for the digital design of business processes. In order to exploit this potential, architectures, employee requirements and traditional ways of thinking must be fundamentally redesigned.
The focus of IT is shifting away from negotiating service levels towards composing a symphony of applications and partners.
In the next article in our short series, we will look at identifying valuable business cases for "digitization" with Hana.