The global and independent platform for the SAP community.

The future belongs to data streaming pipelines

To facilitate access to this data, most IT departments centralize as much information as possible.
E3 Magazine
June 10, 2024
This text has been automatically translated from German to English.

They typically use point-to-point data pipelines to move data between operational databases and a centralized data warehouse or data lake. ETL (extract, transform and load) pipelines, for example, ingest data, transform it in regular batches and later forward it to a downstream analytical data warehouse. ETL pipelines and reverse ETL pipelines also send the results of data analyses that take place in the warehouse back to operational databases and applications.

Even though companies today often operate dozens to hundreds of point-to-point data pipelines, more and more IT managers are coming to the conclusion that point-to-point and batch-based data pipelines are no longer fit for purpose. Older pipelines are usually not very flexible and are perceived as "black boxes" by developers, as they cannot be adapted and are difficult to transfer to other environments. When operational processes or data need to be adapted, data developers therefore avoid changing existing pipelines. Instead, they add more pipelines and the associated technical debt. The bottom line is that traditional ETL pipelines require too much computing power and storage space, which can lead to scaling and performance issues as well as high operational costs as data volumes and requirements increase.

Data streaming pipelines are a modern approach to providing data as a self-service product. Instead of sending data to a centralized warehouse or analytics tool, data streaming pipelines can capture changes in real time, enrich them in the flow and send them to downstream systems. Teams can use their own self-service access to process, share and reuse data wherever and whenever it is needed. 

In contrast to conventional pipelines, data streaming pipelines can be created using declarative languages such as SQL. This avoids unnecessary operational tasks with a predefined logic of required operations. This approach helps maintain the balance between centralized continuous observability, security, policy management, compliance standards and the need for easily searchable and discoverable data.

Write a comment

Working on the SAP basis is crucial for successful S/4 conversion. 

This gives the Competence Center strategic importance for existing SAP customers. Regardless of the S/4 Hana operating model, topics such as Automation, Monitoring, Security, Application Lifecycle Management and Data Management the basis for S/4 operations.

For the second time, E3 magazine is organizing a summit for the SAP community in Salzburg to provide comprehensive information on all aspects of S/4 Hana groundwork. All information about the event can be found here:

SAP Competence Center Summit 2024


Event Room, FourSide Hotel Salzburg,
At the exhibition center 2,
A-5020 Salzburg

Event date

June 5 and 6, 2024

Regular ticket:

€ 590 excl. VAT


Event Room, Hotel Hilton Heidelberg,
Kurfürstenanlage 1,
69115 Heidelberg

Event date

28 and 29 February 2024


Regular ticket
EUR 590 excl. VAT
The organizer is the E3 magazine of the publishing house AG. The presentations will be accompanied by an exhibition of selected SAP partners. The ticket price includes the attendance of all lectures of the Steampunk and BTP Summit 2024, the visit of the exhibition area, the participation in the evening event as well as the catering during the official program. The lecture program and the list of exhibitors and sponsors (SAP partners) will be published on this website in due time.