The global and independent platform for the SAP community.

The future belongs to data streaming pipelines

To facilitate access to this data, most IT departments centralize as much information as possible.
E3 Magazine
June 10, 2024
avatar
This text has been automatically translated from German to English.

They typically use point-to-point data pipelines to move data between operational databases and a centralized data warehouse or data lake. ETL (extract, transform and load) pipelines, for example, ingest data, transform it in regular batches and later forward it to a downstream analytical data warehouse. ETL pipelines and reverse ETL pipelines also send the results of data analyses that take place in the warehouse back to operational databases and applications.

Even though companies today often operate dozens to hundreds of point-to-point data pipelines, more and more IT managers are coming to the conclusion that point-to-point and batch-based data pipelines are no longer fit for purpose. Older pipelines are usually not very flexible and are perceived as "black boxes" by developers, as they cannot be adapted and are difficult to transfer to other environments. When operational processes or data need to be adapted, data developers therefore avoid changing existing pipelines. Instead, they add more pipelines and the associated technical debt. The bottom line is that traditional ETL pipelines require too much computing power and storage space, which can lead to scaling and performance issues as well as high operational costs as data volumes and requirements increase.

Data streaming pipelines are a modern approach to providing data as a self-service product. Instead of sending data to a centralized warehouse or analytics tool, data streaming pipelines can capture changes in real time, enrich them in the flow and send them to downstream systems. Teams can use their own self-service access to process, share and reuse data wherever and whenever it is needed. 

In contrast to conventional pipelines, data streaming pipelines can be created using declarative languages such as SQL. This avoids unnecessary operational tasks with a predefined logic of required operations. This approach helps maintain the balance between centralized continuous observability, security, policy management, compliance standards and the need for easily searchable and discoverable data.

confluent.io

Write a comment

Working on the SAP basis is crucial for successful S/4 conversion. 

This gives the Competence Center strategic importance for existing SAP customers. Regardless of the S/4 Hana operating model, topics such as Automation, Monitoring, Security, Application Lifecycle Management and Data Management the basis for S/4 operations.

For the second time, E3 magazine is organizing a summit for the SAP community in Salzburg to provide comprehensive information on all aspects of S/4 Hana groundwork.

Venue

More information will follow shortly.

Event date

Wednesday, May 21, and
Thursday, May 22, 2025

Early Bird Ticket

Available until Friday, January 24, 2025
EUR 390 excl. VAT

Regular ticket

EUR 590 excl. VAT

Venue

Hotel Hilton Heidelberg
Kurfürstenanlage 1
D-69115 Heidelberg

Event date

Wednesday, March 5, and
Thursday, March 6, 2025

Tickets

Regular ticket
EUR 590 excl. VAT
Early Bird Ticket

Available until December 24, 2024

EUR 390 excl. VAT
The event is organized by the E3 magazine of the publishing house B4Bmedia.net AG. The presentations will be accompanied by an exhibition of selected SAP partners. The ticket price includes attendance at all presentations of the Steampunk and BTP Summit 2025, a visit to the exhibition area, participation in the evening event and catering during the official program. The lecture program and the list of exhibitors and sponsors (SAP partners) will be published on this website in due course.