ERP Platform Mining


The strategic realignment of ERP marks a paradigm shift that has profound business, technical and organizational implications for the entire SAP community and is accompanied by analysts, DSAG user representatives and partners with both criticism and hope. It is a roadmap from the stable but rigid on-prem world to a dynamic but licensing and technically challenging cloud universe in which SAP is no longer the sole fixed star, but a sun in a system of hyperscalers, data specialists such as Databricks and Snowflake and powerful consulting firms.
At the heart of this ERP transformation is the realization that the digital nucleus of the existing SAP customer must be kept clean - the so-called clean core strategy. In order to increase the speed of innovation and ensure the maintainability of an S/4 system or the new Business Suite, modifications and enhancements are rigorously banned from the core and moved to the SAP Business Technology Platform (BTP). The BTP acts as the technical centerpiece, as a „PaaS“ (Platform-as-a-Service) offering that bundles development, integration, automation, data management and artificial intelligence.
SAP BTP is the place where the future of composable ERP can take place, while the „traditional“ ERP core in the form of S/4 becomes the stable system of record. But this brave new world of Composable ERP brings with it massive technical and organizational challenges. SAP BTP is not a monolithic block - some say not even a platform - but a heterogeneous construction kit consisting of various runtime environments such as Cloud Foundry, Kyma (Kubernetes) and the Abap environment, which is particularly emotional for regular SAP customers and has become known under the code name „Steampunk“.
CAP, RAP and governance
For existing SAP customers, this means that they not only have to master new programming models such as the Cloud Application Programming Model (CAP) or the RESTful Application Programming Model (RAP), but also establish a completely new form of governance in order not to lose control in the variety of ERP services and cloud offerings (SAP and hyperscalers).
SAP's data strategy, which culminated in the announcement of the SAP Business Data Cloud (BDC), is being discussed particularly critically in the community. The BDC is intended to orchestrate and harmonize the data chaos that has resulted from the fragmentation and atomization of IT landscapes and SAP's numerous acquisitions (such as SuccessFactors, Ariba, Concur). But the road to this goal is paved with the ruins of past attempts.
For a long time, SAP tried to solve data integration problems with the SAP Data Hub product, but the concept often failed in practice due to excessive resource consumption, technical complexity and a lack of cost-effectiveness. The SAP Data Warehouse Cloud, which was later renamed SAP Datasphere, was positioned as the successor to enable a „business data fabric“ - an architecture that no longer physically copies data, but links it virtually and semantically, regardless of where it is located.
Business Data Cloud
The Business Data Cloud (BDC) is now the latest strategic superstructure that combines Datasphere, SAP Analytics Cloud and - this is the real sensation - deep integration with external partners such as Databricks and Snowflake. This is where the opening up of SAP's strategy becomes clearest, but also the admission that it can no longer do everything alone in the area of big data and AI.
The partnership with Databricks, a leading provider of data warehouse architectures, is a central pillar of the BDC. SAP data is to be seamlessly integrated into the Databricks platform via an interface called „Zero-copy connectivity via Delta Sharing“ so that it can be processed there using powerful AI and machine learning algorithms without the data having to technically leave the SAP ecosystem or be duplicated. However, critics complain that the BDC in its current form still leaves questions unanswered and sometimes acts as a marketing wrapper around existing products, as key issues regarding licensing and the technical depth of integration still need to be clarified.
Alongside Databricks, Snowflake plays a crucial role in modern data management for existing SAP customers. Although Snowflake is in some ways a competitor to SAP's own Data
warehouse solutions, many users are recognizing the benefits of Snowflake's cloud-agnostic architecture. SAP has responded and established partnerships to enable bidirectional data exchange. Snowflake is positioning itself as the platform that breaks down data silos and merges SAP data with non-SAP data in a high-performance cloud environment, which is often seen as an alternative to the pure SAP Datasphere strategy, especially when companies pursue a multi-cloud strategy.
Google Cloud is also a powerful player in this network. The partnership between SAP and Google goes far beyond pure infrastructure (IaaS). With the open data offering, both companies are aiming to combine SAP data and Google data (e.g. from Google BigQuery) without replication in order to enable AI scenarios. This is vital for SAP's survival, as its own AI ambitions (Joule) are heavily dependent on the computing power and models of hyperscalers.
The SAP „Clean Core“ concept is crucial for the transition from ECC 6.0 to S/4 Hana. Numerous Abap modifications have optimized and individualized SAP's ERP systems in recent years, even at the expense of compatibility and release capability. A new orchestration concept for standards and modifications needs to be created for the stringent further development of ERP (cloud and on-prem): Clean Core is the start.
New SAP partner system
Collibra and DataRobot complement this ecosystem as specialized partners for data governance and advanced AI to fill the gaps that SAP has in its own portfolio, especially in the area of governance of heterogeneous data landscapes and operationalization of AI models.
In this complex environment, the major consulting firms and system integrators act as indispensable navigators.
Companies such as Accenture, Deloitte, Capgemini, PwC and EY have built up massive capacities to accompany the transformation to BTP and BDC. Accenture, for example, uses its deep partnership with SAP to develop industry solutions on BTP and implement the clean core strategy at large corporations, often integrating open source components from partners such as Red Hat into the architecture.
Application and advice
PwC not only has an advisory function, but also uses S/4 Public Cloud and BTP internally for its own global organization, which gives it a high level of credibility in its advisory services. PwC uses Joule Copilot and Business AI to optimize its own processes and resells this experience. Capgemini focuses strongly on the integration of S/4 into complex, hybrid landscapes and develops its own frameworks to accelerate the transition.
Atos and Cognizant are also positioning themselves strongly in the area of managed services and the transformation of legacy systems to the new SAP platform, often bridging the gap between the old on-prem world and the new cloud reality. Bluetree Solutions, specializing in planning and analytics, plays a role in the implementation of financial planning and consolidation solutions that are increasingly based on the SAP Analytics Cloud and Datasphere, even if they operate at a more granular level compared to the „Big 4“.
The role of SAP Datasphere in comparison to BDC and the partners can be differentiated as follows: Datasphere is the technical product, the evolution of the data warehouse in the cloud. It provides the tools for data modelling, virtualization and cataloguing. BDC, on the other hand, is the overarching SAP solution concept that uses Datasphere as a core component, but is expanded into a comprehensive „data fabric“ through the integration of Databricks and other services (such as SAP Analytics Cloud).
While Datasphere attempts to retain semantic sovereignty over SAP data (business context), Databricks and Snowflake provide the raw computing power and open standards for data science and big data analytics that SAP cannot deliver in this depth. The transition from the failed Data Hub to Datasphere and finally to BDC is often painful for existing SAP customers and requires a migration of mindset: away from monolithic ETL (Extract, Transform, Load) towards federation and virtualization of data.
For this transition to succeed, companies need to fundamentally rethink their data architecture and understand Datasphere not just as a new BW, but as a logical layer above a distributed data landscape. However, the sword of Damocles of licensing and commercial framework conditions hangs over all these technical visions. The licensing challenges for BTP and BDC are immense and are causing a great deal of resentment in the SAP community. The rise-with-SAP model effectively forces existing customers into a subscription model in which they give up ownership of their licenses and are degraded to tenants.
Classic data management has a long tradition at SAP: based on the available sources, the connection between SAP Datasphere and the SAP Business Data Cloud (BDC) concept can be described as an evolutionary and hierarchical relationship. In short: SAP Datasphere is the technological centerpiece and the primary data management engine within the higher-level, strategic solution bundle known as the SAP Business Data Cloud.
Platform metric: FUE
The Full Use Equivalent (FUE) metric, which regulates user licensing in the cloud and often appears opaque, is particularly critical. There are complex billing models for SAP BTP, such as the Cloud Platform Enterprise Agreement (CPEA) or the newer BTP Enterprise Agreement (BTPEA), where customers pay „credits“ in advance that can expire at the end of the year if they are not used. This leads to high pressure to actually use the services and makes it difficult to calculate the total cost of ownership (TCO). Another minefield is the indirect access problem (digital access), which arises when external systems - for example via SAP BTP or third-party interfaces - access the digital core of S/4 Hana and generate documents.
There is also a lack of a clear cloud exit strategy. Once you have fully committed to Rise, BTP and BDC, you enter a vendor lock-in from which there is virtually no escape, as the data and logic (especially for developments on BTP) are closely interwoven with an SAP infrastructure. Although the EU Data Act attempts to intervene in a regulatory manner and enforce portability, the technical reality lags far behind the legal requirements.
Hybrid platform strategy
The SAP community is therefore faced with the paradoxical situation of needing modern platforms such as BTP and BDC in order to remain innovative (AI, automation, integration) on the one hand, but being forced into a commercial corset that restricts flexibility and autonomy on the other. The response of many existing customers is a hybrid strategy: they use SAP for the core (clean core), but are increasingly outsourcing innovation and data analysis to neutral platforms or hyperscalers in order to manage the dependency. At the SAP community's Steampunk and BTP Summit on April 22 and 23 in Heidelberg, an alternative data platform will be presented and discussed, among other things. The IT company Boomi offers integration scenarios and platforms, including Agentic AI, which are compatible with SAP ERP/ECC 6.0 and S/4 Hana.
The business challenges of SAP BTP lie primarily in justifying the costs. The platform is considered expensive, and the business case for migrating from established on-prem integration solutions (such as SAP PI/PO) to the Integration Suite on BTP is often difficult to demonstrate if you only consider the pure operating costs. Boomi can also provide adequate answers to this question at Summit 2026. The added value must be argued in terms of agility, faster time-to-market and the use of AI services (such as the GenAI Hub).
From a technical perspective, SAP BTP is powerful, but also complex. The large number of services (over 90) and the different environments require a broad skillset that is often not available internally and has to be purchased at great expense. Organizationally, the platform strategy is forcing companies to restructure their IT departments: Away from pure SAP Basis administrators towards cloud architects and DevOps engineers who are able to manage a dynamic PaaS environment. Naturally, these are also tasks for the CCoE (Competence Center of Expertise).
The Competence Center Summit on CCoE and SAP for Me will be held in Salzburg on June 10 and 11, 2026. In both Heidelberg and Salzburg, SAP partner Snap will be offering an AI experience workshop to explore the possibilities of AI on SAP platforms.
The Customer Center of Expertise (CCoE) plays a decisive, changing role in this structure. It must evolve from a pure operations monitor to an orchestrator of the hybrid landscape. It must ensure governance of the BTP services, monitor costs (credits) and guarantee compliance with security standards in the cloud. Without a strong CCoE, there is a risk of a proliferation of applications and costs on the BTP, which would negate the benefits of the platform.
Steps in the data migration process and roles responsible for new implementation: Only master data and open objects are transferred from legacy systems. Some open documents must be processed before or after the go-live. Cutover phases with locks or double maintenance are made possible. The S/4 Migration Cockpit supports the migration, whereby historical data is not migrated.
Organization and technology
In summary, it can be said that SAP's platform strategy with S/4 Hana, BTP and BDC is technically coherent and paves the way to the Intelligent Enterprise, but represents a massive hurdle for existing SAP customers. They not only have to modernize their technology, but also completely change their contracts, their organization and their way of thinking. Partners - be they hyperscalers, data specialists such as Databricks or major consultants - are indispensable helpers in this process, but they also benefit from this complexity.
The success of SAP's strategy will depend on whether it succeeds in taking existing customers along on this journey without losing them to license pressure and technical overload. The vision is clear: a modular, data-driven company that runs on a flexible platform - a composable ERP. However, the reality is still often a tough battle with migrations, costs and the search for the right architecture mix.
A detailed analysis of the Business Technology Platform reveals a duality between promise and obligation. SAP positions BTP as the indispensable link between the clean core of the ERP system and the innovative outside world. From a technical perspective, the steampunk model (SAP BTP Abap Environment) enables Abap developers to transfer their knowledge to the cloud without modifying the core of the S/4 system. This is essential for the upgradeability of cloud ERP solutions.
BTP licensing
However, technical freedom comes at a price: the licensing of BTP services is volatile. The pay-as-you-go model offers flexibility, but can lead to exploding costs if used uncontrollably, while the subscription model with fixed quotas harbors the risk of shelfware or - in the case of the BTPEA - leads to the expiry of credits at the end of the year.
Business Data Cloud attempts to solve the historical problem of SAP data management. SAP data used to be trapped in the proprietary formats of the applications. With BW and later BW/4 Hana, SAP created powerful data warehouses, but these often ended up as monolithic silos. The SAP Data Hub was an attempt to place an orchestration layer on top of these silos, but failed due to the complexity of container orchestration and the lack of performance with large volumes of data.
Datasphere, as the core of the BDC, is now taking the path of federation: data remains where it is and is linked virtually. But even here there are limits, especially when it comes to performance for complex joins across system boundaries. This is precisely where the partnership with Databricks comes in: Zero-copy replication (based on delta sharing) allows the data to be analyzed physically (but efficiently replicated) in Databricks' data lakehouse without losing the semantic meaning from the SAP context. This is a strategic admission by SAP that specialized providers are often more efficient in the area of big data analytics.
The role of DataRobot and Collibra fits seamlessly into this picture. DataRobot introduces automated machine learning functions (AutoML) that enable business users to train AI models on SAP data without in-depth data science knowledge. Collibra addresses the massive problem of data governance in distributed landscapes. When data is scattered across S/4, Datasphere, Databricks and various hyperscalers, users quickly lose track of lineage, quality and access authorizations. Collibra provides the necessary metadata layer to ensure compliance.
While classic SAP modeling often took place in a closed ecosystem, the Business Data Fabric is radically open. It seamlessly integrates data from SAP systems (S/4 and BW) with data from hyperscalers (Google BigQuery, AWS, Azure) and specialized platforms (Collibra, Databricks). BTP and BDC act as a strategic construct that connects the ecosystem modules to provide a unified view of enterprise data.
Portfolios of SAP partners
The major system integrators have adapted their portfolios accordingly. Accenture, for example, operates massive migration factories that industrialize the switch to S/4 and BTP in order to reduce costs. Deloitte is focusing heavily on business transformation (Kinetic Enterprise), where BTP serves as an enabler for new business models. Capgemini uses its multi-pillar S/4 architecture to show clients how they can integrate SAP and non-SAP solutions on BTP.
Atos and Cognizant contribute their strengths in infrastructure management to stabilize the operation of hybrid landscapes. Bluetree Solutions, often mentioned in the context of SAP analytics and planning, specifically supports customers in closing the gap between operational data and strategic planning in the SAP Analytics Cloud (SAC). The situation remains tense in terms of licensing law. The introduction of Rise-with-SAP has changed the license model. Existing customers are exchanging their „perpetual“ on-prem licenses for a temporary subscription right. This is often sold with the promise of TCO reduction and innovation, but critics and the user association DSAG warn of the long-term cost increases.
The path to the IT platform economy with SAP is mapped out, but it is no walk in the park. It requires a clear architectural vision, robust license management and a willingness to cut out old habits (clean core). BTP and BDC are powerful tools, but they only unfold their value if they are not understood as pure technology, but as enablers for business innovation (end-to-end processes). Dependence on partners and hyperscalers will increase, making the management of vendor interfaces a core competence of IT departments and the CCoE. The SAP world is becoming more open, but also more complex and more expensive - a reality that every CIO must face up to.
S/4 conversion is a technical challenge, see graphic. However, it also requires commitment to the existing customer's ERP data itself: SAP Datasphere is the engine for data management, while the SAP Business Data Cloud is the entire vehicle, which also contains the cockpit (SAC) and the turbo (Databricks) to map a complete data and AI strategy for the company.
Business Data Fabric Platform and ontologies
Business Data Fabric itself fundamentally changes classic data modelling by shifting the focus from the physical consolidation and replication of data (as in classic data warehouses) to a virtual, semantically networked and decentralized data architecture. This paradigm shift addresses the limitations of traditional approaches, in which data was extracted from its source systems and often lost its business context in the process.
In traditional data modeling, data had to be extracted from operational systems using ETL processes (Extract, Transform, Load) and loaded into a central repository (data warehouse or data lake). This led to redundancies, latency times and high costs for data movement. The Business Data Fabric, on the other hand, relies on data federation and virtualization. Instead of physically moving data, it remains at its original location but is made virtually accessible via a uniform semantic layer.
A key technological factor here is „zero-copy“ technology (e.g. via Delta Sharing in partnership with Databricks or Snowflake). It makes it possible to use and analyze SAP and non-SAP data together without having to create duplicates. A major problem with classic modeling was the loss of „business semantics“ when data left the ERP system. Technical table names and relationships (e.g. currency conversions or hierarchies) were often separated during export, requiring complex recovery projects in the data warehouse.
The Business Data Fabric, implemented by SAP Datasphere, retains this business context. It uses a semantic layer that translates technical data structures into understandable business terms and actively manages relationships between data objects (e.g. order, customer, invoice). The SAP Datasphere Knowledge Graph further automates this by creating ontologies that represent relationships and context of data across the landscape, which helps AI models avoid hallucinations.
Traditional approaches were often monolithic and were managed by a central IT department, which became a bottleneck. The Business Data Fabric supports the data mesh concept, which decentralizes responsibility for data. While classic SAP modeling often took place in a closed ecosystem, the Business Data Fabric is radically open. It seamlessly integrates data from SAP systems with data from hyperscalers and specialized platforms. BDC acts as a strategic construct here.







