These days businesses are inundated with vast amounts of data and harnessing it to derive actionable insights is an art-form that has become a critical aspect of staying competitive.
Until recently, operational data stores, enterprise data warehouses and business-specific data marts, armed with a suite of reporting and dashboarding tools, have been enough to cook up pre-canned, function-specific reports for business insights.
Yet, with the advent of growing market trends such as GenAI, Big Data, cross-enterprise operations and self-service analytics, businesses are being pushed to build capabilities like common unified business data models, extra real-time insights, multi-platform scalable architectures and access to open ecosystems.
Subsequently, different architectural patterns and approaches, like data lakes, lakehouses, data mesh and data fabrics have emerged. What we’re seeing, as a result, is a growing trend toward adopting modern data platforms on the cloud and embracing an open data and AI ecosystem that caters to ever-evolving business appetites.
SAP has, of course, been updating its data and AI architecture to drive innovation for its customers. However, this growing enterprise technology trend presents a dilemma for SAP customers – whether to align with SAP’s direction on data and AI strategy or embrace a cloud-native approach with other vendors to build modern data and AI platforms.
The evolution of data and AI architecture in SAP
SAP has a rich history of delivering analytics solutions, beginning with its Business Intelligence (BI) and Business Warehouse (BW) offerings. Initially, SAP BW and SAP Business Objects provided traditional platforms for business warehousing, analytics and reporting.
Over time, SAP broadened its portfolio to encompass advanced analytics capabilities and AI-driven solutions through platforms like SAP HANA and SAP Leonardo. These developments have enabled organizations to unlock the full potential of their data, driving insights and innovation across a range of business functions.
Migrating towards cloud-native solutions
In recent years, there has been a shift towards cloud-native architectures for data and AI, with cloud platforms like AWS, Azure and GCP offering a rich ecosystem of services to scale and flex to modern data needs.
Customers are also exploring solution services like Snowflake and Databricks for building modern data platforms, data warehouses, lakehouses and AI platforms in the cloud. This cloud-native approach provides organizations with agility and access to open data and AI ecosystems, enabling faster innovation and time-to-market.
The dilemma
On one hand, SAP’s offerings continue to evolve, with solutions like SAP Data Sphere (formerly SAP Data Warehouse Cloud) aiming to provide a unified approach to data management and analytics, potentially replacing Business Warehouse. On the other hand, the allure of other vendor’s cloud-native solutions is undeniable, offering not only agility and scalability but also ease of integrating and processing both SAP and non-SAP data.
SAP customers find themselves at a critical juncture – how to align their data strategy to meet business needs and do so cost-effectively.
Comparing SAP Data Sphere with cloud-native solutions
SAP Business Data Fabric (SAP Data Sphere) aims to provide a comprehensive solution for managing, integrating and analyzing data across hybrid and multi-cloud environments. It offers features such as data virtualization, data governance and advanced analytics capabilities.
However, compared to cloud-native solutions, SAP Data Sphere may be a costly option for enterprises requiring large amounts of external data. It may also face challenges in terms of agility, scalability and integration with non-SAP systems. SAP is well aware of the challenge of ingesting non-SAP data and is building partnerships with cloud providers to integrate SAP Data Sphere with cloud platforms with a federated architecture.
Cloud providers like AWS, Microsoft Azure and Google Cloud Platform offer a more open approach to ingesting, processing, storing and governing data on their platforms. These providers offer modern data platform services such as AWS Redshift and S3, Azure Synapse and ADLS and Google Bigquery and GCS storage for storing and analyzing data.
Furthermore, these cloud providers offer comprehensive services to build Data Lake, Lakehouse, Data Mesh and Data Fabric architectures and patterns. With a common challenge for SAP customers being how to ingest SAP data in these platforms, customers have several options:
They can use SAP tools (based on the licenses they have) like SAP Data Intelligence, SLT, SAP Data Services, SAP Data Sphere or use cloud-native tools like AWS Appflow, Microsoft Azure Data Factory or Google BQ Connector (based on SLT).
Cloud-agnostic data platforms such as Snowflake and Databricks, offer a modular and flexible approach to building data and AI platforms. They enable seamless integration with various cloud services, support open standards and allow the use of a wide range of third-party tools and services. Additionally, these platforms employ pay-as-you-go pricing models, enabling organizations to scale resources according to demand and optimize costs. However, the challenge of ingesting SAP data into these platforms persists.
A strategic SAP course
Navigating the evolving data and AI challenge no doubt requires an assessment of an organization’s existing landscape and a thoughtful evaluation of available options that align with its data strategy.
Whether opting for an SAP-centric, cloud-native, or hybrid approach, organizations must ensure that their data and AI architecture align with their business objectives, enabling them to remain competitive in an increasingly data-driven world.
About the author: Mayank Madan is the head of Data and Analytics Practice, EMEA, at Lemongrass.