We know that ERP systems process workloads, channel workflow processes, ingest information and make multifarious connections to systems of record, systems of transaction and wider systems of cloud-native enablement.
What all that means is – ERP data is generally on the move.
Aside from that portions of an enterprise’s data estate that are consigned to longer-term storage for regulatory purposes, legislative compliance and backup, a good chunk of ERP data is on the go, pretty much 24×7.
So much is this the case that now, increasingly, we are channelling some information streams into real-time data environments.
Hyperautomation-powered workflows
Having seen his fair share of ERP data flows move, speed up and gather ever greater cadence over the years is Keith Rigg in his position and VP for technology at data integration platform and hyper-automation-powered workflow company Jitterbit.
Talking about how vendors have now variously added real-time data functionality into their platforms, Rigg reminds us that SAP has added streaming capabilities around messages that ‘set it aside’ (in his view) from other ERP systems. It’s something that his company has the capability to work with. So what else does Jitterbit think are the key factors impacting the world of data streaming in relation to ERP systems?
“Let’s a step back and say that a lot of the experiences we have in the consumer world where we touch technology, be it an app or smart tv or whatever, often transfers into the business world – and this leads to an increased expectation of technology in the enterprise,” said Rigg
It’s true. Why should technology be modern and great on a user’s smartphone and then 1990s or early 2000s when they are operating in the business world when it really matters?
“It’s a silly situation. But you know, there’s a move towards consumption-based pricing in our world, you know, subscriptions. So it’s a switch from a CapEx to an OpEx model. The life cycle of these contracts could be days, could be weeks, could be months. So then, the short timeframes of those contract life cycles means that you’ve got more transactions, more events occurring on a more regular basis for the same volume of business,” explained Rigg, suggesting that the move to data streaming is growing.
More events, more data
Historically, if a user bought something once and kept it for five or 10 years, they are now renting it daily, weekly, or monthly. So there’s going to be an exponential increase in transactions for the same volume of business with the same number of customers or, or consumers.
This means – in computing terms – we have more events, more data and so it’s even more important that we have event-based streaming capabilities to move data not just around an organization, but actually across organizational boundaries more effectively.
“The same is true in the business world. It requires a lot of push notification capability, real-time streaming and the ability to act on those events as they move around. There’s also things like just-in-time production, which of course ‘came on stream’ a long time ago for businesses in the supply chain world, retail and manufacturing for example. So if you want to make a last-minute change to an order, the volume, the spec, the particulars, especially if it’s a configurable product, how do you do that? You can’t be doing that offline. You can’t be doing it with a lag. What if the production already started, you’ve got a cost to the business of that stock sitting there, which may or may not be sold,” stated Rigg.
Hydrating data-driven models
The Jitterbit team say that one of the things that a lot of customers often ask us for is this kind of hydrating data functionality i.e. moving all of an enterprise’s data somewhere, aggregating it, enriching it, having the data lake accessible so that the company can start to perform, not just report. With analytical assessments of data, the business can start to move more towards a data-driven model.
“But it goes beyond that. Coming back full circle now and actually with a bit of a nod to edge computing and ChatGPT and the industry’s increasing awareness of Artificial Intelligence (AI). If we can have the data in the right place with the underpinning technology so that we can you know, analyze it quickly enough so it doesn’t take months to crunch the numbers, but it takes milliseconds – then you can run machine learning and AI so that you can do things like predictive failure alerts or get insights that then might influence the way you direct your online marketing spend,” said Rigg.
Responding to consumer trends in real-time with data streaming functions is an integral part of the way we’re now building always-on o& on-demand services that depend on Continuous Integration & Continuous Deployment (CI/CD).
The real-time stream is no pipe dream and these technologies are always ‘enjoying’ deployment across the ERP, HCM, CRM and other spaces.