Beyond batch, how ERP steers faster with data streaming

Waves of blue and purple tech.

Data moves. All data at some point in its life has moved from, or moved to, one location or another. Be it from a database to an application, from some form of cloud-based repository to and from an IoT device, or simply between internal services inside an operating system or its associated connection points, data is pretty much always on the move.

Even when data has come to rest and resides in longer-term storage, it will typically have been processed through some form of transport mechanism at one stage or another. These core truisms mean that we often focus on the cadence of data and examine its ability to get to where it needs to be in a timely manner.

If we think back to pre-millennial times, data services and the users that depended on them often had to accommodate for “nightly builds” if they wanted their data backbone to serve them with the most up-to-date information. This was (and in fact still is) the era of batch processing, also sometimes known as workload automation or job scheduling.

According to data integration and enterprise cloud platform company Tibco, batch processing is a cost-effective way to process huge amounts of data in a small amount of time.

“A good example of batch processing is how credit card companies do their billing. When customers get their credit card bills, it isn’t a separate bill for each transaction; rather, there is one bill for the entire month. That bill is created using batch processing. All the information is collected during the month, but it is processed on a certain date, all at once,” notes Tibco, in a technical briefing document.


Beyond batch, into real-time

But the world now moves faster than batch; we all work in an always-on universe where smartphones have to provide timely services in seconds and enterprise applications have to serve us with continuously integrated services and connectivity. This is the age of real-time data.

Key among the enabling technologies in this space is Apache Kafka. Tracing its roots back to 2011, the technology came out of an internal project to create a messaging queue driven by software engineers working at LinkedIn. From its “basic” messaging queue beginnings, the Apache Kafka data streaming platform is now capable of handling over a million messages per second, an amount that equates to some trillions of messages per day.

Jay Kreps This technology goes above and beyond Kafka – Jay Kreps, Confluent

While pure Apache Kafka remains open source with the operational management responsibilities left in the hands of the user, Confluent provides a fully managed cloud-native stream processing and analysis service. It’s a technology that Confluent co-founder and CEO Jay Kreps says “goes above and beyond Kafka” for modern real-time application use cases to power what he calls real-time decision-making, or at least the digital iteration of such a process.

What is data streaming?

Going back to software engineering school for a moment, we can define data streaming as a computing principle. It denotes an approach that oversees a time-ordered sequence for data to pass through an application, component or service. Typically focused on the lower-level data records happening inside any given IT system, data streaming services carry log files that relate to everything from a single keyboard stroke to industrial machine instrumentation sensor readings.

As granular and diminutive as these records are, once aggregated they help paint a richly illustrative picture of what exactly is happening inside an IT deployment. This reality logically takes us towards ERP i.e. data streaming analytics can provide us with a precision-engineered view of what is happening inside an enterprise software application or function. So just how do we apply this power?

“I’m stating the obvious here, but data is what keeps any organization on track – so [the] real-time access that data streaming promises should logically be viewed as a huge asset,” says Chris Gorton, SVP for EMEA north, south and emerging markets at data management platform company Syniti.

Chris GordonReal-time means nothing if you can’t trust the data being streamed – Chris Gorton, Syniti

“Not governing and ensuring data accuracy slows down business process operations, hinders the sales process and can leave organizations at risk of non-compliance.”

The Syniti man reminds us that consolidating key business operations through an ERP system untangles data, making it much easier to manage and clean, so that the whole business can be sure it is using the right information, at the right time. “An ERP system’s inherent automation factor reduces the opportunity for errors and it introduces a consistency that means the data produced and streamed can ensure compliance and collaboration across teams. It means that businesses can operate more efficiently, that big sale can be closed more quickly… and those difficult decisions can be made more confidently,” adds Gorton.


The ascent of events

Many in the data business have been pushing for wider adoption of these admittedly quite deep-dive technologies for a while. VP for technology at Jitterbit Keith Rigg openly states that his team has been advocating event-based messaging “for some time” now. His team is in favor of real-time patterns when it takes solutions to market; they say that this approach reflects the company’s data integration platform that automates workflows and busts productivity through hyperautomation.

“There are a number of things which have happened over time,” says Rigg. “There’s too many organizations out there still relying on batch overnight scheduled jobs to take a clone of their ERP or transactional data into a data warehouse, before then reporting on outdated information – for example from an integration perspective, things being scheduled every hour or, you know, twice a day.”

He says that this inconvenient truth means there’s always a latency. There’s always the opportunity for different systems in the business to disagree with one another. So there isn’t a single version of the truth within the business. For example, finance could be saying something different to customer services. Equally, customer services could be accepting an order because it appears there’s still credit available – but finance has already put them on a credit stop because they’ve not paid an invoice.

“Now, those problems may have existed historically because the organization itself wasn’t mature enough to automate and have a single virtual system and a single view of the truth. Or in many cases, the technology barriers were such that it wasn’t impossible,” explains Rigg.

Keith RiggIt was very difficult to get to that happy place – Keith Rigg, Jitterbit

The Jitterbit VP urges us to fast-forward to where we are today. He suggests that now (and to be honest, probably for the last five, even 10 years) the technology hasn’t been the barrier. He suggests that it’s been people’s awareness of those technical barriers being removed, or them being at the right point in an investment life cycle to be able to say, well, you know what, it’s now the point to do an upgrade or a replacement, and we can transition to this new way of working.

For the last half-decade, Jitterbit itself has always advocated event-based messaging to existing customers and prospects alike. Using its API gateway as a conduit into the firm’s own integration platform, the company then makes sure that data is moved across the enterprise in real-time. Or if not, it admits, near real-time. The foundational building blocks of the company’s “listener framework” gives it a streaming capability at the base level of the platform. The Jitterbit message queue gives users the ability to persist messages and guarantee delivery from a source system to a target system.


Into the (data) logging business

As we embrace data streaming at a wider number of application touchpoints across the enterprise, we will inevitably be faced with a more complex data landscape overall. It makes sense therefore for data streaming specialists to develop ERP skills and, equally, for ERP vendors and integration partners to develop data streaming skills.

Moving forward then, we need to realize that the actual streaming of real-time data is generally the more straightforward part of the data management process. The complexity comes from then interpreting that data in a way that is actionable and adds value to the business, whilst ensuring that the authenticity of the data collected can be trusted. This is the opinion of Bob De Caux, in his role as VP for AI and Automation at IFS.

“The most efficient way of streaming data is when it is implemented as an event log that is connected to various sources. Understanding what this streamed data means in a business context allows data to be connected to the meaningful aspects of assets within an organization where value can ultimately be added.

Bob DeCauxBusinesses aren’t just looking at a single stream of data anymore – Bob De Caux, IFS

“As there is now technology that can handle multi-variant anomaly detection, which means pulling data together and analyzing it from multiple streams and event logs,” says De Caux.

Speaking from direct experience drawn from working with the IFS customer base, De Caux says that by observing how end-users are interacting with an application and the sensory data that streams in, business processes can be analyzed and automated using AI. He explains that essentially, this creates a feedback loop, allowing the automation of these processes to become automated. Whilst there remains a human element within this business process, there is a need for a strong level of explainability. The end-user needs to understand what they’re signing off on and the outputs from those automated decisions must be provided in a way that a human can interpret.

“If a business process is automated and something does go wrong down the line, there will need to be an auditable log that shows compliance with any future AI and automation regulation. Having access to real-time data streams puts businesses in a position to improve their processes, deliver better customer experience and allows for more informed decision making – however this can only happen if the data being used is secure and authentic,” concludes IFS’ De Caux.

Dreaming the data streaming dream has arguably become an integral part of the way the most advanced enterprise IT systems are now developing. Streaming evangelists like to talk about the “death of batch”, but that’s (again, arguably) because it sounds snappy and makes for a good keynote headline or conference breakout session. There will be an element of batch data inside most IT departments of any size, principally because data types are so multifarious and diverse, but the real-time ERP stream has moved far beyond being any kind of pipe dream.