Building ‘smart’ ERP demands continuous software intelligence

Building ‘smart’ ERP demands continuous software intelligence

Key Takeaways

The introduction of Continuous Software Intelligence (CSI) in ERP systems is crucial for understanding and managing the complex relationships and dependencies within software, enabling quicker and more reliable innovation.

Manual mapping and traditional tools fall short in providing a dynamic and comprehensive understanding of evolving ERP systems, leading to static and inconsistent insights that hinder effective modernization efforts.

Changes in ERP systems not only affect the software itself but also impact data flow and analytics, necessitating agile data pipelines that can adapt to modifications while ensuring compliance and operational integrity.

Is it time to introduce ERPSI into the lexicon?

The practice of software intelligence (sometimes shortened to SI) is the act of examining the structural condition of software assets, which may include applications, database structures, wider software frameworks/suites and source code.  

In more granular terms, software intelligence is the process by which we decompose the structure of software and its requisite components, catalogue that information and then – typically – perform further analysis of it to look for relationships and structures between various sets of other software and components. Today we know that SI is needed because the software systems developers are building and modifying are frequently large, interdependent and complex. When a developer fails to understand the downstream impacts of a change they are about to commit, it invariably results in unseen bugs that are only uncovered when the overall system is in use.

Enter ERPSI

An acronym that the technology industry should arguably have come up with by now is ERPSI (pronounced: erpsey), which obviously stands for the development of ERP suites with an inherent ability to exert software intelligence functions. The ERP arena is, very arguably, a perfect place to start analysing the internal relationships between different languages, frameworks, abstractions, architectures, data models and underlying (infrastructural) software services. So how should it work?

Software developers working to build enterprise-scale ERP software suites must maintain and evolve multifaceted systems which contain some software components that are often quite old (the ERP acronym itself dates back to the early 1990s).

Advancing and modernising these complex systems without ‘breaking’ them is increasingly difficult, if not impossible in some cases. This challenge is made even more daunting given that many aspects of the software comprising ERP suites is shifting to microservices and being scattered across cloud providers. For software developers tinkering away at their ERP suites, SI tools should perhaps be mandatory to help them navigate that complexity.

But ERPSI doesn’t come in a can; this is not plug-and-play technology. So what shape does it form?

Because, says Edwin Gnichtel, CTO of CodeLogic, the most important aspect of any such technology is the ability to comprehensively store and reference the relationships and dependencies between these elements of software. In practice, this makes SI in the ERP space highly logical, but best applied via a systematic and considered approach.

Known for its work with continuous software intelligence technology, the CodeLogic team insists that we need to properly analyse and understand the potential impacts likely to result when building or modifying any software inside an ERP system.

Naturally variegated and interwoven ERP

This complex reality is underlined by the naturally variegated and interwoven nature of ERP software, which may span from systems of record to expense management, to user experience analysis and onward into field service management, and everything in between. However, argues Gnichtel, deep analysis of these kinds of systems is not reasonably possible in any manual, human-directed fashion.

“While large-scale efforts to manually map ERP software and other systems are common objectives of software modernisation projects, the result is a time-limited, static understanding of an evolving system, usually of highly inconsistent fidelity and accuracy,” says Gnichtel.

Even with the use of domain-specific SI tools (e.g. application performance monitoring (APM) software), the results fail to peer deeply enough into how software is actually running.

The result is a time-limited, static understanding of an evolving system, usually of highly inconsistent fidelity

Above ‘as-built’ documentation

Outside of such projects, attempts to comprehensively catalogue the highly detailed ‘as it is built’ documentation that ERP software generates is generally limited, with intelligence and analysis tools often siloed into areas such as static source code analysis, security profiling tools and APM systems. From CodeLogic’s perspective, this results in disconnected sets of relationship and dependency data, again, of highly inconsistent fidelity.

“Looking at the application of these (basic) tools to the software within ERP systems, they provide ‘degrees’ of SI at best, but a second generation of comprehensive, unifying platforms are required to bridge the gaps between these systems and end ineffective manual discovery and documentation practices,” says Gnichtel.

To understand why comprehensive profiling, aggregation and analysis is required, we have to get to grips with the scale of the problem. A typical enterprise application consists of millions of relationships, or in graph terminology, nodes and edges. CodeLogic advocates a move to what it calls Continuous Software Intelligence (CSI). This more dynamic approach enables ERP providers to visually map out the networked relationship between all their software components and store it in a mechanically coherent form.

This, in turn, is argued to help ERP software developers innovate more quickly and more reliably. It also gives the enterprise users of those ERP software suites greater confidence in the robustness of the solutions that are, in many cases, running their businesses. Only then does the true scale of software complexity become apparent.

“While these systems capture and organise information with speed and accuracy beyond that which is possible with manual or ad hoc methods, the power in such systems comes not merely from having highly detailed information; it’s the ability to rely on the CSI system to provide analysis of the data, provide focussed actionable information and allow users of the system to quickly profile for impact,” explains Gnichtel.

Blunt truths

Bluntly put, breaking code and the often ugly unintended consequences of changing anything in a complex system like ERP software has become one of the biggest impediments to innovation and change in this space today.

Gnichtel says that in the past, abstraction models were implemented for the purpose of simplifying interactions between software, or to ease complexity when building new functionality. Increasingly though, abstractions are being implemented for the sole purpose of fault or change isolation; the theory being that it’s better to wrap new code around old code, than risk breaking unknown things further down in the stack.

“If the software development industry in general and ERP software in particular, doesn’t get serious about CSI and accept that modern software requires a systematic, automated approach to capturing and understanding complexity, software will eventually be unable to move forward,” asserts Gnichtel.

Taking an equally holistic but perhaps slightly less fatalistic view of this space, Nick Jewell has seen his fair share of environments where customers have come close to breaking complex ERP systems, or at least ingested some new DNA into a software system that ultimately ends up being an unwanted mutation.

As senior director for product marketing at data analytics platform company Incorta, Jewell says he has worked with customers and seen them updating their ERP software suites and applications, a process that normally involves changes to configuration or customisation. This configuration involves optimising basic system functions and components using a framework supplied by the ERP vendor, often in the form of extension tables and well-defined integrations or software ‘hooks’ such as APIs.    

 

Fragile modifications

“Customisation introduces new features or options that were not present in the original deployment. This process is generally more expensive and time-consuming, but it has more potential to differentiate how the company operates. Making changes to operational source code when deviating from the core ERP product introduces future risks to the deployment, especially around handling upgrades in this new, modified environment,” says Jewell.

Companies today clearly have some extra reasons to consider using ERP Software Intelligence platforms to augment their technical capabilities and to mitigate against the risk of unexpected outages as a result of new code or configuration changes. However, the impact from changes to the ERP system’s configuration or source code has the potential to go beyond the software itself. So then, is there a knock-on impact to the health and wealth of ERP data and the analytics we perform on it?

“Naturally, any change to an enterprise software deployment at this level will also change the data that flows from within the application (‘the system of record’) into analytical systems that exist downstream in the organisation’s data flow (‘the systems of insight’). These changes can have dramatic or unintended consequences on how the data is captured, transformed or presented to end-users and decision-makers,” says Jewell.

You should minimise the presence of data pipelines themselves where possible, opting instead for real-time analytics

 In most legacy environments, ERP data flows into systems of insight through a complex series of data pipelines: landing in a raw data zone, often known as a data lake, becoming progressively refined and shaped in enterprise data warehouses and often becoming summarised and aggregated for business users in subject-aligned data marts for end users, who access this final layer through traditional BI tools.  

 “Any change to the core ERP system will mean data pipelines will need updates too. This may mean adding new data attributes and relationships, or ensuring that changing ERP logic is reflected in the data structures that are presented to analysts, data scientists and other data consumers,” explains Incorta’s Jewell.

 

Manageable data-compliant change

Analytics teams can be caught off-guard by changes in these large source systems, leaving end-users unable to access or analyse new attributes to drive value in their work. Organisations in regulated industries especially need to ensure that their business processes remain governed and reconcilable, so they can prove compliance. 

The word from Jewell then is clear enough, in that to achieve manageable data-compliant change in an ERP system, an organisation’s data pipeline must be as agile as its software. 

“Ideally, you should minimise the presence of data pipelines themselves where possible, opting instead for real-time analytics against raw operational data derived from the ERP environment. With this approach, any changing attributes, structures or relationships can be seamlessly integrated into downstream analytics along with tracking the history of those changes, without adding technical complexity”.

If we fail to reach this higher level of software intelligence for ERP systems architecture and engineering management, then we a) risk making software engineers live in perpetual fear of change b) potentially create more brittle ERP systems going forwards and c) fail to coin a swanky acronym like ERPSI – and nobody wants any of these things to happen.