Building a Robust AI Strategy: The CFO’s Guide to Leveraging APIs and Data

In today’s unpredictable technology-driven world, the office of the CFO plays a crucial role when it comes to generating financial value but also when bringing new opportunities to modernize business, including understanding the role that AI plays to streamline operations.

We’ve been watching the AI paradigm shift unfold for several years but have yet to see its potential fully realized. Generative AI has made a notable impact on business today with the world’s largest companies each putting their own spin on it. We’re also seeing companies debate the best route to make data safe and how we can ensure we’re using AI responsibly. Companies like Meta are staking their claim that AI models need to be open sourced while others are pushing towards close and proprietary being best.

We’ve seen that CFOs have adopted AI largely to automate repeatable tasks and enhance data analysis capabilities for strategic liquidity forecasting – who can argue with AI’s ability to analyze extensive datasets pertaining to sales, expenses, and cash flow trends. However, it appears we as a society, especially in finance, are not entirely ready for adoption just yet. In fact, McKinsey recently released a report on how CFOs are thinking about the future, which found that “Only a fifth of respondents said they were using generative AI, and of that group, about half (49%) said their AI projects were in the pilot or experimental phase.”

Why such a meager uptake? There are several key factors that need to be accomplished first.

One crucial yet often overlooked argument is that a robust data strategy is indispensable for a successful AI strategy. Prioritizing data and APIs is essential before harnessing the full potential of AI. Concretely, there is no chance to build robust AI on data if it is not properly searchable.

Today’s AI discussions frequently focus on generative AI, large language models and tools such as  ChatGPT. While these innovations present exciting use cases, the future of research in our field points towards the integration of generative AI with tabular data and time series analysis. This emerging trend is particularly relevant to our work, offering promising breakthroughs in areas such as liquidity forecasting and optimization of financial resources.

The world of liquidity performance has entered the age of big data. We already observe significant growth of volume, velocity and variety of data coming through our liquidity performance platform. It includes more and more granular and detailed bank statements, payment files, financial transactions and invoices. We see digital businesses processing orders of magnitude and larger volumes of data but the rest of the economy is rapidly catching up.

So what does this mean? If you want your systems to achieve true liquidity performance and you face a high volume of transactions, cash flow, etc., here is what you need:

  • Big data processing technology. You need a system that can process big data along with transactional data with the right computer capabilities for format mapping, search indexation and machine learning algorithms training. You need to trust your technology partner to protect your sensitive customer data with full control on the data processors.
  • A platform that will help you process quality data. It’s important to work with a solution that has the ability to connect with all business stakeholders in order to clean and normalize data enterprise wide. For instance, if your company has grown by acquisition, you might have disjointed platforms and data, or you need a transversal solution that can connect with all your ERPs and banking partners.
  • Data access adapted to your IT environment. We understand customers have different needs, which is why offering both analytics as a service and standard data sharing with any customers’ internal business intelligence systems is so important. Whether you’re looking for external support or to build your own solutions, having a technology partner that empowers you is critical so you don’t need to rely on large IT requests.

In order to embrace the future of liquidity performance with an integrated, intelligent, and intuitive platform and take advantage of AI, you need to have an in-platform data warehouse that allows you to have visibility into all  transactions and assets in the balance sheet.

With data coming from banks and ERPs in thousands of different formats and various protocols, maintaining consistency becomes a significant challenge for AI applications. We are committed to overcoming this challenge by normalizing the data, ensuring that all these varied data points are aligned to provide uniform and reliable information. It’s important that users can handle high transaction volumes effortlessly with robust big data processing and leverage cutting-edge AI through a state-of-the-art machine learning operation platform. Open APIs and big data connectors ensure seamless integration with your existing systems. Self-service controls let your team manage data flows and analytics to meet your specific needs, as well as provide you actionable insights from operational and financial analytics, all without needing an in-house data team.

It’s clear that AI is still very much in its infancy, especially in the world of finance, and as this technology develops, it’s critical that CFOs lead the way in helping their companies navigate the current landscape, balancing short-term management with long-term value creation. While most finance teams are just beginning their journey with Machine Learning  and Generative AI, these emerging technologies offer a significant chance for CFOs and the finance sector to boost their effectiveness and efficiency, but first they must ensure they have a data strategy in place.

Jean-Baptiste Gaudemet, SVP Data & Analytics at Kyriba