‘Glorified filing cabinet’: Cloud’s greatest failing

Why we need to move beyond the glorified filing cabinet  

As transformation initiatives continue to accelerate, cloud continues to be a key enabler for organisations transitioning out of lockdown crisis mode to new hybrid working practices and recession-proofing strategies. As a result, cloud remains a major focus for investment: Gartner predicts that by 2026, public cloud spending will exceed 45 percent of all enterprise IT spending, up from less than 17 percent in 2021.    

Mass cloud adoption combined with mass digitalisation has also had a direct knock-on effect to the volume of data generated. Today, every business is essentially a digital business, generating terabytes of data during daily activities. From the websites and apps customers engage with when purchasing goods and services, to the IoT devices or sensors that track anything from agriculture, manufacturing and supply chain processes, and the predictive maintenance of escalators, big data undoubtedly now lives in the cloud. From cloud-based data warehouses to data lakes, enterprises are turning to the cloud to manage and store massive amounts of data. 

But while many businesses have a large volume of data, many more are unable to make use of it. With most business process datasets now dispersed over different data lakes, data warehouses, and data silos that can’t talk to each other, profiling, preparing, and creating an insight generating data pipeline has become a common challenge for many organisations. 

 Despite over a decade of hype and attention, the cloud has yet to reach its full business transformation potential. Cloud can mean several different things to various organisations. For some, cloud implementations allow companies to have additional data centres that are geographically friendly to where their users and/or their data sits. This can save time by reducing the amount of data movement, while also providing necessary legal compliance depending on the use case. For others, the cloud means making full use of the ever decreasing cost of data storage compared to the same storage on-premise. Unfortunately, many companies are still predominantly focussing on the second approach, and cloud adoption strategies remain centred – and siloed – around data storage. 

 Today’s storage cost is incredibly low and ever decreasing, making the value proposition of creating and storing data an easy equation. But one of the cloud’s ultimate benefits can rapidly develop into a problem.  

With data growing at an exponential rate, businesses are already facing an overwhelming amount of fragmented raw data. Unable to unlock it for new opportunities to drive growth and create value, many are just hoarding it without a purpose or end goal in mind.  

 

Transitioning from data storage to insight generation  

As one of the critical drivers of business success, the importance of data cannot be overstated. However, clean, structured and normalised data is needed to fuel trustworthy and accurate insights. As the volatile business environment of the last couple of years has highlighted, quick and accurate data-driven decision-making agility is the difference between a successful business and a failed one.  

Organisations and business executives need to have the right information necessary to formulate action plans quickly. According to an Alteryx-commissioned survey by IDC released in February 2022, 62 percent of practitioners, and 75 percent of mid-to-upper management, are now expected to make agile and scalable data-driven decisions as part of their day-to-day work. The aim of any cloud strategy should always be about facilitating the end business result. Rather than just storing data, cloud strategies need to focus on making the best use of the greater hardware elasticity available to scale up and down compute resources when required.  

Rather than just storing data, cloud strategies need to focus on making best use of the greater hardware elasticity available to scale up and down compute resources.

The cloud combines nearly infinite storage with the elastic computing capacity needed to process, analyse  and automate processes at speed and at scale – delivering a comprehensive modern data and analytics ecosystem. Effectively leveraging this scalable compute power is the key to using the big data already in cloud repositories. Building advanced analytics into your cloud strategy transitions the business away from ‘drowning in data but starving for insights’, to analysing data from different platforms that would have otherwise remained siloed and deriving insights from these hugely valuable resources. 

 

Fast, scalable data analytics capabilities are within touching distance of every business

With so much data already in the cloud, it is a far more efficient process – both regarding time and money – to move the analytic processes to the cloud instead of moving terabytes of data to a single computer to be processed. However, only a fraction of today’s business data is accessed and analysed due to being dispersed across different clouds and silos. In-cloud analytics – assessing the data where it lives – is key to analysing and unlocking the value from within the vast volumes of data organisations are holding.   

 Cloud-based analytics’ biggest potential impact comes from combining data democratisation with enormous compute resources and analytical and data-handling power, delivering data-driven decision-making at scale. However, cloud-based analytics also provides massive leverage in the form of democratising the responsibility for data analytics, helping every worker get the insights they need from huge amounts of data so more people across the business can consume and analyse in order to identify critical patterns and trends more easily. 

 Today’s evolving business landscape requires fast insights, formed on trustworthy information and accurate decisions for organisations to thrive. This high demand for data insight induces increased pressure on data analysts and business analysts to produce precise analysis in a rapid turnaround.  

 

From data accessibility to data-insights: the need to democratise data analytics for all 

As business leaders think about analytics, they need to see it as a collaborative process within the whole organisation. It’s key to ensure that everyone from IT, to domain experts and business analysts, to data engineers and data scientists, can collaborate, develop models, solve problems and drive insights with data. If data is the fuel needed to drive business insights, then a data-literate analytic-enabled workforce is the engine that powers this process, delivering new insights and innovations previously out of reach. 

 However, there simply aren’t enough people with data science skills to meet the requirement for the data analytics needed by businesses today, and many employees are still mired in unproductive work using inefficient legacy spreadsheets.   

A strategy is needed to move beyond cloud’s stagnant use as a glorified and fiercely guarded filing cabinet, focussing on human intelligence and analytics for all.

 With only a fraction of today’s business data accessed and analysed, unlocking the value from within the huge volumes of data organisations are holding necessitates a significantly expanded strategy to move beyond cloud’s stagnant use as a glorified and fiercely guarded filing cabinet – one that focusses on human intelligence and analytics for all. Cloud makes both data and analytics more accessible. By providing speed of execution and ease of use via access to browser-based self-service analytics across the enterprise at scale, business leaders can also help the next generation of knowledge workers put data-driven insights at the heart of decision-making.  

 While the pace of change continues accelerating, only those organisations empowering people with data literacy while providing faster and more accessible cloud-based analytics will be capable of delivering business insights at scale.    

Alan Jacobson is chief data and analytics officer at Alteryx