AI’s Hidden Dependencies: How Energy, Compute, Scale Change the Cost of AI

Digital image of data over field with wind turbines.

Key Takeaways

AI infrastructure costs are rising as energy demand, compute intensity, and concentrated supply chains introduce systemic risks that many enterprises have not priced into their AI strategies.

The shift from discrete AI training to continuous inference turns AI into a persistent operating expense that directly affects finance, supply chain, and HR processes.

ERP architecture decisions now shape long-term AI risk exposure, as embedded AI dependencies lock in assumptions about pricing, availability, and financial stability.

A new report from Arthur D. Little’s Blue Shift Institute examines the hidden AI infrastructure costs shaping enterprise adoption. The report, AI’s Hidden Dependencies, describes how AI’s growth depends on energy systems, water availability, and highly concentrated compute supply chains.

That dependence introduces risks most enterprises have not priced into their AI strategies. The report warns that these dependencies create systemic vulnerabilities for businesses long before most organizations recognize them.

As Blue Shift Director Dr. Albert Meige puts it, “AI feels cheap today because its real economic and environmental costs are essentially hidden. Once dependence sets in, those costs will surface. And companies should be strategically prepared.”

Why AI’s Infrastructure Footprint Is Expanding Faster Than Expected

The report’s central finding is that AI’s apparent lightness masks a rapidly expanding industrial footprint. Contributors describe how data centers already account for roughly 0.5% of global carbon emissions, with AI-driven energy demand projected to rise sharply as usage shifts from training runs into continuous inference.

That transition is critical. While training once dominated AI’s environmental impact, the report notes that inference could account for as much as 90% of lifecycle emissions as AI becomes embedded in everyday workflows.

Energy constraints emerge as the most immediate limiting factor. According to the report, AI-optimized servers consume five to eight times more electricity than conventional infrastructure, driving a surge in data center demand that grids are struggling to absorb.

The report notes that in hubs such as Northern Virginia and Dublin, data centers could consume 30% to 40% of local electricity by 2030. In some regions, grid connection queues already stretch for years, making power access a gating factor for AI deployment.

Those pressures are compounded by supply chain concentration. The report documents how a small number of firms dominate AI chips, fabrication, and cloud platforms, increasing exposure to pricing shifts and geopolitical disruption.

Instead of forecasting a single outcome, the report shows how uncertainty itself becomes a risk. Constraints around compute demand and infrastructure access may surface abruptly, after organizations have already embedded AI into core operations.

What AI’s Rising Energy and Compute Costs Mean for ERP Systems

When applied to ERP systems, the report’s findings help explain how AI dependency can create new cost, availability, and control risks for organizations.

Unlike experimentation layers, ERP platforms increasingly embed AI into forecasting, planning, close, compliance, and workflow automation. These uses rely on continuous inference, turning AI resource consumption into a persistent operational load.

That shift matters because ERP customers rarely control where AI workloads run or how underlying infrastructure constraints affect pricing and availability.

As AI features become integral to systems of record, changes in compute pricing, grid capacity, or provider policies can propagate quickly through finance, supply chain, and HR processes. What appear to be infrastructure issues upstream surface inside ERP as budget volatility, performance constraints, or service limits tied to contracts.

The report’s emphasis on resilience over sovereignty also maps directly to ERP strategy.

Few organizations can realistically isolate ERP AI from dominant cloud and compute providers. Instead, the practical response lies in architectural and contractual flexibility.

Hybrid deployment models, workload segmentation, and clearer visibility into AI-related costs and dependencies become governance concerns. In that sense, ERP roadmaps now encode long-term assumptions about AI availability and cost stability. Those assumptions should be examined with the same rigor organizations apply to other systems of record.

What This Means for ERP Insiders

AI costs migrate from IT to operations. As inference becomes embedded in transactional workflows, AI spending shifts from discretionary IT budgets into day-to-day operating costs. That migration makes AI efficiency, pricing stability, and availability concerns matters of financial governance, not experimentation.

Infrastructure limits shape software outcomes. AI performance inside enterprise systems increasingly reflects grid capacity, cooling constraints, and provider infrastructure decisions. Software roadmaps now inherit physical limits that organizations neither own nor directly control.

ERP strategy shapes AI risk exposure. Choosing where and how AI is embedded in ERP systems commits organizations to long-lived assumptions about compute access and cost stability. Those architectural choices quietly define exposure to future pricing shifts, regulatory reforms, and disruptions in the AI supply chain.