The Deloitte Tech Trends report is in its thirteenth year and looks at current, emerging and future technology trends. It is essential reading for any digital leader and is widely regarded as one of the most comprehensive and thorough reports of its kind. The report utilises Deloitte’s history in working with global organisations across industries and a combination of unique business and technology expertise to predict the trends and opportunities that are most likely to impact businesses in the next 18 to 24 months.
This year’s report shows that, as organisations continue to build on their resilience from responding to COVID-19, new challenges such as global supply chain shortages and fierce competition for talent are forcing companies to become smarter and nimbler. Those that challenge orthodoxies, apply new approaches, and engineer a tech-forward future can drive innovation, gain advantage and differentiate themselves both inside and outside the enterprise. To give the report some additional context, I sat down with Mike Bechtel, chief futurist, Deloitte Consulting LLP, to ask him some questions on the key findings.
Tech Trends 2022 is categorised into three main areas: optimising IT, advancing the enterprise and projecting the possible.
By applying automation, AI and machine learning to internal processes, lean IT talent can focus on strategic, value-add projects while enhancing operations. Trends discussed in this category of Tech Trends 2022 include:
• IT, disrupt thyself: automating at scale
• Cyber AI: real defence
• The tech stack goes physical
Advancing the enterprise
Utilising powerful technology tools such as data, cloud and security, organisations can create powerful new business models to meet customer demands and gain competitive advantage. The three trends include:
• Data-sharing made easy
• Cloud goes vertical
• Blockchain: ready for business
Projecting the possible
Field notes from the future looks at technology and trends that stand to take hold in the next five to 10 years including quantum technologies, exponential intelligence and ambient computing.
On cyber and trust
As data becomes more distributed across multiple platforms, are we making it harder for business leaders to trust that their data is safe and secure? The sheer volume of data, 5G technologies (soon to be 6G), sophistication of cyber attacks, increase in data sharing and the proliferation of algorithms making decisions all amount to uncertainty for CIOs. Is the industry doing enough to ensure that the solutions they bring to market are reliable and secure? Is cyber and trust a board level concern or just for the CIO/CDO?
MB Automation, abstraction and acceleration are godsends to the degree that they free up human capacity to focus on higher order opportunities. That said, it’s important to acknowledge that ‘good’ outcomes do not arise from turbo-charging ‘bad’ processes. For example, small cyber vulnerabilities, automated and accelerated, become big ones. Similarly, AI/ML models, trained on data with small tacit biases, can end up amplifying inequity. Our research suggests that a key piece in realising the benefits of automation at scale is to ensure that you’re inventorying and understanding those processes you intend to automate. Done correctly, these journeys force important discussions between tech leaders and business leaders which result in more clarity, more explicit goals, and more intentionality. In summary, automation done right doubles as an IT hygiene check-up.
We consider cyber and trust one of our enduring ‘macro technology forces’, which means that it’s always been an IT concern and always will be. For all the optimism inherent in emerging technology work, it’s important for leaders – yes, CxO’s and boards of directors (BoD) too – to realise that there’s always a risk dimension resident in every reward. Our Deloitte Global Technology Leadership study reminds us that technology leaders typically serve in their roles for three to four years. Their incentives are tuned towards ‘nurturing now’ as opposed to ‘enabling new’ let alone ‘exploring next’. Interestingly, boards are the ones charged with the long view. As stewards and guides for the long haul, boards should certainly be conversant in matters of cyber and trust, which together are just another name for ‘risk management’, which is very much part of a typical BoD remit.
Where is the appetite for core modernisation? Business leaders are being bombarded with new opportunities – new solutions for employee and customer experience, digital workflows, to name just a few – has the importance of a clean stable core been forgotten and how do you see enterprises continuing to invest in core modernisation whilst maintaining appropriate investment levels in new opportunities?
MB The reason we consider core modernisation an enduring macro technology force: today’s shiny innovation will inevitably become our successors’ rusty legacy application renewal project. Core modernisation is rarely sexy, but it has one of the best business cases around insofar as every creaky legacy tech that’s replaced, remediated or put to pasture lowers the interest rate we pay on our technical debt. Less abstractly, less resources nursing old systems means more resources available to pursue new opportunities. One of our trends this year, ‘Automation at scale: IT disrupt thyself,’ is a recognition that less to operate means more to innovate. In short, I wouldn’t pitch a core modernisation project to my business leader as a dowdy standalone maintenance project. I’d pitch it as the first step in whatever innovative business-led programme we’re out to accomplish next.
Experience and digital reality
As we all expect consumer-grade experiences at work, can employers keep up with the demands and expectations of their workforces? What are the areas (other than HR) that business leaders can focus on to deliver digital experiences for their workforce? And how ready are they to embrace the ‘enterprise metaverse’ with new technologies like mixed reality? Is there sufficient business benefit to investing in these technologies? The trend seems to be capability-led rather than consumer-led?
MB Having worked in all things new-fangled for nearly 25 years, I agree that few technologies have felt quite as capability-led or supply-side as virtual reality (VR).
As a futurist, I’m secretly an historian, which is why I can’t help but look beyond the breathless faddishness around the metaverse and instead take the long view. Our research reminds us that the whole history of human-computer interaction has be a surprisingly straightforward evolution of ever more sophisticated interfaces delivering us ever simpler experiences. The punch-card era presumed users had PhD’s. The command line era, night school. For a generation, we’ve seen graphical user interfaces evolve from click and type to touch and swipe on ever smaller screens. The one thing we know is that the next evolution of the interface won’t be something smaller than a smartwatch screen. Instead, evidence points to the next frontier for simplicity being anything and everything ‘beyond the glass’.
VR and virtual worlds are certainly part of that future, but so are quieter ambient experiences like conversational voice (think smart speakers), spatial computing (think smart glasses), and predictive/proactive interfaces that infer your next need. Together, all these ambient assistants work together as a team of helpers, creating a sort of ‘digital Downton Abbey’. That’s the consumer-led, or demand-led angle: who wouldn’t want their own butler and staff at home and at work?
Data, AI and analytics
Are business leaders ready to trust their mission-critical data-led decision to algorithms? What are the key trends in data and analytics and how can business leaders keep up with the volumes of data they are producing?
MB One of the things we’ve seen in our research is that the current generation of business and technology leaders grew up hearing the trope ‘computers never make mistakes’. We think of computers as calculators that give us the right answer every time. Period.
AI/ML ushers in an entirely different paradigm. As we move from ‘telling’ machines what to do via deterministic code, to ‘teaching’ machines what to do via machine learning, business leaders need to recognise that, just like humans, mechanical minds make mistakes. This doesn’t mean we reject machine intelligence outright. It means we need, as leaders, to lower our standard from perfection to merely very useful – or more accurately – more useful than the humans who used to do this.
Take call-centre applications as an example. Organisations are beginning to implement symbolic AIs that can detect and emulate human emotions and infer human needs. These algorithms don’t need to be as charming as a movie star or as empathic as your therapist. They simply need to be more charming and empathic than your least effective call centre representative.
We are on our fourth big cloud shift already: first the move to private cloud, then public cloud, then hybrid and multi-cloud, then industry cloud – do you get a sense from business leaders that we are building more and more complexity into our cloud architecture without the associated benefit? Most so-called vertical clouds are not much more than a vanilla public cloud with some industry specific bolt-ons – will the cloud providers be able to continue to sell this story without genuine industry depth across multiple domains?
MB Regarding cloud complexity, our research counterintuitively suggests the inverse: the more you cloud-source, the smaller your core, and the smaller your core, the quicker you can respond to fluid markets.
In talking with enterprise CIO’s, they sometimes lament their size. We hear stories about how scrappy start-ups, with no business being in their business, are threatening to put them out of business, in part because of their comparative agility. The notion is that their size is an encumbrance – aircraft carriers vs speedboat fleets, etc.
We don’t see industry clouds or vertical clouds as an à la carte revolution so much as the recognition that even sector-specific functions (say, reservations logic for a hospitality company, or claims-management logic for an insurer) can increasingly be cloud-sourced. And not just to hyper-scalers. Traditional ERP and open-source software solutions are also starting to offer sector-specific menus.
As geeky as anything with the term API tends to sound, this is honestly as much a business trend as a tech trend. Knowing that we’ll eventually be able to cloud-source just about all our business logic forces a discussion between tech and business leaders about where we’ll compete and where we’ll win. The old Michael Porter quote is new again: “The essence of strategy is knowing what not to do.”
By hard-circling differentiated capabilities, and cloud-sourcing the rest, firms can focus their precious, scarce development talent on signal as opposed to noise. Or, as my grandma’s favourite crooner Bing Crosby used to sing: ‘Accentuate the positive, eliminate the negative, and don’t mess with Mr In-between.’ And by don’t mess with, I suspect he meant cloud-source.
Core modernisation is rarely sexy, but it has one of the best business cases around insofar as every creaky legacy tech that’s replaced, remediated, or put to pasture lowers the interest rate we pay on our technical debt.
The one thing we know is that the next evolution of the interface won’t be something smaller than a smart-watch screen. Instead, evidence points to the next frontier for simplicity being anything and everything ‘beyond the glass’.
We don’t see industry clouds or vertical clouds as an à la carte revolution so much as the recognition that even sector-specific functions (say, reservations logic for a hospitality company, or claims-management logic for an insurer) can increasingly be cloud-sourced.