ServiceNow AI momentum builds with NVIDIA partnership

NVIDIA ServiceNow K23

In the latest Knowledge 23 announcement, ServiceNow has unveiled a partnership with NVIDIA, building upon its ecosystem for enterprise-grade generative AI capabilities.

ServiceNow will leverage NVIDIA’s custom large language models trained on data specifically for its Now platform. The models will add to the new Open AI and domain-specific LLMs announced yesterday and offer AI capabilities for employees and developers, IT departments and customer service teams.

The solution aims to improve intelligent virtual assistants and agent tools, with purpose-built large language model-fed, customizable AI chatbots. Customer service teams will be also able to use generative AI for automatic issue resolution and article generation based on customer case summaries and chat summarization for faster hand-off, resolution and wrap-up.

In addition, the employee experience will also be improved by helping identify growth opportunities based on natural language queries and information from an employee’s profile.

For ‘hungry and humble’ ServiceNow, as CEO Bill McDermott puts it, the company is experiencing its first period of ‘pull’ from its user base after much push for growth over the last ten years.

By customer demand, this is a company stepping up into new territory. Since 2019, ServiceNow has doubled its product offering, with no sign of slowing down. Loathed to box tick, the hope is that ServiceNow’s workflow management platform will lend itself well to new generative AI capabilities, rather than touting the latest tech at the peak of AI hype.

Today saw some of the customer, employee and developer capabilities already available demoed to an excitable 10,000-strong audience.

ServiceNow’s president and COO CJ Densai said in today’s keynote: “We firmly believe that this technology has reached an inflection point where we can do something meaningful. We think the future of ServiceNow and Generative AI is bright. The new ServiceNow Assist will be an assistant on the side to help you with the use cases of ServiceNow. We have been inspired by all the work NVIDIA has done on deep learning, and we cannot wait to roll this out for our users.”

Jensen Huang, co-founder and CEO of NVIDIA said: “This is the biggest computer industry platform transition of our generation. This is the first computer that can automatically generate text, video, proteins, genes, chemicals, anything with structure. We have developed a state of the art language model systems, large, medium and small. ServiceNow is the world’s enterprise service platform and, over time we will develop on top of it domain-specific AIs optimized for the data and skills of each of their customers.”

The K23 news follows other recent Now AI gains, including its expanded Microsoft partnership, the Utah release for AI-powered process mining and RPA and its partnership with Hugging Face for the release of StarCoder, the 15 billion parameter open access large language model (LLM).

In a K23 media and analyst panel with NVIDIA, Hugging Face and ServiceNow’s VP of product platform, AI, Jeremy Barnes, and CTO Kelly Steven-Waiss answered questions on generative AI developments.

What’s clear, before we all get too excited, is that there is a long way to go before businesses master this technology. To be the AI scrooge, many of the finer details remain in the experiment stage, and the guardrails for responsible use aren’t yet cemented in for many firms, no matter the vendor and industry.

Jeff Boudier, VP of product and growth, Hugging Face said: “We’re providing the tools for every company in the world to build with AI. Gen AI is very much in the early adopter phase. The same early adopters of generic generative AI were the early adopters of transfer learning a couple of years ago; financial services, healthcare, consumer tech. But I think the responsibility of explainable AI is a shared responsibility from the platform to the companies that are making use of it.

“NVIDIA, Meta, Google and Microsoft and so on, all have huge contributions to the research community. Our responsibility is to advance the field through open science and research, putting in guardrails, evaluation models and data sets.”

In an aside with ERP Today, Barnes said: “One of the challenges with the natural language model is that no-one knows how it will work. It’s currently very exciting, and our approach is to work with customers where we can learn naturally from each other. What we’ve learned is that when we work really closely with customers early on we ensure success. Generative AI and the ServiceNow platform complement each other well and we are in a really good position to bring this to customers.”

As part of the partnership, ServiceNow is also helping NVIDIA streamline its IT operations with these generative AI tools; using NVIDIA data to customize NVIDIA NeMo foundation models running on hybrid-cloud infrastructure consisting of NVIDIA DGX Cloud and on-premises NVIDIA DGX SuperPOD AI supercomputers.

It’s certainly a time of development for ServiceNow and generative AI in general. There’s a lot to figure out with this new technology, across vendors and users alike. Hopefully, everyone comes out with a jackpot-winning healthy ROI, but the work starts now to discover how to make the most of it responsibly.