Microsoft is using Oracle Cloud Infrastructure (OCI) AI infrastructure, along with Microsoft Azure AI infrastructure, for inferencing of AI models that are being optimized to power Microsoft Bing conversational searches daily.
Leveraging the Oracle Interconnect for Microsoft Azure, Microsoft is able to use managed services like Azure Kubernetes Service (AKS) to orchestrate OCI Compute at massive scale to support the increasing demand for Bing conversational search.
Bing conversational search requires powerful clusters of computing infrastructure that support the evaluation and analysis of search results that are conducted by Bing’s inference model.
Inference models need thousands of compute and storage instances and tens of thousands of GPUs that can operate in parallel as a single supercomputer over a multi-terabit network.
Karan Batta, senior vice president, Oracle Cloud Infrastructure, said: “GenAI is a monumental technological leap and Oracle is enabling Microsoft and thousands of other businesses to build and run new products with our OCI AI capabilities.
“By furthering our collaboration with Microsoft, we are able to help bring new experiences to more people around the world.”
Divya Kumar, global head of marketing for Search and AI at Microsoft, said: “Microsoft Bing is leveraging the latest advancements in AI to provide a dramatically better search experience for people across the world.
“Our collaboration with Oracle and use of Oracle Cloud Infrastructure along with our Microsoft Azure AI infrastructure, will expand access to customers and improve the speed of many of our search results.”
This announcement follows Oracle’s previous news that saw London Stock Exchange Group (LSEG) chose to leverage Oracle Cloud to transform and streamline its finance operations. In addition to this, LSEG announced last year that it entered into a long-term strategic partnership with Microsoft to architect LSEG’s data infrastructure using Microsoft Cloud and develop new products and services for data and analytics.