Snowflake launches open LLM Arctic to cater to enterprise needs

a blue chip with AI written on it attached to a motherboard | Snowflake launches open LLM Arctic to cater to enterprise needs

Key Takeaways

Snowflake has introduced Snowflake Arctic, an open enterprise-grade large language model (LLM) optimized for complex workloads, operating under an Apache 2.0 license which allows for wide-ranging use.

Arctic features a Mixture-of-Experts (MoE) architecture that enhances training systems and model performance by activating 17 out of 480 billion parameters, focusing on delivering high-quality results with token efficiency.

The launch of Arctic aligns with Snowflake's vision of AI as a transformational technology, providing users with powerful, flexible tools to leverage their data and integrate open-source LLMs into their AI strategies.

Snowflake has announced its large language model (LLM), Snowflake Arctic, designed to set a precedent for open enterprise-grade LLMs.

Optimized for complex enterprise workloads, Arctic, which falls under an Apache 2.0 license that permits ungated personal, research and commercial use, aims to set a new openness standard for enterprise AI technology. 

As a part of Snowflake’s strategic efforts, Arctic’s differentiated Mixture-of-Experts (MoE) architecture design promises to improve both training systems and model performance, with a data composition focused on enterprise needs. 

According to the company, Arctic helps deliver high-quality results by activating 17 out of 480 billion parameters at a time to achieve quality with token efficiency.

“This is a watershed moment for Snowflake, with our AI research team innovating at the forefront of AI,” said Sridhar Ramaswamy, CEO of Snowflake. “By delivering industry-leading intelligence and efficiency in a truly open way to the AI community, we are furthering the frontiers of what open-source AI can do. Our research with Arctic will significantly enhance our capability to deliver reliable, efficient AI to our customers.”

When asked during a briefing session how this announcement aligns with Snowflake’s broader business objectives, Ramaswamy said that “this is not a positioning exercise,” and the company “genuinely thinks of AI as a transformational technology”. 

“I know people are sick of hearing that from tech leaders. But I think of this as a technology that is perhaps even more impactful than mobile and we all know how much impact that had on humanity. 

“Just the very basics of changing how information is consumed and supplied to software, but also taken away from software is a big deal. And therefore it is an important overlay component of everything that we do,” the CEO explained.

According to recent research by Forrester, approximately 46 percent of global enterprise AI decision-makers said that they are leveraging existing open-source LLMs to adopt generative AI as a part of their organization’s AI strategy. As Snowflake provides a data foundation to more than 9,400 companies around the world, the recent innovations can empower users to leverage their data with open LLMs, while offering flexibility and choice with what models they work with.

In addition, Snowflake provides code templates, with flexible inference and training options so users can get started with deploying and customizing Arctic using their preferred frameworks.

These will include NVIDIA NIM with NVIDIA TensorRT-LLM, vLLM and Hugging Face. 

“Snowflake and AWS are aligned in the belief that generative AI will transform virtually every customer experience we know,” said David Brown, vice president of compute and networking, AWS. 

“Using Amazon EC2 P5 instances with Snowflake’s efficient training system and model architecture co-design, Snowflake was able to quickly develop and deliver a new, enterprise-grade model to customers. And with plans to make Snowflake Arctic available on AWS, customers will have greater choice to leverage powerful AI technology to accelerate their transformation.”

Eric Boyd, corporate vice president of Azure AI Platform, Microsoft, also said that the company is “pleased to increase enterprise customer choice in the rapidly evolving AI landscape by bringing the robust capabilities of Snowflake’s new LLM model Arctic to the Microsoft Azure AI model catalog.

“Our collaboration with Snowflake is an example of our commitment to driving open innovation and expanding the boundaries of what AI can accomplish,” he added.