Pre- to Post-AI Transition to Be ‘Bigger Than the Internet’

Article By : Sally Ward-Foxton

AI will become as pervasive as the internet, and will support a whole industry of competitors and collaborators up and down the technology stack.

Data center AI chip and system startup SambaNova hit the headlines recently with an enormous Series D funding round of $676 million, pushing the 100-person company’s valuation above $5 billion, a staggering amount for a young company which only emerged from stealth at the end of 2020. The company is now one of the best-funded AI chip companies in the world, with more than $1.1 billion raised. EE Times sat down with SambaNova CEO Rodrigo Liang to discuss the company’s strategy and roadmap.

Firstly, why are SambaNova and its competitors in the data center AI space attracting such huge amounts of funding? Is it required to get the product right, does it take this amount of resources to go up against the incumbents, or does it simply reflect investors’ view of the market opportunity?

“As much as we know that semiconductor design is a very capital-intensive venture, a billion dollars in cash for a small company is a lot,” Liang said. “I think it reflects what people see as the market opportunity.”

The market for data center AI hardware and software systems is growing rapidly as demand increases for AI-specific processing power.

SambaNova CEO Rodrigo Liang
Rodrigo Liang (Image: SambaNova)

“I really believe that this transition that the world’s going through, this pre-AI to post-AI transition is probably going to be as big, if not bigger than the internet,” Liang said. “If you think about the magnitude of changes that the internet brought to the world across every industry and every customer in every industry, I think you’re going to see something similar, if not bigger.”

Liang argues that all companies will need to embrace the technology for survival in the post-AI age.

“This isn’t something that some people are going to need for certain types of use cases – everybody needs it,” he said. “We’re at a point where the cost of not doing it is extremely high. Your competitors are all transitioning their business to AI systems, so it’s becoming a corporate imperative – you have to do something or be left behind.”

Companies that embraced the internet were catapulted to the top of their industries – today, the largest companies in the world are internet-based – and AI will be no different, Liang said.

“My customer base isn’t the Fortune 20 who have all the AI expertise, it’s the ‘Fortune Everybody’,” Liang said. “Think about everybody who needs it – big or small – now they have access because we can actually get you up and running with state-of-the-art AI and maintain it for you.”

Company origins
SambaNova’s origins date back to the early 2000s, when Liang and co-founder Stanford Professor Kunle Olukotun met at a company founded by Olukotun – Afara Websystems – making multicore processors for the data center. Afara was acquired by Sun Microsystems, and later became part of Oracle, and while Olukotun returned to Stanford, Liang and the Afara team built many generations of Sun’s SPARC processor line.

Back at Stanford, Olukotun met fellow Professor Chris Ré, a Macarthur Genius Award winner, and the two began thinking about how to run large scale machine learning in a better way.

Their conclusion, said Liang, was that several things were needed. A software stack was needed that offers a high level of abstraction, “because the problem is so large that the human mind can’t optimize for it.” A hardware system to run data flow applications efficiently was needed; there was no available data flow processor out there at the time. And since the nature of machine learning workloads means they evolve very quickly, a reconfigurable system was needed to keep up with the times.

The next step became finding a team to design such a chip – Liang’s team at Oracle-Sun, with its track record of building high performance processors, fit the bill. With the right combination of timing, technology and team, Liang, Olukotun and Ré co-founded SambaNova in 2017.

SambaNova CEO interview - DataScale Rack System
SambaNova’s DataScale rack scale system for AI acceleration in the data center (Image: SambaNova)

The company came out of stealth mode in December 2020 with its system-level accelerator for data center and HPC applications called DataScale. DataScale is based on the company’s Cardinal SN10 reconfigurable dataflow unit chip and its scale allows for the biggest networks such as natural language processing (NLP) models to be run with high accuracy. Alongside the rack-based DataScale product, SambaNova rents its hardware for a monthly fee, a business model the company calls “Dataflow-as-a-service”.

SambaNova’s mission is ease of deployment for at-scale AI; the process of building an AI practice can be long and hard, even for large companies. It involves hiring scores of increasingly scarce data scientists, then figuring out the technology infrastructure they need. Gathering data and selecting or developing AI models comes next, all while keeping up with the latest research in the world of models, which moves extremely quickly.

“Those are all things that they’re not expert in, because their business is something else,” Liang said. “They just want to find a way to do it to that state-of-the-art [level] as efficiently as possible. And that’s what we structure our company for: How can I deliver that capability for you that you need for that transformation that allows you to get that value without having to build it all yourself.”

SambaNova’s pitch is that it will take on many of the tasks that would otherwise require a large data science team. The company will maintain hardware and software systems, train models to the highest accuracy, and make sure everything runs as efficiently as possible. Its strategy also includes pieces of this pipeline, as Liang says, “it’s not all or nothing.”

What’s to stop other players emulating this whole-pipeline approach?

“There aren’t a lot of players that are able to tackle the problem as a whole platform,” Liang said. “You’ll see people building chips just to do an inference on one specific application, but as a general-purpose platform, there just aren’t that many players that are able to tackle the breadth of it.”

Customers and roadmap
SambaNova’s strategy includes building solutions for multiple different verticals, based on same hardware. These verticals are natural language processing (NLP), computer vision, recommendation systems and AI for science. In the AI for science category, the company has already announced its systems are installed at Argonne National Laboratory and Lawrence Livermore National Laboratory, but Liang said the company’s customers also include hyperscale cloud providers and on-premises data centers across a variety of industries.

“Going back to the internet comparison, who would not be a customer of the internet?” he said.

With a customer base of the “Fortune Everyone,” does one solution, albeit scalable, really suit all the possible use cases out there? AI workloads differ hugely in nature between different models, and as more industries adopt AI, workloads will no doubt continue to diverge.

SambaNova CEO interview - Cardinal chip
SambaNova’s Cardinal SN10 reconfigurable dataflow unit processor (Image: SambaNova)

Here Liang cited the reconfigurable nature of SambaNova hardware, saying SambaNova can compete across all verticals because “we’re allowing at the very core on a high-performance microprocessor to reconfigure the data paths to match the need of the model,” he said. “If you have to do it by hand, it’s really hard, but we have a very intelligent compiler stack that allows me to actually do it all automatically. It’ll figure it out what I need, and then it’s closely tied to the hardware. This is one of the benefits you get by having a company that’s innovating from hardware all the way up the stack… we can do things that most other chip companies just don’t have the capabilities to do, because they don’t control those upper layers of the stack.”

With a hardware and software system that can already do it all – where does SambaNova’s product roadmap go from here?

Liang has in mind that the company will certainly move up the stack, since working with customers has revealed they increasingly don’t want to write their own models – all they ultimately want is the insight that’s extracted from the data, he argued.

“We understand the workflow and the pipeline that it takes to do all of that,” he said. “You’ll see us continue to grow that area, certainly expanding across more and more verticals, we’ll have a lot more complete solutions to address the needs of certain types of vertical markets.”

Engineering SambaNova’s technology stack to keep up with rapidly evolving models which are rapidly growing in size is also on the cards.

“We need to catch up to what people really want to do with these systems,” Liang said. “We’re already running some of the largest models that anybody in the world can do, and we’ll continue to push the envelope.”

Shape of the industry
SambaNova is part of an elite group of data center AI chip startups, many which have also raised large amounts of funding (Graphcore, Groq, and others). Like the rest of the group, SambaNova is aiming to capture a part of the market from incumbents Nvidia and Intel. While there is often discussion about which company or technology will “win” this market, Liang’s view is that AI will eventually become so pervasive that the market will support a whole industry of competitors and collaborators at all levels of the food chain.

“There were many winners of the internet,” he said. “Over the 20 years at the various different levels like networking, storage, and other things that happened as the industry transitioned from pre-internet to post-internet – AI is the same thing. Right now we’re looking at a very specific set of things that exist in today’s infrastructure, and people try and figure out who’s going to win and lose on those particular models… it’s going to be a pervasive industry, it is going to get segmented across many different things, like the internet did before. There are going to be a lot of different players that are going to compete for those segments.”

Liang is confident SambaNova can carve out a part of this market – but with such an enormous market opportunity combined with a proven, experienced team and $1.1 billion in funding, who wouldn’t be?

This article was originally published on EE Times.

Sally Ward-Foxton covers AI technology and related issues for EETimes.com and all aspects of the European industry for EETimes Europe magazine. Sally has spent more than 15 years writing about the electronics industry from London, UK. She has written for Electronic Design, ECN, Electronic Specifier: Design, Components in Electronics, and many more. She holds a Masters’ degree in Electrical and Electronic Engineering from the University of Cambridge.

Subscribe to Newsletter

Leave a comment