Inside a cavernous room this week in a one-story building in Santa Clara, California, six-and-a-half-foot-tall machines whirred behind white cabinets. The machines made up a new supercomputer that had gone live last month.

The supercomputer, which was unveiled Thursday by Cerebras, a Silicon Valley start-up, was built with the company’s specialized chips, which are designed to power artificial intelligence products. The chips are noted for their size, about the size of a dinner plate, or 56 times larger than a commonly used chip for AI. Each Cerebras chip contains the computing power of hundreds of traditional chips.

Cerebras said it had built the supercomputer for G42, an AI company. G42 said it planned to use the supercomputer to create and power artificial intelligence products for the Middle East.

“What we are showing here is that there is an opportunity to build a very large, dedicated AI supercomputer,” said Andrew Feldman, CEO of Cerebras. He added that his startup wanted to “show the world that this work can be done faster, it can be done with less energy, it can be done at a lower cost.”

Demand for computing power and AI chips has skyrocketed this year, fueled by a global AI boom. Tech giants like Microsoft, Meta and Google, as well as a myriad of startups, have rushed to release AI products in recent months after the AI-powered chatbot ChatGPT went viral for the eerily human prose it could generate.

But manufacturing artificial intelligence products typically requires significant amounts of computing power and specialized chips, leading to a fierce search for more of those technologies. In May, Nvidia, the leading maker of chips used to power artificial intelligence systems, said appetite for its products, known as graphics processing units, or GPUs, was so strong that its quarterly sales would exceed Wall Street estimates by more than 50 percent. The forecast sent Nvidia’s market value above $1 trillion.

“For the first time, we are seeing a huge jump in computing requirements” due to AI technologies, said Ronen Dar, founder of Run:AI, a Tel Aviv start-up that helps companies develop AI models. That has “created a huge demand” for specialized chips, he added, and companies have been “quick to secure access” to them.

To get their hands on enough AI chips, some of the biggest tech companies, including Google, Amazon, Advanced Micro Devices, and Intel, have developed their own alternatives. Start-ups like Cerebras, Graphcore, Groq and SambaNova have also joined the race, aiming to break into the market that Nvidia has dominated.

Chips are poised to play such an important role in AI that they could change the balance of power between tech companies and even nations. The Biden administration, for its part, recently weighed restrictions on the sale of AI chips to China, with some US officials saying China’s AI capabilities could pose a threat to US national security by enhancing Beijing’s military and security apparatus.

AI supercomputers have been built before, including by Nvidia. But it’s rare for startups to create them.

Headquartered in Sunnyvale, California, Cerebras was founded in 2016 by Mr. Feldman and four other engineers, with the goal of creating hardware that accelerates the development of AI. Over the years, the company has raised $740 million, including from Sam Altman, who runs the OpenAI AI lab, and venture capital firms like Benchmark. Cerebras is valued at $4.1 billion.

Because the chips typically used to drive AI are small, often the size of a postage stamp, hundreds or even thousands of them are needed to process a complicated AI model. In 2019, Cerebras unveiled what it claimed was the largest computer chip ever built, with Feldman saying its chips can train AI systems 100 to 1,000 times faster than existing hardware.

G42, the Abu Dhabi company, started working with Cerebras in 2021. It used a Cerebras system in April to train an Arabic version of ChatGPT.

In May, G42 asked Cerebras to build a network of supercomputers in different parts of the world. Talal Al Kaissi, chief executive of G42 Cloud, a G42 subsidiary, said the cutting-edge technology would allow his company to build chatbots and use AI to analyze preventive care and genomic data.

But the demand for GPUs was so high that it was hard to get enough to build a supercomputer. Cerebras’ technology was available and cost-effective, Mr. Al Kaissi said. So Cerebras used its chips to build the supercomputer for G42 in just 10 days, Feldman said.

“The time scale has been greatly reduced,” Al Kaissi said.

Over the next year, Cerebras said, it plans to build two more supercomputers for G42, one in Texas and one in North Carolina, and, after that, six more distributed around the world. You are calling this network Condor Galaxy.

Still, startups are likely to find it hard to compete against Nvidia, said Chris Manning, a Stanford computer scientist whose research focuses on AI. That’s because people who build AI models are used to using software that runs on Nvidia’s AI chips, he said.

Other startups have also tried to enter the AI ​​chip market, but many have “effectively failed,” Dr. Manning said.

But Feldman said he was hopeful. Many AI companies don’t want to be locked into just Nvidia, he said, and there is global demand for other powerful chips like Cerebras.

“We hope this will move AI forward,” he said.

Leave a Reply

Your email address will not be published. Required fields are marked *