Home Science & Tech Google and Meta unveil their Nvidia challengers

Google and Meta unveil their Nvidia challengers

by ccadm


Meta and Google just announced their intention to push further into the AI chip race, currently dominated by Nvidia. Both companies are touting their new custom AI chips as a key element of their AI strategies.

To understand their significance, you have to understand that the current generation of AI models requires massive amounts of computing power. This is because they need to crunch huge amounts of data to train the large language models that power them. 

This makes conventional CPUs worthless for the amount of processing required by AI models. This has created a market for AI-specific computer chips. Until now, semiconductor giant Nvidia has dominated this market. According to estimates, the wait time for its flagship AI chips is months long.

Read: Sheikh Hamdan bin Mohammed announces opening of the Dubai Centre for Artificial Intelligence

Even as they wait for Nvidia to deliver the chips, tech behemoths are working to build their own silicon. This not only reduces their dependence on a third-party supplier, but also enables them to tailor their hardware for their own AI models.

As things stand now, hardware is a key AI growth area.

Up in arms

Google recently unveiled its Arm-based processor. Dubbed Axion, the processor was announced during Google’s Cloud Next event in Las Vegas on April 9, 2024. 

Google has been creating custom silicon for almost a decade now with its multi-generational Tensor Processing Units (TPUs). These are application-specific chips Google uses to accelerate machine learning workloads. 

However, building a complete CPU is a major undertaking. While Google hasn’t yet released complete details about Axion, we know it’s built atop Arm’s Neoverse V2 platform. 

Axion supports various general-purpose processes such as CPU-based AI training. The chip will enhance Google’s AI capabilities within its data centers. 

Axion will be accessible to Google Cloud customers later in 2024.

Google is also updating its TPU AI chips that are used as alternatives to Nvidia’s GPUs for AI acceleration tasks. “TPU v5p is a next-generation accelerator that is purpose-built to train some of the largest and most demanding generative AI models,” Mark Lohmeyer, Google Cloud’s vice president and general manager of compute and AI infrastructure, writes in a blog post.

Google’s Axion comes months after Microsoft revealed its own custom AI chips. One of these is an AI chip built to train large language models. The other is a custom Arm-based processor designed for cloud and AI workloads. 

Me too

Shortly after Google announced Axion, Meta unveiled the next generation of its custom silicon “built with AI in mind”.

The custom AI chip joins Meta’s family of custom silicon known as Meta Training and Inference Accelerator (MTIA).  Meta reportedly will use the new MTIA to rank and recommend content across Facebook and Instagram.

“As AI workloads become increasingly important to our products and services, [MTIA’s] efficiency will improve our ability to provide the best experiences for our users around the world,” writes Meta.

Meta has been adding more AI features to its social networking apps. Back in September 2023, it released its own version of an AI model similar to ChatGPT. Dubbed Meta AI, the conversational AI is available in select geographies on WhatsApp, Messenger and Instagram. Meta has also added new generative AI features to its social networking apps.

Soon after rolling out Meta AI, the company said it would spend about $35 billion to support its AI workloads. Reportedly, during an investor meet, Meta CEO Mark Zuckerberg said that AI will be Meta’s biggest investment area in 2024.

Bargaining chips

Cost aside, it’s the lead times that’s forcing tech giants to develop their own AI capable silicon. 

Nvidia, like most of the semiconductor industry, relies on Taiwanese manufacturer TSMC to put together its processors. This reliance on one manufacturer is solely responsible for the lead times for these new-generation processors to extend into months.

Read: Why AI chip maker Nvidia can’t afford to rest on its laurels

Industry insiders reckon Google and Meta will be able to produce their AI chips without relying on TSMC. This means, even if they aren’t able to keep pace with Nvidia’s offerings, they’ll have shorter wait times, if any. This alone makes them an attractive proposition over Nvidia’s AI chips.

Then there’s the fact that Nvidia’s AI offerings are general purpose chips. So, while they might be suitable for different kinds of AI workloads, they might not be able to outpace a custom chip designed for a specific AI task.

This is why Google and Meta’s in-house chips could be the start of trouble for Nvidia. They tell Nvidia they aren’t the only game in town. If nothing else, these custom AI chips could give some of Nvidia’s biggest customers a bargaining chip. 

For more news on technology, click here.



Source link

Related Articles