Home AI Amazon Is Struggling to Challenge Nvidia’s AI Chip Supremacy

Amazon Is Struggling to Challenge Nvidia’s AI Chip Supremacy

by ccadm


Amazon has been developing its own AI chips to cut costs, which has also helped increase Amazon Web Services (AWS) profitability. However, the e-commerce giant is struggling to develop AI chips that can rival Nvidia’s standard chips.

Project migration issues, compatibility gaps, and low usage are some of the concerns slowing the adoption of Amazon’s AI chips. The situation has also put heavy revenues at stake that Amazon generates from its cloud business. The challenges Amazon faces were identified through confidential documents and sources familiar with the matter, as reported by Business Insider.

Amazon’s In-House AI Chips Encounter Stifled Adoption

Trainium and Inferentia are the top-of-the-line Amazon-designed chips that debuted near the end of last year. The publication reported that last year, Trainium’s adoption rate among AWS’s customers was merely 0.5% compared to that of Nvidia’s graphic processing units.

Also read: Amazon Profit Exceeds Wall Street Expectations as AWS’s Generative AI Works Wonders

Amazon made the assessment to measure the usage percentage of different AI chips through its AWS services in April 2024, according to the report. At the same time, the adoption rate for Inferentia was slightly higher, with 2.7%. Inferentia is a special chip designed for inference, an AI task that usually refers to the computing process for AI model usage by end consumers. The report mentioned an internal document saying that;

“Early attempts from customers have exposed friction points and stifled adoption.”

The above statement refers to the challenges that large cloud customers have faced when transitioning to Amazon’s custom chips. Nvidia’s CUDA platform is considered more appealing to customers, and the report identifies it as a key reason.

Amazon’s Custom AI Chip Development Under Internal Review

AWS, the world’s largest cloud service provider, is now developing its home-brewed computer chips to facilitate operations. Amazon, at times, flaunts its AI chip efforts. However, the picture shown in the documents is different from what the company projects.

Amazon Is Struggling to Challenge Nvidia's AI Chip Supremacy
Singapore’s Minister for Communications and Information, Tan Kiat How, with AWS executives and partners. Source: AWS.

The internal documents state that the company is struggling with a slow adoption rate, but Amazon’s CEO has different views. At its First quarter earnings call, Amazon CEO Andy Jassy said the demand for AWS chips was high.

“We have the broadest selection of NVIDIA compute instances around, but demand for our custom silicon, training, and inference is quite high, given its favorable price-performance benefits relative to available alternatives.”

Andy Jassy

Jassy also mentioned early adopters of AWS silicon chips in his investor letter, saying that “we already have several customers using our AI chips, including Anthropic, Airbnb, Hugging Face, Qualtrics, Ricoh, and Snap.” At the same time, Anthropic is a different case altogether because Amazon is the heaviest backer of the startup. The cloud giant has invested $4 billion in Anthropic, and the investment deal binds it to use AWS-designed silicon.

A Major AWS Component Leverages Nvidia GPUs

Amazon Web Services offers a variety of processors, from Nvidia’s Grass Hopper chips to AMD and Intel. Most of its profitability comes from designing its own data center chips, which helps it save costs by avoiding buying GPUs from Nvidia.

Also read: Nvidia Experiences Remarkable Growth Amid Rising AI and GPU Demand

Amazon debuted its first AI chip, Inferntia, in 2018, but Nvidia still leads in offering solutions that are more widely adopted by different industries. AWS, Microsoft, and Google are some of Nvidia’s largest customers. All these giants rent GPUs through their cloud services. 

In March, Adam Selipsku, CEO of AWS, attended Nvidia GTC 2023. Both companies made a joint announcement focused on their strategic collaboration to advance generative AI. 

“The deep collaboration between our two organizations goes back more than 13 years, when together we launched the world’s first GPU cloud instance on AWS, and today we offer the widest range of NVIDIA GPU solutions for customers.”

Selipsku

Nvidia’s Platform, called Cuda, is usually preferred by developers. As Nvidia spent many years of time and effort in creating it, and the industry has adopted it, which makes handling things easier for them. On the other hand, Amazon still has to solve this puzzle with trial and error.


Cryptopolitan reporting by Aamir Sheikh



Source link

Related Articles