Amazon racing to develop AI chips cheaper, faster than Nvidia’s, executives say: reuters.com

In Austin, Texas, on July 25, a chip lab was opened by Amazon.com (AMZN.O) where on a Friday afternoon, six engineers tested a new server design.

During a lab visit, Amazon’s Rami Sinno stated on Friday that the server contained Amazon’s AI chips, which rival Nvidia’s leading chips.

Amazon is creating its own processors to reduce its dependence on expensive Nvidia chips, known as the Nvidia tax, which are used in some of the artificial intelligence cloud services at Amazon Web Services, the main source of growth.

With their own custom-designed chips, Amazon aims to assist customers in performing complicated calculations and handling large data sets at a lower cost.

Microsoft and Alphabet, both opening new tabs, are following suit.

Sinno, who leads engineering at Amazon’s Annapurna Labs within its cloud division AWS, mentioned that customers were requesting more cost-effective options compared to Nvidia.

Amazon purchased Annapurna labs in the year of 2015.

Even though the company’s AI chip projects are just starting out, Amazon has been working on its Graviton chip for traditional computing for almost ten years and is now in its fourth version. The Trainium and Inferentia AI chips are more recent models.

David Brown, Vice President of Compute and Networking at AWS, stated on Tuesday that by offering up to 40% or even 50% in certain cases of better price and performance, the cost of running the same model with Nvidia should be reduced by half.

In the January-March quarter, sales at AWS, which make up almost 20% of Amazon’s total revenue, increased by 17% to $25 billion compared to the same period last year. Market share in cloud computing is held by AWS at approximately 33%, while Microsoft’s stake is similar.

Recomended