Amazon’s AI Chip Takes On Microsoft and Google: Cloud Computing Gets a Major Upgrade
The e-commerce giant Amazon has released two new artificial intelligence chips for its cloud services – Graviton4 and Trainium2. Amazon Web Services (AWS) CEO Adam Selipsky announced that Trainium2, the second generation of the chip, is designed specifically for training artificial intelligence systems.
According to the company, Graviton4 processors are based on the Arm architecture and consume less energy than Intel or AMD chips.
Graviton4 offers up to 30 percent better performance, 50 percent more cores and 75 percent more memory bandwidth than the current generation Graviton3 processors.
On the other hand, Trainium2 is designed to provide up to 4x faster training than the first generation Trainium chips and can be deployed on EC2 UltraClusters of up to 100,000 chips, enabling the training of basic models (FMs) and large language models. (LLM) in a fraction of the time and improves energy efficiency up to 2x.
“The fourth generation we’ve shipped in just five years, Graviton4 is the most powerful and energy-efficient chip we’ve ever built for a wide range of workloads. And as interest in generative AI grows, Tranium2 will help customers train their ML models faster, cheaper and more energy-efficient,” said David Brown, vice president of computing and networking at AWS.
The AWS move comes weeks after Microsoft announced its own AI chip called Maia. The Trainium2 chip also competes with AI chips from Alphabet’s Google, which has offered its Tensor Processing Unit (TPU) to its cloud computing customers since 2018.
More than 50,000 AWS customers already use Graviton chips. Startup Databricks and Amazon-backed Anthropic, an OpenAI competitor, plan to build models using the new Trainium2 chips, Amazon said.
In addition, Amazon and NVIDIA also announced that they will expand their strategic collaboration to provide the most advanced infrastructure, software and services to power customers’ generative artificial intelligence (AI) innovation.