AMD Chief Executive Officer Lisa Su introduced a long-anticipated lineup called the MI300 at an event Wednesday held in San Jose, California. (Bloomberg)News 

AMD CEO Unveils Nvidia Challenger and Predicts Impressive Results

Advanced Micro Devices Inc. has introduced new accelerator chips that are designed to outperform competitor products in running artificial intelligence software, targeting a market currently dominated by Nvidia Corp.

The company unveiled the long-awaited lineup called the MI300 at an event held on Wednesday in San Jose, California. CEO Lisa Su also gave an eye-popping prediction about the size of the AI chip industry, saying it could reach more than $400 billion in the next four years. That’s more than double AMD’s forecast in August, showing how quickly expectations for AI hardware are changing.

The release is one of the most important in AMD’s five-decade history and kicked off a showdown with Nvidia in the hot market for AI accelerators. Such chips help develop AI models by bombarding them with data, which has to be done as skillfully as traditional computer processors.

Building artificial intelligence systems that rival human intelligence – considered the holy grail of computing – is now within reach, Su said in an interview. But the adoption of the technology is just beginning. It will take time to assess the impact on productivity and other aspects of the economy, he said.

“The truth is, we’re so early,” Su said. “This is not a fad. I believe it.”

AMD is showing growing confidence that the MI300 lineup can beat the biggest names in tech, which could steer billions in spending for the company. Customers using the processors include Microsoft Corp., Oracle Corp. and Meta Platforms Inc., AMD said.

Shares of Nvidia fell 2.3% to $455.03 in New York on Wednesday, a sign that investors see the new chip as a threat. Still, AMD shares did not rise accordingly. On a day when tech stocks were generally down, shares fell 1.3% to $116.82.

Growing demand for Nvidia chips from data center operators helped boost the company’s stock this year, pushing its market value above $1.1 trillion. The big question is how long it will practically have the accelerator market to itself.

AMD sees an opening: Large language models — used by AI chatbots like OpenAI’s ChatGPT — require massive amounts of computer memory, and that’s where the chipmaker believes it has an advantage.

The new AMD chip has more than 150 billion transistors and 2.4 times more memory than Nvidia’s H100, the current market leader. It also has 1.6 times as much memory bandwidth, further improving performance, AMD said.

Su said the new chip matches Nvidia’s H100 in terms of its ability to train AI software and is much better at inference — the process by which software is run when it’s ready for real-world use.

While the company expressed confidence in its product’s performance, Su said it won’t just be a competition between the two companies. Many others are also competing for market share.

At the same time, Nvidia is developing its own next-generation chips. The H100 will be followed by the H200 in the first half of next year, giving access to a new high-speed memory type. It should match at least part of AMD’s offering. And then Nvidia is expected to release an entirely new architecture for the processor later this year.

AMD’s prediction that AI processors will grow into a $400 billion market underscores the boundless optimism of the AI industry. That’s $597 billion for the entire chip industry in 2022, according to IDC.

Back in August, AMD had offered a more modest forecast of $150 billion for the same period. But it will take some time for the company to capture a large part of the market. AMD has said its own revenue from accelerators will top $2 billion in 2024, and analysts estimate the chipmaker’s total sales will reach about $26.5 billion.

The chips are based on semiconductors called graphics processing units, or GPUs, typically used by video gamers to get the most realistic experience. Their ability to perform certain types of calculations quickly by performing multiple calculations simultaneously has made them a popular choice for practicing AI software.

Related posts

Leave a Comment