SAN FRANCISCO — AMD CEO Lisa Su swatted away talk that the artificial intelligence boom is a bubble, telling a WIRED audience she sees the spending cycle as durable even as investors hunt for signs of overheating, Dec. 4, 2025. In the same breath, she framed AMD’s next moves as a test of execution: scale supply for a landmark OpenAI buildout while absorbing a new 15% fee tied to limited AI-chip shipments to China.
AMD bets on scale, not “bubble” timing
Su’s message was blunt: AMD is planning for demand that lasts longer than a hype spike. At WIRED’s Big Interview, she argued that the industry is still early in deploying AI across businesses and consumer products, which makes today’s infrastructure buildout feel less like froth and more like groundwork.
That posture matters because AMD is no longer pitching “potential” in AI accelerators — it is signing for volume. In October, OpenAI and AMD announced a multi-year agreement for OpenAI to deploy 6 gigawatts of AMD Instinct GPUs, with an initial 1 gigawatt slated to begin in the second half of 2026. The companies positioned it as a deep hardware-and-software partnership aimed at rack-scale deployments across multiple chip generations, according to OpenAI’s announcement.
For AMD, the OpenAI pact is both a trophy and a timer. The company must deliver on performance-per-watt, software tooling and steady supply — the three pressure points that have historically determined whether challengers can pry meaningful share from the market leader.
AMD’s China challenge now comes with a price tag
At the same time, AMD is navigating a geopolitical toll booth. Su said the company is prepared to pay a 15% fee to the U.S. government on shipments of certain licensed AI chips to China — a rare arrangement that effectively taxes revenue tied to exports that Washington otherwise limits. The comments were reported by Reuters.
The fee doesn’t just hit margins. It adds uncertainty for forecasting, especially as customers in China weigh domestic alternatives and as U.S. policy continues to evolve. Still, the willingness to pay signals that AMD views China as too large to abandon entirely, even in a narrowed, heavily regulated lane.
Older context: this “moment” has been years in the making
The current sprint builds on groundwork AMD laid when it publicly sharpened its AI ambitions. In late 2023, AMD raised its estimate of the AI-chip market and rolled out MI300-family data center parts as it tried to loosen Nvidia’s grip, Reuters reported. By mid-2024, Microsoft said it would offer cloud customers an AMD-based platform as an alternative to Nvidia for AI workloads — an early signal that big buyers wanted optionality, also reported by Reuters.
Now, AMD is trying to convert that optionality into commitment. Su is telling the market the AI buildout is real, then backing it with contracts that force AMD to prove it can deliver at hyperscale — even as trade friction adds cost and complexity to the chip race.

