SAN FRANCISCO — Oracle Cloud Infrastructure announced Tuesday that it will deploy 50,000 artificial intelligence chips from Advanced Micro Devices beginning in the second half of 2026, marking one of the largest single AI hardware investments outside of Nvidia’s ecosystem.
The move highlights a growing willingness among cloud providers to diversify away from Nvidia’s dominant position in the AI chip market.
AMD shares rose about 2 percent following the news, while Oracle’s stock fell 4 percent and Nvidia’s slipped more than 3 percent amid investor reactions to the shifting dynamics in the AI infrastructure sector.
Karan Batta, senior vice president of Oracle Cloud Infrastructure, said the company sees strong customer interest in AMD’s latest AI chips. “We feel like customers are going to take up AMD very, very well especially in the inferencing space,” Batta said during a briefing.
The announcement comes as tech companies worldwide are racing to secure the computing power needed to support rapid advances in artificial intelligence, from large language models to enterprise automation tools.
Nvidia has long dominated the AI accelerator market, supplying roughly 80 percent of the world’s AI data center GPUs. Oracle’s partnership with AMD adds a new dimension to this landscape.
The chips, part of AMD’s Instinct MI450 series unveiled earlier this year, represent a significant leap forward for the company.
The MI450 chips are designed to work together in massive configurations, allowing 72 chips to function as a single system a critical capability for training and deploying next generation AI models.
The deployment will position Oracle among the first major cloud providers to adopt AMD’s MI450 architecture at scale. “This is a calculated move,” said Laura Chen, a semiconductor analyst at TechSight Research.
“Oracle has been looking for ways to differentiate itself from larger cloud rivals like Amazon and Microsoft, and backing AMD gives them both cost flexibility and strategic leverage.”
Industry analysts view Oracle’s move as a strategic challenge to Nvidia’s market leadership. While Nvidia remains the preferred choice for many AI developers due to its robust CUDA software ecosystem, AMD has been steadily narrowing the gap with its ROCm platform and open-source compatibility.
“AMD’s MI450 chips offer impressive scalability and efficiency,” said Mark Davidson, a senior hardware engineer at AI consultancy CoreVision Labs.
If Oracle can integrate these chips seamlessly into its cloud infrastructure, it could open up a new market for enterprises seeking alternatives to Nvidia based solutions.
Experts also noted that AMD’s partnership with OpenAI whose CEO Sam Altman appeared alongside AMD CEO Lisa Su in June gives the company a credibility boost in the AI sector.
That collaboration signaled to the industry that AMD is a serious contender, Davidson added. “This Oracle deal reinforces that momentum.”
According to industry data from Mercury Research, Nvidia’s market share in data center GPUs was about 81 percent in mid 2025, while AMD held roughly 11 percent and Intel accounted for the remainder.
However, AMD’s share has been growing steadily, driven by new chip architectures and expanding partnerships. For Oracle, deploying 50,000 AMD AI chips could reduce costs by as much as 20 percent compared with Nvidia’s equivalents, according to preliminary estimates from research firm TrendForce.
The cost advantage comes not only from hardware pricing but also from greater flexibility in licensing and software integration.
“Many cloud providers are eager to avoid being overly dependent on Nvidia,” said TrendForce’s lead analyst, Ravi Patel. “AMD’s latest chips give them a viable alternative without sacrificing too much performance.”
Customers and developers are watching closely to see how AMD powered Oracle Cloud instances perform in real world AI workloads. “We’ve been waiting for more options beyond Nvidia,” said Maria Gonzalez, CTO of Boston based AI startup InferenceWorks.
If Oracle’s AMD systems deliver competitive speed and cost, we’d absolutely consider migrating part of our compute pipeline there. Others remain cautiously optimistic.
“Nvidia still has the software ecosystem advantage,” said Arun Menon, an AI researcher at the University of Toronto. “AMD’s challenge will be convincing developers that ROCm is ready for production scale use.”
For Oracle, customer adoption will be key. “It’s not just about the chips,” Batta emphasized. “It’s about giving customers the flexibility to choose the right compute for their AI workloads whether that’s Nvidia, AMD, or future architectures.”
Oracle’s investment signals a broader trend across the tech industry the pursuit of hardware diversity in the AI race. Microsoft, Google, and Amazon have all announced plans to expand their use of non Nvidia chips, including in house designs and partnerships with companies such as AMD and Intel.
Analysts expect AMD’s Instinct MI450 chips to gain further traction once they become available in 2026, especially as AI applications expand beyond model training to large scale inference and enterprise deployment.
The next wave of AI growth will be driven by inference workloads, Chen of TechSight Research noted. “That’s where AMD’s architecture can truly shine.”
As cloud providers ramp up multi vendor strategies, Nvidia faces growing competition though few expect its dominance to disappear overnight.
Nvidia remains years ahead in software, Davidson said. “But if AMD continues executing at this pace, the balance of power could start to shift by the end of the decade.”
Oracle’s decision to deploy 50,000 AMD AI chips marks a defining moment in the evolving battle for AI infrastructure supremacy.
By embracing AMD’s Instinct MI450 technology, Oracle is positioning itself as a major player in diversifying the AI compute landscape and signaling to the industry that Nvidia’s once unquestioned dominance may finally be facing credible competition.
As AI workloads grow more complex and global demand for GPUs intensifies, Oracle’s bet on AMD underscores a new era of hardware pluralism in cloud computing one where performance, cost, and flexibility will shape the future of artificial intelligence deployment.