Amazon and Google Tip Off Nvidia CEO Secret Briefings Reveal Who Really Controls the AI Chip Race

SAN FRANCISCO — Before announcing their latest artificial intelligence chips, Amazon and Google tip off Nvidia CEO Jensen Huang, according to a new report from The Information. 

The practice underscores Nvidia’s commanding position in the AI hardware ecosystem, where its GPUs power the majority of training and inference workloads for large scale AI models.

Sources familiar with the matter said both Amazon and Google provide advance notice to Huang before revealing in house silicon, reflecting a careful balancing act between competition and reliance on Nvidia’s technology.

Nvidia remains the dominant supplier of AI accelerators, serving as the backbone for major cloud providers. Its graphics processing units are essential for training large language models and increasingly handle inference workloads as well.

Despite years of investment in proprietary chips Amazon’s Trainium and Google’s Tensor Processing Units both companies still depend heavily on Nvidia’s CUDA ecosystem, which integrates hardware, software, and developer tools.

Even after spending billions on custom designs, they can’t yet match Nvidia’s ecosystem advantages, said Anjali Rao, a semiconductor analyst at FutureSight Research. “Alerting Huang in advance is a safeguard against supply risks.”

Why hyperscalers tread carefully

Analysts say the notification practice reflects strategic caution rather than subservience. Nvidia’s influence extends beyond chip supply it acts as a financial anchor across the AI infrastructure chain, investing in suppliers and securing future capacity.

In September, Nvidia agreed to purchase up to $6.3 billion of unused GPU capacity from CoreWeave over seven years. It invested $700 million in British data center startup Nscale and spent $900 million acquiring talent and technology from networking firm Enfabrica.

Earlier this year, Nvidia committed $5 billion to Intel for joint chip development and backed OpenAI’s proposal for a 10-gigawatt GPU data center, valued at as much as $100 billion.

These moves show Nvidia is locking down the global supply chain, said Victor Delgado, a technology strategist at CapitalFront Advisory. “Any company dependent on those chips must manage the relationship carefully.”

According to Omdia research, Nvidia commands more than 80 percent of the global AI accelerator market, projected to surpass $150 billion by 2027.

By contrast, Amazon’s and Google’s custom chips hold only small shares, used mainly in their internal clouds. Even as they pursue performance and cost advantages, their systems still rely on Nvidia GPUs for large scale AI model training.

The biggest barrier is CUDA, Rao said. Shifting away would require rewriting codebases and retraining engineers risks few are ready to take.

Engineers working within cloud environments recognize Nvidia’s grip firsthand. Our workloads are deeply tied to Nvidia’s stack, said Omar Patel, a cloud engineer at a major US provider. 

Even with Amazon’s Trainium, Nvidia GPUs remain essential for performance and reliability. Others view the pre announcement briefings as standard business etiquette.

“Notifying a key supplier before a big reveal is good practice,” said Michelle Tan, a supply chain consultant. “It preserves trust and avoids disruptions in supply or pricing.”

Still, some industry watchers see it as a signal of dependency. When customers feel obligated to alert a supplier about rival products, it reflects real power imbalance, Delgado said.

Challengers rising, but Nvidia steady

Amazon and Google continue to advance their custom silicon programs. Amazon’s Trainium 2 is expected to deliver higher training efficiency, while Google is testing sixth generation TPUs for multimodal AI applications.

Startups such as Cerebras, Graphcore, and SambaNova are also building alternatives, but widespread adoption remains limited. Nvidia’s combination of hardware, software, and financial muscle is hard to unseat, Rao said. The ecosystem effect is strong.

Amazon and Google tip off Nvidia CEO Jensen Huang before announcing new AI chips a practice that reflects Nvidia’s unmatched role in AI computing. 

While cloud giants pursue greater independence through custom designs, the company’s dominance in GPUs, software, and financing continues to shape the global AI infrastructure landscape.

Whether this dynamic endures will depend on how quickly competitors can scale performance, ecosystems, and trust to rival Nvidia’s entrenched position.

Leave a Comment