In a landmark move for AI infrastructure, Nvidia has officially signed a multi-year deal to supply 1 million AI chips to Amazon Web Services (AWS) by the end of 2027. The agreement, confirmed by Nvidia executive Ian Buck on March 20, 2026, marks one of the largest hardware commitments in cloud computing history.
The deal is a cornerstone of Nvidia CEO Jensen Huang’s projected $1 trillion sales opportunity for the next-generation Blackwell and Rubin chip architectures.
The “Full-Stack” Infrastructure Deal
Unlike previous hardware purchases, this is a strategic partnership that integrates Nvidia’s entire compute and networking stack into the AWS ecosystem.
- Broad Chip Mix: The 1 million unit figure includes not only flagship GPUs but also Nvidia’s Spectrum networking chips and the newly released Groq inference chips (acquired via a $17 billion licensing deal in late 2025).
- The “Seven-Chip” Strategy: AWS plans to use a combination of seven different Nvidia chip types to optimize AI inference—the process where AI models generate real-time answers for users.
- Networking Integration: In a significant shift, AWS will deploy Nvidia’s ConnectX and Spectrum-X networking gear alongside its own custom-built infrastructure to handle massive AI workloads.
Strategic Impact: Training vs. Inference
Industry analysts at Tekedia note that this deal signals a shift in the AI economy from the “Training Arms Race” to “Inference at Scale.”
| Phase | Primary Focus | Hardware Demand |
| Training (2023–2025) | Building Large Language Models (LLMs). | High-intensity GPU clusters. |
| Inference (2026+) | Serving millions of daily user queries. | Mixed-chip stacks (GPUs + specialized inference chips). |
Market Context: The $60 Billion Revenue Potential
While the exact financial terms remain undisclosed, market estimations suggest the deal could be worth upwards of $60 billion through 2027.
- GPU Value: 1 million GPUs at an estimated $35,000 each would account for $35 billion.
- Inference & Networking: The addition of Groq chips and Spectrum networking equipment is projected to add another $22 billion.
- Software Licensing: Integrated software and support services could contribute a further $3 billion.
The “Co-opetition” with AWS Silicon
Despite this massive purchase, Amazon is not abandoning its in-house chip development. AWS will continue to scale its own Trainium and Inferentia chips.
“The coexistence of these efforts reflects a dual strategy: build internally where possible, but buy externally where necessary to maintain a competitive lead in the AI cloud market,” noted one industry report.
