top of page
AINews (3).png

Amazon’s Unveils New Impressive AI Chip: Will Nvidia Partner?

  • Writer: Covertly AI
    Covertly AI
  • Dec 3
  • 3 min read

Updated: Dec 4

Amazon Web Services is stepping up its push to make AI infrastructure faster, cheaper, and less power hungry with the formal launch of its third generation training chip, Trainium3, unveiled at AWS re:Invent 2025. 


ree

The company introduced Trainium3 UltraServer, a full system built around a 3 nanometer Trainium3 chip plus AWS’s in house networking, framing it as a major leap not only for training large AI models but also for running AI applications during peak demand (Bort). In practical terms, AWS says the new system delivers more than four times the performance and four times the memory compared to the prior generation, aiming to help customers train models quicker and serve real world inference workloads more reliably (Bort).


Scale is a central part of AWS’s pitch. Each UltraServer can host 144 Trainium3 chips, and AWS says thousands of UltraServers can be linked into clusters totaling up to 1 million Trainium3 chips, which it describes as roughly ten times the scale supported by the previous generation (Bort). That kind of clustering matters because today’s frontier models often require massive, tightly connected fleets of processors to train efficiently. AWS is also emphasizing efficiency, claiming Trainium3 systems are 40% more energy efficient than the earlier generation, a notable point as cloud providers face rising scrutiny over the electricity demands of rapidly expanding data centers (Bort).


ree

AWS argues these gains translate directly into customer savings. The company says early users, including Anthropic (where Amazon is also an investor), Japan’s large language model developer Karakuri, and companies such as SplashMusic and Decart, have already used Trainium3 systems to significantly reduce inference costs (Bort). Separately, Amazon said organizations including Anthropic, Karakuri, Metagenomi, NetoAI, Ricoh, and Splash Music have reduced training costs by up to 50% compared with alternatives, underscoring AWS’s broader message that custom silicon is a path to lower operational costs for enterprises building and running AI (”Amazon Launches New AI Chip While Teasing Future Tech Collaboration With Nvidia”). AWS also pointed to its own platform adoption, noting that Amazon Bedrock, its managed service for foundation models, is already serving production workloads on Trainium3 as a signal of readiness for enterprise scale deployment (TechCrunch).


The timing is strategic. Wall Street continues to debate how effectively Amazon can win the next wave of AI driven cloud contracts, even as AWS remains the largest cloud provider while facing intense competitive pressure from Microsoft and Google. That pressure is heightened by momentum around Google’s Gemini 3 model and broader interest in custom chips as potential challengers to Nvidia’s dominance in AI processors. Amazon CEO Andy Jassy has also highlighted the financial stakes, calling Trainium a multibillion dollar business growing 150% quarter over quarter, and Amazon shares ticked up after the Trainium3 announcement (”TechCrunch”).


ree

President and CEO of Amazon


At the same time, AWS is signaling it is not taking an either or approach with Nvidia. Alongside Trainium3, AWS previewed Trainium4, saying it is already in development and will support Nvidia’s NVLink Fusion, a high speed interconnect designed for chip to chip communication. AWS says this will allow Trainium4 powered systems to interoperate with Nvidia GPUs while still leveraging Amazon’s lower cost server rack technology, potentially making it easier to attract AI software stacks built with Nvidia’s CUDA ecosystem in mind (Bort). Even with no timeline announced for Trainium4, the roadmap suggests AWS wants to compete on price performance while staying compatible with the dominant Nvidia centered AI world, a combination that could matter as enterprises look for flexibility without betting everything on a single chip supplier (Yahoo Tech).


This article was written by the Covertly.AI team. Covertly.AI is a secure, anonymous AI chat that protects your privacy. Connect to advanced AI models without tracking, logging, or exposure of your data. Whether you’re an individual who values privacy or a business seeking enterprise-grade data protection, Covertly.AI helps you stay secure and anonymous when using AI. With Covertly.AI, you get seamless access to all popular large language models - without compromising your identity or data privacy.


Try Covertly.AI today for free at www.covertly.ai, or contact us to learn more about custom privacy and security solutions for your business.  


Works Cited


Bort, Julie. “Amazon releases an impressive new AI chip and teases an Nvidia-friendly roadmap.” TechCrunch, 2 Dec. 2025, https://techcrunch.com/2025/12/02/amazon-releases-an-impressive-new-ai-chip-and-teases-a-nvidia-friendly-roadmap/


“Amazon Launches New AI Chip While Teasing Future Tech Collaboration With Nvidia.” Investors.com, 2 Dec. 2025, https://www.investors.com/news/technology/amazon-stock-ai-chip-trainium3-nvidia/


“Amazon releases an impressive new AI chip and teases an Nvidia-friendly roadmap.” Yahoo Tech, 2 Dec. 2025, https://tech.yahoo.com/ai/articles/amazon-releases-impressive-ai-chip-160000779.html


HPCwire. “AWS Brings the Trainium3 Chip to Market with New EC2 UltraServers.” HPCwire, 2 Dec. 2025, https://www.hpcwire.com/2025/12/02/aws-brings-the-trainium3-chip-to-market-with-new-ec2-ultraservers/.


Bloomberg News. “Amazon CEO Andy Jassy’s Pay Package Was $212 Million in 2021.” Bloomberg, 1 Apr. 2022, https://www.bloomberg.com/news/articles/2022-04-01/amazon-ceo-andy-jassy-s-pay-package-was-212-million-in-2021. 


Data Center Dynamics. “AWS makes Trainium3 UltraServers generally available.” Data Center Dynamics, 2 Dec. 2025, https://www.datacenterdynamics.com/en/news/aws-makes-trainium3-ultraservers-generally-available/.


Comments


Subscribe to Our Newsletter

  • Instagram
  • Twitter
bottom of page