Yazılar

Nvidia’s New AI Chips Slash Training Times for Massive AI Models

Nvidia’s latest generation of AI chips is making significant advances in training some of the world’s largest artificial intelligence systems, according to new benchmark data released on Wednesday by MLCommons, a nonprofit organization that tracks AI system performance.

The results show a dramatic drop in the number of chips required to train large language models (LLMs), highlighting Nvidia’s growing technological lead in this critical area of AI development. While much of the financial market’s current focus is on the booming sector of AI inference—where AI models answer user queries—training remains a core competitive battleground, especially for developing next-generation models with trillions of parameters.

Blackwell Chips Outperform Previous Generations

Nvidia’s new Blackwell chips demonstrated superior performance over its previous Hopper generation. In tests involving Meta Platforms’ open-source Llama 3.1 405B model, which is complex enough to simulate some of the most demanding AI training workloads, Nvidia’s Blackwell chips completed training tasks with more than double the speed per chip compared to Hopper.

In one benchmark, a system using 2,496 Blackwell chips completed the training run in just 27 minutes. By comparison, even though more than three times as many Hopper chips were used in previous tests, they only achieved faster results due to sheer scale rather than efficiency.

Nvidia and its partners were the only ones to submit data for models of this size, giving Nvidia a clear demonstration of its leadership in training capabilities for multi-trillion parameter models.

Changing Industry Trends in AI Training

Chetan Kapoor, chief product officer of CoreWeave, which collaborated with Nvidia on the results, noted that AI companies are moving away from building vast, homogenous data centers with 100,000 or more identical chips. Instead, they are increasingly assembling smaller, specialized subsystems that handle different aspects of the training process. This modular approach allows companies to speed up training times and manage extremely large model sizes more efficiently.

“Using a methodology like that, they’re able to continue to accelerate or reduce the time to train some of these crazy, multi-trillion parameter model sizes,” Kapoor explained at a press briefing.

Global Competition Also Heating Up

While Nvidia maintains a dominant position, competitors around the world are also pushing for breakthroughs. For example, China’s DeepSeek has recently claimed it can create competitive chatbots while using far fewer chips than many U.S. rivals, adding to the growing international race for AI supremacy.

MLCommons’ report also included results from Advanced Micro Devices (AMD) and others, though Nvidia’s Blackwell system stood out in the training category.

U.S. Nears Deal to Allow UAE Import of 500,000 Nvidia AI Chips Annually Starting 2025

The United States is moving toward a landmark agreement with the United Arab Emirates (UAE) that would allow the import of 500,000 of Nvidia’s most advanced AI chips per year, starting in 2025, according to two sources familiar with the matter. The draft deal, still under negotiation, could significantly boost the UAE’s ambitions to become a global AI hub and represents a strategic shift in U.S. technology export policy.

Under the current version of the agreement:

  • 100,000 chips per year (20%) would be allocated to G42, a major UAE tech firm backed by Abu Dhabi’s sovereign wealth fund Mubadala and chaired by national security adviser Sheikh Tahnoon bin Zayed Al Nahyan.

  • The remaining 400,000 chips would go to U.S. tech giants like Microsoft and Oracle, which are expected to build or expand data centers in the UAE.

The deal could triple or quadruple the AI computing power previously accessible to the UAE under Biden-era restrictions. However, one source noted that the agreement has encountered growing opposition in Washington in recent days, particularly over concerns the chips might eventually benefit China or other adversarial actors.

Strategic and Political Implications:

  • The deal would elevate the Gulf region, especially the UAE, as a third major AI power center alongside the U.S. and China.

  • The agreement reportedly includes a reciprocal clause: for every AI facility G42 builds in the UAE, it must construct a similar one in the U.S., promoting bilateral infrastructure investment.

  • The definition of what constitutes an “advanced AI chip” (e.g., Nvidia’s Blackwell or future Rubin GPUs) will be established later by a dedicated working group, which will also set security parameters.

Trump and Gulf AI Expansion:

Coinciding with the deal, former U.S. President Donald Trump, during his tour of the Gulf this week, announced $600 billion in tech commitments from Saudi Arabia, including chip deals with Nvidia, AMD, and Qualcomm. The Trump administration also plans to rescind Biden-era AI chip export restrictions, accelerating tech collaboration with Gulf nations.

Nvidia, G42, the White House, and the U.S. Commerce Department all declined to comment publicly. However, if finalized, the deal would mark one of the most significant U.S. AI technology transfers to the Middle East to date.

Nvidia to Invest Billions in U.S. Chip Production Over Four Years

Nvidia (NVDA.O) plans to invest hundreds of billions of dollars in U.S.-made chips and electronics over the next four years, CEO Jensen Huang told the Financial Times. The company expects to spend around $500 billion on electronics during this period, with a substantial portion allocated to domestic manufacturing.

Huang emphasized that the U.S. AI industry could expand more rapidly with support from government policies. His comments come as Nvidia seeks to address investor concerns about demand for its high-cost AI chips, especially following the emergence of China’s DeepSeek chatbot as a potential competitor.

While Nvidia declined to comment on the FT report, Huang stated that the company can now manufacture its latest systems in the U.S. through key suppliers like Taiwanese chipmakers TSMC (2330.TW) and Foxconn (2317.TW). He also noted an increasing competitive threat from China’s Huawei.

Huang highlighted that TSMC’s U.S. investments significantly strengthen Nvidia’s supply chain resilience. Earlier, at Nvidia’s developer conference in California, he told analysts that orders for 3.6 million Blackwell AI chips from four major cloud firms likely underestimate actual demand, as they do not account for customers such as Meta Platforms (META.O), smaller cloud providers, and startups.