Yazılar

Nvidia Warns U.S. GAIN AI Act Could Harm Competition, Echoes AI Diffusion Rule

Nvidia criticized the proposed GAIN AI Act on Friday, warning that it would restrict global competition and hurt the U.S. economy much like last year’s AI Diffusion Rule, which limited the export of high-performance chips.

The Guaranteeing Access and Innovation for National Artificial Intelligence Act, introduced as part of the National Defense Authorization Act, would require AI chipmakers to prioritize domestic orders before fulfilling foreign contracts. Exporters would also need licenses to ship chips above certain performance thresholds, specifically processors rated 4,800 or higher in total computing power.

In a statement, Nvidia argued the law addresses a non-existent issue:

“We never deprive American customers in order to serve the rest of the world. In trying to solve a problem that does not exist, the proposed bill would restrict competition worldwide in any industry that uses mainstream computing chips.”

The Act mirrors the AI Diffusion Rule enacted under President Joe Biden, which rationed computing capacity among allies while cutting off rivals like China. Both measures reflect Washington’s effort to secure U.S. access to advanced silicon and limit China’s AI capabilities, particularly amid concerns about its military applications.

The debate comes just weeks after President Donald Trump struck a deal with Nvidia allowing the company to resume certain AI chip exports to China in exchange for the U.S. government receiving a cut of sales—an unprecedented arrangement underscoring the geopolitical stakes around advanced semiconductors.

If enacted, the GAIN AI Act could reshape the global AI hardware supply chain, tightening U.S. control over who gets access to the most powerful chips.

OpenAI’s Cash Burn Projected to Hit $115B by 2029 Amid Chip, Data Center Push

OpenAI has revised its financial outlook sharply upward, projecting it will burn through $115 billion by 2029, according to The Information. The new figure is about $80 billion higher than its earlier estimate, reflecting the surging costs of powering ChatGPT and other AI models.

The report says OpenAI expects to lose over $8 billion in 2024 alone, roughly $1.5 billion more than forecast earlier this year. The company anticipates that annual burn will balloon to $17 billion next year, rising to $35 billion in 2027 and $45 billion in 2028.

To rein in costs, OpenAI is pursuing vertical integration—developing its own AI server chips and data center infrastructure. Its first in-house chip, being developed in partnership with Broadcom, is expected in 2025 and will be used internally. On the infrastructure side, OpenAI has struck major agreements, including:

  • A $4.5 GW data center expansion with Oracle announced in July.

  • The Stargate project, a planned $500 billion, 10 GW buildout backed by SoftBank.

  • Expanded computing capacity through Google Cloud.

The staggering burn rate underscores the immense capital intensity of generative AI, where costs for cloud computing, GPUs, and electricity are skyrocketing. At the same time, it highlights OpenAI’s strategy to reduce reliance on external providers like Nvidia and Amazon Web Services by building a proprietary AI stack—from chips to data centers.

Broadcom Soars on $10B AI Chip Deal, Likely With OpenAI

Broadcom shares surged 15% Friday after unveiling a $10 billion AI chip order from a new, unnamed customer—an announcement that cements its role as a key custom chip supplier in the race to expand generative AI infrastructure. The blockbuster order immediately sparked speculation that the buyer is OpenAI, with analysts at J.P. Morgan, Bernstein, and Morgan Stanley pointing to the timing and scale of the deal.

If confirmed, the partnership would mark OpenAI’s biggest move yet toward developing its own in-house processors, reducing reliance on Nvidia and AMD, whose stock prices dipped 2% and 5% respectively after Broadcom’s news. Reuters previously reported that OpenAI had been working with Broadcom on a custom chip project.

The deal highlights Big Tech’s broader trend of diversifying away from Nvidia’s costly, supply-constrained GPUs. Microsoft, Amazon, Google, and Meta are already designing their own silicon. Broadcom, which already supplies custom AI chips to Google and Meta, now appears positioned to capture even more of the rapidly expanding market.

The rally added more than $200 billion to Broadcom’s valuation, boosting its market cap above $1.44 trillion. Analysts now forecast Broadcom’s AI revenue could surpass $40 billion in fiscal 2026, far above last quarter’s $30 billion projection.

Adding to investor optimism, longtime CEO Hock Tan confirmed he would remain in charge for at least another five years. Under his leadership, Broadcom has transformed into a central player in the global AI supply chain.