Yazılar

SK Hynix Prepares HBM4 Production to Defend Market Lead Over Rivals

SK Hynix (000660.KS) announced on Friday that it has completed internal certification of its HBM4 (high-bandwidth memory 4) chips and established a production system, positioning itself to maintain its dominance in the advanced memory market.

Key Developments

  • In March 2025, SK Hynix shipped 12-layer HBM4 chip samples to customers.

  • The company aims to complete mass production preparations for these chips within H2 2025.

  • Shares rose 7% to a record high of 328,500 won ($236.71), outperforming the benchmark KOSPI’s 1.5% gain.

Market Context

  • HBM technology: First introduced in 2013, HBM stacks DRAM vertically to save space, cut power use, and process vast data volumes required by AI workloads.

  • Market share: SK Hynix is projected to hold about 60% of the HBM market in 2026, down slightly from its current 66%, according to Meritz Securities.

  • Customers: Nvidia remains its largest client, though Samsung Electronics and Micron supply smaller volumes.

Rival Strategies

  • Samsung Electronics: Plans to use a 1c-nanometer node for HBM4, compared to SK Hynix’s 1b-nanometer process, signaling a push to catch up despite a weaker track record. Samsung already provided HBM4 samples to customers and plans to start supply in 2026.

  • Micron: Competing with custom-built logic dies (“base dies”) that make it harder for customers to switch suppliers.

Industry Impact

  • SK Hynix’s first-mover advantage in HBM4 is expected to secure early contracts with major AI players like Nvidia.

  • Analysts note that customer-specific base dies mark a technological shift that could lock buyers into long-term supplier relationships.

Market Performance YTD

  • SK Hynix: +88.9%

  • Samsung Electronics: +41.7%

  • Micron (Nasdaq): +78.9%

  • KOSPI benchmark: +41.5%

Alibaba and Baidu Turn to In-House Chips for AI Training Amid U.S. Restrictions

Alibaba and Baidu have begun using their own internally designed chips to train AI models, partly replacing Nvidia’s processors, according to a report from The Information. The move signals a major shift in China’s AI development strategy, as U.S. export controls continue to restrict access to advanced American-made semiconductors.

Key Developments

  • Alibaba has used its homegrown chips since early 2025 to train smaller AI models.

  • Baidu is testing its Kunlun P800 chip to train new versions of its Ernie AI model.

  • Both companies still rely on Nvidia for their most advanced models but are working to reduce dependence.

Impact on Nvidia

Nvidia remains dominant in AI training hardware, but China accounts for a large share of its business. The firm’s most powerful U.S.-approved chip for China, the H20, lags behind the H100 and Blackwell series — but still outperforms most Chinese alternatives.

However, employees cited by The Information said Alibaba’s latest AI chip matches the performance of Nvidia’s H20, narrowing the gap between U.S. and Chinese hardware.

An Nvidia spokesperson responded: “The competition has undeniably arrived … We’ll continue to work to earn the trust and support of mainstream developers everywhere.”

Geopolitical Pressure

  • U.S. export restrictions have pushed Chinese companies to accelerate domestic chip design.

  • Beijing has urged firms to rely on home-grown semiconductor technology as part of its strategic autonomy push.

  • Nvidia CEO Jensen Huang recently said talks with the White House over permission to sell a less advanced next-gen chip to China will take time.

According to the report, Nvidia has agreed to give the Trump administration 15% of China sales of its H20 chips in exchange for continued export licenses.

The Bigger Picture

China’s pivot toward domestic AI chips marks both a risk to Nvidia’s China revenues and a milestone for Chinese chipmakers, who are beginning to close the performance gap under intense geopolitical and economic pressure.

Nvidia Unveils “Rubin CPX” AI Chips for Video and Software Generation

Nvidia (NVDA.O) announced plans to launch a new AI chip, dubbed Rubin CPX, by the end of next year, targeting highly complex workloads such as video generation and AI-assisted software coding. The chip will be built on Nvidia’s upcoming Rubin architecture, the successor to its current Blackwell technology.

Why It Matters

  • AI systems are rapidly evolving, with tasks like video generation and “vibe coding” (AI-assisted software creation) pushing hardware to new limits.

  • Processing one hour of video can require up to 1 million tokens, a massive challenge for current GPUs.

  • Rubin CPX will integrate video decoding, encoding, and inference into a single system, making processing faster and more efficient.

Economic Angle

  • Nvidia estimates a $100 million investment in Rubin CPX systems could generate $5 billion in token revenue.

  • Wall Street is closely watching the ability of AI hardware firms to turn capital spending into measurable returns.

Market Impact

  • Nvidia already dominates the AI chip market, with its high-end processors fueling the latest wave of generative AI.

  • The company’s move reflects both its defensive strategy against rivals and its offensive push to expand AI capabilities beyond text and images into full-scale video and software generation.

The Bigger Picture

  • Nvidia’s rise has made it the world’s most valuable company, but competition in AI infrastructure is intensifying.

  • With Rubin CPX, Nvidia is betting that integrated, video-ready AI chips will anchor the next phase of AI growth — and cement its lead in the sector.