Yazılar

SK Hynix Soars to Record High as Big Tech AI Spending Fuels Chip Demand

SK Hynix shares surged to record levels after major U.S. technology companies signaled even stronger artificial intelligence infrastructure spending, reinforcing investor confidence that the global AI semiconductor boom — particularly for advanced memory chips — is far from slowing.

The South Korean chipmaker, a major supplier of high-bandwidth memory (HBM) used in AI servers, benefited from renewed expectations that hyperscalers including Microsoft, Meta, Alphabet, and Amazon will continue aggressively expanding data center capacity despite soaring component costs. Combined AI-related capital expenditure from major U.S. tech firms is now expected to exceed $700 billion this year, significantly increasing pressure on already constrained semiconductor supply chains.

SK Hynix’s rally also reflects its strategic advantage in memory markets critical to AI accelerators. As advanced AI workloads increasingly depend on high-performance memory, SK Hynix has emerged as one of the most direct beneficiaries of infrastructure-scale AI deployment.

The company’s outperformance relative to Samsung also highlights investor preference for firms perceived as more directly leveraged to current AI demand without comparable labor or operational uncertainty. Samsung’s labor tensions have created additional caution despite broader industry strength.

Executives and central bank officials are increasingly suggesting this semiconductor cycle may differ from previous boom-bust patterns because AI demand is more structurally embedded in cloud computing, enterprise software, defense systems, and future digital infrastructure than earlier consumer-driven chip surges.

A critical factor remains supply scarcity. Big Tech executives have openly acknowledged that memory shortages and pricing inflation are becoming defining constraints on AI expansion. This dynamic is boosting pricing power for leading memory suppliers while reinforcing investor expectations that companies like SK Hynix may sustain elevated profitability longer than traditional semiconductor cycles.

The broader market takeaway is clear: as AI infrastructure spending accelerates globally, memory chipmakers are becoming foundational to the next phase of technological competition.

Nvidia B300 Servers Hit $1M in China as US Curbs Tighten Supply

Nvidia’s advanced B300 AI servers are reportedly selling for nearly 7 million yuan, around $1 million, in China as stricter US export controls and anti-smuggling crackdowns sharply reduce supply. According to industry sources, prices have almost doubled from roughly 4 million yuan late last year, creating a major scarcity premium in the Chinese grey market.

The B300 server, equipped with eight B300 GPUs, costs around $550,000 in the United States, but Chinese demand for high-end AI computing has pushed prices far beyond that level. Chinese technology companies are aggressively seeking cost-efficient hardware to power AI inference and token generation, while many remain cautious about directly holding Nvidia systems due to sanctions concerns.

Reuters reports that pressure increased after US authorities prosecuted Supermicro co-founder Wally Liaw in March, disrupting key black-market supply channels. Nvidia emphasized that B300 systems are restricted from sale in China and warned that unauthorized diversion would receive no support or service from the company.

Some Chinese firms unable to afford direct purchases are instead turning to rentals, with short-term annual contracts reaching 190,000 yuan per month. At the same time, domestic players like Huawei are trying to capitalize on Nvidia’s restricted access, challenging Nvidia’s estimated 55% Chinese AI chip market share.

The surge highlights how geopolitical restrictions are reshaping China’s AI infrastructure market, driving up costs while accelerating local competition in advanced computing hardware.

Intel’s SambaNova Investment Clears U.S. Antitrust Review

Intel has secured U.S. antitrust clearance for its expanded investment in AI chip startup SambaNova, removing a potential regulatory hurdle as the semiconductor giant deepens its position in one of the industry’s fast-growing artificial intelligence infrastructure segments.

Intel invested $35 million in SambaNova earlier this year, increasing its ownership stake to 8.2% from 6.8%, and plans an additional $15 million investment. The approval signals that U.S. regulators do not currently view the deal as posing significant competitive concerns, despite Intel CEO Lip-Bu Tan also serving as chairman of SambaNova.

The move is strategically significant as Intel seeks broader exposure to AI hardware markets beyond its traditional CPU dominance. SambaNova specializes in AI accelerators and enterprise-scale machine learning systems, placing it in direct competition with other advanced AI chipmakers operating in a rapidly expanding market shaped by surging demand for generative AI, inference, and large-scale data center compute.

For Intel, the deal may serve multiple purposes: financial upside through startup growth, strategic influence in AI infrastructure, and diversification as the company works to strengthen its broader semiconductor relevance amid fierce competition from Nvidia, AMD, and emerging AI-focused firms.

Regulatory approval also highlights how government scrutiny is increasingly focused not only on large acquisitions, but also on minority strategic investments that could affect competitive dynamics in critical technology sectors. While the current transaction passed review, Intel’s growing involvement with SambaNova may continue attracting attention as AI chip competition intensifies.

The broader implication is clear: major semiconductor players are increasingly using targeted startup investments to secure positioning in the next phase of AI compute expansion, where ownership, partnerships, and ecosystem control may prove as important as chip performance itself.