Yazılar

Samsung and SK Hynix to Supply Chips for OpenAI’s $500 Billion Stargate Project

Samsung Electronics and SK Hynix, South Korea’s top semiconductor manufacturers, have signed letters of intent to supply memory chips for OpenAI’s massive Stargate project, marking a major step in Seoul’s growing role in global artificial intelligence infrastructure.

As part of the deal, OpenAI will collaborate with both companies to build two new AI data centers in South Korea, branded as “Korean-style Stargate,” aligning with President Lee Jae Myung’s goal of turning the country into an AI innovation hub in Asia. The decision leverages South Korea’s strong industrial base and its status as the world’s second-largest ChatGPT subscription market after the United States.

The agreements were announced on Wednesday following a high-profile meeting in Seoul between OpenAI CEO Sam Altman, President Lee Jae Myung, and the chairmen of Samsung Electronics and SK Hynix.

The Stargate project, unveiled by U.S. President Donald Trump in January, aims to invest $500 billion into developing next-generation AI infrastructure with global partners such as SoftBank, Oracle, and now the South Korean chip giants. The initiative seeks to secure the computing capacity needed to sustain AI’s rapid growth and maintain U.S. leadership in the field.

South Korea’s presidential adviser Kim Yong-beom revealed that OpenAI plans to order 900,000 semiconductor wafers by 2029 and establish joint ventures with Samsung and SK Hynix to operate two 20-megawatt-capacity data centers domestically.

“The significant part of the Stargate project would be impossible without memory chips from the two companies,” said Kim.

He added that South Korea may also participate in financing the project.

Altman, in his remarks, emphasized the strategic importance of Korea:

“Korea has an industrial base like nowhere else in the world that is critical for the development of AI. We’re very excited to build Stargate Korea with Samsung and Hynix to support the sovereign AI needs of the country.”

Together, Samsung and SK Hynix control about 70% of the global DRAM market and nearly 80% of the HBM (High Bandwidth Memory) market. HBM technology, introduced in 2013, stacks chips vertically to save space, boost performance, and reduce power consumption, making it vital for AI data processing.

Analysts estimate that 900,000 wafers of advanced DRAM could be worth more than 100 trillion won ($70 billion), though prices may fluctuate depending on market conditions.

In addition to the memory supply deals:

  • Samsung SDS, an IT services affiliate, signed a partnership with OpenAI to develop and operate AI data centers under the Stargate framework.

  • Samsung Heavy Industries and Samsung C&T will collaborate on floating offshore data centers, designed to reduce cooling costs and carbon emissions.

Meanwhile, Google has also been in talks with several South Korean companies to explore potential AI collaborations. In June, SK Group announced a 7 trillion won investment, including $4 billion from Amazon Web Services, to build another major data center in the country.

Despite optimism about AI’s transformative potential, some investors remain cautious, citing the risk of a tech infrastructure bubble as companies rush to build large-scale data facilities.

The Stargate project, delayed earlier by prolonged negotiations and site selection, is now poised to gain new momentum through this South Korea partnership, reinforcing the nation’s position at the heart of the global AI supply chain.

SK Hynix Prepares HBM4 Production to Defend Market Lead Over Rivals

SK Hynix (000660.KS) announced on Friday that it has completed internal certification of its HBM4 (high-bandwidth memory 4) chips and established a production system, positioning itself to maintain its dominance in the advanced memory market.

Key Developments

  • In March 2025, SK Hynix shipped 12-layer HBM4 chip samples to customers.

  • The company aims to complete mass production preparations for these chips within H2 2025.

  • Shares rose 7% to a record high of 328,500 won ($236.71), outperforming the benchmark KOSPI’s 1.5% gain.

Market Context

  • HBM technology: First introduced in 2013, HBM stacks DRAM vertically to save space, cut power use, and process vast data volumes required by AI workloads.

  • Market share: SK Hynix is projected to hold about 60% of the HBM market in 2026, down slightly from its current 66%, according to Meritz Securities.

  • Customers: Nvidia remains its largest client, though Samsung Electronics and Micron supply smaller volumes.

Rival Strategies

  • Samsung Electronics: Plans to use a 1c-nanometer node for HBM4, compared to SK Hynix’s 1b-nanometer process, signaling a push to catch up despite a weaker track record. Samsung already provided HBM4 samples to customers and plans to start supply in 2026.

  • Micron: Competing with custom-built logic dies (“base dies”) that make it harder for customers to switch suppliers.

Industry Impact

  • SK Hynix’s first-mover advantage in HBM4 is expected to secure early contracts with major AI players like Nvidia.

  • Analysts note that customer-specific base dies mark a technological shift that could lock buyers into long-term supplier relationships.

Market Performance YTD

  • SK Hynix: +88.9%

  • Samsung Electronics: +41.7%

  • Micron (Nasdaq): +78.9%

  • KOSPI benchmark: +41.5%

Micron Expands US Investment by $30 Billion Amid Trump’s Onshoring Push

Micron Technology announced on Thursday a significant expansion of its U.S. investment plans, adding $30 billion to its existing commitments as President Donald Trump intensifies efforts to bring semiconductor manufacturing back to American soil. The memory chip maker now projects total investments of $200 billion, up from previous plans of approximately $125 billion.

The new funding will support the construction of a second cutting-edge memory fabrication facility in Boise, Idaho, and the expansion of its manufacturing site in Manassas, Virginia. “These investments are designed to allow Micron to meet expected market demand, maintain share and support Micron’s goal of producing 40% of its DRAM in the U.S.,” the company stated.

Micron’s DRAM chips are widely used in personal computers, automotive systems, industrial equipment, wireless communications, and artificial intelligence. The company’s High-Bandwidth Memory (HBM) products are seen as essential for powering next-generation AI models. About $50 billion of Micron’s total investment will be dedicated to research and development.

President Trump’s administration has pushed hard for semiconductor onshoring, with Trump threatening new tariffs on chip imports and reconsidering previous subsidies granted under former President Joe Biden. In December, Micron secured nearly $6.2 billion in government subsidies through Biden’s $52.7 billion 2022 CHIPS and Science Act. Trump’s administration is now renegotiating some of those grants, according to Commerce Secretary Howard Lutnick.

The expansion aligns with broader trends in the U.S. semiconductor industry. Nvidia, a key customer of Micron, announced plans in April to build AI servers worth up to $500 billion in the U.S. over the next four years, in partnership with firms such as Taiwan’s TSMC. “Micron’s investment in advanced memory manufacturing and HBM capabilities in the U.S., with support from (the) Trump administration, is an important step forward for the AI ecosystem,” said Nvidia CEO Jensen Huang.

Micron also finalized a $275 million direct funding award under the CHIPS Act to further support its Manassas facility expansion.