Yazılar

China’s CXMT eyes $4.2 billion Shanghai listing to fund DRAM expansion

China’s leading dynamic random access memory (DRAM) chipmaker ChangXin Memory Technologies (CXMT) said on Tuesday it plans to raise 29.5 billion yuan ($4.22 billion) through an initial public offering in Shanghai, as it seeks to expand production and narrow the gap with global rivals.

According to its prospectus, CXMT will issue 10.6 billion shares, with proceeds earmarked primarily for upgrading production lines, improving manufacturing technologies and boosting research and development of advanced DRAM products. The listing follows the company’s unveiling last month of its latest DDR5 DRAM chips, directly challenging established competitors in South Korea and the United States.

Founded in 2016 with strong state backing, CXMT has become a cornerstone of China’s ambition to build a domestic memory chip industry. The global DRAM market is currently dominated by Samsung Electronics, SK Hynix and Micron Technology, which together control more than 90% of the market. CXMT held around a 4% global market share in the second quarter, according to data from Omdia cited in the prospectus.

Picture background

The company operates three 12-inch DRAM fabrication plants in Beijing and at its headquarters in Hefei, Anhui province. After nine funding rounds, CXMT counts major Chinese investors such as Alibaba and Xiaomi, and has developed four generations of DRAM technology.

CXMT is also investing heavily in high-bandwidth memory (HBM), a specialised form of DRAM essential for advanced processors such as Nvidia’s graphics processing units used in generative AI. The company plans to begin HBM production by the end of 2026 at a back-end packaging facility under construction in Shanghai.

Financially, CXMT expects strong growth. It projects revenue could rise by as much as 140% year-on-year in 2025, driven by higher memory prices and increased sales volumes since July. While the company has posted heavy losses in recent years, it said it could turn profitable as early as 2026, depending on wafer shipments and average selling prices. CXMT reported losses of 8.32 billion yuan in 2022, 16.3 billion yuan in 2023 and 7.1 billion yuan in 2024, and recorded a 2.3-billion-yuan loss in the first half of this year.

Micron tops forecasts with AI-fueled HBM demand, sees strong Q1 revenue

Micron Technology projected first-quarter revenue of $12.5 billion ± $300 million, well above Wall Street’s estimate of $11.94 billion, as booming demand for its high-bandwidth memory (HBM) chips drives growth amid the AI race.

AI demand supercharges Micron

  • Q4 HBM revenue hit nearly $2 billion, putting Micron on pace for ~$8B annually, CEO Sanjay Mehrotra said.

  • HBM chips, built by stacking DRAM vertically, reduce power use while enabling massive data processing — making them indispensable for training and running advanced AI models.

  • Micron is a key HBM supplier to Nvidia, whose dominance in AI accelerators makes HBM supply one of the most competitive battlegrounds in semiconductors.

2026 outlook already sold out

  • Micron expects to lock in deals for all 2026 HBM capacity in the coming months.

  • HBM3E pricing agreements are nearly complete; HBM4 pricing talks are ongoing.

  • “The pricing on HBM4 is actually significantly higher than the pricing on HBM3E,” said Chief Business Officer Sumit Sadana, citing tight supply and strong ROI expectations.

  • TSMC will partner with Micron to manufacture the base logic die for its HBM4E chips.

Financial performance

  • Adjusted Q4 EPS: $3.03, topping forecasts.

  • Adjusted gross margin forecast (Q1): 51.5%, far above expectations of 45.9%.

  • Analysts said stronger-than-expected pricing drove the margin boost.

U.S. policy and subsidies

  • Micron has received $6.2B under the CHIPS and Science Act, passed under former President Joe Biden.

  • Current Commerce Secretary Howard Lutnick is exploring converting subsidies into equity stakes in chipmakers, but Sadana said Micron does not expect its grant terms to change.

  • Micron recently received a disbursement after completing a milestone at its Idaho fab, Mehrotra confirmed.

Big picture

Micron is riding the wave of AI-driven chip demand, securing long-term contracts at higher prices while boosting profitability. With HBM4 set to command premium pricing, Micron is positioning itself as a critical player alongside Nvidia, Samsung, and SK Hynix in the global AI supply chain.

Micron Expands US Investment by $30 Billion Amid Trump’s Onshoring Push

Micron Technology announced on Thursday a significant expansion of its U.S. investment plans, adding $30 billion to its existing commitments as President Donald Trump intensifies efforts to bring semiconductor manufacturing back to American soil. The memory chip maker now projects total investments of $200 billion, up from previous plans of approximately $125 billion.

The new funding will support the construction of a second cutting-edge memory fabrication facility in Boise, Idaho, and the expansion of its manufacturing site in Manassas, Virginia. “These investments are designed to allow Micron to meet expected market demand, maintain share and support Micron’s goal of producing 40% of its DRAM in the U.S.,” the company stated.

Micron’s DRAM chips are widely used in personal computers, automotive systems, industrial equipment, wireless communications, and artificial intelligence. The company’s High-Bandwidth Memory (HBM) products are seen as essential for powering next-generation AI models. About $50 billion of Micron’s total investment will be dedicated to research and development.

President Trump’s administration has pushed hard for semiconductor onshoring, with Trump threatening new tariffs on chip imports and reconsidering previous subsidies granted under former President Joe Biden. In December, Micron secured nearly $6.2 billion in government subsidies through Biden’s $52.7 billion 2022 CHIPS and Science Act. Trump’s administration is now renegotiating some of those grants, according to Commerce Secretary Howard Lutnick.

The expansion aligns with broader trends in the U.S. semiconductor industry. Nvidia, a key customer of Micron, announced plans in April to build AI servers worth up to $500 billion in the U.S. over the next four years, in partnership with firms such as Taiwan’s TSMC. “Micron’s investment in advanced memory manufacturing and HBM capabilities in the U.S., with support from (the) Trump administration, is an important step forward for the AI ecosystem,” said Nvidia CEO Jensen Huang.

Micron also finalized a $275 million direct funding award under the CHIPS Act to further support its Manassas facility expansion.