Yazılar

Nvidia Shares Slip on Outlook

Nvidia shares declined despite strong earnings as investors shifted attention toward long-term returns and capital allocation.

Market participants remain cautious about the company’s continued investment in expanding the artificial intelligence ecosystem rather than prioritizing shareholder distributions.

The reaction reflects broader concerns about future growth sustainability as competitors advance new technologies and major cloud firms explore custom chip solutions.

While demand for AI infrastructure remains high, expectations around profitability and return on investment are becoming more prominent in market discussions.

Leadership reaffirmed its focus on reinvesting in innovation to support the evolving computing landscape.

The development underscores growing investor scrutiny of strategic priorities within the semiconductor sector.

Broadcom Targets 3D Chip Sales

Broadcom expects to sell at least one million advanced stacked chips by 2027, signaling a major step forward in its AI hardware strategy.

The company’s technology combines multiple silicon layers into a single integrated unit, improving performance and energy efficiency for high-demand computing tasks.

Early engineering samples are already being tested by partners, with broader production planned in the coming years. The design enables greater data flow between components, supporting increasingly complex AI workloads.

Broadcom’s approach also allows flexibility in manufacturing processes, helping customers tailor chip performance to specific needs.

The initiative is expected to open a substantial new revenue stream while strengthening the company’s position in the competitive AI semiconductor landscape.

Samsung Begins Shipping HBM4 Chips to Boost AI Position

Samsung Electronics said it has started shipping its most advanced high-bandwidth memory chips, HBM4, as it seeks to close the gap with rivals in supplying critical components for artificial intelligence accelerators.

Demand for high-performance memory has surged amid the global buildout of AI data centers. HBM chips are essential for feeding large volumes of data into AI accelerators, including those developed by Nvidia. Samsung has previously trailed competitors such as SK Hynix in delivering earlier-generation HBM products.

Samsung said its HBM4 chips deliver a consistent processing speed of 11.7 gigabits per second, a 22% improvement over its HBM3E predecessor, with peak speeds reaching 13 Gbps to address growing data bottlenecks. The company added that it plans to provide samples of next-generation HBM4E chips in the second half of the year.

Shares of Samsung rose following the announcement, reflecting investor optimism about its efforts to regain momentum in the competitive AI memory market. SK Hynix, which has maintained a leading position in HBM production, has said it aims to preserve its strong market share as competition intensifies. Meanwhile, U.S.-based Micron Technology has also begun high-volume production and customer shipments of HBM4.

The rollout underscores intensifying competition among memory manufacturers as AI infrastructure expansion continues to drive demand for faster, more efficient chip technologies.