Yazılar

CoreWeave Beats Q2 Revenue Estimates on AI Demand but Posts Larger Loss

Cloud services provider CoreWeave exceeded second-quarter revenue expectations on Tuesday, driven by strong demand for AI infrastructure, but a larger-than-expected net loss pushed its shares down 10% in after-hours trading.

REVENUE AND BACKLOG

  • Q2 revenue: $1.21 billion (est. $1.08B)

  • Revenue backlog: $30.1 billion as of June 30, up from $25.9 billion on March 31

  • Annual revenue forecast: Raised to $5.15–$5.35 billion from prior $4.9–$5.1 billion

LOSSES AND COSTS

  • Net loss: $290.5 million (est. $190.6M)

  • Operating expenses: Jumped to $1.19 billion from $317.7 million a year earlier
    CEO Michael Intrator noted the main challenge is accessing power shells to support AI infrastructure at scale.

AI GROWTH AND STRATEGY
CoreWeave operates 33 AI data centers in the U.S. and Europe and provides access to Nvidia chips for enterprises training large AI models.
The company highlighted rising demand for AI inference, particularly chain-of-thought reasoning models, which significantly increase computational requirements.

M&A AND CUSTOMER CONCENTRATION

  • CoreWeave’s $9 billion all-stock acquisition of Core Scientific will secure 1.3 GW of power under contract, though some shareholders oppose the deal.

  • The company acknowledged that its reliance on large customers like OpenAI is both a strategic advantage and a potential risk.

  • Contracts with hyperscalers have been expanded to meet growing demand.

MARKET RESPONSE
Shares fell 10% after-hours to $133.71, despite nearly tripling since the March IPO. Analysts noted that strong revenue visibility is tempered by cost growth and customer concentration risks.

IBM Launches Power11 Chips and Servers to Simplify AI Deployment in Business

IBM has unveiled its latest data center innovation with the launch of the Power11 chips and accompanying server systems, targeting more energy-efficient performance and streamlined AI adoption for enterprise use. This marks IBM’s first major update to its Power chip line since 2020.

Designed to compete with Intel and AMD in data centers—especially in sectors like financial services, manufacturing, and healthcare—IBM’s Power11 systems integrate tightly coupled chips and software to enhance reliability and security.

Tom McPherson, IBM’s Power Systems general manager, highlighted the new systems’ operational resilience: available from July 25, the Power11 servers require no planned downtime for software updates, and average unplanned downtime is just over 30 seconds annually. Crucially, the systems can detect and respond to ransomware attacks within one minute.

Later this year, IBM plans to integrate the Power11 chips with Spyre, its AI accelerator chip launched last year. Unlike Nvidia’s focus on AI training, IBM’s approach centers on simplifying AI inference—the practical deployment of AI to accelerate business tasks.

McPherson explained that IBM aims to offer seamless AI inferencing capabilities that improve business processes without the high computational power needed for AI training. Early customers are already working with IBM to integrate these AI functions.

This new line reflects IBM’s strategy to provide businesses with secure, efficient, and easy-to-deploy AI solutions, emphasizing inference acceleration over raw training performance.

Dutch Chipmaker AxeleraAI Receives $66 Million EU Grant for AI Chip Development

AxeleraAI, a prominent Dutch chipmaker focused on artificial intelligence (AI), has secured a grant of up to 61.6 million euros ($66 million) to develop a new chip designed for data centres, in line with European Union efforts to strengthen its AI capabilities.

The EU’s initiative aims to close the AI competitiveness gap between Europe, the United States, and China, by funding domestic chipmakers and establishing publicly funded AI factories—data centres that will be accessible to European scientists, companies, and startups.

Fabrizio Del Maffeo, AxeleraAI’s CEO, expressed his pride in the grant and the opportunity to expand the company’s business. AxeleraAI, based in Eindhoven, Netherlands, won the funding from EuroHPC, the agency responsible for the EU’s supercomputer and AI factory network. The company plans to use the funds to develop a chip tailored for “inference” AI computing, a process crucial for running AI models once they have been trained.

While AxeleraAI is not aiming to challenge Nvidia’s dominance in the data centre space, particularly in training large AI models, Del Maffeo emphasized that their chip will provide high-performance solutions for inference computing once networks are ready for deployment.

In addition, the rise of cost-effective AI models, like China’s DeepSeek, may drive increased demand for inference computing, offering AxeleraAI a valuable market opportunity. The company’s upcoming Titania chip will be built on the open-source RISC-V standard, an alternative to systems like Intel and Arm, gaining traction in industries like automotive and China.

AxeleraAI’s current chip, Metis, is used in “edge AI” applications, such as analyzing CCTV footage in factories to identify safety issues. Founded in 2021, AxeleraAI has previously raised $200 million in investments, including from Samsung.