Yazılar

TE Connectivity to Acquire Richards Manufacturing for $2.3 Billion

TE Connectivity (TEL.N) announced it will acquire utility grid products manufacturer Richards Manufacturing Co for approximately $2.3 billion in cash, aiming to bolster its position in the electrical utilities sector amid surging power demand.

The acquisition comes as the power needs of data centers are expected to double within five years due to the rapid development and adoption of artificial intelligence. Demand is projected to rise from 176 TWh in 2023 to between 325 and 580 TWh by 2028.

President Donald Trump recently supported a $500 billion investment pledge by tech companies and investors to build infrastructure for AI facilities, highlighting the sector’s growing energy demands. Additionally, aging grid infrastructure, increased extreme weather events, and a shift toward greener energy sources are driving the need for grid upgrades and more resilient systems.

TE Connectivity CEO Terrence Curtin said, “The acquisition of Richards Manufacturing aligns with our strategy and positions us to further capitalize on an accelerating grid replacement and upgrade cycle in North America.”

Following the news, TE Connectivity’s shares rose about 4% in pre-market trading.

The Galway, Ireland-based company will acquire Richards Manufacturing from funds managed by Oaktree Capital Management, L.P., and the Bier family, long-time owners of the business. The deal is expected to close in June, financed through a combination of cash and new debt.

Once completed, Richards Manufacturing will become part of TE’s Industrial Solutions segment, contributing an estimated $400 million to annual sales. The acquisition is expected to enhance TE’s sales growth and adjusted operating margins, with projected accretion of about 10 cents to adjusted EPS in the first full year.

Goldman Sachs & Co. LLC is serving as TE Connectivity’s financial advisor, with Davis Polk & Wardwell LLP providing legal counsel.

Connection Challenge Could Hamper France’s AI Hub Ambitions Despite Nuclear Power Advantage

France’s bid to become a global leader in artificial intelligence (AI) is facing potential setbacks due to delays in connecting power-hungry data centres to the national electricity grid. Despite boasting abundant nuclear energy—critical to attracting AI investments—the time it takes to establish the necessary infrastructure could slow down the country’s growth in the sector.

Macron’s Vision and Investments:

In a recent AI summit, French President Emmanuel Macron highlighted the country’s reliance on clean and reliable nuclear power as a key asset for AI development. With over 100 billion euros ($103.26 billion) in AI investment pledges, France is positioning itself as a major player in Europe’s race to catch up with the U.S. The pledge includes a $10 billion supercomputer facility by UK-based Fluidstack, which will require 1 gigawatt (GW) of power—equivalent to the output of one of France’s smaller nuclear reactors.

Brookfield, a global asset manager, also committed to spending 20 billion euros to develop AI infrastructure, including data centres. With 57 nuclear reactors, France produces over two-thirds of its electricity from nuclear power, and last year, it exported a record amount of energy, mostly to Italy.

Grid Connection Bottleneck:

The challenge lies not in generating the electricity but in connecting it to the data centres. France’s energy grid, though robust, may struggle to keep up with the surge in demand that AI data centres will bring. Experts warn that, while building data centres can be completed in under a year, constructing the necessary transmission lines to supply them with power could take up to five years.

Fatih Birol, executive director of the International Energy Agency, highlighted the issue at the AI summit, noting that countries with sustainable and affordable electricity supplies have a competitive edge. However, the slow pace of building the required transmission infrastructure presents a bottleneck for France’s ambitious plans.

Efforts to Expedite Construction:

Construction and permitting procedures in Europe are notably slower than in the U.S., as Anj Midha, a general partner at Andreessen Horowitz, pointed out. In response, state-owned utility EDF has identified four sites for data centres on its land, with existing grid connections and 2 GW of power already available. These sites are expected to reduce project timelines by several years, but challenges remain.

EDF is also in talks with companies to power additional 1 GW data centre projects, though the completion of these sites may still be delayed by the need for public consultation and the high costs associated with constructing new high-voltage power lines.

OpenAI Set to Finalize First Custom Chip Design This Year

OpenAI is advancing toward its goal of reducing its reliance on Nvidia by finalizing the design of its first in-house artificial intelligence (AI) chip, sources familiar with the matter told Reuters. The company plans to send its first custom-designed chip for fabrication at Taiwan Semiconductor Manufacturing Co. (TSMC) in the coming months, marking a significant step toward mass production, which is expected to begin in 2026.

The process, referred to as “taping out,” involves sending the chip design to a factory for production. While the initial tape-out can cost tens of millions of dollars and take six months for completion, there’s no guarantee the first version of the chip will be successful. If issues arise, OpenAI would need to diagnose and repeat the tape-out process, which can delay production further.

OpenAI views this chip development as a strategic move to enhance its negotiating position with other chip suppliers. The company’s engineers plan to build upon this initial design, creating increasingly advanced processors with broader capabilities for future iterations. If the first tape-out is successful, OpenAI aims to test its custom AI chip as a potential alternative to Nvidia’s chips later this year.

OpenAI’s in-house team, led by Richard Ho, who joined from Google’s custom AI chip program, is collaborating with Broadcom to design the chip. Despite being a smaller team compared to those at tech giants like Google and Amazon, OpenAI’s chip development is progressing at a remarkable pace, outpacing the years-long efforts of other companies in the space.

Currently, Nvidia dominates the AI chip market with an 80% share, but the increasing costs and reliance on a single supplier have prompted major companies, including OpenAI, to explore alternatives. OpenAI’s custom chip is designed to train and run AI models and will initially be deployed on a limited scale. The chip will be manufactured using TSMC’s advanced 3-nanometer process technology and will feature systolic array architecture, high-bandwidth memory (HBM), and extensive networking capabilities—similar to Nvidia’s chips.

While the first chip is expected to play a limited role within OpenAI’s infrastructure, the company plans to expand its AI chip program in the future. To match the scale of Google or Amazon’s AI chip programs, OpenAI would need to expand its engineering team significantly.