Yazılar

Nvidia unveils AI models for faster, cheaper weather forecasts

Nvidia has released three open-source artificial intelligence models designed to improve the speed and cost efficiency of weather forecasting. The announcement was made at the American Meteorological Society’s annual meeting, highlighting the chipmaker’s broader push to apply AI software beyond traditional computing workloads.

The new models aim to replace conventional weather simulations, which are often expensive and time-consuming to run. Nvidia said its AI-driven approach can match or exceed the accuracy of traditional methods while delivering results significantly faster and at a lower operational cost once the models are trained.

One of the key commercial use cases is expected to be in the insurance sector, where companies rely on large-scale weather simulations to assess rare but damaging events such as floods and hurricanes. Traditional forecasting requires running large ensembles of simulations, a process that can be slow and costly. Nvidia said AI removes this bottleneck by enabling massive ensembles to be processed at unprecedented speed.

The models are part of Nvidia’s Earth-2 initiative and include tools for 15-day global forecasts, short-term severe storm prediction over the United States, and systems that combine data from multiple weather sensors to improve forecasting accuracy.

Microsoft rolls out next generation of its AI chips, takes aim at Nvidia’s software

Microsoft has unveiled the second generation of its in-house artificial intelligence chip, Maia 200, alongside new software tools designed to challenge Nvidia’s dominance among AI developers. The chip is going live this week at a Microsoft data center in Iowa, with a second deployment planned in Arizona, marking a key step in the company’s effort to reduce reliance on external chip suppliers.

The Maia 200 follows Microsoft’s first Maia chip introduced in 2023 and arrives as major cloud providers increasingly develop their own AI hardware. Companies such as Google and Amazon Web Services, traditionally large Nvidia customers, are now rolling out custom chips that compete directly with Nvidia’s offerings. The shift reflects growing demand for tailored AI infrastructure optimized for large-scale cloud workloads.

Alongside the new chip, Microsoft announced a suite of software tools to support developers, including Triton, an open-source programming framework that performs similar functions to Nvidia’s widely used Cuda software. By strengthening its software ecosystem, Microsoft is targeting what many analysts view as Nvidia’s most significant competitive advantage.

The Maia 200 is manufactured by Taiwan Semiconductor Manufacturing Company using advanced 3-nanometer technology and incorporates high-bandwidth memory. Microsoft has also emphasized the use of SRAM, a fast memory type that can improve performance for AI systems handling large volumes of user requests, a design choice increasingly favored by Nvidia’s emerging competitors.

Nvidia invests $2 billion in CoreWeave to boost data center build-out

Nvidia has invested $2 billion in CoreWeave, becoming the AI infrastructure provider’s second-largest shareholder as the two companies deepen their partnership to expand data center capacity across the United States. The announcement pushed CoreWeave’s shares up 9% in premarket trading, highlighting investor confidence in the growing demand for AI-focused cloud infrastructure.

CoreWeave is part of a group of so-called neocloud companies that supply specialized hardware and computing capacity for artificial intelligence workloads. Demand for these services has surged as enterprises accelerate AI adoption. Nvidia’s new investment is expected to help CoreWeave speed up the acquisition of land and power needed to construct large-scale data centers, with the company targeting more than 5 gigawatts of AI data center capacity by 2030.

The investment was made at a purchase price of $87.20 per share, adding roughly 23 million shares and nearly doubling Nvidia’s stake in CoreWeave. Nvidia had previously held a 6.3% stake, making it the company’s third-largest shareholder. Despite scrutiny over Nvidia’s investments in AI firms, CoreWeave said the funds would be used for data center expansion, research and development, and workforce growth, rather than for purchasing Nvidia processors.

Once a cryptocurrency miner, CoreWeave has transformed its business to focus on leasing Nvidia GPUs to technology and AI companies. CoreWeave’s chief executive said the expanded collaboration reflects strong and growing demand for Nvidia’s computing platforms across the AI ecosystem.