Yazılar

Apple Hit With Lawsuit Over Use of Books in AI Training

Apple was sued Friday in federal court in Northern California by authors who accuse the company of illegally using copyrighted books to train its “OpenELM” large language models. The proposed class action, filed by writers Grady Hendrix and Jennifer Roberson, claims Apple copied protected works without consent, credit, or compensation.

“Apple has not attempted to pay these authors for their contributions to this potentially lucrative venture,” the lawsuit alleges. Neither Apple nor the plaintiffs’ lawyers immediately commented.

The case adds Apple to the growing list of tech giants—Microsoft, Meta, and OpenAI among them—facing litigation over whether training AI on copyrighted material constitutes infringement or fair use. On the same day, Anthropic agreed to a $1.5 billion settlement with authors who accused it of training its Claude chatbot on pirated books, a deal hailed as the largest copyright recovery in history.

According to the lawsuit, Apple’s models were trained on a known dataset of pirated books, allegedly including works by Hendrix and Roberson. The case seeks damages and legal recognition that Apple must compensate authors when their intellectual property is used to build AI systems.

The dispute underscores the escalating clash between AI developers and creators, as courts weigh how copyright law applies to massive datasets powering generative AI. With multiple cases now moving forward in U.S. courts, the outcome could reshape both the AI industry and protections for authors in the digital era.

OpenAI’s Cash Burn Projected to Hit $115B by 2029 Amid Chip, Data Center Push

OpenAI has revised its financial outlook sharply upward, projecting it will burn through $115 billion by 2029, according to The Information. The new figure is about $80 billion higher than its earlier estimate, reflecting the surging costs of powering ChatGPT and other AI models.

The report says OpenAI expects to lose over $8 billion in 2024 alone, roughly $1.5 billion more than forecast earlier this year. The company anticipates that annual burn will balloon to $17 billion next year, rising to $35 billion in 2027 and $45 billion in 2028.

To rein in costs, OpenAI is pursuing vertical integration—developing its own AI server chips and data center infrastructure. Its first in-house chip, being developed in partnership with Broadcom, is expected in 2025 and will be used internally. On the infrastructure side, OpenAI has struck major agreements, including:

  • A $4.5 GW data center expansion with Oracle announced in July.

  • The Stargate project, a planned $500 billion, 10 GW buildout backed by SoftBank.

  • Expanded computing capacity through Google Cloud.

The staggering burn rate underscores the immense capital intensity of generative AI, where costs for cloud computing, GPUs, and electricity are skyrocketing. At the same time, it highlights OpenAI’s strategy to reduce reliance on external providers like Nvidia and Amazon Web Services by building a proprietary AI stack—from chips to data centers.

Broadcom Soars on $10B AI Chip Deal, Likely With OpenAI

Broadcom shares surged 15% Friday after unveiling a $10 billion AI chip order from a new, unnamed customer—an announcement that cements its role as a key custom chip supplier in the race to expand generative AI infrastructure. The blockbuster order immediately sparked speculation that the buyer is OpenAI, with analysts at J.P. Morgan, Bernstein, and Morgan Stanley pointing to the timing and scale of the deal.

If confirmed, the partnership would mark OpenAI’s biggest move yet toward developing its own in-house processors, reducing reliance on Nvidia and AMD, whose stock prices dipped 2% and 5% respectively after Broadcom’s news. Reuters previously reported that OpenAI had been working with Broadcom on a custom chip project.

The deal highlights Big Tech’s broader trend of diversifying away from Nvidia’s costly, supply-constrained GPUs. Microsoft, Amazon, Google, and Meta are already designing their own silicon. Broadcom, which already supplies custom AI chips to Google and Meta, now appears positioned to capture even more of the rapidly expanding market.

The rally added more than $200 billion to Broadcom’s valuation, boosting its market cap above $1.44 trillion. Analysts now forecast Broadcom’s AI revenue could surpass $40 billion in fiscal 2026, far above last quarter’s $30 billion projection.

Adding to investor optimism, longtime CEO Hock Tan confirmed he would remain in charge for at least another five years. Under his leadership, Broadcom has transformed into a central player in the global AI supply chain.