Yazılar

Apple Appeals €500 Million EU Fine Over App Store Restrictions

Apple has officially filed a lawsuit challenging a €500 million ($587 million) antitrust fine imposed by the European Commission, contesting claims it violated the Digital Markets Act (DMA). The tech giant submitted the appeal on Monday, the final day to do so, at the EU’s General Court, the bloc’s second-highest legal authority.

The Commission ruled in April 2025 that Apple had unlawfully restricted app developers from directing users to cheaper payment options outside the App Store, a practice viewed as anti-competitive under the DMA.

In a public statement, Apple argued that the decision “goes far beyond what the law requires,” adding that the imposed fine was “unprecedented” and that the Commission is now effectively mandating how we run our store. Apple said it changed its policies to avoid daily fines of up to €50 million, or 5% of its average global daily revenue.

Despite modifying its App Store rules last month to comply with EU regulations, Apple insists the changes were made under protest, calling the Commission’s stance “confusing for developers and bad for users.” The company maintains that its original policies were fair and necessary for maintaining quality and user safety within the App Store ecosystem.

The European Commission has begun gathering feedback from developers to assess whether Apple’s revised App Store practices meet the obligations of the DMA. A decision on whether further changes will be required is still pending.

The case represents a significant moment in the EU’s broader campaign to rein in the influence of Big Tech, using the DMA to challenge gatekeeper platforms like Apple, Meta, Google, and Amazon. It also marks one of the first major legal battles under the DMA framework, setting a precedent for how tech firms may operate across the EU going forward.

EU’s AI Code of Practice for Firms Likely Delayed Until End of 2025

The European Commission announced on Thursday that the Code of Practice designed to help companies comply with the EU’s Artificial Intelligence Act (AI Act) may only come into effect by late 2025. This code aims to guide thousands of businesses on meeting the new AI regulations, especially for general-purpose AI (GPAI) models like OpenAI’s ChatGPT, Google’s, and Mistral’s AI systems.

Background and Delay Calls

  • The Code of Practice was originally slated for publication on May 2, 2025, but its release has been delayed.

  • Major tech companies, including Alphabet (Google), Meta, and European firms such as Mistral and ASML, alongside some EU governments, have requested postponements due to the lack of clear compliance guidelines.

  • The European AI Board is currently debating the timeline, with end of 2025 under consideration for full implementation.

Voluntary but Important

  • Signing up for the Code is voluntary, but companies that refuse will not gain the legal certainty given to signatories.

  • The Code will clarify the expected quality standards AI service users can demand, reducing risks of misleading claims by providers, according to Nick Moës, Executive Director of AI advocacy group The Future Society.

  • The Code also involves oversight by legally mandated authorities to assess AI service quality.

EU’s Position and Industry Reaction

  • Despite calls for delay, the Commission insists it remains committed to the AI Act’s goals of harmonized, risk-based AI regulations and market safety.

  • Critics, such as campaign group Corporate Europe Observatory, accuse Big Tech of using delay tactics to weaken crucial AI safeguards.

Enforcement Timeline

  • The AI Act’s rules on GPAI models become legally binding on August 2, 2025, but enforcement will begin only a year later, on August 2, 2026, for new models entering the market.

  • Existing AI models have until August 2, 2027, to comply fully with the regulations.

EU Faces Mounting Pressure to Delay Enforcement of AI Act as Deadline Nears

With key provisions of the EU Artificial Intelligence Act (AI Act) set to begin on August 2, major tech companies and political figures are urging the European Commission to delay enforcement. Critics say the current framework lacks sufficient guidance, placing a heavy burden on businesses—especially startups—without clear rules on how to comply.

What Happens on August 2?

Although the AI Act was passed in 2024, its rules are being phased in gradually. On August 2, some of the first obligations come into force—specifically for General Purpose AI (GPAI) models such as those developed by Google, OpenAI, Mistral, and others.

These initial provisions require AI developers to:

  • Draw up technical documentation

  • Disclose training data summaries

  • Comply with EU copyright laws

  • Conduct testing for bias, toxicity, and robustness

More rigorous rules apply to high-impact and systemic-risk models, which will need:

  • Adversarial testing

  • Incident reporting

  • Risk assessments

  • Energy efficiency disclosures

However, full enforcement—particularly penalties and oversight powers—doesn’t begin until August 2, 2026.

Why Are Companies Pushing for a Delay?

Tech companies argue that they lack clarity on how to comply with the law. A promised AI Code of Practice, meant to serve as the act’s compliance manual, was due on May 2 but has not been published. The European AI Board is now discussing pushing the guidance release to late 2025.

In an open letter, 45 European AI firms called for a two-year “clock-stop”—a suspension of the countdown to enforcement—until key standards are finalized. They also asked for simpler regulations, warning that unclear requirements could damage European innovation.

Lobbying group CCIA Europe, which represents companies like Google and Meta, said:

“A bold ‘stop-the-clock’ intervention is urgently needed to give AI developers and deployers legal certainty.”

Will the EU Postpone It?

Officially, the European Commission has not signaled a postponement. It insists that the August 2 start date for GPAI obligations stands, although the lack of finalized guidance suggests informal delays in compliance expectations.

Some political figures—including Swedish Prime Minister Ulf Kristersson—have also expressed concern, calling the act “confusing” and backing the idea of a pause.

What Comes Next?

Even if the AI Act’s initial deadlines hold, enforcement might be soft or flexible in the early stages due to the lack of practical tools. The AI Code of Practice remains the critical next step for clarity.

Meanwhile, the tension highlights a broader EU challenge: balancing innovation with regulation, especially in fast-moving fields like artificial intelligence.