Yazılar

EU to make WhatsApp more responsible for tackling harmful content

The European Commission has formally designated Meta-owned WhatsApp as a “very large online platform” under the EU’s Digital Services Act, increasing its responsibility for addressing illegal and harmful content. The designation specifically applies to WhatsApp’s channels feature, not to its core private messaging service.

According to the Commission, WhatsApp channels reached an average of 51.7 million monthly active users in the European Union during the first half of 2025, exceeding the 45 million user threshold set by the DSA. Platforms above this limit are subject to stricter obligations, including enhanced risk assessments and stronger measures to limit the spread of illegal content.

The Digital Services Act requires very large platforms to invest more heavily in content moderation systems, a process that can be costly due to the scale of data involved and that also raises concerns around user privacy. Other companies already classified under the same category include Meta’s Facebook and Instagram, Google’s YouTube, TikTok, Temu and Microsoft’s LinkedIn.

Following the designation, Meta has four months to bring WhatsApp channels into full compliance with the additional DSA requirements, setting a deadline of mid-May 2026. A WhatsApp spokesperson said the company remains committed to improving safety and integrity measures as its channels continue to grow across the EU and globally.

EU Considers Applying Tougher Content Rules to WhatsApp Under Digital Services Act

The European Union is considering making WhatsApp more accountable for tackling illegal and harmful content after the messaging platform crossed a key user threshold under the bloc’s digital regulations, a European Commission spokesperson said on Friday.

WhatsApp, owned by Meta Platforms, reported about 51.7 million average monthly active users for its WhatsApp Channels service in the European Union during the first six months of 2025. This exceeds the 45 million user threshold set by the EU’s Digital Services Act (DSA), potentially bringing the service under stricter regulatory oversight.

The DSA imposes tougher obligations on so-called “very large online platforms,” requiring them to take stronger action against illegal and harmful content. Platforms already designated under this category include Meta’s Facebook and Instagram, YouTube, TikTok, Temu and LinkedIn.

European Commission spokesperson Thomas Regnier said the Commission’s focus is on distinguishing between private messaging, which falls outside the scope of the DSA, and public-facing features such as WhatsApp Channels, which function more like social media platforms.

“The objective for the Commission is to check what is actually private messaging, which doesn’t fall under the scope of the DSA, and what are open channels that act more as a social media platform, which do fall under the scope of the DSA,” Regnier told a daily press briefing. He added that the Commission is actively examining the issue and did not rule out formally designating WhatsApp Channels under the DSA.

WhatsApp was not immediately available for comment.
If designated as a very large online platform, WhatsApp could face fines of up to 6% of its global annual revenue for breaches of the DSA.

Poland urges Brussels to probe TikTok over AI-generated content

Poland has asked the European Commission to investigate TikTok after the platform hosted artificial intelligence–generated content calling for Poland to leave the European Union, which authorities said was almost certainly Russian disinformation.

Polish officials said a TikTok profile featuring videos of young women dressed in Polish national colours and promoting an exit from the EU had gained traction in recent weeks before disappearing from the platform. Warsaw argues the content posed risks to public order, information security and democratic processes both in Poland and across the EU.

In a letter to the Commission, Deputy Digitalization Minister Dariusz Standerski said the use of synthetic audiovisual material and the way it was distributed suggested TikTok was failing to meet its obligations as a “Very Large Online Platform” under EU law. A Polish government spokesperson said the videos contained Russian linguistic patterns, pointing to a coordinated disinformation effort.

TikTok said it has been in contact with Polish authorities and removed content where it violated platform rules. A Commission spokesperson confirmed receipt of Poland’s request, noting that under the Digital Services Act (DSA), very large platforms must assess and mitigate risks linked to their services, including those arising from AI-generated content. The Commission added that it had already sought information from TikTok and other platforms in March 2024 on how they address AI-related risks.

EU governments have stepped up scrutiny of social media platforms amid concerns over foreign interference in elections and domestic politics. Last year, the Commission opened formal proceedings against TikTok, which is owned by ByteDance, over its handling of election-related risks, including during Romania’s 2024 presidential vote.

Poland is now urging Brussels to open new proceedings under the DSA, which requires major platforms such as TikTok, Meta’s Facebook and X to remove harmful content. Breaches can result in fines of up to 6% of a company’s global annual turnover.