Yazılar

European Commission Reviews Child Safety Measures on Snapchat, YouTube, and App Stores

The European Commission has begun reviewing how platforms such as Snapchat, YouTube, Apple’s App Store, and Google Play protect minors online under the EU’s Digital Services Act (DSA). The investigation focuses on whether these companies’ safeguards are sufficient to prevent young users from being exposed to illegal products or harmful content.

The Commission has requested detailed information on age verification tools and on how the platforms block access to content promoting illegal substances, including drugs and vapes, as well as to material that could encourage eating disorders.

EU technology chief Henna Virkkunen said the assessment, carried out in cooperation with national authorities, aims to determine whether platforms are truly protecting children.

Google stated it already enforces “robust parental controls” and offers “age-appropriate experiences” across its platforms. “We keep expanding these efforts and continue to engage with the Commission on this critical area,” a Google spokesperson said.

The DSA, which came into full effect in 2024, imposes strict obligations on digital platforms to identify and mitigate risks linked to illegal or harmful content — marking one of the EU’s strongest steps toward regulating online safety for minors.

Dutch Court Orders Meta to Simplify Facebook and Instagram Timelines

A Dutch court has ordered Meta Platforms to change how it presents Facebook and Instagram timelines, ruling that users must be given a simple and direct way to opt out of personalized content based on profiling.

The decision, issued on Thursday, found that elements of Meta’s current design violate the EU’s Digital Services Act (DSA), a sweeping law intended to curb manipulative digital practices and increase user control over online platforms.

Under the ruling, Meta has two weeks to implement the changes in the Netherlands. Users must be able to select a chronological timeline or another non-profiled feed, and — critically — that choice must remain active instead of resetting when users close the app or browser.

The court said Meta’s practice of automatically reverting users to the algorithmic “recommended content” feed amounted to a “dark pattern”, a manipulative design that limits free choice and infringes on the right to freedom of information.

“People in the Netherlands are not sufficiently able to make free and autonomous choices about the use of profiled recommendation systems,” the court said.

The timing of the ruling was also significant: the court noted that these design practices could influence public opinion ahead of the Dutch general election on October 29, emphasizing the importance of media neutrality and user autonomy.

META TO APPEAL

Meta said it would appeal the decision, insisting it had already made substantial adjustments to comply with the DSA and had notified Dutch users about how to view non-personalized feeds.

“We introduced substantial changes to our systems to meet our regulatory obligations under the DSA,” a Meta spokesperson said. “Proceedings like this threaten the digital single market and the harmonized regulatory regime that should underpin it.”

Meta also argued that such rulings should be handled at the EU level rather than by individual member states, warning that fragmented national court decisions could undermine the DSA’s unified enforcement goals.

DIGITAL RIGHTS GROUP CELEBRATES

The Dutch digital rights organization Bits of Freedom, which filed the case, welcomed the court’s ruling.

“It is unacceptable that a few American tech billionaires can determine how we view the world,” said spokesperson Maartje Knaap, calling the decision a major victory for digital freedom and user rights in Europe.

The ruling marks a new milestone in the EU’s effort to hold global tech firms accountable under the DSA — and could inspire similar challenges in other member states as regulators and courts push for greater transparency and user control in digital platforms.

EU Presses Apple, Google and Microsoft on Efforts to Combat Financial Scams

European Union regulators have asked Apple, Google, Microsoft, and Booking.com to detail the steps they are taking to prevent their platforms from being used for financial scams, highlighting growing concern over the rising cost of online fraud.

The inquiry falls under the Digital Services Act (DSA), the EU’s sweeping legislation that requires major tech companies to take stronger action against illegal and harmful online content.

“Today, we sent requests for information, under the DSA, to Apple, Booking.com, Google and Microsoft on how they identify and manage risks related to financial scams,” EU tech chief Henna Virkkunen wrote on X.

Virkkunen warned that online fraud has become easier than ever to launch, frequently leading to significant financial losses for consumers. She noted that scams such as fake hotel listings, fraudulent banking apps, and deepfake videos of public figures promoting false investments cost Europeans more than €4 billion ($4.7 billion) each year.

Authorities worldwide have also raised alarms that AI tools could make scams like phishing and fake investment schemes more convincing and harder to detect.

The EU’s probe underscores its heightened scrutiny of Big Tech’s responsibilities in protecting users against financial crime.