Yazılar

UK Weighs Australia-Style Social Media Ban for Children Under 16

Britain is considering an Australia-style ban on social media use for children under the age of 16, as the government steps up scrutiny of how digital platforms affect young people’s mental health and development. Prime Minister Keir Starmer said children risk being drawn into “a world of endless scrolling, anxiety and comparison,” and warned that the government is ready to take robust action.

The move follows an announcement that officials will examine whether features such as infinite scrolling should be restricted and whether the current age at which children can access social media platforms is appropriate. Ministers are set to visit Australia, which last month became the first country to introduce a nationwide ban on social media for under-16s, to study how the policy is enforced. Technology Secretary Liz Kendall said Britain is considering the same age threshold.

While supporters argue that a ban would provide clear protection for children, critics warn it could push harmful activity underground or reduce access to the positive aspects of social media. The government is also reviewing stronger age-verification checks and whether the UK’s digital age of consent is too low.

Concerns have intensified amid the rapid spread of AI-generated content online, including recent reports involving xAI’s Grok chatbot generating non-consensual sexual images. Britain has already announced plans to ban AI nudification tools and remove addictive platform features, alongside enforcing the Online Safety Act, which has increased age checks and reduced access to harmful content.

Starmer said no option is off the table as the government works with experts to identify the most effective safeguards for children online.

TikTok Collected Sensitive Data on Canadian Children, Probe Reveals

TikTok has pledged to strengthen safeguards to keep children off its platform after a Canadian investigation concluded that the company failed to adequately block underage users and protect their personal information.

The inquiry, led by Canada’s federal privacy commissioner Philippe Dufresne along with privacy watchdogs in Quebec, British Columbia, and Alberta, found that hundreds of thousands of Canadian children used TikTok annually despite the platform’s minimum age requirement of 13.

Investigators also determined that TikTok collected sensitive personal data from “a large number” of children and used it for marketing and content-targeting purposes. “TikTok collects vast amounts of personal information about its users, including children. This data is being used to target the content and ads that users see, which can have harmful impacts, particularly on youth,” Dufresne said at a press conference.

In response, TikTok agreed to adopt stricter age-verification systems, improve transparency about how user data is used, and prevent advertisers from directly targeting anyone under 18, except through broad categories such as language or approximate location. The company also expanded the privacy information available to Canadian users.

A TikTok spokesperson said the company was pleased regulators accepted several of its proposals to “further strengthen” protections for Canadian users, while noting disagreement with some of the findings. The spokesperson did not specify which ones.

The case comes amid growing global scrutiny of TikTok due to concerns about its ties to China. TikTok is owned by Beijing-based ByteDance, and governments worldwide—including the EU and the U.S.—have taken steps to restrict or ban the app on official devices.

In Canada, the government launched a review of TikTok’s planned expansion in 2023, which ultimately led to an order demanding the company shut down its Canadian operations over national security risks. TikTok is challenging that order.