Yazılar

Iraq Bans Roblox Over Child Safety and Moral Concerns

The Iraqi government has announced a nationwide ban on the U.S.-based gaming platform Roblox (RBLX.O), citing child safety and moral concerns, as part of a wider crackdown across the Middle East on online games and virtual worlds.

Officials said the decision followed a comprehensive government study and field monitoring, which found that Roblox enabled direct communication between users — a feature they claimed exposed children and adolescents to online exploitation, cyber-extortion, and harmful behavior. The government also said the game’s content was “incompatible with Iraq’s social values and traditions.”

Roblox Corporation responded that safety was its top priority and expressed interest in working with Iraqi authorities to restore access. “We strongly contest recent claims made by the Iraqi authorities, which we believe are based on an outdated understanding of our platform,” a company spokesperson said.

The spokesperson added that Roblox had already suspended certain communication features, such as in-game chat, in Arabic-speaking regions, including Iraq, earlier this year as part of ongoing safety updates.

The Iraqi Ministry of Communications stated that the platform “involves several security, social, and behavioral risks,” emphasizing that the move was taken to protect young users.

The ban aligns Iraq with other Middle Eastern nations that have tightened regulation of digital entertainment platforms. In August 2024, Turkey similarly blocked access to Roblox, citing risks of child exploitation and abuse.

Analysts say the decision reflects a broader regional effort to regulate online gaming and interactive media, balancing youth protection with the growing popularity of global virtual platforms.

Florida Attorney General Targets Roblox With Criminal Subpoenas Over Child Safety Concerns

Florida’s Republican Attorney General James Uthmeier announced that his office has issued criminal subpoenas to Roblox Corp, accusing the gaming platform of becoming a “breeding ground for predators” that endangers children. In a statement shared on X, Uthmeier condemned the company, saying Roblox “enabled our kids to be abused” while profiting from them.

The subpoenas aim to uncover evidence about alleged criminal activities on the platform, including communications between suspected predators and victims. Roblox, which has over 70 million daily users — most of them minors — has been under growing scrutiny over child safety.

Concerns about the platform intensified after a Hindenburg Research report last year accused Roblox of failing to protect its young audience. In response, Roblox increased investments in user protection, implementing tighter messaging restrictions for children under 13, AI-based monitoring, and strict content moderation.

In a statement to Reuters, Roblox emphasized that it prohibits image and video sharing in chat, blocks personal information exchange, and is working on age estimation systems for users. “While no system is perfect, our teams and automated tools continuously monitor communications,” the company said.

The controversy extends beyond U.S. borders: Iraq banned Roblox this week, claiming its chat features expose children to exploitation and cyber-extortion. Meanwhile, Roblox faces multiple U.S. lawsuits — including in Louisiana and San Francisco — alleging the company fails to prevent sexual predators from targeting minors.

European Commission Reviews Child Safety Measures on Snapchat, YouTube, and App Stores

The European Commission has begun reviewing how platforms such as Snapchat, YouTube, Apple’s App Store, and Google Play protect minors online under the EU’s Digital Services Act (DSA). The investigation focuses on whether these companies’ safeguards are sufficient to prevent young users from being exposed to illegal products or harmful content.

The Commission has requested detailed information on age verification tools and on how the platforms block access to content promoting illegal substances, including drugs and vapes, as well as to material that could encourage eating disorders.

EU technology chief Henna Virkkunen said the assessment, carried out in cooperation with national authorities, aims to determine whether platforms are truly protecting children.

Google stated it already enforces “robust parental controls” and offers “age-appropriate experiences” across its platforms. “We keep expanding these efforts and continue to engage with the Commission on this critical area,” a Google spokesperson said.

The DSA, which came into full effect in 2024, imposes strict obligations on digital platforms to identify and mitigate risks linked to illegal or harmful content — marking one of the EU’s strongest steps toward regulating online safety for minors.