Singapore unveils new law empowering online safety commission to block harmful content
Singapore will establish a new online safety commission with authority to compel social media platforms and internet providers to block harmful online content, under a bill tabled in parliament on Wednesday.
The proposed law follows research by the Infocomm Media Development Authority (IMDA) in February, which found that more than half of verified user complaints about online harms — including child abuse, cyberbullying, and harassment — were not promptly addressed by major platforms.
The commission, which is expected to be operational by mid-2026, will have powers to order platforms to restrict or remove harmful content, ban perpetrators, and grant victims a right to reply. It will also be able to direct internet service providers to block access to harmful web pages or entire platforms within Singapore.
The new agency will oversee cases of doxxing, stalking, abuse of intimate images, and child exploitation, with further powers to target non-consensual data disclosures and incitement of enmity added in later phases.
The bill will be debated in the next parliamentary session. Minister for Digital Development and Information Josephine Teo said the initiative aims to address the persistent failure of online platforms to act on harmful content. “More often than not, platforms fail to take action to remove genuinely harmful content reported to them by victims,” Teo said.
The move expands Singapore’s regulatory oversight following the Online Criminal Harms Act, which took effect in February 2024. Under that law, the Home Affairs Ministry previously threatened Meta with fines of up to S$1 million ($771,664) for failing to combat impersonation scams on Facebook.











