Australia’s eSafety Commissioner Criticizes YouTube, Apple for Failing to Address Child Abuse Material
Australia’s internet safety regulator, the eSafety Commissioner, released a report on Wednesday accusing major social media platforms, notably YouTube and Apple, of “turning a blind eye” to online child sexual abuse material (CSAM). The watchdog highlighted YouTube’s unresponsiveness to inquiries and its failure to track user reports and response times related to CSAM.
The report found that YouTube, along with Apple, could not provide data on the number of user reports about child abuse content or the speed of their responses. The Australian government recently decided to include YouTube in its groundbreaking ban on social media use for teenagers, reversing an earlier exemption based on the Commissioner’s advice.
Julie Inman Grant, eSafety Commissioner, stated that these companies fail to prioritize child protection and are allowing serious crimes to occur unchecked on their platforms. She emphasized that no other consumer-facing industry would be permitted to operate while enabling such crimes.
In response, a Google spokesperson clarified that eSafety’s criticisms were based on reporting metrics rather than overall safety performance, noting that YouTube proactively removes over 99% of abuse content before it is flagged or viewed.
The report also assessed other platforms, including Meta (Facebook, Instagram, Threads), Apple, Discord, Microsoft, Skype, Snap, and WhatsApp, finding “safety deficiencies” such as failures to detect or block livestreaming of abuse content, inadequate reporting mechanisms, and inconsistent use of hash-matching technology to identify known abuse images.
Despite warnings in prior years, some companies have not sufficiently addressed these gaps. The report specifically noted that Apple and YouTube did not disclose how many trust and safety staff they employ or detailed information about user reports on child abuse content.

