The European Union has initiated a formal investigation into TikTok under the Digital Services Act, citing concerns related to child safety, risk management, and other pertinent issues
The European Union has formally launched an investigation into TikTok’s compliance with the bloc’s Digital Services Act (DSA), as announced by the Commission.
The investigation will focus on several key areas, including the protection of minors, advertising transparency, data access for researchers, and the management of risks related to addictive design and harmful content, according to a press release from the Commission.
The DSA serves as the EU’s regulatory framework for online governance and content moderation, which now applies broadly to thousands of platforms and services since its implementation on Saturday. However, larger platforms like TikTok have been subject to additional requirements since last summer, particularly in areas such as algorithmic transparency and systemic risk. It is these additional rules that TikTok is being investigated for.
Penalties for confirmed violations of the DSA can amount to up to 6% of global annual turnover.
The Commission’s decision follows several months of information gathering, including requests for information from TikTok on issues such as child protection and disinformation risks.
Although the EU’s concerns regarding TikTok’s content governance and safety predate the DSA, the platform was compelled to make operational adjustments back in June 2022, following an investigation by regional consumer protection authorities into child safety and privacy complaints.
The Commission will now intensify its information requests to TikTok as it investigates suspected breaches. This may involve interviews, inspections, and requests for additional data.
There is no formal deadline for the conclusion of the investigation, with its duration depending on factors such as the complexity of the case and the extent of cooperation from the company under investigation.
TikTok responded to the formal investigation by stating that it has implemented features and settings to protect teenagers and prevent children under 13 from accessing the platform. The company expressed its commitment to working with experts and the industry to ensure the safety of young users and welcomed the opportunity to explain its efforts to the Commission.
TikTok confirmed receiving a document from the Commission outlining the decision to open an investigation. The company stated that it has responded to previous requests for information from the Commission but has yet to receive feedback on its responses. Additionally, TikTok offered its internal child safety staff to meet with Commission officials, but the offer has not been taken up.