Yazılar

German Activists Sue X Over Lack of Election Influence Data

Two activist groups have filed a lawsuit against Elon Musk’s social media platform X, accusing it of violating European law by refusing to provide necessary data to track disinformation ahead of Germany’s national election on February 23. The Society for Civil Rights (GFF) and Democracy Reporting International (DRI) claim that X is not offering systematic access to important information, such as the reach of posts, likes, and shares, which other platforms have made available for monitoring.

According to Michael Meyer-Resende of DRI, the groups have the right to access this data under the European Union’s Digital Services Act. Despite requests, X has not granted access to the data needed for tracking public debates on the platform.

The lawsuit comes amid heightened concerns over online disinformation ahead of elections in Europe, especially after the controversial presidential election in Romania in 2024, which was allegedly influenced by a Russian-driven social media campaign, though Moscow denied any involvement.

The situation is further complicated by Musk’s endorsement of Germany’s far-right political party, Alternative for Germany (AfD), and his continued influence over the platform. Since taking control of Twitter (now X), Musk has limited access to data for researchers, charging for what was previously free, raising concerns about transparency and potential misuse of the platform in democratic processes.

 

EU Mulls Expanding Investigation into Elon Musk’s X, Says Digital Chief

The European Union is considering broadening its investigation into Elon Musk’s social media platform, X, to determine whether it has violated its content moderation regulations. This development comes from Henna Virkkunen, the European Commission’s Executive Vice President for Digital Affairs, who revealed that the EU is currently assessing whether the ongoing probe into X is comprehensive enough to cover all potential breaches. The investigation focuses on whether X has complied with the requirements set by the EU’s Digital Services Act (DSA), which is designed to ensure better regulation of online platforms.

In December 2023, the European Commission initiated formal proceedings against X, accusing the platform of not adequately addressing illegal content and disinformation. The Commission highlighted concerns that the social network might be falling short in meeting its obligations to tackle harmful or misleading material that spreads online. The allegations also suggest that X could be in violation of the transparency and deceptive design provisions outlined in the DSA, which require platforms to be more transparent about how they operate and ensure their features are not misleading to users.

The EU’s Digital Services Act imposes strict rules on tech companies, especially those with significant reach and user bases, to ensure that their platforms are safe for users and that illegal content is removed in a timely manner. The ongoing investigation into X is part of the EU’s broader effort to hold online platforms accountable and ensure they meet these regulatory standards. As the probe continues, the European Commission may expand its scrutiny if it finds that the current investigation is not fully addressing all areas of concern.

The potential expansion of the investigation into X underscores the EU’s commitment to regulating the digital landscape and addressing the challenges posed by large social networks. With disinformation and online harm becoming ever more prominent issues, the outcome of this probe could have significant implications for X and other platforms operating in the EU, especially regarding their content moderation practices and user safety protocols.

EU Tech Companies Agree to Stronger Measures Against Online Hate Speech

Meta’s Facebook, Elon Musk’s X, Google’s YouTube, and other tech giants have agreed to enhance their efforts to combat online hate speech under a revised code of conduct, which will now be incorporated into the European Union’s Digital Services Act (DSA). The update aims to make these platforms more accountable in tackling harmful content.

Key Points:

  • Revised Code of Conduct: Facebook, X, YouTube, and others have committed to improving their approach to addressing illegal hate speech on their platforms, under the updated voluntary code of conduct, initially launched in May 2016. This code will now align with the requirements of the EU’s Digital Services Act (DSA), which mandates tech companies to take stronger action against harmful and illegal online content.
  • Tech Companies’ Pledge: In addition to enhancing detection mechanisms, companies like Instagram, LinkedIn, TikTok, and Twitch, alongside the bigger players, have agreed to measures such as using automatic detection tools for hate speech and ensuring that at least two-thirds of hate speech notices are reviewed within 24 hours. They will also provide data on how their recommendation systems contribute to the spread of harmful content.
  • Transparency and Oversight: The updated code will also allow public and non-profit entities with expertise in hate speech to monitor how platforms handle hate speech notices. This will increase the transparency and accountability of tech companies, with a focus on issues like race, ethnicity, religion, and gender identity.
  • EU’s Position on Hate Speech: EU tech commissioner Henna Virkkunen emphasized that the European Union has no tolerance for illegal hate speech, whether online or offline. The strengthened code aligns with the DSA, which is pushing for stricter regulations on tech companies to address online harms and ensure that harmful content is swiftly removed.