1. The Code of Practice on Disinformation (COPE) dates back to 2022 and has been signed by 42 companies, including major tech giants like Google, Meta, Microsoft, and TikTok. The EU introduced the Digital Services Act (DSA), which aims to establish a benchmark for assessing compliance with the DSA. This initiative is meant to ensure platforms commit to transparency in political advertising and fair competition during elections.
2. The Code of Practice outlines voluntary commitments and measures to combat disinformation, emphasizing transparency and collaboration involving all parties. The EU is taking formal measures to assess compliance with the DSA, emphasizing the need for transparency in political advertising. However, signing the code does not automatically imply "preumption of innocence," as the EU clarified. The Commission aimed to complete the work by January, and Paul Gordon emphasized the importance of meaningful engagement, stating that platforms should not be limited to ticks-and-box exercises.
3. The DSA has come into effect in 2023, and the EU is administering multiple investigations into platforms, including X, TikTok, and Meta’s Facebook and Instagram. Meta’s move to discontinue fact-checking in the US and shift to a community notes system is causing legal challenges, with the company opting out of the COPE. The EU expects X to address these vulnerabilities by providing detailed information about its algorithmic changes and enhancing data protection.
4. The Code of Practice on countering illegal hate speech online was first drafted in 2016 and has been signed by platforms like Facebook, Instagram, and TikTok. The EU’s reviewing of Meta’s platform, X, revealed several breaches of the DSA, including limitations on data access, non-compliance with advertising regulations, and the misuse of verification systems by fraudsters. These findings have prompted a deeper investigation into X’s potential role in amplifying far-right messaging.
5. The EU is implementing additional legal measures to address the challenges X faced, including fines and stricter oversight of its operations within the EU. These measures aim to RückRPETEN on X’s algorithmic changes and data practices. In 2025, Elon Musk’s livestream with Alice Weidel, which continued to promote algorithmic manipulation, has led to legal concerns. The ongoing ambiguity surrounding X’s role may further complicate the ongoing process for Musk’s terms.
In summary, the COPE and the EU’s ongoing efforts to enforce stricter guidelines for disinformation and hate speech are creating significant complexities in the online environment. The consequences of this legal wranglement could have far-reaching effects for businesses, individuals, and citizens alike.