Russian Disinformation Campaign Utilizes Sophisticated "Doppelganger" Websites to Spread Pro-Kremlin Narratives Across Europe
A sophisticated and persistent disinformation campaign orchestrated by Russian agents has been spreading pro-Kremlin narratives across Europe through the use of cloned websites mimicking reputable news sources, according to a joint report by several European press freedom organizations. These "doppelganger" websites, named after the German word for a near-identical double, mirror the design and layout of legitimate news outlets, often differing only by a subtle change in the domain name, making them difficult for casual readers to distinguish from authentic sources. This tactic allows the spread of fabricated stories and propaganda that align with Russian interests, effectively injecting disinformation into the European media landscape. The campaign, active for at least two years, has intensified its tactics, incorporating artificial intelligence to create deepfake images and audio clips of well-known journalists, further eroding public trust in legitimate news sources.
The recent incident involving Poland’s public broadcaster, Polskie Radio, illustrates the deceptive nature of these doppelganger websites. A fake website, polskieradio.icu, closely mimicked the broadcaster’s genuine site, polskieradio.pl, publishing pro-Russian, Euroskeptic headlines designed to mislead readers. This incident, documented by the International Press Institute (IPI) and other press freedom watchdogs, highlights the insidious nature of the disinformation campaign, which aims to subtly manipulate public opinion by exploiting the credibility of established news organizations. The deceptive nature of these websites, often almost indistinguishable from the real thing, makes them particularly effective in spreading disinformation among unsuspecting audiences.
The U.S. Department of Justice has identified these websites as part of "Russian government-directed foreign malign influence campaigns," informally known as "Doppelganger." In September, the DOJ announced the seizure of 32 internet domains used in these campaigns, highlighting the U.S. government’s recognition of the threat posed by this sophisticated disinformation operation. The fake websites frequently publish content related to politically sensitive topics, aiming to sway public opinion on key issues. This content is then amplified through social media platforms, utilizing both paid advertisements and bot accounts to reach wider audiences. The insidious nature of this campaign lies in its ability to seamlessly integrate false narratives into the mainstream media landscape, thereby impacting public discourse and potentially influencing political outcomes.
The Media Freedom Rapid Response (MFRR) consortium, comprised of European press freedom watchdogs, has documented the evolution of this disinformation campaign. While in 2023, the fake websites primarily relied on fabricated articles, the tactics have shifted in 2024 to incorporate AI-generated deepfakes and fake audio clips. This escalating use of sophisticated technology further blurs the lines between real and fabricated information, creating a "climate of chaos" for European news consumers and undermining trust in legitimate media outlets. The use of cryptocurrency transactions to purchase the domains of these fake websites further obscures the operations, making it challenging to trace the perpetrators and hold them accountable. Investigations have linked the crypto wallets used to Russia, solidifying evidence of the Kremlin’s involvement in this disinformation campaign.
The campaign has primarily targeted Ukraine, Poland, and Germany, with previous instances targeting France during its presidential elections. The narratives pushed through these websites often aim to undermine trust in the Ukrainian government, portray Ukraine as losing the war against Russia, or emphasize Ukraine’s supposed lack of resources. A significant shift in this campaign is the adaptation of language to target specific countries. Unlike previous Russian disinformation campaigns that primarily focused on Russian-language content, the doppelganger campaign tailors its messaging to the local language of the targeted country, significantly increasing its effectiveness and reach. This localized approach further enhances the credibility of the fake websites and makes the disinformation more palatable to the intended audience.
The widespread use of Facebook in Ukraine, where approximately 54% of the population uses the platform, presents a significant challenge in combating the spread of these disinformation campaigns. The platform has become a key vector for disseminating false information originating from these doppelganger websites. This reliance on social media platforms for news consumption highlights the responsibility of tech giants like Meta, Facebook’s parent company, to effectively monitor and combat the spread of disinformation on their platforms. The ease with which fake accounts can be created and used to disseminate links to these malicious websites poses a significant hurdle in containing the spread of false narratives. The ongoing struggle to control the spread of disinformation underscores the need for increased vigilance from both individuals and social media platforms to discern between credible news sources and sophisticated imitations designed to manipulate public opinion.