In an increasingly interconnected world, where information spreads at the speed of light, it’s easy to assume that elections are simply about casting votes. However, a groundbreaking report by UK-based Refute, a company at the forefront of understanding digital threats, reveals something far more insidious: election interference has transformed into a sophisticated, multi-layered operation. This isn’t just about a few last-minute tricks; it’s a meticulously planned “instrument of interference” woven throughout the entire election cycle, often kicking off months before anyone even steps into a voting booth. Vlad Galu, Refute’s Chief Technology Officer, who will be sharing these insights at the upcoming Warsaw Resilience Conference, explains that the goal is chillingly simple yet incredibly effective: to divide the electorate and convince those living abroad that their home country is spiraling downwards. “If you’re a politician running to be elected, mathematically speaking, the best strategy is to divide the electorate and then only appeal to part of it,” Galu notes. “That’s a much easier way to get into office than to try to speak to everyone.” This strategy highlights a fundamental shift in political campaigning, where the focus moves from uniting a nation to strategically segmenting it, leveraging existing divisions to gain power. It’s a stark reminder that in the digital age, winning hearts and minds can sometimes mean exploiting vulnerabilities and trust.
Take the upcoming 2025 presidential election in Romania, for instance. Refute’s team uncovered a staggering 32,500 TikTok videos pushing populist candidates. Think about that for a moment – tens of thousands of short, attention-grabbing clips, many of which showed telltale signs of being part of a larger plan. We’re talking about content duplicated across multiple accounts, sometimes even featuring AI-generated faces or voices, making it hard to suss out who’s real and who’s not. But the most alarming detail? The engagement patterns. While roughly a quarter of all Romanians live outside their home country, a whopping 48% of the interactions with these videos came from abroad. This isn’t random; it’s a clear signal that the diaspora, often a passionate and influential demographic, was being systematically targeted. It’s like a whisper campaign amplifying anxieties and grievances, making them feel like their country is crumbling, subtly pushing them towards certain candidates. This calculated approach to manipulate perceptions among a crucial segment of voters illustrates the evolving nature of influence operations, moving beyond simple propaganda to tailored narratives that resonate with specific demographics, exploiting their geographical distance and potential emotional ties to their homeland.
The situation in Moldova, as outlined in Refute’s report, paints an even more elaborate and resource-intensive picture. Here, the playbook involved a broader, more aggressive mix of tactics. Imagine a sophisticated operation that combined old-school vote-buying networks – literally paying people for their ballots – with a deluge of online disinformation designed to sow confusion and distrust. And it wasn’t just nameless internet trolls; the report suggests involvement from embassy-linked personnel, adding a layer of official backing to the interference. Refute’s digital detectives found over 16,000 accounts behaving like bots during the election period alone – a digital army pushing specific messages. The estimated cost? A staggering $150 million, cited from intelligence sources. This isn’t pocket change; it’s a colossal investment, indicating the high stakes and the immense value placed on influencing electoral outcomes. Across both Romania and Moldova, the consistent theme is a shift towards “layered operations.” This means an intricate dance of automated online networks, influencers amplifying particular messages, and clever use of AI-generated media. They spread their content across all the popular platforms – TikTok, Telegram, Facebook – often blending genuine user engagement with inauthentic activity. This makes it incredibly difficult to trace back to the source, like trying to untangle a ball of yarn where every thread looks similar.
Vlad Galu, with a touch of weariness in his voice, emphasizes that this information warfare is not only complicated but also incredibly expensive for those trying to combat it. He starkly compares it to conventional warfare: “We have to think about this information warfare as pretty much the same thing as conventional warfare. We are in a war situation, it’s just fought with different means on different grounds.” The core problem, he explains, is the massive cost imbalance. The people spreading disinformation can churn out low-quality, high-impact content with minimal effort. “So it’s a very low-effort, very high-yield activity,” Galu points out. Think about it: a few people can create a fake video or a misleading meme in minutes, and it can go viral, swaying public opinion. On the flip side, organizations like Refute are held to a much higher standard. They need “certainty, clarity,” which demands rigorous analysis, sophisticated computing power, and massive amounts of data. This quest for accuracy and verification is incredibly costly, creating an uneven playing field where those who wield falsehoods have a significant advantage. This fundamental asymmetry in effort and cost means that defending democratic processes from information attacks is an uphill battle, requiring constant innovation and significant resources to stay ahead of the curve.
These campaigns, Galu explains, are cunning in their ability to exploit existing anxieties and political narratives. They don’t invent problems; they amplify them. Common themes include cleverly framing defense spending as a trade-off against domestic welfare – essentially, “why are we spending money on tanks when our schools need funding?” They also push the idea that dialogue with Russia is the only path to stability, subtly undermining alliances and trust in Western institutions. And perhaps most dangerously, they preemptively cast doubt on the legitimacy of elections themselves, setting the stage for distrust even before a single vote is counted. In Hungary, as the April 2026 parliamentary elections approach, similar tactics are already surfacing. European security sources, cited in Refute’s report, are keenly watching an interference campaign that “follows the same blueprint” seen in Moldova. Galu, with an almost ominous tone, remarks, “Monday, April 12, is going to be a very interesting day to wake up to,” hinting at the potential disruption and division that awaits. This pattern underscores how these operations are not isolated incidents but rather part of a repeatable strategy, adapted and deployed across various geopolitical contexts, always seeking to exploit existing societal fault lines and undermine democratic institutions from within.
Refute’s most crucial takeaway is a stark warning: the biggest vulnerability in our elections isn’t the physical act of voting anymore. It’s the information environment that precedes it. Imagine a silent, invisible battle being fought in our news feeds, social media timelines, and messaging apps. Coordinated amplification, where a specific narrative is pushed by many accounts at once, dormant accounts suddenly springing to life to spread a message, and precisely targeted stories – these can subtly shape how voters think and feel long before authorities even have a legitimate reason to intervene. “Once disinformation campaigns begin, it is extremely hard to rein it back in again. Prevention is far more cost-effective than damage control,” the report unequivocally states. This isn’t just about cleaning up a mess; it’s about building resilience. Galu passionately advocates for a shift towards continuous monitoring and automated analysis of online activity. He argues that our current responses are largely reactive – we wait for a problem to emerge and then try to fix it. This is simply not enough given the scale and sophistication of modern influence operations. We need a proactive, always-on approach, a digital immune system capable of detecting and neutralizing threats before they can take root and poison the well of public discourse.

