The modern world, with its dazzling display of interconnectedness, has also unveiled a darker side: the pervasive influence of digital statecraft, where foreign powers skillfully wield disinformation as a weapon. This isn’t just a fleeting trend; it’s a fundamental shift in how international relations and domestic politics are conducted. Imagine a bustling, global town square, where conversations flow freely and ideas are exchanged. Now, envision a few shadowy figures subtly whispering falsehoods, twisting narratives, and planting seeds of doubt among the crowds. This is the essence of what Professor Martin Innes, a leading expert from Cardiff University, and co-director of the Security, Crime and Innovation Intelligence Institute, has been tirelessly observing and analyzing. His insights, shared with two critical UK inquiries – “Disinformation diplomacy: How malign actors are seeking to undermine democracy” by the Foreign Affairs Committee and “The Rycroft Review – Report of the Independent Review into Countering Foreign Financial Influence and Interference in UK Politics” – paint a stark picture of a world where truth is a battleground. He emphasizes that the UK and its allies are facing an ongoing and undeniable threat, especially from Russia, which actively seeks to exploit vulnerabilities in our “information ecosystems” to destabilize and control. This isn’t merely about distant, abstract geopolitical maneuvers; it directly impacts the very fabric of our societies, our trust in institutions, and even our daily conversations.
Professor Innes’s research reveals the astonishing scale and sophistication of this digital manipulation. He presented sobering data to the Foreign Affairs Committee, indicating that between 2011 and 2023, over 70 countries have been targeted by these clandestine operations. Imagine a global chess game where countless players are making their moves, and the pieces are not just armies or resources, but narratives and public opinion. While many actors are involved in these activities, Russia consistently emerges as the most prominent, followed by Iran and China. What’s truly unsettling is the adaptability of these campaigns. Professor Innes highlighted a recent, chilling development: Russian operations experimenting with Welsh and Gaelic. This isn’t about grand, sweeping propaganda anymore. This is about “micro-targeting,” a precision-guided approach that leverages the power of social media to connect with specific groups. Think of it like a master tailor custom-fitting a suit for every individual, but instead of fabric, they’re using information and misinformation to shape beliefs. This ability to target different ethnic groups, regions, and languages creates a vast array of possibilities for those with malign intentions. It means that the narratives can be tailored to resonate deeply with specific communities, exploiting their concerns, fears, and aspirations. This is no longer a broad-brush approach; it’s a nuanced and insidious strategy designed to sow discord and exert influence at a granular level, making it incredibly difficult to detect and counter.
One of the most striking examples Professor Innes and his team uncovered involved the Russian-backed “Doppelgänger network” and its cynical exploitation of public interest surrounding the Princess of Wales’s health. In the days leading up to her cancer diagnosis announcement, this network actively amplified conspiracies and disinformation. Consider the immense public attention and concern surrounding such a figure. The Doppelgänger network saw this as an opportunity, like vultures circling a vulnerability. They posted replies on X (formerly Twitter), not just about the Princess, but strategically interwoven with material denigrating Ukraine and celebrating President Putin’s electoral “victory” – topics clearly aligned with Kremlin interests. Professor Innes articulated the genius and malevolence of this strategy: “There are two dimensions to it. First, at the time— it was probably the biggest story on social media on the planet. If you can get into that news cycle and media cycle, you suddenly get a lot more eyeballs on the content.” It’s like hijacking a popular highway to deliver your own, more sinister cargo. “They were amplifying this and then they were dropping in things about Ukraine. There was a content layer to it.” But the deeper, more insidious goal was to “undermine trust in institutions.” The Royal Family, a symbol of stability and tradition in the UK, became a target. By eroding trust in such a fundamental institution, these actors seek to weaken the very foundations of society, making it more susceptible to external influence and internal strife.
This pattern of exploiting organic events for geopolitical gain is a recurring theme. Professor Innes illustrated this with the disinformation spread after the 2024 Southport attacks. He explained that while initial conspiracy theories might arise organically from public anxieties or misunderstandings, there’s a dedicated group of Russian actors who are constantly monitoring the media landscape. Imagine a specialized team of digital strategists, always on the lookout for opportunities. “They look at these and they go, ‘Right. That one looks like it has traction. We can get behind it. We can boost and amplify that, and that will help us to achieve our geopolitical aims of sowing chaos, discord and all those kinds of things.’” This isn’t about creating fake news from scratch every time; it’s often about identifying existing, nascent narratives – even if they’re baseless – and pouring fuel on the fire. They act as digital accelerants, taking a small spark of misinformation and fanning it into a raging inferno of conspiracy and doubt, all to further their strategic objectives of destabilizing and fracturing societies. This repeated pattern highlights a cynical, opportunistic approach to information warfare, where any event, big or small, can be weaponized.
Tackling this hydra-headed problem of disinformation is, as Professor Innes acknowledges, incredibly complex. The traditional tools of international relations often fall short. He points out the paradoxical effects of sanctions: “The impact of sanctions in this area is not really understood. Some of the individual organisations that we have looked at, which have been sanctioned by (the UK), the US and the EU, seem to have taken that as a badge of honour, and it has resulted in them receiving awards for their work in Russia. They have secured more contracts, and secured closed contracts, so they are not bidding on the open market.” This shows how some malignant actors operate outside conventional frameworks, turning punishment into a perverse form of recognition and even an advantage. It’s like trying to stop a ghost with a net. Beyond sanctions, Professor Innes emphasizes the crucial role of accountability for social media platforms. These platforms, while providing immense benefits, also serve as fertile ground for disinformation and profit handsomely from the content shared, regardless of its veracity. “Personally, I certainly think there is a case for saying that we need to revisit the extent to which platforms are responsible and accountable for the things that are published on their services, out of which they are making considerable amounts of money.” This isn’t about stifling free speech but about establishing a baseline of responsibility for the incredibly powerful tools they provide.
Furthermore, Professor Innes highlights a critical aspect of this modern warfare: the outsourcing and agility of these operations. “The bit that has interested me for a while, particularly when we look at what Russia is doing, is the outsourcing and the use of contracted agencies that are more creative and agile in terms of their approach. They are like digital natives.” This means traditional government departments, often structured and slow-moving, are facing adversaries who are nimble, innovative, and constantly evolving their tactics. It’s like trying to fight a swarm of buzzing, unpredictable drones with a single, slow-moving tank. “Whatever (the UK puts) together, we need to appreciate that there are different dimensions to this, and it probably transcends the traditional remit of any one single government department.” This necessitates a more integrated, cross-departmental, and adaptable approach to counter these sophisticated threats. His evidence to The Rycroft Review, which concluded that the UK faces a “persistent problem of foreign interests seeking to exert influence on, and to interfere in, our politics,” underscores the urgency. The report’s recommendation for “clear lead accountability at ministerial and senior official level for leading the work to combat foreign online political interference, with resources commensurate to the challenge” is not just a bureaucratic suggestion; it’s a call to arms for a crucial battle – the battle for truth and the integrity of our democracies in the digital age. This isn’t a problem that will solve itself; it requires dedicated resources, coordinated effort, and a profound understanding of the ever-shifting digital landscape to safeguard our societies from these pervasive and insidious influences.

