It’s like a war on our screens, not just on battlefields, where stories and images are used as weapons. Right now, a shadowy information war is being waged, with players like Iran, backed by Russia and China, using our own social media feeds to spread their narratives. They’re churning out content, sometimes mocking figures like President Trump, sometimes painting leaders as bloodthirsty, and even creating fake attack videos, including a chilling one showing a missile hitting New York’s Liberty Island. It’s no coincidence that Jeffrey Epstein’s name often pops up in these discussions, adding another layer of intrigue and distrust. This isn’t just about harmless memes; these are deliberate attempts to tap into our anxieties and existing divisions, especially around the U.S.-Israeli military involvement in ongoing conflicts. They are masters of creating a deluge of propaganda – narratives that stretch the truth, or outright disinformation – often using super-realistic AI-generated videos and images to make it all seem incredibly real.
While many of these fabrications are quickly exposed for what they are, the damage is often done, reaching millions of people across platforms like X, Facebook, and TikTok before the debunking can even catch up. For Iran’s beleaguered leadership, this information war is almost as powerful as any physical weapon. While the Strait of Hormuz offers them leverage over global oil, this digital manipulation allows them to stir up popular anger and unease, not just abroad, but even within countries directly involved, like the United States. As one expert, Darren L. Linvill, aptly put it, Iran seems to be “winning the propaganda war,” largely because they’ve been preparing for this kind of conflict for decades, in stark contrast to the perceived unpreparedness of many Western administrations. Ironically, with the internet largely shut down within Iran, the true target of this digital onslaught isn’t their own population, but those of us outside their borders, living in societies where information flows more freely.
Underneath the surface, this sophisticated campaign operates through a network of human-controlled accounts, not just automated bots. Researchers have uncovered a secretive web of accounts on platforms like X and Instagram, all working to push a pro-Iranian agenda. These aren’t just random users; they’re often linked to the Islamic Revolutionary Guards Corps, skillfully impersonating Spanish-speaking individuals in places like Texas and Venezuela, or English speakers in the UK and Ireland. They’re cunning in their methods, sometimes even lifting content directly from well-known Western influencers like Jackson Hinkle and Mario Nawfal, who have massive followings and a reputation for outspoken views on global affairs. This piggybacking allows them to amplify their messages and give them an air of legitimacy. One particularly notable campaign involved a Tucker Carlson interview, where a clip suggesting that Israel had manipulated the U.S. into war was spread almost simultaneously by dozens of accounts, a clear indication of coordinated effort rather than organic virality.
Iran is particularly adept at leveraging the erratic statements of figures like former President Trump and exploiting vulnerabilities in American institutions that once served as fact-checkers. For instance, a cleverly crafted AI video from an Iranian state network poked fun at Trump’s inability to rally allies to keep the Strait of Hormuz open, even featuring fake laugh tracks from Putin and Kim Jong-un listening to a rap song. As Jonathan Ruhe, an analyst, points out, Trump’s struggles with alliance-building effectively “started the fire,” and Iran’s disinformation campaigns are simply “pouring gasoline on that.” While the American military and social media platforms try to dispel these falsehoods, new ones continue to emerge, sometimes even from official government accounts, ironically showcasing the pervasive nature of AI-generated content and misleading narratives.
Adding another layer to this complex tapestry, Russia and China, who share a disdain for unchecked American military power and maintain close ties with Iran, are actively participating in this information war. They not only amplify Iranian propaganda but create their own, often in what appears to be a coordinated fashion. Research firms have documented numerous instances where state media and covert influence operations from these countries push narratives that perfectly align with Iran’s. They highlight Iran’s ability to control crucial shipping lanes like the Strait of Hormuz and even push claims that the war was a smokescreen to distract from sensitive information about figures like Jeffrey Epstein. This “travel chain of narratives,” as described by researchers, sees misleading broadcasts from Iranian state TV picked up by online influencers, then transformed into AI-generated media, and finally given widespread circulation by Chinese and Russian bot armies.
The goal of this elaborate scheme, as one company noted, is likely to normalize support for escalation, deflect blame onto external actors, and solidify the image of Iran as a victim, thereby justifying their “defensive or retaliatory stances.” The sheer scale of this effort is staggering. One social media monitoring company found that Iran activated a massive network of fake social media accounts, generating a whopping 145 million views in just the first two weeks of the war, with TikTok alone accounting for 72% of that. These incredibly convincing AI-generated fakes, depicting numerous attacks on Israel, are a testament to the sophisticated and coordinated nature of this campaign. The recurring content, consistent hashtags, and rapid bursts of posts all point to a structured operation, designed to dominate online discussions during critical moments of the conflict and, in essence, shape our perception of reality.

