Close Menu
Web StatWeb Stat
  • Home
  • News
  • United Kingdom
  • Misinformation
  • Disinformation
  • AI Fake News
  • False News
  • Guides
Trending

State Dept. Directive Signals Shift From Counter-Disinfo to Cognitive Warfare

April 11, 2026

Lego memes, AI ‘slopaganda’: Iran’s disinformation machine

April 11, 2026

Orban’s opponents targeted by AI-driven disinformation ahead of Hungary’s elections

April 11, 2026
Facebook X (Twitter) Instagram
Web StatWeb Stat
  • Home
  • News
  • United Kingdom
  • Misinformation
  • Disinformation
  • AI Fake News
  • False News
  • Guides
Subscribe
Web StatWeb Stat
Home»Disinformation
Disinformation

Russia Used AI in 27% of Disinformation Incidents in 2025 — UNITED24 Media

News RoomBy News RoomMarch 21, 20264 Mins Read
Facebook Twitter Pinterest WhatsApp Telegram Email LinkedIn Tumblr

The world of information is undergoing a profound and somewhat unsettling transformation, as nations increasingly weaponize digital tools to sway public opinion and sow discord. One of the primary actors in this evolving landscape is Russia, which, according to insights from the Center for Countering Disinformation, is significantly escalating its disinformation efforts by leveraging the power of artificial intelligence. It’s not just about spreading falsehoods anymore; it’s about doing so with a speed, scale, and sophistication previously unimaginable. This isn’t some distant futuristic scenario; it’s happening right now, shaping perceptions and fueling narratives in real-time.

A recent threat assessment from the European External Action Service (EEAS) paints a stark picture of this digital battleground. In a single year, 2025, a staggering 540 instances of foreign information manipulation and interference were documented. These aren’t isolated incidents, but rather a concerted effort involving an estimated 10,500 social media channels and websites. Imagine legions of digital voices, some human, many now artificially generated, all singing from the same hymn sheet of deception. Among these targeted campaigns, Ukraine remains the perennial focus. The goal is clear: to erode international support for a nation fighting for its sovereignty and to undermine the trust its citizens place in their leaders and their collective resistance. It’s a calculated psychological warfare, designed to weaken from within and isolate from without.

What’s truly alarming is the rapid technological evolution underpinning these operations. The EEAS findings reveal a sharp departure from traditional disinformation tactics. A significant 27% of all recorded incidents in 2025 involved sophisticated AI-generated content – be it text designed to sound authentically human, synthetic audio capable of mimicking voices, or manipulated videos that blur the lines between reality and fabrication. This isn’t just about making things look good; it’s about making them real enough to be believed, even when they’re entirely manufactured. This technological leap dramatically lowers the barrier to entry, allowing hostile actors to churn out an unprecedented volume of persuasive content with fewer human resources and at a fraction of the cost. It’s an industrialized approach to deception, making it easier and cheaper to flood the information ecosystem with misleading narratives.

While the EEAS report points to a broad array of actors, a significant portion of these manipulative efforts are directly attributable. Among the identified cases, Russia stands out, linked to 29% of the incidents, while China accounts for 6%. A substantial 65% remained unattributed, highlighting the challenges of identifying the perpetrators in a constantly evolving digital landscape where anonymity can be carefully cultivated. The report explicitly states, “Russian and Chinese actors have fully implemented AI tools to speed up content production and increase meddling activities with fewer resources.” This isn’t merely an observation; it’s a warning that generative AI technologies are fundamentally transforming the economics of large-scale influence campaigns, making them more accessible and potent than ever before.

The EEAS also highlights specific vulnerabilities in the information environment. Major political events and significant news cycles are particularly susceptible to these influence operations, with nearly half of the recorded incidents in 2025 coinciding with elections, protests, or international crises. These are moments when public sentiment is often high, anxieties are elevated, and people are actively seeking information, making them prime targets for manipulative narratives. Examples include election-related campaigns tracked in countries such as Germany, Poland, Romania, Moldova, and the Czech Republic, where attempts were made to sway voters or destabilize political processes. It’s a strategic exploitation of critical democratic junctures, designed to inject doubt and influence outcomes.

A concrete example of this broader trend can be seen in the consistent efforts to portray Ukraine as a threat to civilians. Ukraine’s Center for Countering Disinformation recently exposed a new Russian narrative falsely alleging that Ukrainian drones were deliberately targeting civilians in the Belgorod region. This particular campaign, as the Center notes, followed a familiar propaganda template: it was built on emotionally charged claims and unverified reports, designed to evoke outrage and sympathy without providing concrete evidence. The latest iteration included fabricated stories of a drone striking a woman’s car and another supposedly chasing an elderly resident with a goat – narratives designed to be easily digestible, emotionally resonant, and propagate quickly, despite a complete lack of independent verification. These aren’t just stories; they are weapons wielded in the information war, aimed directly at the heart of public perception and trust.

Share. Facebook Twitter Pinterest LinkedIn Tumblr Email
News Room
  • Website

Keep Reading

State Dept. Directive Signals Shift From Counter-Disinfo to Cognitive Warfare

Lego memes, AI ‘slopaganda’: Iran’s disinformation machine

Orban’s opponents targeted by AI-driven disinformation ahead of Hungary’s elections

Opinion: The War Against Tibet’s Story- Digital repression, disinformation, and what the Tibetan community must do now.

Russia and Iran-backed networks spread disinformation to discredit Ukraine | Ukraine news

Russian propagandists lie about “military trains” to justify strikes on railways – Center for Countering Disinformation

Editors Picks

Lego memes, AI ‘slopaganda’: Iran’s disinformation machine

April 11, 2026

Orban’s opponents targeted by AI-driven disinformation ahead of Hungary’s elections

April 11, 2026

Fake news

April 11, 2026

BJP’s NV Subhash accuses Pawan Khera of false allegations in Assam

April 11, 2026

BJP’s NV Subhash slams Pawan Khera over “false” allegations against Assam CM Himanta Biswa Sarma

April 11, 2026

Latest Articles

‘A False Front’: The California Agency Failing to Stop Conservatorship Abuses

April 11, 2026

Opinion: The War Against Tibet’s Story- Digital repression, disinformation, and what the Tibetan community must do now.

April 11, 2026

PDAM calls for crackdown on petrol station misinformation amid shortage fears

April 11, 2026

Subscribe to News

Get the latest news and updates directly to your inbox.

Facebook X (Twitter) Pinterest TikTok Instagram
Copyright © 2026 Web Stat. All Rights Reserved.
  • Privacy Policy
  • Terms
  • Contact

Type above and press Enter to search. Press Esc to cancel.