Close Menu
Web StatWeb Stat
  • Home
  • News
  • United Kingdom
  • Misinformation
  • Disinformation
  • AI Fake News
  • False News
  • Guides
Trending

Vaccines facing misinformation spike: WHO experts – Northeast Mississippi Daily Journal

March 21, 2026

EU unveils coordinated strategy to counter cyber, sabotage and disinformation threats amid rising hybrid attacks

March 21, 2026

Vaccines facing misinformation spike: WHO experts – The Killeen Daily Herald

March 21, 2026
Facebook X (Twitter) Instagram
Web StatWeb Stat
  • Home
  • News
  • United Kingdom
  • Misinformation
  • Disinformation
  • AI Fake News
  • False News
  • Guides
Subscribe
Web StatWeb Stat
Home»Disinformation
Disinformation

Russia Used AI in 27% of Disinformation Incidents in 2025 — UNITED24 Media

News RoomBy News RoomMarch 21, 20264 Mins Read
Facebook Twitter Pinterest WhatsApp Telegram Email LinkedIn Tumblr

The world of information is undergoing a profound and somewhat unsettling transformation, as nations increasingly weaponize digital tools to sway public opinion and sow discord. One of the primary actors in this evolving landscape is Russia, which, according to insights from the Center for Countering Disinformation, is significantly escalating its disinformation efforts by leveraging the power of artificial intelligence. It’s not just about spreading falsehoods anymore; it’s about doing so with a speed, scale, and sophistication previously unimaginable. This isn’t some distant futuristic scenario; it’s happening right now, shaping perceptions and fueling narratives in real-time.

A recent threat assessment from the European External Action Service (EEAS) paints a stark picture of this digital battleground. In a single year, 2025, a staggering 540 instances of foreign information manipulation and interference were documented. These aren’t isolated incidents, but rather a concerted effort involving an estimated 10,500 social media channels and websites. Imagine legions of digital voices, some human, many now artificially generated, all singing from the same hymn sheet of deception. Among these targeted campaigns, Ukraine remains the perennial focus. The goal is clear: to erode international support for a nation fighting for its sovereignty and to undermine the trust its citizens place in their leaders and their collective resistance. It’s a calculated psychological warfare, designed to weaken from within and isolate from without.

What’s truly alarming is the rapid technological evolution underpinning these operations. The EEAS findings reveal a sharp departure from traditional disinformation tactics. A significant 27% of all recorded incidents in 2025 involved sophisticated AI-generated content – be it text designed to sound authentically human, synthetic audio capable of mimicking voices, or manipulated videos that blur the lines between reality and fabrication. This isn’t just about making things look good; it’s about making them real enough to be believed, even when they’re entirely manufactured. This technological leap dramatically lowers the barrier to entry, allowing hostile actors to churn out an unprecedented volume of persuasive content with fewer human resources and at a fraction of the cost. It’s an industrialized approach to deception, making it easier and cheaper to flood the information ecosystem with misleading narratives.

While the EEAS report points to a broad array of actors, a significant portion of these manipulative efforts are directly attributable. Among the identified cases, Russia stands out, linked to 29% of the incidents, while China accounts for 6%. A substantial 65% remained unattributed, highlighting the challenges of identifying the perpetrators in a constantly evolving digital landscape where anonymity can be carefully cultivated. The report explicitly states, “Russian and Chinese actors have fully implemented AI tools to speed up content production and increase meddling activities with fewer resources.” This isn’t merely an observation; it’s a warning that generative AI technologies are fundamentally transforming the economics of large-scale influence campaigns, making them more accessible and potent than ever before.

The EEAS also highlights specific vulnerabilities in the information environment. Major political events and significant news cycles are particularly susceptible to these influence operations, with nearly half of the recorded incidents in 2025 coinciding with elections, protests, or international crises. These are moments when public sentiment is often high, anxieties are elevated, and people are actively seeking information, making them prime targets for manipulative narratives. Examples include election-related campaigns tracked in countries such as Germany, Poland, Romania, Moldova, and the Czech Republic, where attempts were made to sway voters or destabilize political processes. It’s a strategic exploitation of critical democratic junctures, designed to inject doubt and influence outcomes.

A concrete example of this broader trend can be seen in the consistent efforts to portray Ukraine as a threat to civilians. Ukraine’s Center for Countering Disinformation recently exposed a new Russian narrative falsely alleging that Ukrainian drones were deliberately targeting civilians in the Belgorod region. This particular campaign, as the Center notes, followed a familiar propaganda template: it was built on emotionally charged claims and unverified reports, designed to evoke outrage and sympathy without providing concrete evidence. The latest iteration included fabricated stories of a drone striking a woman’s car and another supposedly chasing an elderly resident with a goat – narratives designed to be easily digestible, emotionally resonant, and propagate quickly, despite a complete lack of independent verification. These aren’t just stories; they are weapons wielded in the information war, aimed directly at the heart of public perception and trust.

Share. Facebook Twitter Pinterest LinkedIn Tumblr Email
News Room
  • Website

Keep Reading

EU unveils coordinated strategy to counter cyber, sabotage and disinformation threats amid rising hybrid attacks

Alex Jones’ Infowars is shutting down, but his disinformation legacy lives on

40 million euros to combat online disinformation

How a disinformation network is destabilising the Alliance of Sahel States

Disinformation leads Chișinău conference

SSU exposes large-scale Russian disinformation operation targeting Hungarian community in Zakarpattia

Editors Picks

EU unveils coordinated strategy to counter cyber, sabotage and disinformation threats amid rising hybrid attacks

March 21, 2026

Vaccines facing misinformation spike: WHO experts – The Killeen Daily Herald

March 21, 2026

Russia Used AI in 27% of Disinformation Incidents in 2025 — UNITED24 Media

March 21, 2026

Democrats block standalone voter ID bill attempt on the Senate floor

March 21, 2026

Alex Jones’ Infowars is shutting down, but his disinformation legacy lives on

March 21, 2026

Latest Articles

Video: Misinformation surrounding redistricting: Can misleading voters carry legal consequences?

March 21, 2026

40 million euros to combat online disinformation

March 21, 2026

Zionist plot to attack Al-Aqsa worshippers in false flag op.

March 21, 2026

Subscribe to News

Get the latest news and updates directly to your inbox.

Facebook X (Twitter) Pinterest TikTok Instagram
Copyright © 2026 Web Stat. All Rights Reserved.
  • Privacy Policy
  • Terms
  • Contact

Type above and press Enter to search. Press Esc to cancel.