Close Menu
Web StatWeb Stat
  • Home
  • News
  • United Kingdom
  • Misinformation
  • Disinformation
  • AI Fake News
  • False News
  • Guides
Trending

Vaccines facing misinformation spike: WHO experts – Northeast Mississippi Daily Journal

March 21, 2026

EU unveils coordinated strategy to counter cyber, sabotage and disinformation threats amid rising hybrid attacks

March 21, 2026

Vaccines facing misinformation spike: WHO experts – The Killeen Daily Herald

March 21, 2026
Facebook X (Twitter) Instagram
Web StatWeb Stat
  • Home
  • News
  • United Kingdom
  • Misinformation
  • Disinformation
  • AI Fake News
  • False News
  • Guides
Subscribe
Web StatWeb Stat
Home»AI Fake News
AI Fake News

AI ‘expert’ exposed: Fake Kremlin-linked analyst planted stories in African media

News RoomBy News RoomMarch 18, 2026Updated:March 19, 20265 Mins Read
Facebook Twitter Pinterest WhatsApp Telegram Email LinkedIn Tumblr

Imagine a scene straight out of a spy novel, but instead of secret agents and covert meetings, picture artificial intelligence and fake online personas as the main characters. This isn’t fiction, though. It’s the alarming reality of how Russian propagandists, using sophisticated AI tools like ChatGPT, have quietly snaked their way into African media, weaving a web of misinformation and pro-Kremlin tales. The story unfolds with a seemingly credible geopolitical academic, “Dr. Manuel Godsin,” who, as investigations by OpenAI and Code for Africa (CfA) revealed, was nothing more than a ghost in the machine – a digital puppet designed to launder Russian propaganda. This elaborate scheme wasn’t just about Godsin; it was a sprawling network involving fake think tanks and social media groups, all meticulously crafted to spread disinformation far and wide, touching platforms from Facebook to Microsoft’s MSN and numerous African news outlets. It’s a stark reminder that in our increasingly digital world, the battle for truth is being fought with new, invisible weapons.

The “Dr. Manuel Godsin” operation is a textbook example of what those in the information integrity world call a “paper person” – a fabricated identity given just enough surface-level credibility to slip past overworked and under-resourced newsrooms. Imagine a meticulously crafted LinkedIn profile, complete with impressive but entirely made-up PhDs from prestigious universities like Bergen and Oslo, and a long list of supposedly published books that don’t actually exist. This fake academic persona, whose face was even lifted from a real St. Petersburg law student, then became the mouthpiece for pro-Russian, anti-Western narratives. The content itself was often generated by AI, with the orchestrators even trying to trick OpenAI’s own models to sound less “AI-like” by instructing them to avoid tell-tale grammatical quirks. These articles, often thinly veiled opinion pieces, painted Russia in a positive light while criticizing Ukraine, the US, and the UK, sometimes even dipping into local African politics in countries like South Africa and Kenya. It’s a chilling demonstration of how easily AI can be weaponized to create persuasive, yet entirely false, narratives.

The brilliance and insidiousness of this campaign lay in its sophisticated “information laundering” technique. Narratives seeded by Russian-aligned sources, sometimes even from official state-funded agencies like African Initiative, would quickly morph and reappear under Godsin’s byline in seemingly independent African news outlets. This process adds a layer of false legitimacy, making the propaganda appear as genuine, unbiased analysis from a credible expert. Imagine a story about an alleged 72-hour ultimatum for a South African ambassador to leave the US, initially reported by African Initiative, then swiftly re-spun by “Dr. Godsin” as a sign of South Africa’s new independent international stance. These tactics were systematic, with patterns emerging around various politically charged topics. The goal was clear: take Russian narratives, dress them up as independent African commentary, and then amplify them across a wide range of platforms, effectively poisoning the well of public discourse and shaping perceptions in Africa to align with Russia’s strategic interests.

What makes this campaign particularly concerning is its reach and the willingness of some mainstream media outlets to become unwitting (or perhaps even willing) conduits for this disinformation. Of the 27 websites that published Godsin’s articles, many (13, to be precise) had already been flagged in previous investigations for their involvement in foreign information manipulation. A significant chunk of these articles, almost half, ended up on platforms associated with South Africa’s third-largest media conglomerate, Independent Media, a group that has itself faced scrutiny for publishing articles by other fictitious authors and for its perceived pro-Russia stance. Even reputable platforms like MSN, known for their strict editorial guidelines, were caught in the dragnet, publishing Godsin’s articles as “expert analysis,” for example, an outlandish claim about the US secretly providing humanitarian aid to a whites-only town in South Africa. This illustrates a critical vulnerability: even seemingly robust media safeguards can buckle under the sophisticated pressure of a well-resourced and technologically advanced disinformation operation.

The consequences are far-reaching. When fabricated stories from anonymous sources are allowed to circulate in mainstream media, even if later debunked, they gain what CfA calls “archival authority.” The sheer act of being published lends them a deceptive air of truth and legitimacy, influencing public opinion and shaping narratives long after their falsehood has been exposed. The Godsin case highlights that the problem isn’t just about individual fake accounts; it’s about the systemic weaknesses in journalistic gatekeeping. As Sbu Ngalwa, secretary-general of The African Editors Forum, aptly noted, AI dramatically lowers the barrier to entry for disinformation. In a world saturated with information, where newsrooms are often stretched thin, the human element of journalism – critical thinking, rigorous fact-checking, and adherence to ethical standards – becomes an indispensable safeguard against foreign interference. Without these defenses, African news organizations risk becoming “useful idiots” in a larger geopolitical game, unknowingly spreading propaganda that harms their own communities.

Ultimately, the “Dr. Manuel Godsin” saga is a cautionary tale for the digital age. It’s a stark reminder that the information we consume, particularly online, needs to be met with a healthy dose of skepticism and critical analysis. For media organizations, it underscores the urgent need to bolster their editorial safeguards, invest in fact-checking, and enforce rigorous standards of verification. In a world where AI can conjure up convincing fake personas and narratives at an unprecedented scale, the responsibility to discern truth from falsehood increasingly falls on both news producers and news consumers alike. The fight against foreign information manipulation is not just a technological challenge; it’s a battle for truth, trust, and the integrity of democratic discourse, demanding constant vigilance and a renewed commitment to ethical journalism.

Share. Facebook Twitter Pinterest LinkedIn Tumblr Email
News Room
  • Website

Keep Reading

AI-powered smart glasses blur the line between real and fake photos online

Luke Littler to trademark his face to combat gen-AI deepfakes

Luke Littler applies to trademark his face to combat AI fakes – BBC

Luke Littler: Darts star makes copyright application to trademark his face and stop AI fakes | Darts News

Video Of Al Jazeera Anchor Resigning Over Iran War Coverage Is AI-Generated

Israeli Prime Minister Benjamin Netanyahu has dismissed rumours of his death, calling them “fake news”. – facebook.com

Editors Picks

EU unveils coordinated strategy to counter cyber, sabotage and disinformation threats amid rising hybrid attacks

March 21, 2026

Vaccines facing misinformation spike: WHO experts – The Killeen Daily Herald

March 21, 2026

Russia Used AI in 27% of Disinformation Incidents in 2025 — UNITED24 Media

March 21, 2026

Democrats block standalone voter ID bill attempt on the Senate floor

March 21, 2026

Alex Jones’ Infowars is shutting down, but his disinformation legacy lives on

March 21, 2026

Latest Articles

Video: Misinformation surrounding redistricting: Can misleading voters carry legal consequences?

March 21, 2026

40 million euros to combat online disinformation

March 21, 2026

Zionist plot to attack Al-Aqsa worshippers in false flag op.

March 21, 2026

Subscribe to News

Get the latest news and updates directly to your inbox.

Facebook X (Twitter) Pinterest TikTok Instagram
Copyright © 2026 Web Stat. All Rights Reserved.
  • Privacy Policy
  • Terms
  • Contact

Type above and press Enter to search. Press Esc to cancel.