Close Menu
Web StatWeb Stat
  • Home
  • News
  • United Kingdom
  • Misinformation
  • Disinformation
  • AI Fake News
  • False News
  • Guides
Trending

Media a target of Marcos Jr. health rumors too — disinformation researcher

April 11, 2026

Condemning the spread of misinformation

April 11, 2026

France 24 did not broadcast video report on disinfo against Pakistan

April 11, 2026
Facebook X (Twitter) Instagram
Web StatWeb Stat
  • Home
  • News
  • United Kingdom
  • Misinformation
  • Disinformation
  • AI Fake News
  • False News
  • Guides
Subscribe
Web StatWeb Stat
Home»AI Fake News
AI Fake News

AI ‘expert’ exposed: Fake Kremlin-linked analyst planted stories in African media

News RoomBy News RoomMarch 18, 2026Updated:March 19, 20265 Mins Read
Facebook Twitter Pinterest WhatsApp Telegram Email LinkedIn Tumblr

Imagine a scene straight out of a spy novel, but instead of secret agents and covert meetings, picture artificial intelligence and fake online personas as the main characters. This isn’t fiction, though. It’s the alarming reality of how Russian propagandists, using sophisticated AI tools like ChatGPT, have quietly snaked their way into African media, weaving a web of misinformation and pro-Kremlin tales. The story unfolds with a seemingly credible geopolitical academic, “Dr. Manuel Godsin,” who, as investigations by OpenAI and Code for Africa (CfA) revealed, was nothing more than a ghost in the machine – a digital puppet designed to launder Russian propaganda. This elaborate scheme wasn’t just about Godsin; it was a sprawling network involving fake think tanks and social media groups, all meticulously crafted to spread disinformation far and wide, touching platforms from Facebook to Microsoft’s MSN and numerous African news outlets. It’s a stark reminder that in our increasingly digital world, the battle for truth is being fought with new, invisible weapons.

The “Dr. Manuel Godsin” operation is a textbook example of what those in the information integrity world call a “paper person” – a fabricated identity given just enough surface-level credibility to slip past overworked and under-resourced newsrooms. Imagine a meticulously crafted LinkedIn profile, complete with impressive but entirely made-up PhDs from prestigious universities like Bergen and Oslo, and a long list of supposedly published books that don’t actually exist. This fake academic persona, whose face was even lifted from a real St. Petersburg law student, then became the mouthpiece for pro-Russian, anti-Western narratives. The content itself was often generated by AI, with the orchestrators even trying to trick OpenAI’s own models to sound less “AI-like” by instructing them to avoid tell-tale grammatical quirks. These articles, often thinly veiled opinion pieces, painted Russia in a positive light while criticizing Ukraine, the US, and the UK, sometimes even dipping into local African politics in countries like South Africa and Kenya. It’s a chilling demonstration of how easily AI can be weaponized to create persuasive, yet entirely false, narratives.

The brilliance and insidiousness of this campaign lay in its sophisticated “information laundering” technique. Narratives seeded by Russian-aligned sources, sometimes even from official state-funded agencies like African Initiative, would quickly morph and reappear under Godsin’s byline in seemingly independent African news outlets. This process adds a layer of false legitimacy, making the propaganda appear as genuine, unbiased analysis from a credible expert. Imagine a story about an alleged 72-hour ultimatum for a South African ambassador to leave the US, initially reported by African Initiative, then swiftly re-spun by “Dr. Godsin” as a sign of South Africa’s new independent international stance. These tactics were systematic, with patterns emerging around various politically charged topics. The goal was clear: take Russian narratives, dress them up as independent African commentary, and then amplify them across a wide range of platforms, effectively poisoning the well of public discourse and shaping perceptions in Africa to align with Russia’s strategic interests.

What makes this campaign particularly concerning is its reach and the willingness of some mainstream media outlets to become unwitting (or perhaps even willing) conduits for this disinformation. Of the 27 websites that published Godsin’s articles, many (13, to be precise) had already been flagged in previous investigations for their involvement in foreign information manipulation. A significant chunk of these articles, almost half, ended up on platforms associated with South Africa’s third-largest media conglomerate, Independent Media, a group that has itself faced scrutiny for publishing articles by other fictitious authors and for its perceived pro-Russia stance. Even reputable platforms like MSN, known for their strict editorial guidelines, were caught in the dragnet, publishing Godsin’s articles as “expert analysis,” for example, an outlandish claim about the US secretly providing humanitarian aid to a whites-only town in South Africa. This illustrates a critical vulnerability: even seemingly robust media safeguards can buckle under the sophisticated pressure of a well-resourced and technologically advanced disinformation operation.

The consequences are far-reaching. When fabricated stories from anonymous sources are allowed to circulate in mainstream media, even if later debunked, they gain what CfA calls “archival authority.” The sheer act of being published lends them a deceptive air of truth and legitimacy, influencing public opinion and shaping narratives long after their falsehood has been exposed. The Godsin case highlights that the problem isn’t just about individual fake accounts; it’s about the systemic weaknesses in journalistic gatekeeping. As Sbu Ngalwa, secretary-general of The African Editors Forum, aptly noted, AI dramatically lowers the barrier to entry for disinformation. In a world saturated with information, where newsrooms are often stretched thin, the human element of journalism – critical thinking, rigorous fact-checking, and adherence to ethical standards – becomes an indispensable safeguard against foreign interference. Without these defenses, African news organizations risk becoming “useful idiots” in a larger geopolitical game, unknowingly spreading propaganda that harms their own communities.

Ultimately, the “Dr. Manuel Godsin” saga is a cautionary tale for the digital age. It’s a stark reminder that the information we consume, particularly online, needs to be met with a healthy dose of skepticism and critical analysis. For media organizations, it underscores the urgent need to bolster their editorial safeguards, invest in fact-checking, and enforce rigorous standards of verification. In a world where AI can conjure up convincing fake personas and narratives at an unprecedented scale, the responsibility to discern truth from falsehood increasingly falls on both news producers and news consumers alike. The fight against foreign information manipulation is not just a technological challenge; it’s a battle for truth, trust, and the integrity of democratic discourse, demanding constant vigilance and a renewed commitment to ethical journalism.

Share. Facebook Twitter Pinterest LinkedIn Tumblr Email
News Room
  • Website

Keep Reading

Viral image of Tinubu, Sowore handshake is AI-generated

Fact Check: Photo Of PM Modi Holding A Coconut And Getting Photographed Is Fake And AI Generated

Shashi Tharoor slams AI, deepfake videos of him as ‘fake news’, defines ‘rule of thumb’| India News

Image claiming to show US airman rescued in Iran is fake. Here’s the proof

It’s finally happened: I’m now worried about AI. And consulting ChatGPT did nothing to allay my fears | Emma Brockes

Fake AI videos of Artemis II’s moon flyby are going viral

Editors Picks

Condemning the spread of misinformation

April 11, 2026

France 24 did not broadcast video report on disinfo against Pakistan

April 11, 2026

The Mainichi News Quiz: What percent of local gov’ts want laws on disaster misinformation?

April 11, 2026

Gov’t demands Meta intervention vs oil-linked disinformation

April 11, 2026

Weekly Wrap: Misinformation On Assembly Polls, Shashi Tharoor & More

April 11, 2026

Latest Articles

BJP, EC tried to invalidate my Bhabanipur candidature with ‘false cases’: Mamata at Keshiyari rally | India News

April 11, 2026

Roya News | South Korea president clashes with ‘Israel’ on rights, disinformation claims

April 11, 2026

South Korea president clashes with Israel on rights, disinformation claims

April 11, 2026

Subscribe to News

Get the latest news and updates directly to your inbox.

Facebook X (Twitter) Pinterest TikTok Instagram
Copyright © 2026 Web Stat. All Rights Reserved.
  • Privacy Policy
  • Terms
  • Contact

Type above and press Enter to search. Press Esc to cancel.