Close Menu
Web StatWeb Stat
  • Home
  • News
  • United Kingdom
  • Misinformation
  • Disinformation
  • AI Fake News
  • False News
  • Guides
Trending

Meta wants X-style community notes to replace fact checkers – can it work?

July 1, 2025

Originator Profile Development Selected for Govt Project; Initiative Aims to Combat AI-Driven Misinformation

July 1, 2025

Cambodian in custody after soliciting fake donations for border troops, sharing false information

July 1, 2025
Facebook X (Twitter) Instagram
Web StatWeb Stat
  • Home
  • News
  • United Kingdom
  • Misinformation
  • Disinformation
  • AI Fake News
  • False News
  • Guides
Subscribe
Web StatWeb Stat
Home»United Kingdom
United Kingdom

The Role of TikTok Bots and AI in the Resurgence of UK Far-Right Violence

News RoomBy News RoomDecember 9, 20245 Mins Read
Facebook Twitter Pinterest WhatsApp Telegram Email LinkedIn Tumblr

Far-Right Exploits AI and Social Media to Fuel Unrest After Stabbing Attack

A stabbing attack that claimed the lives of three children in Southport has become a catalyst for a resurgence of far-right activity in Britain. Within hours of the tragedy, an AI-generated image depicting menacing figures in Muslim attire outside Parliament was circulating on social media, inflaming anti-immigrant sentiment. This incident highlights the alarming ways in which artificial intelligence is being weaponized by extremist groups to spread misinformation and incite violence. The image, shared by an account notorious for spreading false information, quickly gained widespread visibility, demonstrating the speed and reach of online propaganda in the current digital landscape. The inflammatory caption, "We must protect our children," further fueled anxieties and played on existing societal fears.

The use of AI extends beyond static images. AI-generated music featuring xenophobic lyrics related to the Southport attack has surfaced on platforms like Suno, which allows users to create songs with AI-generated vocals and instrumentals. This demonstrates the versatility of AI tools in creating and disseminating inflammatory content, reaching audiences through various mediums. The creation of songs like "Southport Saga," with its disturbing lyrics, demonstrates the potential for AI to be used to normalize and disseminate hateful ideologies through seemingly innocuous channels.

This surge in far-right activity is characterized by a new level of online coordination and mobilization not seen in years. Exploiting the Southport tragedy, extremist groups have organized over ten protests across various social media platforms, including X (formerly Twitter), TikTok, and Facebook. The online activity includes death threats against the prime minister, incitement to attack government buildings, and blatant antisemitism. This highlights the increasingly bold and overt nature of online hate speech and its potential to translate into real-world violence.

Experts warn that this level of mobilization hasn’t been witnessed since the rise of the English Defence League (EDL) in the 2010s. The easy accessibility of AI tools has added a new dimension to the threat, enabling extremists to create a wide range of inflammatory content, from images and songs to text-based propaganda. This democratization of content creation significantly amplifies the potential for manipulation and the dissemination of harmful narratives.

The rise of AI-generated content poses unprecedented challenges. Andrew Rogoyski, a director at the University of Surrey’s Institute for People-Centred AI, emphasizes the ease with which anyone can create compelling, yet potentially harmful, imagery. He stresses the responsibility of AI model providers to implement stronger safeguards against misuse. The current lack of effective regulation and the rapid advancement of AI technology create a dangerous gap that extremist groups are readily exploiting.

The far-right landscape has evolved beyond traditional organizational structures. While established groups like Britain First and Patriotic Alternative remain active, a significant portion of the online activity is driven by individuals unaffiliated with any specific organization. These individuals, often influenced by far-right social media "influencers," contribute time and resources to further a shared political agenda. Joe Mulhall, director of research at Hope Not Hate, describes this as a collaborative effort outside of formal hierarchies, operating through micro-donations and online networking.

The hashtag #enoughisenough, previously associated with anti-migrant activism, has been co-opted by right-wing influencers to promote protests and further their agenda. This tactic underscores the sophisticated use of existing online discourse to amplify extremist narratives and connect with a wider audience. The exploitation of established hashtags allows extremist groups to piggyback on existing conversations and inject their ideologies into mainstream discussions.

Analysts have also noted the use of bots to artificially inflate the visibility of extremist content. Tech Against Terrorism, a UN initiative, identified a TikTok account created specifically after the Southport attack. Despite its recent creation, the account quickly amassed a significant number of views, suggesting the use of bot networks to promote the content. This tactic of artificial amplification creates a false sense of popular support for extremist views and can contribute to the normalization of hateful ideologies.

The influence of key figures like Tommy Robinson, the far-right activist currently evading court proceedings, continues to be significant. Other influential figures, including actor-turned-activist Laurence Fox and conspiracy theory websites like the Unity News Network (UNN), have also played a role in disseminating misinformation and inflammatory rhetoric related to the Southport attack. Their online platforms serve as echo chambers, reinforcing extremist views and encouraging radicalization.

The UNN’s Telegram channel, a platform known for its lack of moderation, has become a breeding ground for violent rhetoric, including calls for the burning of government buildings and the execution of political figures. This unchecked spread of violent speech poses a serious threat to public safety and democratic institutions.

Activists from Patriotic Alternative, one of the fastest-growing far-right groups, were observed at riots in Southport. Several other groups, including those divided over issues like the war in Ukraine and the Israeli-Palestinian conflict, have also attempted to capitalize on the current climate of unrest. This demonstrates the opportunistic nature of extremist groups and their willingness to exploit any event to further their agenda.

Dr. Tim Squirrell, director of communications at the Institute for Strategic Dialogue, warns that the current online information environment is "the worst it’s been in recent years." He highlights the rise of accounts, both large and small, that curate news stories appealing to anti-migrant and anti-Muslim sentiments, often disregarding factual accuracy. This proliferation of misinformation creates a fertile ground for extremist ideologies to take root and flourish.

The confluence of these factors – the exploitation of a tragic event, the weaponization of AI, the coordinated online mobilization, and the spread of misinformation – creates a volatile situation with the potential for significant real-world consequences. The current climate bears disturbing similarities to the period that saw the rise of the EDL, raising concerns about a resurgence of street-level far-right extremism. The challenge lies in addressing the complex interplay of these factors to prevent further escalation and protect vulnerable communities from targeted violence.

Share. Facebook Twitter Pinterest LinkedIn Tumblr Email
News Room
  • Website

Keep Reading

Meta wants X-style community notes to replace fact checkers – can it work?

Industry-funded alcohol-reduction apps contain misinformation, study warns |

UK: Far-right riots allegedly fuelled by social media misinformation spread on X, Telegram, Instagram & Facebook

Pakistani man faces cyber-terror charge over false posts linked to UK riots | Cybercrime News

Health Misinformation: UNPACKED | LSHTM

Science needs a better story – PRWeek

Editors Picks

Originator Profile Development Selected for Govt Project; Initiative Aims to Combat AI-Driven Misinformation

July 1, 2025

Cambodian in custody after soliciting fake donations for border troops, sharing false information

July 1, 2025

DEPED WARNS VS FAKE NEWS ON SATURDAY CLASSES The Department of Education (DepEd) has warned the public against false social media posts claiming that Saturday classes will be added for elementary to senior high school students. DepEd urg – Facebook

July 1, 2025

7 Manila cops nabbed for robbery-extortion over false drug rap – ABS-CBN

July 1, 2025

Germany’s Fragmented Approach to Disinformation in 2025 Elections

July 1, 2025

Latest Articles

Poonam Dhillon speaks out on Sridevi’s intelligence and talent, debunking false rumors; says, “I’ve always been an admirer of her work” : Bollywood News

July 1, 2025

Information overload: Can we keep our minds and our democracy?

July 1, 2025

Fake news in the age of AI

July 1, 2025

Subscribe to News

Get the latest news and updates directly to your inbox.

Facebook X (Twitter) Pinterest TikTok Instagram
Copyright © 2025 Web Stat. All Rights Reserved.
  • Privacy Policy
  • Terms
  • Contact

Type above and press Enter to search. Press Esc to cancel.