Close Menu
Web StatWeb Stat
  • Home
  • News
  • United Kingdom
  • Misinformation
  • Disinformation
  • AI Fake News
  • False News
  • Guides
Trending

Morris Dixon chides Opposition for ‘false allegations’ against Rural School Bus System

July 6, 2025

Nick Clegg: Don’t blame algorithms — people like fake news – The Times

July 6, 2025

WHEN A VICE PRESIDENT ECHOS KREMLIN DISINFORMATION ACROSS THE GLOBE, SHOULD CONGRESS INVESTIGATE?

July 6, 2025
Facebook X (Twitter) Instagram
Web StatWeb Stat
  • Home
  • News
  • United Kingdom
  • Misinformation
  • Disinformation
  • AI Fake News
  • False News
  • Guides
Subscribe
Web StatWeb Stat
Home»Disinformation
Disinformation

Combating Disinformation Through an Organized Crime Framework

News RoomBy News RoomJanuary 19, 20255 Mins Read
Facebook Twitter Pinterest WhatsApp Telegram Email LinkedIn Tumblr

A New Approach to Disinformation: Treating it Like Organized Crime

The Australian government’s current approach to combating disinformation, centered on content moderation and platform access control, is akin to mopping the floor while a flood rages. It addresses surface-level symptoms but ignores the underlying cause – the organized networks that drive disinformation campaigns. A more effective strategy would be to treat disinformation as a form of organized crime, focusing on dismantling the networks responsible for its creation and dissemination, rather than simply reacting to individual instances of false content. Organized crime legislation proves effective because it targets the underlying structure and patterns of criminal activity, not just the commodities being traded. Similarly, laws targeting disinformation should focus on indicators of coordinated inauthentic behavior, financial patterns, and systematic manipulation for profit or influence – regardless of the specific content. This approach would effectively target the disinformation infrastructure without compromising freedom of expression.

Instead of the current whack-a-mole approach to content moderation, which relies on controversial community notes and ineffective access bans, governments, social media platforms, and cybersecurity partners could collaboratively target and dismantle malicious disinformation enterprises. By focusing on the networks and actors behind these campaigns, resources can be directed towards identifying and disrupting the coordinated efforts that amplify and spread disinformation. This shift in focus allows for more proactive measures, rather than reactive responses to individual pieces of content. This approach recognizes that disinformation is a deliberate act, distinct from misinformation, which is unintentionally false. While previous efforts have focused on content moderation and fact-checking, these methods have proven largely ineffective in stemming the tide of disinformation.

The limitations of current content moderation practices are evident in the challenges posed by both human and AI-driven moderation efforts. Human moderation is costly and time-consuming, while AI moderation often struggles with nuances of language, context, and intent, leading to misclassification of content and difficulty in differentiating between disinformation and legitimate discussions about disinformation. Automated systems often fail to detect harmful content in languages other than English, regional dialects, and culturally specific contexts. Furthermore, AI often misinterprets satire and humor, leading to the removal of legitimate content. Platform access control, such as age-based bans, presents another set of challenges, particularly in the realm of enforcement. More critically, such measures raise fundamental philosophical concerns by impeding young people’s development as informed digital citizens and restricting their participation in online civic discourse.

The philosophical underpinnings of free speech in liberal democracies present a significant challenge to traditional content moderation approaches. These approaches treat free speech as a technical problem solvable through algorithms and regulations, neglecting its crucial role in the democratic process. Limiting access to online platforms for specific age groups creates tension with democratic principles, as it restricts young people’s access to vital spaces for civic engagement and public discourse. This can hinder the development of essential digital literacy skills, leaving young people unprepared for the complexities of online environments when they eventually gain unrestricted access, which in any case in the modern age can be difficult to prevent.

Framing disinformation as organized crime allows for a more targeted approach, addressing the root cause of the problem: the malicious actors and networks that create and distribute harmful content. This approach avoids the pitfalls of broad restrictions on speech and platform access by focusing specifically on malicious groups, regardless of whether their origin is foreign interference, domestic coordinated inauthentic networks, or financially motivated groups profiting from fake news. By focusing on the organized nature of these efforts, law enforcement and intelligence agencies can leverage existing frameworks designed to combat organized crime to disrupt the infrastructure supporting disinformation campaigns.

To effectively prosecute disinformation as organized crime, several elements must be demonstrated: criminal intent, harm or risk to public safety, structured and coordinated efforts, and proceeds of crime. The definition of disinformation itself, which involves the intent to deceive for malicious purposes, covers the first two elements. Assessments from organizations like the Australian Security Intelligence Organisation (ASIO) and the Australian Strategic Policy Institute (ASPI) have highlighted the threat and harm posed by foreign interference and online information operations targeting Australia. These assessments provide evidence of both intent and harm – to individuals, institutions, and society – pursued for financial or political gain. The structured and coordinated nature of disinformation campaigns is also well-documented. Social media companies like Meta, Google, Microsoft, OpenAI, and TikTok regularly detect and disrupt covert online influence operations, demonstrating the organized nature of these activities and providing insights into the tactics employed by malicious actors. Finally, the financial element of disinformation, its funding streams, can be treated as proceeds of crime, further aligning it with the organized crime model. Investigating and targeting the financial infrastructure supporting these campaigns, including shell companies, suspicious transactions, and compromised accounts, can help differentiate malicious actors from individuals expressing genuine, albeit controversial, beliefs. By focusing on the organized and financially driven nature of disinformation, this approach minimizes the risk of chilling legitimate discourse or impeding the free exchange of ideas, thereby preserving the core values of liberal democratic societies.

Share. Facebook Twitter Pinterest LinkedIn Tumblr Email
News Room
  • Website

Keep Reading

WHEN A VICE PRESIDENT ECHOS KREMLIN DISINFORMATION ACROSS THE GLOBE, SHOULD CONGRESS INVESTIGATE?

China Used Embassies, Disinformation To Sabotage Rafale Fighter Jet Exports After India-Pakistan Clash: Report

Beijing disinformation targeted French Rafale jets to boost sales of China-made planes, intel says

China, Pakistan Behind Anti-Rafale Campaign? France Alleges Global Disinformation Plot

China, Pakistan behind anti-Rafale jets campaign? France flags ‘disinformation’ after India’s Operation Sindoor

France Accuses China Of Anti-Rafale Disinformation Push To Derail Jet Sales In Asia, Africa

Editors Picks

Nick Clegg: Don’t blame algorithms — people like fake news – The Times

July 6, 2025

WHEN A VICE PRESIDENT ECHOS KREMLIN DISINFORMATION ACROSS THE GLOBE, SHOULD CONGRESS INVESTIGATE?

July 6, 2025

China Used Embassies, Disinformation To Sabotage Rafale Fighter Jet Exports After India-Pakistan Clash: Report

July 6, 2025

Melbourne synagogue fire shows Australia’s multicultural project needs urgent help

July 6, 2025

Beijing disinformation targeted French Rafale jets to boost sales of China-made planes, intel says

July 6, 2025

Latest Articles

China, Pakistan Behind Anti-Rafale Campaign? France Alleges Global Disinformation Plot

July 6, 2025

China, Pakistan behind anti-Rafale jets campaign? France flags ‘disinformation’ after India’s Operation Sindoor

July 6, 2025

SDPI State office-bearer Riyaz Kadambu booked for fake news

July 6, 2025

Subscribe to News

Get the latest news and updates directly to your inbox.

Facebook X (Twitter) Pinterest TikTok Instagram
Copyright © 2025 Web Stat. All Rights Reserved.
  • Privacy Policy
  • Terms
  • Contact

Type above and press Enter to search. Press Esc to cancel.