Close Menu
Web StatWeb Stat
  • Home
  • News
  • United Kingdom
  • Misinformation
  • Disinformation
  • AI Fake News
  • False News
  • Guides
Trending

AI-Generated Political Campaigns Are Getting Harder to Spot

May 11, 2026

A Nobel economist models how AI rots the information environment

May 11, 2026

How Illinois county clerks are combating election misinformation

May 11, 2026
Facebook X (Twitter) Instagram
Web StatWeb Stat
  • Home
  • News
  • United Kingdom
  • Misinformation
  • Disinformation
  • AI Fake News
  • False News
  • Guides
Subscribe
Web StatWeb Stat
Home»Disinformation
Disinformation

A Nobel economist models how AI rots the information environment

News RoomBy News RoomMay 11, 20265 Mins Read
Facebook Twitter Pinterest WhatsApp Telegram Email LinkedIn Tumblr

It seems like we’re all feeling it, that nagging sense that something’s off with the information we encounter online. Australians, in particular, are really worried about it, even more so than the thought of a foreign attack, according to a big survey. We feel swamped by information, struggle to tell what’s real from what’s fake, and suspect that algorithms are pulling our strings. Now, a couple of brilliant minds, Joseph Stiglitz and Maxim Ventura-Bolet, have given us the economic proof for this feeling. They’ve shown that without intervention, the market is set up to crank out more fake news and less truth. And guess what? Artificial intelligence is only going to pour gasoline on that fire. Their key takeaway is clear: we can’t fix this by just asking people to be nicer online. We have to change the fundamental ways in which information is produced and shared.

To understand this mess, let’s simplify a bit. Think of information as something that’s bought and sold. For a long time, traditional news organizations – newspapers, journalists, broadcasters – were the main producers, and we, the public, were the consumers. Quality journalism, which isn’t cheap to make, was largely supported by advertising. This system, while imperfect, at least kept the lights on for newsrooms. But then came social media, and everything changed. Suddenly, new players emerged – citizen journalists, influencers – and sandwiched between them and us were these huge digital platforms. Remember how hard it used to be to find good information online before Google? These platforms were a godsend, gathering everything, making it searchable, and surfacing what seemed most relevant, all for free. But Stiglitz and Ventura-Bolet highlight the insidious truth: it’s not that we became lazy; it’s that these platforms deliberately made it harder to go directly to original sources because their entire business model depends on keeping our eyes glued to their screens.

The core of the problem is that platform owners make their money through advertising and data collection, which thrive on engagement. Every second we spend on their platforms is revenue for them. If we go directly to a news site, that revenue goes to someone else. What these platforms quickly realized was that outrage and strong emotions kept people hooked far longer than calm, factual reporting. So, their algorithms started rewarding provocative content, whether it was true or not, because it generated more clicks. Nuanced discussions, thoroughly verified facts, public interest journalism – these don’t get the same mileage because they aren’t as profitable. As a result, the algorithms, designed to keep us coming back, steered us away from original news sites and towards AI overviews or our social media feeds. The original content creators, the ones doing the hard work, lost the traffic and, consequently, the revenue, usually without us even realizing the cost until much later.

Now, imagine AI stepping into this already fragile picture. On one hand, it can legitimately summarize and synthesize information by scraping content from quality journalism. But it does this without necessarily paying the original creators, effectively undermining their revenue streams further. On the other hand, AI can also churn out incredibly convincing fake content, cheaply and at breakneck speed. Stiglitz makes a crucial point: AI isn’t concerned with truth or falsehood; it’s optimized for efficiency. This means it has no inherent preference for creating accurate information over misleading content if the latter is easier or more efficient to generate. This capability allows a flood of disinformation and low-quality content to overwhelm our information ecosystem. When quality producers are no longer incentivized to create new, truthful content, the problem compounds. As Stiglitz eloquently puts it with an old but relevant adage: “garbage in, garbage out.” If AI is trained on biased or inaccurate data, it will inevitably produce distorted content, further muddying the waters.

The ripple effect of this isn’t just about truth; it’s also about polarization. Stiglitz and Ventura-Bolet’s model shows how this becomes a self-fulfilling prophecy. Once disinformation starts to gain traction, audiences, often driven by a need for confirmation, seek out content that reinforces their existing beliefs. This creates a powerful demand for more disinformation, which in turn becomes profitable for those who produce it. This cycle actively pushes quality news organizations, which depend on objective reporting, further out of business. So, the economic incentives are actually structured to reward more disinformation, leading us into a spiraling decline where quality information struggles to survive. The big winners in this scenario are the platforms themselves, who profit from our attention regardless of content quality, and the misinformation producers who use cheap, AI-generated content to capture engagement. The losers are clear: quality news, public interest journalism, and, most importantly, the public that relies on them for an informed society.

Stiglitz and Ventura-Bolet offer a stark warning but also a glimmer of hope: this isn’t a destiny we have to accept. However, they are firm that the market, left to its own devices, will not fix itself. No one within the current system is incentivized to act differently. Individuals choosing to log off social media, while perhaps good for their own mental well-being, won’t change the underlying economic drivers for online platforms or AI companies. The only way to stop this downward spiral, they argue, is through government intervention. This means things like holding platforms accountable for amplifying certain content, compelling them to tackle coordinated disinformation campaigns, and ensuring that news producers have intellectual property protections for their hard work. Australia has shown leadership in this area before, trying to make digital platforms pay for news content. While recent agreements with AI companies are a positive step, a memorandum of understanding isn’t the same as concrete regulation. Goodwill alone isn’t a strong enough incentive structure. The urgent task now is to solidify these efforts into binding regulations before the damage to our information environment becomes irreversible and the ability to distinguish truth from fiction is completely lost.

Share. Facebook Twitter Pinterest LinkedIn Tumblr Email
News Room
  • Website

Keep Reading

Elections Sask. wants more power to counter disinformation, foreign interference

U.K. Sanctions Russian ‘Disinformation’ Outfit Over Plot to Sway Armenian Elections

Under the slogan “Independent Media… Strong Society”: Etaf Al-Rudan Highlights from Amman the Role of Community Media in Promoting Social Peace and Combating Disinformation

Burkina Faso bans French TV channel over ‘disinformation’

'We've not done well on the propaganda war': PM attributes drop in support from Israel in US to social media disinformation – The Times of Israel

The Science Misinformation Gap – Quillette

Editors Picks

A Nobel economist models how AI rots the information environment

May 11, 2026

How Illinois county clerks are combating election misinformation

May 11, 2026

Elections Sask. wants more power to counter disinformation, foreign interference

May 11, 2026

Informed Source: WSJ Report on Iran Proposal Is False in Key Areas

May 11, 2026

Delta decries ‘politically-motivated misinformation’ on condition of schools

May 11, 2026

Latest Articles

U.K. Sanctions Russian ‘Disinformation’ Outfit Over Plot to Sway Armenian Elections

May 11, 2026

Egypt prosecution refers defendant to trial over false claims against private university – Courts & Law – Egypt

May 11, 2026

Grok spreads election misinformation saying migration is at a ‘record-high’

May 11, 2026

Subscribe to News

Get the latest news and updates directly to your inbox.

Facebook X (Twitter) Pinterest TikTok Instagram
Copyright © 2026 Web Stat. All Rights Reserved.
  • Privacy Policy
  • Terms
  • Contact

Type above and press Enter to search. Press Esc to cancel.