Close Menu
Web StatWeb Stat
  • Home
  • News
  • United Kingdom
  • Misinformation
  • Disinformation
  • AI Fake News
  • False News
  • Guides
Trending

Anne Hathaway Clears the Air on Devil Wears Prada 2 Casting ‘Misinformation’

May 3, 2026

Disinformation in Minneapolis Shooting Points at People Who Were Not Involved

May 3, 2026

Busan police use fines to deter false emergency reports that waste resources, hamper operations

May 3, 2026
Facebook X (Twitter) Instagram
Web StatWeb Stat
  • Home
  • News
  • United Kingdom
  • Misinformation
  • Disinformation
  • AI Fake News
  • False News
  • Guides
Subscribe
Web StatWeb Stat
Home»Misinformation
Misinformation

AI decides what we see online. Digital platforms must tell us how they do it

News RoomBy News RoomMay 3, 20267 Mins Read
Facebook Twitter Pinterest WhatsApp Telegram Email LinkedIn Tumblr

It feels like the world of news has been turned on its head, and we’re all scrambling to figure out what’s real, what’s reliable, and who we can actually trust. The lines have blurred, and a lot of that confusion stems from the digital platforms we use every day – you know, social media, search engines, and now, those increasingly common AI chatbots. These platforms, through their hidden algorithms, are making invisible choices about the news we see. It’s like they’re the silent editors of our digital world, deciding what gets a spotlight and what gets buried, often without any real accountability or consideration for accuracy. This isn’t just a minor inconvenience; it’s fundamentally reshaping how we access information, impacting everything from civic discourse to our very trust in the news.

We’re all feeling it – that constant barrage of information, the nagging doubt about whether something we read online is actually true. Australians, in particular, are becoming so overwhelmed that they’re starting to switch off from traditional news sources, drifting instead towards the seemingly endless feeds of social media, the opinions of influencers, and now, the slick summaries offered by AI. It’s a murky, polluted landscape where these opaque algorithms are the gatekeepers, and honestly, they don’t seem to care much about facts or quality. This is a huge problem, because a healthy society relies on solid, evidence-based reporting to make informed decisions and thrive. It’s like we’re trying to navigate a dense fog, and the only maps we have are constantly changing and often misleading.

And just when we thought it couldn’t get worse, local journalism – that essential bedrock of community information – is struggling to survive, disappearing from our towns and cities. The distrust many people feel towards mainstream news is at an all-time high, and the rise of “zero-click” AI search results is pouring gasoline on the fire. Instead of offering links to actual news articles, these AI systems are serving up answers directly, which means fewer people visit news websites. This directly impacts their audience, their ability to get subscribers, and their revenue, pushing an already fragile news industry right to the brink. It’s a domino effect: less traffic, less revenue, less journalism, and ultimately, less reliable information for all of us.

Recognizing this growing crisis, a diverse group of leaders – folks from industry, government, non-profits, digital platforms, and academia – gathered recently for a “News Futures: Media Policy Roundtable.” The resounding consensus? The secrecy surrounding how algorithms on social media, search engines, and AI platforms operate – how they decide what to show, what to rank, and what to hide, all with minimal accountability – has become a major threat to the very existence of journalism and to the public’s trust. The report that came out of this meeting isn’t just a call for minor tweaks; it’s a demand for a complete overhaul, a “paradigm shift,” in how we define and support journalism in Australia. It’s a pretty bold statement, and it speaks to the urgency of the situation.

It’s no surprise that misinformation is absolutely thriving in this environment. When people are hungry for information but can’t find enough verified, credible sources, that’s when the lies and half-truths take root. Quality news, readily available, acts as a crucial counterweight to this misinformation. Our research vividly shows a strong connection between regularly consuming news and people’s ability to tell the difference between fact and fiction. But the content landscape is evolving faster than our ability to regulate it. Laws and civic education can’t keep up with things like “deepfakes” created by AI, which can be incredibly convincing. There are no clear rules about where online content comes from or how to check if it’s real. And because many AI systems are like “black boxes,” it’s incredibly hard to pinpoint who’s responsible when they spread false information or show biases.

The statistics are pretty stark. Australians already have very little confidence in their own ability to spot misinformation. Only about 40% feel confident they can determine if a website or social media post is trustworthy, and just 43% believe they can verify the truthfulness of information found online. This problem is getting worse with the increasing prevalence of “AI slop” – that low-quality, often factually incorrect content generated by AI. It’s led to a situation where Australians are among the most concerned globally about online misinformation. When everything starts to feel unreliable, it’s easy to just switch off, and a significant 69% of Australians admit to avoiding news often, sometimes, or occasionally. This disengagement is a dangerous trend, leaving us more susceptible to manipulation and less equipped to make informed decisions.

The problem, at its core, is that digital platforms are fundamentally unreliable gateways to news. Their algorithms, those invisible decision-makers, are constantly shaping what we see and don’t see. They act as filters, often elevating certain content and demoting others, frequently without much regard for quality or accuracy. They create “winners” and “losers” in the information ecosystem. The frustrating part is, there’s no real incentive for these platforms to tell us how their algorithms work, when they change, or how they decide what news gets prioritized (or de-prioritized). They certainly aren’t keen on explaining how AI-generated content is produced. There’s an urgent, pressing need for transparency in how algorithms select and present information, and a clear requirement for mandatory labeling of anything generated by AI. We need to know who or what is behind the information we consume.

Thankfully, the roundtable participants weren’t just about identifying problems; they also laid out five vital priorities to significantly improve our current information ecosystem. Three of these zero in on AI directly. First, we need much greater transparency from the big technology companies. Australians deserve to know how algorithms on search engines, social media, and AI chatbots curate news. We need clear labels and disclosures whenever AI is involved in creating content. This kind of transparency would be a massive step towards rebuilding trust and giving users more control over their information stream.

Second, there needs to be fair rules for how AI uses news content. It’s simply not right for AI companies to take journalism for free to train their systems. We need industry-wide licensing agreements, copyright reform, and stronger competition laws to ensure news organizations are properly compensated when their hard work is used by generative AI tools. This would help sustain the very news that these AI models rely on. Third, and perhaps most crucial, is prioritizing media and AI literacy education across the entire nation. Teaching people how algorithms work, how to recognize bias, and how to spot misinformation is one of the fastest and most cost-effective ways to intervene. And it’s not just for kids in schools; adults need ongoing opportunities to learn these essential digital skills too.

Beyond AI, the roundtable highlighted two other critical areas. Fourth, journalism funding should reflect its role as a public good. One-off grants aren’t enough to sustain a vital industry. Proposals like a tax offset for journalists’ salaries could offer a more sustainable way to support newsrooms directly, especially smaller and regional outlets, while ensuring accountability. Finally, there’s a need for journalism training for news influencers, content creators, and digital-first outlets. As the media landscape diversifies, a common industry code is essential to maintain the quality and integrity of the entire news ecosystem. This isn’t a task for one player; the industry needs to work together to establish these standards.

Ultimately, society simply cannot afford to live in an information environment where invisible AI dictates what we see, hear, and believe. Without decisive action, the public interest journalism that forms the very backbone of our democracy and social cohesion will continue to weaken and, eventually, crumble. The stakes are incredibly high, and it’s up to all of us – technologists, journalists, policymakers, and citizens – to demand a more transparent, accountable, and trustworthy information world.

Share. Facebook Twitter Pinterest LinkedIn Tumblr Email
News Room
  • Website

Keep Reading

Anne Hathaway Clears the Air on Devil Wears Prada 2 Casting ‘Misinformation’

UNESCO warns misinformation erodes trust as Afghanistan media faces deepening crisis

Janhvi Kapoor’s Team Clarifies Alcohol Remark, Calls Out ‘Misinformation’

Cyberpedia launches AI search engine to tackle misinformation, disinformation

SERAP, NGE say attacks on journalists weakening accountability, spreading misinformation

Fans Carnival Cruise: Director Clears Misinformation on Allowed Cooling Devices

Editors Picks

Disinformation in Minneapolis Shooting Points at People Who Were Not Involved

May 3, 2026

Busan police use fines to deter false emergency reports that waste resources, hamper operations

May 3, 2026

AI decides what we see online. Digital platforms must tell us how they do it

May 3, 2026

Managing disinformation at scale | Deloitte Insights

May 3, 2026

Is Zcash (ZEC) in a False Rally? Analysts Weigh In as Price Pushes Above $400

May 3, 2026

Latest Articles

Zardari, PM Shehbaz vow to defend press freedom – Pakistan Today

May 3, 2026

One arrested following false report of shots fired in Brookings County

May 3, 2026

UNESCO warns misinformation erodes trust as Afghanistan media faces deepening crisis

May 3, 2026

Subscribe to News

Get the latest news and updates directly to your inbox.

Facebook X (Twitter) Pinterest TikTok Instagram
Copyright © 2026 Web Stat. All Rights Reserved.
  • Privacy Policy
  • Terms
  • Contact

Type above and press Enter to search. Press Esc to cancel.