Close Menu
Web StatWeb Stat
  • Home
  • News
  • United Kingdom
  • Misinformation
  • Disinformation
  • AI Fake News
  • False News
  • Guides
Trending

Getting Used to Bad Things Is Bad for Your Health

March 20, 2026

Letter to the editor: Housing disinformation ‘muddies the facts’

March 20, 2026

MSU Museum panel teaches about AI, politics and misinformation

March 20, 2026
Facebook X (Twitter) Instagram
Web StatWeb Stat
  • Home
  • News
  • United Kingdom
  • Misinformation
  • Disinformation
  • AI Fake News
  • False News
  • Guides
Subscribe
Web StatWeb Stat
Home»Misinformation
Misinformation

MSU Museum panel teaches about AI, politics and misinformation

News RoomBy News RoomMarch 20, 20265 Mins Read
Facebook Twitter Pinterest WhatsApp Telegram Email LinkedIn Tumblr

In an age where information swirls around us like a digital storm, it’s becoming harder to tell what’s real and what’s not. The MSU Museum, with its “Blurred Realities” exhibit, is stepping into this challenging space, inviting us to not just see, but to deeply think about the AI-generated content we’re constantly consuming. This isn’t just about cool tech; it’s about understanding how AI is quietly, and sometimes not so quietly, shaping our perceptions, especially in critical areas like elections. A recent panel, “AI, Elections, and the Fight for Facts,” brought together experts to unravel this complex knot, a collaboration between the museum and MSU’s Department of Political Science. It’s like they’re saying, “Hey, this isn’t just theory; this is happening now, and we need to talk about it.”

At the heart of the “Blurred Realities” series lies “Generative Persuasion,” an exhibit masterfully crafted by Dr. Jennifer Gradecki and Dr. Derek Curry, associate professors at Northeastern University. They didn’t just create an exhibit; they built a mirror reflecting how Generative AI can subtly twist our understanding, crafting compelling yet entirely false narratives that can push people towards extreme views. Their inspiration wasn’t theoretical; it was ripped straight from the headlines. Gradecki recounted unsettling parallels, pointing to the infamous Cambridge Analytica scandal of 2018, where personal data from millions of Facebook profiles was used without consent to microtarget voters, influencing pivotal elections like Brexit and the 2016 US presidential race. While the exact impact remains debated, the success of the campaigns it supported speaks volumes. Adding to this, she highlighted OpenAI’s threat reports, which expose how state and non-state actors are actively employing ChatGPT and other AI tools to concoct disinformation campaigns. What’s even more concerning, these actors aren’t just using paid services; they’re leveraging local, open-source AI models to operate in secrecy, making their influence even harder to trace.

Despite its fictional framework, “Generative Persuasion” is a stark portrayal of a very real threat. Gradecki emphasized that while the exhibit is an artwork, it’s not speculative. We know, she asserted, that influence campaigns are already using microtargeting and generative AI to rapidly produce persuasive disinformation. She made a crucial distinction: disinformation isn’t just outright lies; it’s also distorted truths, half-truths, and emotionally charged judgments. The core purpose, she explained, is to manipulate and influence – to ignite strong emotions like pride, hatred, or outrage, or even to divert attention and stifle dissent. For Curry, the exhibit’s mission is clear and urgent: to equip us with the skills to navigate this digital landscape, fostering “media, data and AI literacies” and encouraging a healthy skepticism towards all online content. It’s about empowering us to discern, question, and ultimately, stand firm against manipulative narratives.

Ashlee Smith, Senior Director of Content and Education for WKAR, brought her invaluable media perspective to the panel, moderating the discussion. She underscored the critical timing of these conversations, especially during an election year, as the proliferation of AI content blurs the lines between fact and fiction, making it increasingly difficult for ordinary people to tell what’s real. Having to navigate the evolving landscape of AI in her daily work at WKAR, Smith shared her firsthand experience with its rapid pace of change. She passionately articulated WKAR’s commitment to remaining a trusted source of information, emphasizing that their offerings are “human, factual, and editorially sound” in a vast “sea of synthetic media.” Her resolve to uphold journalistic integrity serves as a beacon in these turbulent times, reminding us of the enduring value of authentic reporting.

Smith’s message extends beyond the immediate challenge of identifying AI-generated content; she hopes to ignite a deeper understanding among students about the profound societal implications of AI. She acknowledged that AI can feel like an intimidating subject, yet stressed its paramount importance. “One of the most important things we can do as citizens and consumers is to be media literate,” she urged, emphasizing the need to grasp AI’s effects and implications so we can actively seek truth rather than passively accepting what’s presented. Her hope is that students will engage with these crucial conversations, be inspired to learn more, and share this vital information with their peers, fostering a ripple effect of informed citizenry. Claire Urban, an international relations and comparative cultures and politics sophomore, echoed this sentiment. She shared how her studies at James Madison College have honed her critical thinking skills, enabling her to scrutinize news with a discerning eye. Urban articulated her growing concern about AI’s increasing presence in politics, acknowledging its potential to profoundly shape future elections.

Urban’s concerns about AI’s impact on elections are not just abstract. She cited a Washington Post article detailing how AI companies have already influenced or aligned with dozens of candidates in primary elections. The recent news of OpenAI’s partnership with the Department of Defense, followed swiftly by a campaign against Anthropic labeling it a “supply chain risk,” further illuminates the complex power dynamics at play. As AI technologies advance, she observed, distinguishing between what’s AI and what’s not becomes an increasingly daunting task. Despite this growing influence, Urban firmly believes that “AI should not have a hand in our elections.” She eloquently highlighted the irreplaceable human elements in politics that AI can never replicate. Diplomacy, she argued, is our primary defense, preceding weapons and conflict. Relying on AI for electoral insights or world information, she warned, provides a “watered-down version” that lacks the critical thinking and empathy essential for understanding complex human situations. Urban urged people to engage directly with politicians’ words, to understand their legislative motivations, something AI simply cannot convey. Her powerful call to action stressed the urgent need to fact-check everything in this new era of constantly forged information, warning that without it, we risk misunderstanding conflicts, cultures, and the very ideas that shape our world.

Share. Facebook Twitter Pinterest LinkedIn Tumblr Email
News Room
  • Website

Keep Reading

Getting Used to Bad Things Is Bad for Your Health

Mental health misinformation widespread on social media, study finds

Opinion: The “Thrown Off Roofs” Narrative – Why Misinformation Harms Queer Solidarity – MambaOnline

United Arab Emirates: Abu Dhabi: Arrests for spreading misinformation

Abu Dhabi Police Arrest 109 for Spreading Misinformation During Ongoing Events

AD Police arrests 109 for filming events, spreading misinformation

Editors Picks

Letter to the editor: Housing disinformation ‘muddies the facts’

March 20, 2026

MSU Museum panel teaches about AI, politics and misinformation

March 20, 2026

EC launches DSA crackdown on ‘disinformation’ ahead of Hungary elections

March 20, 2026

Man arrested in Aylesbury after suspected false imprisonment and assault incident

March 20, 2026

Supreme Leader Mojtaba denies Iran’s role in attacks in Oman, Turkiye, dubs them ‘false flag tactic’ – World

March 20, 2026

Latest Articles

Mental health misinformation widespread on social media, study finds

March 20, 2026

What would a human rights-based response to disinformation and information manipulation look like?

March 20, 2026

Fact check: Meningitis B vaccine and false ‘lockdown’ claim

March 20, 2026

Subscribe to News

Get the latest news and updates directly to your inbox.

Facebook X (Twitter) Pinterest TikTok Instagram
Copyright © 2026 Web Stat. All Rights Reserved.
  • Privacy Policy
  • Terms
  • Contact

Type above and press Enter to search. Press Esc to cancel.