It’s a perplexing reality in our digitally saturated world: why does “fake news” often spread like wildfire, while well-researched, factual information struggles to gain traction? This question has long bothered Kennesaw State University researcher Aaron French, and his latest study, published in the Information Systems Frontiers journal, sheds a revealing light on the matter. French, an associate professor in the Michael J. Coles College of Business, partnered with Amrita George and Veda C. Storey from Georgia State University, and Joshua Madden from the University of Tennessee, to delve into the intricate psychology behind our susceptibility to misinformation. Their findings point to a powerful, often subconscious force at play: human emotion. It turns out that when a piece of information strikes a deep emotional chord – whether it’s tapping into our anger, fear, or a sense of personal relevance – we’re far more likely to embrace it, endorse it, and enthusiastically share it, even if its factual basis is tenuous at best. This emotional connection, they discovered, is a far more potent driver of belief and dissemination than mere truth. It speaks to our fundamental human nature, our yearning for stories that resonate with our inner world, even if those stories are ultimately a fabrication.
To understand this phenomenon, French and his team embarked on an ambitious endeavor, meticulously analyzing approximately 10,000 social media posts collected during the turbulent 2020 coronavirus pandemic. This period, rife with uncertainty and heightened emotions, provided a fertile ground for both genuine and deceptive information to flourish. From this wealth of data, they crafted an innovative framework they call the Content Dimensions–Overton Window–Perceived Utility model. Imagine it as a lens through which we can better understand how people sift through questionable information when the ground beneath them feels shaky. This model homes in on three crucial dimensions of any news item: its veracity, or how true it purports to be; its emotional appeal, essentially how it makes you feel; and its relevance, or how closely it touches upon your personal life and concerns. Their analysis revealed a critical insight: when faced with the deluge of online content, individuals are remarkably attentive to the tone of a message and the emotional impact it carries. This sensitivity often outweighs a rigorous assessment of its factual accuracy. It highlights a common human tendency to prioritize how information feels over how true it is, a vulnerability that purveyors of fake news skillfully exploit.
A particularly striking revelation from their research concerns the “Overton Window.” Coined by policy analyst Joseph P. Overton, this concept refers to the range of ideas or policies that are considered acceptable within a society. In the context of news, the “Overton Window” represents the boundaries of what people are willing to accept as “true enough.” Professor Veda C. Storey, a computer information systems expert at GSU and a co-author of the study, explained that people often have a certain tolerance for information that is “close enough” to the truth, sparing them the effort of thorough verification. What’s concerning, she notes, is that this “window” appears to be widening significantly. This means that a growing number of people are becoming more open to believing information that is increasingly extreme or deviates further from verifiable facts. As Storey puts it, this widening window creates a fertile environment for fake news to become not only more extreme in its content but also to exert a far broader and more disruptive influence on society. It’s a worrying trend, indicating a potential erosion of collective critical thinking and an increasing susceptibility to narratives that might once have been dismissed out of hand.
French draws a helpful distinction between tabloid news and fake news to underscore the deceptive nature of the latter. “Fake news resembles tabloid news in almost every way,” he observes, “The difference is ambiguity.” He elaborates that when people encounter tabloid headlines about celebrities having alien babies, there’s a clear understanding that it’s for entertainment, not to be taken as factual reporting. The intention is humor or shock value, and the audience implicitly recognizes the fantastical nature of the claims. Fake news, however, operates with a far more insidious agenda. “But fake news pretends to be real reporting,” French emphasizes. This pretense is crucial: fake news deliberately mimics the aesthetics and conventions of legitimate journalism, blurring the lines between credible information and outright fabrication. This deliberate ambiguity is what makes it so dangerous and effective. It leverages our trust in established news formats while subtly injecting verifiably false information, intending to mislead and manipulate our perceptions. It’s a wolf in sheep’s clothing, masquerading as a reliable source to push a particular agenda, sow discord, or simply deceive for various motives.
Given the complexities of this landscape, French offers some straightforward, yet powerful, advice: education and verification are our best defenses. His simplest recommendation is to “go to the source.” He urges us to resist the urge for an immediate emotional outburst when we encounter something that triggers a strong reaction. “Don’t go to social media and start posting or reacting to it,” he advises. Instead, if a piece of information truly ignites your passions and stirs your emotions, French counsels a pause for reflection and diligent research. “Check the source first and learn about it. Listen to the actual speech or read the actual document for yourself. Don’t just read one quick story, have an emotional reaction, and share the story without knowing what’s taking place.” He points to Finland’s national media-literacy curriculum, which educates students from a young age on how information can be manipulated, as an exemplary model for fostering critical thinking. This proactive approach, teaching individuals to be discerning consumers of information, is crucial in an age where misinformation is rampant. It empowers people to take control of their information diet, to question, to verify, and to ultimately make informed decisions rather than being swayed by emotional manipulation.
The advent of artificial intelligence, French acknowledges, has added another layer of complexity to this already challenging environment. While he emphatically states that “AI doesn’t create fake news. People do,” he concedes that AI has made disinformation significantly more potent and pervasive. The ability of AI to generate highly realistic images, videos, and even text makes deception far easier and more convincing. “Some videos are so real you can’t tell the difference,” he notes, highlighting the sophisticated nature of AI-powered misinformation campaigns. This technological advancement presents a formidable challenge, making it increasingly difficult for the average person to distinguish between genuine and fabricated content. Despite these daunting obstacles, French remains optimistic about the potential of understanding the psychological underpinnings of fake news. He believes that by delving into why people believe misinformation, we can develop effective strategies to combat it. “This study is important because once we understand those causes of why people start believing information, we can try to reduce misinformation and help people verify what they read,” he asserts. The ultimate goal, he concludes, is to “develop approaches to overcome that and reduce that believability, reduce that viral impact. That’s how we get back to real conversations instead of emotional reactions.” By dissecting the emotional levers that fake news manipulates, we can equip ourselves to be more conscious, critical consumers of information, moving away from knee-jerk emotional responses toward a more reasoned, factual engagement with the world.
