Melanie Trecek-King is a powerhouse in the world of science education, a speaker, and a writer passionate about helping us all think more clearly, understand science better, and spot the misinformation that clutters our digital lives. She’s the brilliant mind behind “Thinking Is Power,” an Associate Professor of Biology at Massasoit Community College, and even the Education Director for the Mental Immunity Project – a title that sounds as intriguing as her work. Plus, she’s a Fellow of the Committee for Skeptical Inquiry, showing her deep commitment to critical analysis. Soon, her new book, A Field Guide to Spotting Misinformation, will hit the shelves, promising to be an essential tool for anyone trying to navigate the often-tricky landscape of online information. It’s clear she’s on a mission to empower people with the skills to discern truth from falsehood, a skill more vital now than ever.
In a recent chat, Scott Douglas Jacobsen sat down with Melanie to dig into the story behind her upcoming book and how her years of teaching science to students who weren’t necessarily going to be scientists profoundly shaped her approach. Melanie explained that it’s not enough to just pile on more facts; what people truly need are the skills to evaluate information, especially when faced with a torrent of online falsehoods. She illustrated her point with compelling examples, from her own experience with creationism to historical accounts like Ed Graves, and the fascinating tale of Leon Festinger’s “Seekers.” These stories highlight how deeply our identity, that uneasy feeling of cognitive dissonance, and our natural tendency to favor information that confirms what we already believe (motivated reasoning) can fuel pseudoscience and outright science denial across all walks of life. It’s a powerful reminder that our beliefs are often more complex than a simple grasp of facts, rooted instead in our very sense of self and community.
When asked the classic “Why did you write the book?” question, Melanie’s answer was disarmingly simple, yet profound: “I wrote the book I wanted to read. I wrote it to help me understand the information environment.” She then delved into a more detailed and engaging explanation, painting a vivid picture of her transformation as an educator. For two decades, she’d been a biology professor, specializing in teaching non-majors – students who needed a science course but weren’t aiming for a career in science. She recalled the frustration of trying to make complex topics like the cell cycle and cancer both useful and engaging, only to see her students feel overwhelmed. They’d memorize for the exam, then promptly forget it, often leaving with an even greater aversion to science. Melanie had a lightbulb moment: her students already had access to infinite information on their phones. If they needed to know about a scientific term, they could look it up instantly. But there was a darker side to this accessibility: they were also just one click away from a deluge of misinformation about cancer, and countless other topics. She realized that giving them more facts wasn’t the answer. What they desperately needed was the ability to understand and evaluate the information staring back at them. This epiphany completely reshaped her teaching philosophy and, ultimately, led to the book. She wanted to help students decode what they were seeing, understand why they were seeing it, distinguish between reliable and unreliable sources, and find trustworthy information when it truly mattered. She observed that too often, people use search engines not to seek truth, but to feed their existing biases, reinforcing what they already believe. True clarity, she argued, comes from understanding how information actually works and how our own deeply held beliefs can color our interpretation of it. Crucially, she also recognized the importance of including misinformation in teaching, not avoiding it. While educators often shy away from false claims, Melanie saw them as powerful tools for learning how to critically evaluate evidence. By confronting misleading information in a safe, structured environment, students could develop the chops to recognize it in the wild. This hands-on approach made her material far more engaging and useful, cementing her conviction that these skills aren’t just for students, but for everyone grappling with our complex modern information landscape.
Melanie then shared a powerful example from her book where pseudoscience and science denial tragically converge: her own upbringing as a young Earth creationist. She candidly explained that pseudoscience is believing in something without scientific support, while science denial is actively rejecting well-established scientific evidence. In her case, as a young Earth creationist, she embraced a belief system that lacked evidence and couldn’t be disproven (pseudoscience). Simultaneously, she was denying a cornerstone of biology – evolutionary theory – which provides a unifying framework for understanding all life sciences. To uphold creationism, she had to reject evolution, and the two beliefs, in a powerful feedback loop, reinforced each other. She pointed out that many examples in her book follow this pattern, often stemming from our deep-seated desires or aversions. People embrace pseudoscience because they want something to be true – a miraculous cure, perhaps, or a connection to loved ones who have passed. Conversely, they deny science because they don’t want certain conclusions to be true. Escaping this trap, she stressed, demands self-awareness and an understanding of our own motivations. Some of the most dangerous cases, like the story of Ed Graves (which she references as an extreme example not explained in this transcript), involve beliefs so intertwined with our identity, our communities, and our sense of purpose that rejecting them feels like losing a piece of ourselves. Melanie’s journey away from creationism, for instance, meant distancing herself from a community with values she no longer accepted, including misogyny. The realization that she had been wrong was incredibly liberating, but it also came with the painful cost of losing those deep connections. This profound personal cost, she explained, is a massive barrier for many. To truly engage readers with these complex ideas – reaching deep into their identity, emotions, and worldview – she realized she couldn’t directly challenge their core beliefs. Such a frontal assault would only trigger defensiveness, shifting the focus from critical process to fixed conclusions. Instead, she strategically uses examples that are historical, funny, or culturally distant – cases unlikely to provoke a personal emotional reaction. This clever approach allows readers to hone their critical thinking skills in a non-threatening environment, with the hope that they will then organically apply those hard-won skills to their own deeply held beliefs, fostering genuine, internal change.
The conversation naturally flowed into what motivates people to cling to pseudoscientific beliefs or deny established scientific concepts. While there’s no single magic bullet, Melanie highlighted a fascinating and less emotionally charged story from her book that perfectly illustrates the underlying principles: the captivating tale of Leon Festinger’s research into the “Seekers.” This particular story centers on Dorothy Martin, a housewife in the early 1950s who believed she was a psychic. She claimed to receive messages through automatic writing, first from her deceased father, then from other entities, and eventually from “Sananda,” an extraterrestrial from the planet Clarion whom she equated with Jesus. Sananda, she proclaimed, instructed her to gather a group of followers, dubbed the Seekers, to receive and share prophecies. The core message was apocalyptic: the world was on the brink of destruction by flood or earthquake, but the Seekers would be miraculously rescued by extraterrestrials in a flying saucer. This outlandish premise piqued the interest of social psychologist Leon Festinger, who, along with his colleagues, infiltrated the group to observe how they’d react when the prophecy inevitably failed. The Seekers, deeply committed, followed bizarre instructions, removing all metal objects from their clothes to avoid interfering with the spacecraft. As the predicted doomsday approached, members made enormous sacrifices – quitting jobs, selling possessions, and ostracizing themselves from family – fully convinced of the world’s imminent end. There were even comedic false alarms, like a prank call from someone pretending to be “Captain Video,” which the group, oblivious, didn’t recognize as a hoax, continuing to hold vigil. On the fateful date of December 21st, the group gathered, anticipating their vindication. But as the hours dragged on with no sign of aliens or apocalypse, the mood shifted dramatically from feverish excitement to confusion, then profound distress. These individuals had invested everything, and the possibility of being wrong was emotionally devastating. Several agonizing hours later, Martin miraculously announced a new message: their unwavering faith and dedication had actually saved the world from destruction! This ingenious reinterpretation allowed the group to maintain their core beliefs despite the undeniable failure of the prophecy. Melanie explained that this case became a foundational example of cognitive dissonance – the intense psychological discomfort that arises when our reality clashes with our beliefs or actions. To alleviate this discomfort, people often twist reality to preserve their existing worldview. The critical takeaway, she emphasized, is that people rarely change their minds, especially when maintaining a belief carries significant personal costs. The Seekers resolved their dissonance through motivated reasoning, skillfully constructing explanations that allowed them to remain “correct.” The depth of their commitment was paramount: they had invested their reputations, relationships, and material wealth. For them, being wrong was simply not an option. As one follower, a college professor, reportedly confessed, he had sacrificed too much to admit error.
Melanie’s insights beautifully resonated with Scott’s own journalistic experience: true persuasion is rarely instantaneous and almost never about a direct frontal assault. Instead, it’s about engaging people in conversation, allowing them the space to reflect, and letting their views shift incrementally over time. The more someone has invested – whether emotionally, socially, or financially – in a belief, the harder it is for them to change their mind. This powerful principle has critical implications for how we interact with others. Being publicly wrong is profoundly difficult and often humiliating. Therefore, rather than mocking or shaming people for their beliefs, we must extend them the dignity and space to revise their views without losing face. Another crucial lesson from the Seekers story is that persuasion isn’t primarily about facts. In science education, there’s a common misconception that disagreement stems solely from a lack of information, leading us to believe that simply providing more facts will solve the problem. However, as a former young Earth creationist herself, Melanie knows firsthand that facts alone couldn’t have swayed her. Her beliefs weren’t rooted in evidence, even though she thought they were, but profoundly in her identity, emotions, and social connections. When people hold false beliefs, merely presenting them with more data is often ineffective because the core issue isn’t factual; it lies in deeper, underlying motivations. If our goal is to genuinely help people revise their perspectives, we must address these deeper, often intangible factors, rather than just focusing on surface-level arguments. It’s a call for empathy, patience, and a nuanced understanding of human psychology in the pursuit of truth.

