Our world is grappling with a relentless surge of health misinformation, and I, an epidemiologist, find myself at the forefront of this battle. It’s a strange new reality, where my social media followers have become an early-warning system, alerting me to the latest waves of false claims before they even hit my radar. Just this week, a hantavirus outbreak on a cruise ship, the MV Hondius, ignited a familiar and troubling pattern. Within hours of the first news headlines, my direct messages were flooded with screenshots. One particularly egregious example came from a Texas doctor infamous for promoting ivermectin during the COVID-19 pandemic. She was already advocating for ivermectin as a hantavirus treatment, a claim utterly devoid of scientific basis. The messages weren’t just from concerned individuals; some were from parents whose family members had already been swayed by these misleading claims. Others were from long-time followers, reporting the unfolding misinformation like they were witnessing an emergency. By the time I sat down to record a video addressing the outbreak, I had a backlog of misinformation to debunk, much of which hadn’t even reached my personal feed yet. This constant influx of information, both accurate and misleading, has transformed my role from simply debunking false claims to leveraging my audience as an internal network, a real-time defense against the pervasive spread of inaccurate health data. The sheer speed of it all no longer surprises me; it’s become the grim norm.
The actual hantavirus outbreak is serious, yes, but thankfully, it’s also contained. Eight cases have been linked to the MV Hondius, and sadly, three have been fatal. The strain involved, Andes hantavirus, is notable because, unlike most hantaviruses, it can spread person-to-person. However, this spread typically requires prolonged, close contact, meaning the general public health risk, as assessed by the World Health Organization, remains low. Crucially, there is no specific antiviral treatment for hantavirus; supportive care is the standard medical approach. And, to reiterate, ivermectin does not treat hantavirus. But in the digital realm, these vital facts quickly became irrelevant. Within a single day, social media accounts were pushing narratives blaming the outbreak on COVID-19 vaccines. Even prominent figures like former Representative Marjorie Taylor Greene echoed this dangerous framing, suggesting that pharmaceutical companies “manipulate the virus, make the vaccine, and then make the profits.” Other accounts issued warnings against a non-existent hantavirus vaccine. Some labeled the outbreak a “pharmaceutical scheme,” while others declared it a “Chinese bioweapon,” with posts even referring to it as “Covid 26.” The hashtag #HantaVirusHoax rapidly filled with new content, often from the same accounts that had fueled similar disinformation campaigns during the COVID-19 pandemic, mpox outbreaks, and even avian flu scares. The irony is that these claims often contradict each other, yet this blatant inconsistency does nothing to slow their spread.
This contradictory nature of modern health misinformation is something many people still don’t grasp. It’s no longer a random collection of isolated rumors; it functions more like a sophisticated infrastructure. Picture a standing army of influencers, conspiracy theorists, partisan figures, and outrage-driven pages, all poised to pounce. They present themselves as authorities, rapidly attaching to any new outbreak or health scare. Their modus operandi is depressingly predictable: spread misinformation, watch it go viral, and then monetize the “cures” or “solutions” they’re peddling in their bios. The individual claims themselves become almost secondary to this cyclical process. It’s a well-rehearsed script: a new disease emerges, the official explanation is immediately distrusted, a cover-up is assumed, ivermectin is invariably suggested, a hidden profit motive is identified, and then the entire cycle repeats. During the early days of COVID-19, this process could take weeks to fully gain momentum. Now, it unfolds within a matter of hours. I’ve even seen scientists on social media making grim jokes about the imminent surge of conspiracy theories before any significant misinformation posts have even appeared. We all know the script by heart. One particularly bizarre example circulating in my messages wasn’t even current; people were reposting a social media prediction from 2022 that stated: “Corona ended, 2026: Hantavirus.” This was then presented as “proof” that the current outbreak had been meticulously planned years in advance. However, conspiracy accounts make countless predictions; most are forgotten. But if one aligns even vaguely with reality years later, it’s resurrected as irrefutable evidence.
My deepest concern extends beyond the immediate impact of these misinformation cycles. I worry that people fundamentally underestimate the psychological toll these repeated assaults are having on the public. Most public health experts, understandably, focus on the quantifiable aspects of an outbreak: its size, its containment, its spread. But every outbreak now also serves as a potent opportunity for others to profit from viral misinformation. Each cycle subtly conditions more and more people to approach infectious disease stories with a set of preloaded assumptions: the real cure is being withheld, the government is lying, scientists are inherently corrupt, and vaccines are the true danger. What’s particularly alarming is that these messages are no longer confined to the fringes of the internet. New data from Pew Research, published just this week, reveals that half of all Americans under the age of 50 now get their health and wellness information from influencers and podcasts. Many of the individuals shaping these conversations present themselves as medical experts, despite often possessing little to no relevant training or legitimate expertise. This audience is not only vast but also continually expanding.
Consider the implications: when a future outbreak with genuine pandemic potential inevitably emerges—and one surely will—millions upon millions of people will encounter it within an information ecosystem already meticulously primed to distrust official public health guidance before it even arrives. The narratives are now prewritten. The audience has already internalized the cues and suspicions. What truly keeps me up at night isn’t the mere existence of misinformation; that’s a battle we’ve always fought. It’s the alarming realization that we, as a society, have begun to normalize this environment. We are becoming accustomed to a world where truth and evidence are constantly under siege, and where the most dangerous lies gain traction with alarming speed.
As an epidemiologist, researcher, and professor at the University of Illinois Chicago School of Public Health, I dedicate my platform to dispelling misinformation and educating the public about disease outbreaks. My social media persona, “Dr. Kat, Epidemiologist,” serves as a direct line to those seeking accurate and evidence-based information. But the fight is not mine alone. It requires a collective awakening to the dangers of this new information landscape, a recognition that the steady erosion of trust in science and public health is a threat as potent as any virus. We must find new ways to fortify our collective defenses against this insidious infrastructure of lies, because the health of our communities, and indeed the world, depends on it.

