It’s truly unsettling to witness how easily misinformation can spread, especially when it comes from trusted voices. We’ve all seen, or perhaps even been swayed by, some of the wild claims circulating about COVID-19, its vaccines, and treatments. Imagine hearing a doctor, someone you’re taught to trust implicitly with your health, declare that the pandemic was a “planned operation” or that most vaccinated people would be gone by 2025. That’s precisely what Dr. Rashid Buttar, for example, shared on Twitter. It’s a statement so jarring that it immediately raises alarms, but it’s just one example in a distressing chorus of similar, unfounded claims.
Think about Dr. Sherri Jane Tenpenny, who suggested to lawmakers that the vaccine could somehow make people magnetic – a claim so outlandish it went viral, leaving many scratching their heads or, more concerningly, believing her. Or Dr. Joseph Mercola, who, in early 2020, even suggested inhaling hydrogen peroxide as a COVID-19 cure. These aren’t just random individuals; these are doctors. Their medical degrees, something we typically associate with rigorous training and evidence-based practice, lend an air of legitimacy to their pronouncements, making them all the more dangerous. It’s almost like a magician performing a trick – you know it’s not real, but the way they present it makes you doubt your own senses. These doctors, unfortunately, are part of a group dubbed the “Disinformation Dozen,” identified by a nonprofit as major contributors to vaccine misinformation online. They’re a stark reminder of how easily authority can be misused, especially when it comes to something as vital as public health.
The core of the problem, and what makes these doctors so uniquely problematic, is that their medical credentials act as a sort of shield, giving their unproven or even harmful statements an undeserved weight. When a doctor speaks, people listen, often without questioning. Most of them still hold their medical licenses, despite these highly questionable public statements. This reality has sparked a growing outcry from legitimate medical organizations, who are now urging medical oversight boards to step up and take a more assertive stance. It’s a clear call for accountability, a recognition that the traditional slow pace of disciplinary action simply isn’t adequate when public health is on the line. The Federation of State Medical Boards, for instance, has explicitly stated that doctors spreading COVID-19 misinformation could face serious consequences, including losing their licenses. This is a big deal, signaling a shift from a more lenient approach to a stronger stance against medical professionals who actively undermine public trust and health.
It’s not just the “Disinformation Dozen” either; the problem runs deeper than a small group. We’ve seen other doctors making similarly misleading claims. Take Dr. Dan Stock, who, at an Indiana school board meeting, attributed a summer surge in COVID-19 cases to vaccinated individuals, a claim quickly debunked as “Pants on Fire” false. Or Dr. Stella Immanuel, who famously claimed in a widely circulated video that masks weren’t necessary because hydroxychloroquine could cure COVID-19. Her website still promotes a range of unproven remedies. These examples highlight a disturbing trend: medical professionals, often with little to no expertise in infectious diseases, using their platforms to promote fringe ideas and unproven treatments. It’s a confusing landscape for the average person, trying to decipher truth from fiction, especially when a doctor’s trusted voice is advocating for something so far outside mainstream medical consensus.
The impact of this misinformation is profound, reaching far beyond just individual health choices. It erodes trust in medical institutions, divides communities, and leaves a lasting emotional toll. When experts are sounding the alarm that conspiracy theories are a major reason why over half of unvaccinated Americans are hesitant, it becomes clear that these misleading narratives have a powerful and dangerous “sticking power.” It’s an uphill battle for public health officials when a doctor, someone traditionally held in high esteem, is promoting ivermectin, an anti-parasitic drug primarily for animals, as a COVID-19 cure. People inherently trust doctors; it’s almost ingrained in our societal fabric. As Rachel Moran, a researcher focusing on COVID-19 misinformation, aptly puts it, there’s an assumption that doctors have “insider info that we don’t,” especially during confusing times. This belief, while understandable, makes their misguided messages all the more potent and damaging.
While many might assume state medical boards would swiftly address such issues, the reality is often much slower and more complex. These boards, traditionally responsible for licensing and investigating complaints, are often overwhelmed with more egregious cases like malpractice or criminal activity. Misinformation, while incredibly harmful, sometimes falls lower on the priority list. Arthur Caplan, a prominent medical ethicist, points out the “long, slow process” involved in revoking a license. However, there are signs of change. Some doctors have faced consequences, like Dr. Steven LaTulippe in Oregon, whose license was suspended for refusing to wear a mask and spreading false information about their efficacy. But these cases are still relatively rare, and confidentiality laws in many states make it difficult to even track the true extent of complaints and investigations. Social media companies, too, bear some responsibility. While some accounts spreading misinformation have been removed, the inconsistent application of their own rules means that many still actively promote these harmful narratives, allowing them to continue influencing public opinion and, ultimately, impacting public health outcomes. It’s a complex web of challenges, and finding a holistic solution for accountability and truth is a critical public health imperative.

