It’s like a silent epidemic, but instead of viruses, it’s misleading information that’s making us sick. Experts at the London School of Hygiene & Tropical Medicine (LSHTM) are waving a big red flag, concerned about the tsunami of bad health advice swirling around, especially on crucial topics like vaccines, reproductive health, disease outbreaks, and even the health implications of climate change. What’s truly alarming is how people are increasingly turning to new, often unverified sources for this advice – think dazzling influencers on social media, trendy ‘wellness’ apps that promise cures but deliver little, and even AI chatbots designed to answer questions but sometimes just make things up with convincing confidence. This isn’t just about making a bad choice; it’s about making choices that can literally put our lives, and the lives of those we love, at risk.
Imagine a parent, worried about their child, stumbling upon misinformation online that persuades them to skip childhood vaccinations – vaccinations that have eradicated devastating diseases for generations. Or picture someone with a serious illness, swayed by an unproven ‘miracle cure’ advertised by an influencer, foregoing legitimate medical treatment. And then there are AI chatbots, those seemingly helpful digital assistants, which sometimes “hallucinate” – a fancy word for making up – clinical details, presenting them as fact. The LSHTM is sounding the alarm, stating unequivocally that this isn’t just about individual well-being. This rampant misinformation isn’t just making a few people sick; it’s paving the way for a less healthy, less productive United Kingdom as a whole, undermining decades of scientific and medical progress.
Professor Liam Smeeth, the Director of LSHTM, painted a vivid picture of how things have shifted. He explained that there was a time – not so long ago – when if you had a health concern, you’d call your GP, your family doctor, or consult the trusted NHS website. These were the reliable pillars of health information. But now, he notes, a growing number of people are bypassing these established, trustworthy sources, opting instead for the siren song of social media and AI. And that’s where the danger lies. He highlights that on these newer platforms, individuals and organizations with their own agendas – often commercial or political, rather than genuinely concerned with public health – are dispensing advice without any medical training, without any oversight, and perhaps most critically, without any accountability if their advice leads to harm. It’s a Wild West of health advice, and there’s no sheriff to be found.
Professor Smeeth further stressed that the impact of this misinformation extends far beyond personal choices. Consider the decision to vaccinate: it’s not just about one person; it’s about protecting an entire community, including vulnerable babies still forming in their mothers’ wombs. Or think about climate change – misinformation about its impacts can lead to policy decisions that ignore the growing risks of extreme heat and the spread of mosquito-borne diseases like dengue and chikungunya. He warned that if we don’t actively combat inaccurate and misleading health information, we risk “turning the clocks back,” undoing the immense progress made over decades, particularly the life-saving benefits of routine vaccinations. But there’s a glimmer of hope: with global bodies like the World Economic Forum and the UN recognizing dis/misinformation as a top global risk, Professor Smeeth feels that organizations are finally “waking up to the threat.” That’s why LSHTM has initiated a crucial call to action: they want to unite universities, health organizations, charities, and funders in a new network to fight dangerous health misinformation in the UK – before it’s truly too late.
Professor Adam Kucharski, Co-Director of the Centre for Epidemic Preparedness and Response (CEPR), echoed these concerns, emphasizing the speed at which harmful ideas can spread in today’s interconnected world. He starkly stated, “Today ideas spread faster than pathogens,” making it impossible to treat harmful information as a mere secondary threat, especially when it undermines the vital work of frontline healthcare workers battling disease outbreaks. He also brought up an interesting and often overlooked point: it’s not just outright falsehoods that are the problem. There’s also a category of “technically-true-but-potentially-misleading content” which, because of its subtle nature, can often have an even more widespread and damaging impact on social media. It’s like a half-truth, more dangerous than an outright lie because it’s harder to spot.
Professor Kucharski believes we need to learn from past experiences, particularly the COVID-19 pandemic and the current situation in the US. He stressed the importance of acknowledging and communicating uncertainties. In a world where information is constantly evolving, experts shouldn’t fear admitting what they don’t yet fully know. Failure to do so risks creating a “lost generation” who won’t know whom to trust for reliable health advice. He argues that it’s no longer enough for health experts and institutions to simply say, ‘trust us.’ Instead, they have to actively work to rebuild that trust by providing accessible, high-quality health information that directly addresses people’s concerns and answers their pressing questions. Looking ahead, to prepare for the next “infodemic,” Britain could learn from countries like Finland, which has a pioneering national media literacy program. This program teaches both children and adults how to identify and resist misinformation – a vital skill in our information-saturated age. In July 2025, Parliament’s Communications and Digital Committee also recommended embedding media literacy into the national curriculum, a move that LSHTM experts believe would be crucial, though it raises the challenge of how to best support teachers in navigating these complex and sensitive topics in the classroom. Currently, there’s no unified effort to track the dangerous health misinformation spreading across the UK. One of the main goals for LSHTM’s proposed new network would be to actively monitor these “health misinformation attacks” on key topics and then collaborate on effective social media and online interventions to counter them, creating a united front against the spread of harmful advice.

