Early in the COVID-19 pandemic—well before much of the public understood the extent of the crisis—large numbers of mainstream Facebook users had already become entangled with online communities opposed to best-science guidance, undercutting confidence in expert guidance on everything from masks to vaccines.
That’s the finding of a new study led by Neil Johnson, a physics professor at GW’s Columbian College of Arts and Sciences, that mapped global Facebook conversations starting in December 2019. The researchers also found almost identical online behavior in the case of monkeypox.
“This is a real problem that extends far beyond the COVID-19 pandemic,” Johnson said. “If left unaddressed, we risk losing the battle over hearts and minds when it comes to other crises such as monkeypox, abortion misinformation, climate change—and even trust in upcoming elections.”
The study, titled “Losing the battle over best-science guidance early in a crisis: Covid-19 and beyond” and published in the journal Science Advances, revealed that while public health authorities were still trying to decipher the novel coronavirus and social media platforms were starting to promote official health-related information banners, many Facebook users were already looking elsewhere for information about how to cope.
As early as January 2020, Facebook parenting communities became intertwined with smaller communities whose members were passionate about providing health information but who resisted or opposed expert health and scientific guidance. By mid-February, these parenting communities began sharing their own COVID-19 guidance with similar communities.
In addition, the researchers found that while official health, medical and science communities were engaging online throughout this time, they were mostly talking and listening to one another.
Johnson and his team mapped online conversations in which parenting communities on Facebook clearly co-mingled with groups promoting everything from distrust of vaccines and alternative health to more conspiracy-type content around climate change, 5G, fluoride, chemtrails and genetically modified foods. The team’s map also revealed how expert messaging and conversations sharing best-science guidance took place far from these communities, leaving them to rely on groups with more extreme views for information. Facebook’s targeted health messages also missed the mainstream communities, the research showed.
“This was a huge missed opportunity for effective public health messaging and intervention early in the crisis,” Johnson said. “Maps like the ones we’ve created could help public health experts and social media platforms tailor their best-science COVID-19 guidance around, for example, popular topics within the parenting communities and then introduce that guidance across the Internet globally and at scale.”
Johnson suggested social media platforms and experts avoid targeting their efforts toward more extreme groups and instead focus on mainstream groups, where public health messaging will have more impact.
The study introduces a mathematical model that allows a quantitative analysis of future risk and what-if scenarios. For example, it shows that simply removing the more extreme groups will not solve the misinformation problem. Instead, it would generate a vacuum into which non-rigorous ideas from alternative health and social movements would flow.
Johnson’s research team included CCAS PhD student Lucia Illari and Nicholas J. Restrepo of ClustrX LLC. The Air Force Office for Scientific Research funded the GW portion of the research.