Infodemic – an Epistemological Problem
Few places in the world have been spared the novel coronavirus, much less the onslaught of misleading information surrounding the disease. This little-understood virus, which has forced governments to shut down normal life in order to control its spread to sustainable levels, has inevitably led people to seek answers.
Enter conspiracy theories and alternative media stories, which have gained in popularity and helped to fuel protests against government restrictions and the flouting of public health guidance, like mask-wearing, intended to protect the population. Conspiracy theories have appeal in times of uncertainty: they give believers a clear black-and-white narrative and, importantly, someone to blame.
The problem is that fake news can lead people to behave in inappropriate and even dangerous ways. It’s also a problem that, as experience has shown, can’t be solved by policing without raising concerns about freedom of expression or censorship.
Ironically, the relatively vibrant news media landscape makes the spread of misinformation (rumour) and disinformation (intentionally false information) more likely.
The landscape is also changing quickly. Information manipulators are always one step ahead of information regulators. Online platforms themselves are a moving target for regulators, as users regularly change their consumption habits. Although Facebook remains the world’s most popular social networking site, the most widely used is now the messaging service WhatsApp, whose closed discussions enable ‘problematic content’ to escape wider scrutiny and spread quickly through its so-called ‘echo chambers’.
It’s long been clear that the problem of ‘fake news’ is practically unsolvable. This is because our information behaviour has changed. We’re subject to what the World Health Organisation has dubbed an ‘infodemic’; an over-abundance of information that makes it hard, if not impossible, for people to find trustworthy sources.
Mis/disinformation has also been a problem for the mainstream media, as they struggle with shifts to digital news consumption and plummeting advertising revenues. Against these trends, the so-called ‘legacy media’ of print and television journalism has launched a propaganda campaign aimed at separating its outlets from the bad apples.
Despite this campaign, however, trust in legacy media remains just above the global average for all media, according to the Reuters Institute.
Yet social media platforms especially, despite being vectors of misleading content, have been rather complacent in response to it. One UK study has found that such platforms fail to remove 95% of mis/disinformation reported to them.
The unprecedented COVID crisis has forced the hand of social media companies’ to some extent. Some have made pledges, including to ban false information, and removed or placed warnings on other misleading posts, Donald Trump’s included.
But much more mis/disinformation continues to circulate, threatening our democratic institutions and public health alike.
So, what’s to be done?
We can’t and, perhaps, shouldn’t rely on commercial companies and/or the state to moderate our content. Moderation can quickly become manipulation. Moving forward into a future in which the overabundance of information is only going to grow ever greater, we need to assume greater responsibility for our own consumption. We need to become more critical in our thinking, cultivating our own ability to check alleged facts and to analyse and evaluate arguments.
We must all become philosophers, in other words, and sceptics to boot, assuming disbelief as our default mode, trusting nobody, and challenging those who’d lay claim to our belief to convince us out of our intransigence.
We must simply learn to be much less gullible.