The UK’s senior scientific academy, The Royal Society, set the cat amongst the media pigeons this week with the release of their report on online misinformation, in which they say the risks of removing misinformation outweigh the benefits.
I attended the launch of the report where internet pioneer Vint Cerf, often referred to as one of “the fathers of the internet”, illustrated the thinking of the scientists. Using the analogy of double-entry bookkeeping, he explained that the principle behind this accounting system is never to delete an erroneous entry but rather to reverse it on a subsequent line, “so that everyone can see the workings”.
In the same way, The Royal Society suggest that corrections should be made to online misinformation, rather than deleting it. This approach, Cerf says, is a much more effective long-term solution to misinformation as it will “build collective resilience” in the public on how misinformation works so they can develop the skills to detect it when they encounter it in future.
Calls to remove misinformation have united disparate groups
Calls to deal with the rising volume of online disinformation have so far almost entirely focused on the most effective methods of removing it, so it’s an interesting proposition. The Royal Society stress that content that incites violence, racism or child sex abuse must be removed, however, they recommend that material that is legal but runs counter to the scientific consensus should not be banned.
The report’s authors say that removing misinformation from mainstream platforms, and potentially driving users towards more underground channels, makes it harder for scientists to engage with people such as anti-vaxxers. A more sustainable and nuanced approach is needed which presents the facts and evidence in response to misinformation.
Fact-checking is not enough to eradicate online misinformation
However, the authors of the report acknowledge that fact-checking is not enough in an online environment which currently rewards controversial and polarising content.
What were the other recommendations of the report?
Recognising that a multi-disciplinary approach, involving social scientists, psychologists, lawyers and economists, amongst others, is needed to combat misinformation, the report contained ten other recommendations:
- The UK Government must combat online misinformation which risks societal harm as well as personalised harm, especially when it comes to a healthy environment for scientific communication.
- To help address complex scientific misinformation, content fact checkers could highlight areas of growing scepticism or dispute for deeper consideration by organisations with strong records in carrying out evidence reviews.
- Ofcom must consider interventions for countering misinformation beyond high-risk, high-reach social media platforms.
- Online platforms and scientific authorities should consider designing interventions for countering misinformation on private messaging platforms.
- Social media platforms should establish ways to allow independent researchers access to data in a privacy compliant and secure manner.
- Collaboration is needed to develop examples of best practice for countering misinformation as well as datasets, tools, software libraries, and standardised benchmarks.
- Governments and online platforms should implement policies that support healthy and sustainable media plurality.
- The UK Government should invest in lifelong, nationwide, information literacy initiatives.
- Academic journals and institutions should continue to work together to enable open access publishing of academic research.
- The frameworks governing electronic legal deposit should be reviewed and reformed to allow better access to archived digital content.
The vast majority of adults hold views close to the scientific mainstream
One of the most encouraging findings of the report was the YouGov poll released alongside it which found that, despite the vast proliferation of online misinformation, the majority of UK adults actually hold views close to the scientific mainstream. For example, of the 2,000 adults surveyed, only 7% believed that the BioNTech/Pfizer vaccine is not safe, only 11% believed that the Oxford-AstraZeneca vaccine is not safe, and 90% said human activity is changing the climate.
In the same month that Sweden launched an agency for psychological defence against misinformation which will not only attack online falsehoods but also teach the Swedish public how to verify information, it’s a very welcome start to the new year to see so many initiatives devoted to stemming the tide of the misinformation which currently floods the internet.
To educate yourself on fake news (aka misinformation), filter bubbles and other problems of the digital world pick up a copy of my new book (or buy one for someone who needs it).