Disinformation is not a new problem. There have always been those who used whatever media was available to spread malicious disinformation for the purposes of making money, scoring political points or stirring division.
That is not to say that it is not a serious problem, just to put some perspective on it. Every new media technology brings a new wave of disinformation, a panic about its social impact and measures to control it. When printing came to Europe in the 16th century, the state and the church were terrified of losing control over the flow of information and how this new technology could be used to spread both information and disinformation (or what they called disinformation, such as criticism of monarchies and churches). Censorship was strict and ruthless.
Similarly, the invention of radio brought fears about how it could be used by demogogues and others, leading to early bans on news on this medium. The panic was particularly intense when television arrived in the 1950s: it would make our children stupid, ignorant and lazy, and expose them to material and ideas that would warp their minds, it was often said. Tight licensing conditions followed and in South Africa, the arrival of television was put off for two decades for fear of what it would do.
The internet brought even greater fear because of its lack of gatekeepers and its speed, allowing information, disinformation, hate speech, pornography and violence to spread so much quicker and wider.
Each of these waves of fear had a basis in reality. In time, though, with each new technology we adjusted, learnt how to use and manage it – and saw that the positive benefits outweigh the dangers of it being abused for anti-social purposes.
It is worth bearing this in mind when we consider heavy-handed restrictions on disinformation that will impact on free speech, open discussion and the sharing of ideas. Clearly, social media disinformation is a problem and a threat to democracy, health and social order – as has been strongly demonstrated by the devastating effects of vaccine disinformation. But we need to be careful of being panicked into measures that themselves undermine the democratic, equalizing and educational potential of this powerful medium, and which spread distrust and fear.
Much of the debate about dealing with internet disinformation is whether it should be in the hands of politicians, who cannot always be trusted to act in the public interest, or the privately-owned platforms, who have so far failed miserably to rise to the task.
A report released this week points in another direction, putting the consumer, the ordinary citizen, at the centre of dealing with disinformation. Prof Herman Wasserman of the University of Cape Town, and Dr Dani Madrid-Morales o the University of Sheffield, took a look at media literacy and fact-checking training in South African schools and universities. The assumption is that if citizens are more media literate, they are better equipped to critically assess the media they consume and differentiate between information and disinformation.
Media literacy is what they call “an ecology of skills” such as the ability to read media critically, to understand how it is produced, and the relationship between media and audiences. They point to the need to add to these skills misinformation literacy – the critical and technical capacity to counter the spread of misinformation online. If this was taught systematically in schools, universities and workplaces, then consumers would be better able to make their own decisions on what is trustworthy, believable and safe to share.
Basic Education Minister, Angie Motshekga, speaking at a Unesco event last year, said: “We must consider new media and information literacy programmes to help people understand the consequences of creating and sharing false and misleading content.”
But then the Minister makes many promises (some of them might even qualify as disinformation). Wasserman’s report finds “no nationwide, structured and uniformed teaching of media literacy” in our high schools, though there is some ad hoc instruction in schools that have the capacity. The Western Cape, working with Google, introduced an online safety module, but its implementation was disrupted by Covid.
At universities, it is taught, but mostly as part of other larger modules, and with widely different views of what the problem is and how to handle it.
It doesn’t have to be complicated. Africa Check, the fact-checking organization that commissioned the report (and where – full disclosure – I chair the board), offer tips on how to verify online information quickly and easily. (www.africacheck.org)
We are always encouraged to be careful about the food we put into our bodies. Isn’t it time we were more careful with the information we are?
*Harber is director of the Campaign for Free Expression and Caxton Professor of Journalism at Wits.