Part four of Five Choices, a Philosophical Reflection on Scientific Knowledge
How should we deal with misinformation? We have a choice. One approach is to accept disagreement, to let people share their perspectives even if they are wrong, and thus to tolerate arguments as an essential part of democracy and free speech. But if we do this, we run risks. People may be misled into doing things that put them at grave risk, or even that put everyone in danger. People may be incited into extreme acts that undermine democratic institutions. People might even be lured into hating their neighbours for their differences. Can we bear to undertake such risks?
We have a clear alternative: censorship. We can say that whenever the consequences are sufficiently severe, we are obligated to draw a line in the sand against disinformation and prevent it from being disseminated. We could form a media power bloc - say, a Trusted News Initiative - and get all the tech companies controlling social media, and all the major players in the journalistic media to agree to prevent the dissemination of misinformation. In short, we can unite the most powerful forces in communications technology to enforce censorship in order to prevent misinformation from being spread.
But this too carries risks. The nature of scientific process is built upon disagreements. Despite the simplistic orthodoxy of 'hypothesis, experiment, theory', the production of scientific knowledge is not a sausage machine that you simply crank the handle to reach conclusions. On the contrary, a fairer caricature of the process would be 'competing hypotheses, triangulation of evidence, validation of theories' - and in all three stages, disagreement is essential to success. In the absence of disagreement, we are in danger of drawing premature conclusions based on incomplete evidence, and thus treating provisional hypotheses as robust theories without the painstaking work required to assemble an accurate picture.
Applying censorship to active scientific research topics is not a way of defending scientific knowledge, it is a method of completely preventing its production. You simply cannot stop the spread of misinformation without knowing what the true state of affairs is, and you cannot know this without permitting the disagreements that allow the sciences to conduct effective research programmes. And given that every theory in the sciences is provisional until all objections are eventually resolved (a process that typically takes decades), there is never a viable point at which censorship could plausibly be in the service of scientific truth.
If you resort to censorship, you make it impossible to know the true state of affairs. As a result, people may be misled into doing things that put them at grave risk, even that put everyone in danger, such as prolonging widespread panic. People may be incited into extreme acts that undermine democratic institutions, such as reneging on civil rights agreements. People may be lured into hating their neighbours for their differences, such as whether or not they have taken a vaccine. Can we bear to undertake such risks?
Next week, the final part: The Experts vs the People
The opening image is a detail from an encaustic artwork of unknown providence. As ever, no copyright infringement is intended and I will take the image down if asked by the rightful owner of the artwork.